Real-time data queries designed to support decision making
Today’s managers face two challenges making business increasingly complex, especially against the backdrop of existing volatile market conditions: One challenge being a doubling of data every 18 months, specifically the very foundation of information overload: Data exchange is on the increase causing higher pace and amounts of processes. Therefore, it is virtually no bother to find information; rather, it is extremely challenging to weed out the right information in order to acquire a useful body of knowledge. The second challenge being managers face, which makes business increasingly complex, is that innovation cycles are constantly shortening; this makes it extremely difficult for literally every industry to simply keep up. Thus, what is state-of-the-art today can be obsolete tomorrow. This is why speed at which decisions are made is paramount.
Bottom line: Decision makers need to conduct accurate queries in real-time to avoid traps of volatility. This facilitates the overall decision making process both in terms of daily business and identifying long-term trends.
Yet, there are many hindrances, such as: information overload and unstable market conditions making the query runtime results invalid, meaning the output is obsolete; however, there is an antidote, chiefly: In-memory-computing. In-memory-computing is the quantum leap in terms of making hardware as well as software ready for real time business analyses which can be used for simulations.
What can be inferred?
Speed coupled with optimized decisions
Queries that required days can now be carried out in minutes. This brings about the advantage that executives can perform additional questions; a classic example would be a manager wanting to identify the worst-selling product; the response is Product A; the next query is then why is Product A not selling the way it should; the response is supply chain problems caused by weather difficulties – all in real-time. If a manager has access to this information within minutes, they can take immediate action to circumvent shortcomings by, for example, presenting alternative approaches not involving weather obstacles.
How does in-memory-computing work?
Manage data in columns on main memory
In-memory-computing enables users to quickly analyze enormous volumes of data in a cost efficient manner while reducing the overall IT-landscape complexity: Under normal conditions obtaining detailed insights into data would take far too long. However, in-memory-computing offers the possibility to flexibly perform within minutes compared to the traditional rigid database approach (normal BW system). This is the case because of parallelization, columns, and main memory – furthermore, complex data warehouse data- compression is not necessary.
Parallelization – break tasks into pieces and processes them in parallel (multi-threading).
Columns – in-memory-computing manages data in columns also instead of rows only. This benefit reduces data handling resulting in minimized field redundancies. Therefore, less fields are touched, by sticking to essentials, resulting in higher processing speeds 10-50 times faster.
All data refers to the same memory address with redundant content (Date). The result is less data with equal information.
Main memory – In-memory computing takes advantage of conducting data management via main memory instead of hard disk only. This results in further improved processing power as access to main memory is 120 times faster than hard disk – compressed data coupled with main memory has a low cost/size ratio, which makes the approach even more attractive. Nonetheless, the hard disk is not relieved and sedated, it is still required to save everything; here simplicity, that is, indeed, an easier digestible meal for the computer, is key (save data on hard disk, the persistency layer to ensure data integrity, and access data via memory for analysis). Case in point: One million orders are placed on 01.01.2000 in the system. Then only one entry is created in-memory instead of one million on the hard disk. The connection between the involved data is still valid due to the same memory address. When it comes to a query, a filter has only to look for one entry in-memory instead of one million on hard disk.
Bottom line: The data compression (columns instead of rows only) along with utilizing the main memory (manage data in main memory instead of hard disk) provides for remarkable improvements. Yet, row-oriented and column-oriented approaches are not at odds, rather they complement each other to formulate the most suitable solution: BW systems offer advantages catering to enormous data amounts within complex data integration, whereas in-memory offers advantages catering to scenarios that require quick responses to business analytic queries. Furthermore, in-memory covers both (two analytical measures for business information) OLAP (Online Analytical Processing) and Data-Mining.
What is in it for customers?
Speed, better decision making, and lowered costs
✔Real-time analyses at the speed of thought
✔Faster/optimized decision making
✔More agility and flexibility for business analytics and simulations
“We are witnessing the dawn of a new era in enterprise business computing, defined by the near instantaneous availability of critical information that will drive faster decision making, new levels of business agility, and incredible personal productivity for business users.” Bill McDermott (Co-CEO, SAP)
What is in it for SAP?
Performance breakthrough with improved business application calculation processes
The in-memory-computing technology of SAP is called SAP HANA. SAP HANA offers a significant advantage for SAP, chiefly: The ability to follow a parallelism (multi-threading) approach: As of yet, SAP’s dominant platform ABAP offers, on one hand, a quite stable solution for its customers, yet on the other hand, it is database independent which is technology wise not designed to process the parallelized in-memory-computing analytics. However, with the help of this new SAP HANA technology, SAP can take full advantage of its potential and bring its new software developments to the next level. This will enable SAP to collaborate more agile and flexible with its customers to launch even better solutions. Therefore, SAP HANA can be the foundation of a sustainable strategy that builds on multi-threading to leverage multi-core hardware trend.
Bottom line: SAP and its customers will greatly benefit from SAP HANA, particularly when this new technology joins the SAP portfolio.
What are industry specific examples of in memory-computing?
Healthcare: Health insurance marketers want to identify patient patterns, so that preventive measures for certain target groups can be carried out.
Consumer: Telesales marketers segment their customers based on frequency and amount of purchase. The result is a ranking determining individual services that also includes cooperation marketing. Then, tailored promotions can be implemented to boost sales. This can be done frequently and not only at the end of the year.
Automotive: Car marketers seek to identify – when do most people visit car dealer shops. If Saturday and Sunday is the output then outdoor displays along with other measures can generate useful customer insights after opening hours (click analysis of displays).
Food: Food marketers strive to determine why the price of whey protein increases. The answer is weather problems of supplying countries combined with the fact that Asian families start feeding their children with baby food (main ingredient is whey – higher demand=higher price).
Manufacturing: A manufacturing marketer wants to investigate machine failures by identifying common properties that are exposed to specific subsets of machines with a higher failure rate. This feedback can be integrated into new product development.
Banking: Banking marketers compare current financial scenarios with most similar historic ones in order to accurately forecast. For example, a company that publishes its first quarter results and the like. Consecutively, what if simulations can be calculated and compared.
Public: State marketers want to identify the likelihood of traffic jams and accidents caused by traffic lights. Several scenarios can then be compared in order to find the most suitable solution.