The disruptive impact of in-memory computing

As tech-gear comes ever closer to the physical boundaries of electric-speed, hitherto-unseen bottlenecks are surfacing. Storage vendors have been selling systems using SSDs (solid-state drives, essentially large-capacity flash memory units with higher throughput speeds) at critical data-junctures for a couple of years now. As SSD prices continue to drop, expect this trend to speed up.
 
Speed is found across the tech-spectrum nowadays. Processors ramp up their clock-speeds and multi-core processors amplify dataflow. It all leads to more data.
 
Flash memory is now used to improve the performance of many devices and applications. Gartner VP David Cearley expects to see a huge jump in the use of flash memory in consumer devices, entertainment devices, equipment and other embedded IT systems.
 
"In addition, flash offers a new layer of the memory hierarchy in servers and client computers that has key advantages - space, heat, performance and ruggedness among them," said Cearley. "Unlike RAM, the main memory in servers and PCs, flash memory is persistent even when power is removed. In that way, it looks more like disk drives where we place information that must survive power-downs and reboots, yet it has much of the speed of memory, far faster than a disk drive."
 
Cearley added that software is critical to unlock these advantages. "As lower-cost - and lower-quality - flash is used in the data center, software that can optimize the use of flash and minimize the endurance cycles becomes critical," he said. "Users and IT providers should look at in-memory computing as a long-term technology trend that could have a disruptive impact comparable to that of cloud computing."
  
In-memory as a technology has come into focus over the last two to three years. However, there is nothing new about it, said Surya Mukherjee, lead analyst for Ovum's information management team. "Any memory faster than a disk drive, which has moving parts, will speed the process of data storage, retrieval and analysis," he said.
 
"As RAM and SSDs become cheaper, it's now possible for organizations to load an entire database onto fast memory and benefit from low-latency transactions," Mukherjee explained. "The popularity of in-memory analytics is closely linked to advances in hardware technology, such as 64-bit computing, multi-core processor, and improvements in processor speed. Technical advancements in these areas help vendors optimize the use of memory and speed up data processing performance."
 
Mukherjee says that "software applications using the memory should be optimized to deliver desired features within the application-tier and work with the data in-memory. Also the compression used within in-memory databases and applications should be optimized to perform read/write without the need for significant decompression of data."
 
Which vendors are racing to bring their in-memory products to market?
  
In December 2010 SAP launched HANA: software that uses an in-memory computing engine that allows data to be held in RAM instead of being read from disks or flash storage, thus providing a performance boost. SAP intends for HANA boxes to be attached to its own ERP systems, sucking in and analyzing transactional data in real time. However, HANA's agnostic data access functionality means any information source can be used.
 
In a research note last October, Gartner's Massimo Pezzini and Daniel Sholler wrote: "SAP has started a new generation of application infrastructure and architecture efforts focused on cloud and in-memory. The new vision will force the competition to respond; but addressing its potentially disruptive impact on SAP's most-conservative customers' requirements will be a key challenge."
 
The Gartner analysts described SAP's strategy as "bold" and "very ambitious," adding that," The HANA architecture is a work in progress, and will undergo several significant changes before it is completed. This potentially exposes SAP users to challenges for migrating to, and integrating with, different technology generations."
 
Although still rough around the edges and incomplete, the two analysts said, "SAP's strategy is likely to shake up the application infrastructure market and put extraordinary competitive pressure on the megavendors, including IBM, Microsoft and Oracle, as well as on application infrastructure specialists, such as Red Hat-JBoss, Tibco Software, Software AG and VMware."
 
In various public statements, SAP has made it clear that the long-term goal is for HANA to replace other databases - especially rival Oracle's offering - that are now running its applications, including the flagship Business Suite.
 
Constellation Research analyst Ray Wang agrees that SAP's strong play targeting the database market will hurt Oracle, but added the latter should not worry just yet.
 
"If you're really serious about undercutting Oracle, you make a database," Wang said. "But Oracle's got so big because of all the acquisitions, you can't just cut them off on databases, although that's one big piece. So this is just natural competition between the tech vendors."
  
Oracle sets sail
 
Oracle CEO Larry Ellison hasn't always been a supporter of in-memory technology. According to CIO Magazine, at an event in early 2010, Ellison said: "This is nonsense. There is no in-memory technology anywhere near ready to take the place of a relational database. It's a complete fantasy on [SAP's] part."
  
That was then. Oracle now intends to ship its own in-memory powered appliance, Exalytics, later this year. Both Exalytics and HANA incorporate in-memory databases, providing a performance boost over systems that read and write data from disks. The Exalytics In-Memory Machine X2-4 consists of a single server with 1TB of RAM and four Intel Xeon E7-4800 processors, each with 10 cores, according to an Oracle whitepaper. Exalytics machines can also be clustered together.
 
Exalytics also "supports the broad portfolio of Oracle BI and EPM applications right out of the box," and customers who have existing applications built with Oracle BI Enterprise Edition and Essbase can migrate them to Exalytics without changes, according to the company. But requisite support and software isn't priced in.
 
Oracle's second-quarter results, which showed a 14% drop in hardware systems product revenue, may have accelerated the release schedule for both Exalytics and the Big Data appliance. Oracle is under pressure from investors to boost the hardware business and its leadership is no doubt looking to give sales teams as many products as possible to push as its fiscal year draws to a close.
 
Meanwhile, Oracle shares many customers with SAP and some of those may be evaluating HANA. While SAP plans to position HANA over time for transactional as well as analytic workloads, the rivalry between the two companies' products is difficult to deny.
 
Most software vendors in the business analytics and database market will incorporate in-memory technology somewhere in their product range, said Sharon Tan, IDC Asia-Pacific's research manager for information management.
 
She noted that in-memory offers include SAP's HANA and SAS's High-Performance Analytics.
 
The IDC analyst said that the higher price of flash memory would initially counterbalance user growth. "IDC expects that smaller vendors offering software that incorporates niche analytic capabilities will be seen as acquisition targets as large vendors seek to beef up their portfolio with complementary and proven capabilities," she said.
 
Tan said that SAP has announced plans to expand its reach in retail industries by integrating products from SAF Simulation, Analysis and Forecasting AG, a vendor it acquired in January. "SAF's software provides automated ordering and forecasting software that helps enable more accurate demand prediction and automate the ordering process for mainly the retail industry," she said.
 
Stefan Hammond is technology editor of enterprise group at Questex Media Asia