Big data is getting bigger, with some estimates suggesting that 90 percent of all data was created in the last two years alone. That staggering figure can lead to analysis paralysis for some organizations, but those that can sift through, analyze and take action on information faster than others will have a competitive advantage. Irfan
Kahn Khan, the CTO at Sybase, explained at the Structure Big Data conference certain strategies that can reduce data latency and increase the value of faster decisions. Kahn Khan says the challenge lies mainly in three particular latency areas: data, analysis and decision making all take time from the initial trigger event to the final action event. And as more time passes at each of these hotspots, the value of any decision made from the data is reduced. As if that weren’t bad enough, I.T. shops are slowed by other challenges such as mobile computing adoption of employee-owned devices, the mobile commerce revolution and pressure to boost worker productivity.
What’s needed to respond to these challenges, according to
Kahn Khan are new technology stacks, deployment models and shifts in both data and application paradigms. For starters, the cost of faster DRAM is now approaching that of high-end storage and should be considered for data. Using column stores of data can reduce the size of information arrays by several factors while the adoption of complex event processing engines (CEP) can bring real-time analytics. Data analysis also needs to be pushed down closer to where the data is actually stored — a tighter merging of application stores and data tiers, says Kahn Khan — and the use of in-memory analytics can also diminish information latency, leading to faster decision making in the enterprise.