How streaming can fit into the big data toolbox

While NoSQL databases have gotten their share of interest in recent years, that doesn’t mean developers should rule out running good old SQL queries to get insights on large amounts of data flowing in in real time, without spending an arm and a leg.

Damian Black, CEO of SQLstream, made the pitch for this streaming approach his company employs at GigaOM’s Structure conference in San Francisco on Wednesday.

When Black talks about streaming, he means running continuous queries over data coming in and shooting out results that change in response to input.

This method permits lower latency than Hadoop and conventional relational databases, Black said. “But it also uses a lot less hardware and a lot less infrastructure. It’s able to process very high volumes of data (without exorbitantly high costs).”

As Black noted, these options are not necessarily intended for the same use cases. Rather, they can be thought of as complementary. “Stream processing is increasingly important as an extra tool,” Black said.

Check out the rest of our Structure 2013 live coverage here, and a video embed of the session follows below:

[protected-iframe id=”e0e786f582e6e38d937737efe59f321b-14960843-61002135″ info=”″ width=”640″ height=”360″ frameborder=”0″ scrolling=”no”]
A transcription of the video follows on the next page