Report: How to resolve cloud migration challenges in physical and virtual applications

Our library of 1700 research reports is available only to our subscribers. We occasionally release ones for our larger audience to benefit from. This is one such report. If you would like access to our entire library, please subscribe here. Subscribers will have access to our 2017 editorial calendar, archived reports and video coverage from our 2016 and 2017 events.
Cloud computing
How to resolve cloud migration challenges in physical and virtual applications by Paul Miller:
Enterprise IT infrastructure largely predates the emergence of cloud computing as a viable choice for hosting mission-critical applications. Although large organizations are now showing real signs of adopting cloud computing as part of their IT estate, most cloud-based deployments still tend to be either for new and self-contained projects or to meet the needs of traditional development and testing functions.
Compatibility, interoperability, and performance concerns have kept IT administrators from being completely comfortable with the idea of moving their complex core applications to the cloud. And without a seamless application migration blueprint, the project can seem more of a headache – and risk – than it’s worth. This report will highlight for systems administrators, IT directors, cloud architects, and decision-makers at Software as a Service (SaaS) companies and Cloud Service providers, the different approaches they can take in moving existing applications to the cloud.
To read the full report, click here.

BlueTalon raises $5M, sets sights on securing Hadoop

BlueTalon, a database security startup that launched last year around the goal of enabling secure data collaboration, has shifted its focus to Hadoop and is set to release its first product. The company has also raised an additional $5 million in venture capital, it announced on Wednesday, from Signia Venture Partners, Biosys Capital, Bloomberg Beta, Stanford-StartX Fund, Divergent Ventures, Berggruen Holdings and seed investor Data Collective.

Eric Tilenius, BlueTalon’s CEO, told me during an interview that the company decided to pivot and focus on Hadoop because it kept running into a “gaping hole” while speaking with potential customers. They had a pressing issue of how to put more data into Hadoop so they could take advantage of cheap storage and processing, while not simultaneously opening themselves up to data breaches or regulatory issues. One large company, he said, had a team in place to manually vet every request to access data stored in Hadoop.

Hadoop is an unstoppable force but security is the immovable object. Companies want to do away with the untold millions or billions of dollars in enterprise data warehouse contracts “that shouldn’t and don’t need to be there,” Tilenius said, but there hasn’t yet been a way to do it easily or efficiently from a security standpoint.

BlueTalon-D1

BlueTalon’s software works by letting administrators set up policies via a graphical user interface, and then enforces those policies across a range of storage engines. Policies can go down to the cell level, or range from user access to use access. Depending on who’s trying to access what, it might deny access, mask certain content or perhaps issue a token. The software also audits system activity and can let security personnel or administrators see who has been accessing, or trying to access information.

The BlueTalon Policy Engine, as the product is called, will be released within the next month and will support the Hive, Impala, and ODBC and JDBC environments. The company hopes to add support for the Hadoop Distributed File System by the end of the second quarter.

BlueTalon is hardly the first company or technology to tackle security and data governance in Hadoop, but Tilenius thinks his company has the easiest and most fine-grained approach available. There are numerous open-source projects and both Cloudera and Hortonworks have made security acquisitions recently, but much of that work is focused around higher-level policies or encryption.

However they choose to get it, though, it’s undeniable that companies — especially large ones — are going to want some sort of improved security for Hadoop. We’ll be sure to ask more about it at our Structure Data conference next month, which features technology leaders from Goldman Sachs, Lockheed Martin and more, as well as executives from across the big data landscape.

Update: This post was corrected at 11:21 a.m. PT to correct the day on which BlueTalon announced its funding.

Another big data breach, this time at insurance company Anthem

Anthem, the nation’s second largest insurance provider, was hit by hackers who stole lots of customer data including names, birth dates, medical IDs, social security numbers, snail-mail and e-mail addresses, and employment information —  but allegedly no credit card or medical information, the company said. Although with all that other information out there, that may not be much comfort.

In a letter to customers, Anthem CEO Joseph Swedish acknowledged that his own information was stolen but said there is no evidence that credit card or medical information were compromised. [company]Anthem[/company], formerly known as [company]Wellpoint[/company], posted more information here for customers.

Little is known about which of the company’s databases or applications were hijacked, but Anthem said all of its businesses were affected. And there was the usual butt-covering: Swedish said the company “immediately made every effort to close the security vulnerability, contacted the FBI and began fully cooperating with their investigation.” Anthem also characterized the breach as a result of “a very sophisticated external cyber attack.” But, seriously, what else would they say? As a couple wiseguys on Twitter put it: “It’s better than saying you left the front door open.” Or the keys on the visor.

Anthem also said it hired Mandiant, a sort of cybersecurity SWAT team, to assess its systems and recommend solutions. Cybersecurity specialist Brian Krebs has more on the potential impact.

The topic of the breach came up during a call earlier today during which the White House discussed its interim report on big data opportunties with reporters. The gist was that Anthem appeared to have notified authorities within 30 days of finding the problem, which is what the White House would stipulate in bills it is formulating.

The security of healthcare data is of particular concern — and preserving patient privacy was the impetus behind HIPAA and other regulations. But, as Gigaom pointed out earlier this year, that data security may be as much fiction as fact.

The benefits of consolidating digital patient data in one place so that a patient or her doctors can access it spells convenience for authorized users, but that data conglomeration also offers a compelling target for bad guys.

At this point it would be natural for a given consumer to feel both spooked and jaded by these security snafus. Last year alone, there were major breaches at Target, Home Depot, and JPMorgan Chase, affecting hundreds of millions of people in aggregate.

Hortonworks and 3 large users launch a project to secure Hadoop

Hadoop vendor Hortonworks, along with customers Target, Merck and Aetna, and software vendor SAS, has started a new group designed to ensure that data stored inside Hadoop systems is only used how it’s supposed to be used and seen by whom it’s supposed to be seen. The effort, called the Data Governance Initiative, will function as an open source project and will address the concerns of enterprises that want to store more data in Hadoop but fear the system won’t match industry regulations or stand up to audits.

The group is similar in spirit to the Open Compute Foundation, which launched in 2011. Facebook spearheaded the Open Compute Project effort and drove a lot of early innovation, but has seen lots of contributions and involvement from technology companies such as Microsoft and end-user companies such as Goldman Sachs. Tim Hall, Hortonworks’ vice president of project management, said Target, Merck and Aetna will be active contributors to the new Hadoop organization — sharing their business and technical expertise in the markets in which they operate, as well as developing and deploying code.

Among the rationale for creating the Data Governance Initiative were questions about the sustainability of the Hortonworks open source business model, some which were brought to light with the revenue numbers it published as part of its initial public offering process, Hall acknowledged. The idea is that this group will demonstrate Hortonworks’ commitment to enterprise concerns and work with large companies to solve them. It will also show how Hortonworks can drive Hadoop innovation without abandoning its open source model.

“We want to make sure folks understand it’s not just these software companies we can work with,” Hall said, referencing the initial phases of Hadoop development led by companies such as Yahoo and Facebook.

A high-level view of Apache Falcon.

A high-level view of Apache Falcon.

Hortonworks plans to publish more information about the Data Governance Initiative’s technical roadmap and early work in February, but the Apache Falcon and Apache Ranger projects that Hortonworks backs will be key components, and there will be an emphasis on maintaining policies as data moves between Hadoop and other data systems. Code will be contributed back to the Apache Software Foundation.

Hall said any companies are welcome to join — including Hadoop rivals such as MapR and Cloudera, which has its own pet projects around Hadoop security —  but, he noted, “It’s up to the other vendors to recognize the value that’s being created here.”

“There’s no reason why Cloudera couldn’t wire up their [Apache] Sentry project to this,” Hall added. “. . . We’d be happy to have them participate in this once it goes into incubator status.”

Of course, Hadoop competition being what it is, he might well suspect that won’t happen anytime soon. Cloudera actually published a well-timed blog post on Wednesday morning touting the security features of its Hadoop distribution.

You can hear all about the Hadoop space at our Structure Data conference in March, where Hortonworks CEO Rob Bearden, Cloudera CEO Tom Reilly and MapR CEO John Schroeder will each share their visions of where the technology is headed.

Update: This post was updated at 12:20 to correct the name of the organization. It is the Data Governance Initiative, not the Data Governance Institute.

Smog computing: Why tech biz should learn from past regulations

Watching the President’s State of the Union and his calling upon Congress to enact legislation to better protect our personal information, I started wondering how much good we’ve actually gotten from regulatory compliance standards like HIPAA and Sarbanes-Oxley already in place. I knew I couldn’t dismiss them as useless, but I wanted to look for a model of regulatory compliance that’s been steeping longer from which I could reel some parallels to the current state of information privacy.

California’s air quality law — and subsequent state and federal legislation — was an acknowledgement that progress comes at a price. But environmental regulations are themselves an admission that we cannot completely eliminate the dangers associated with modernity; they are instead an attempt to mitigate the risks.

Los Angeles was still something of a sleepy outpost on America’s burgeoning West Coast when oil was discovered there in the 1890s. According to the 1900 U.S. Census not many more than 100,000 people called L.A. home. The oil boom and a growing film industry attracted enough people through the 1910s that the population quintupled to over a half-million by 1920. Growth continued during the next two decades when the concurrent Great Depression and Dust Bowl catastrophes touched off a westward migration that saw hundreds of thousands of people move from America’s Midwest and relocate to California in search of work.

In 1943, industry and automobiles resulted in the first report of smog in the city. By 1947, Los Angeles’ toxic air had become so problematic that Governor Earl Warren signed the Air Pollution Control Act, thus beginning the age of environmental law.

You might say the Air Pollution Control Act was the first piece of regulation that endeavored to protect people from “the cloud.”

Information and cloud security can take a lesson (and solace) from the pages of environmental law. Despite early attempts at regulation, air quality grew far worse before it got better. We’ve learned more about the dangers and how to better reduce their effects, and so it is with protecting and managing data.

Protecting data in the cloud

As with environmental stewardship, there should be laws in place that create incentives for implementing strong data security practices in the context of cloud adoption. Once again California took the lead with the passage of landmark data breach notification law, SB1386, in 2002. More state and federal laws have followed, including HIPAA/HITECH, Gramm Leach Bliley, Sarbanes-Oxley, Massachusetts 201 CMR 17, PCI DSS and others.

The problem is that the legislative process unfolds based on the lessons of the past while technology advances with an eye toward the future. Attempts to write novel law that anticipates and remedies the unknown and adverse effects of technological innovation can have unforeseen and detrimental consequences, such as discouraging further innovation or the adoption of needed innovations.

Reliance on regulation can also have the effect of directing resources inefficiently — to appease auditors rather than address problems that need solving. In fact, the nature of the third party audit trade itself would seem to reward a pursuit of repeat business rather than effective compliance.

At the same time, CIOs are paradoxically chartered to be both compliant (ticking the box) and innovative (thinking outside the box). Software-as-a-service (SaaS) adoption is a good example of this double standard. Business units demand tools like Salesforce.com, Marketo and SuccessFactors, but compliance teams and auditors raise red flags over lack of data governance and unclear privacy accountability. To outsiders looking in, the need to keep up with progress is self-evident, but doing so in the context of our current regulatory environment puts those who follow the rules at a clear disadvantage.

Looking back at air pollution regulations, California’s compliance standards actually made it more difficult for Californians to buy low emission diesel cars in the previous decade because the idea of a low emission diesel vehicle was not considered by those who created the law. Therefore, in California, it was illegal to sell cars which polluted less than their standard gasoline powered equivalents simply because they were “diesel powered.” Regulators eventually caught up to the times in 2012 when California passed the LEV (Low Emission Vehicle) III regulations.

It is urgent that CIOs regularly examine the impact of regulation on productivity because, unlike California, enterprise IT can’t afford to wait a decade for compliance to catch up to the needs of their business.

Ionic Security rakes in $40.1M to encrypt your documents

As large-scale data breaches become more commonplace, Ionic Security is betting that encryption is the way to go for enterprises to protect themselves, and its taken in a $40.1 million series C funding round to prove its point. The startup now has a total of $78.1 million.

The basic premise behind Ionic Security is to secure company data — regardless of file type — through encryption so that an organization’s information can remain safely scrambled from prying eyes in case of a break-in or document leak, explained Ionic Security founder and CTO Adam Ghetti.

The Atlanta-based startup’s technology platform is “all about protecting data in such a way so unauthorized users shouldn’t be able to do unauthorized things,” said Ghetti.

Ghetti didn’t go into details on how exactly Ionic Security’s technology does this, citing that the company is still in stealth mode and plans on explaining more of its platform once the product hits general availability and formally launches in the first half of 2015.

Ionic CTO and founder Adam Ghetti

Ionic CTO and founder Adam Ghetti

He did say that Ionic Security is different from other encryption-centric security startups out there like Veradocs (which took in $14 million in a November funding round) in that the company wants to make sure that its encryption technology can play nice with the majority of document and file types in existence; organizations should be able to use Ionic Security’s encryption platform regardless of what operating system or device they want to safeguard.

“Ionic is a platform that doesn’t care about the construct of the data itself,” said Ghetti.

Users have to install a small on-premise component that houses their encryption keys (Ionic doesn’t hold those items), and the rest of the encryption technology is delivered as a software-as-a-service, Ghetti explained.

Meritech Capital Partners drove the funding round along with Kleiner Perkins Caufield & Byers, Google Ventures, Tech Operators and Jafco Ventures. Meritech Capital Partners’s managing partner Mike Gordon will be taking a seat on Ionic Security’s board.

Yup, 2014 was a big year in cloud

2014 was the year in which both Microsoft and Google got serious about their public cloud options and taking on Amazon Web Services directly with their own Infrastructure as a Service and associated services.

That both of these companies have extremely deep pockets is not lost on the market leader AWS which continues to roll out new, and higher-level services frequently. If you’re a cloud deployer or would-be cloud deployer, AWS Re:Invent is a must-attend event.

Long story short: [company]Google[/company] and [company]Microsoft[/company] have made huge strides, but AWS, with its 8-year head start, remains the cloud to beat. It’s a good time to be a cloud customer provided you can track the dueling product releases and price cuts and can manage to keep yourself out of the vendor lock-in that afflicted many IT shops in the past few decades.

Cybersecurity fears grow

The counterpoint to all of the above is that 2014 was also a year that saw scary security breaches including the latest: Anonymous is claiming credit for filching 13,000 passwords and credit card data of users with [company]Sony[/company] PlayStation, Microsoft Xbox LIVE, and [company]Amazon[/company] accounts. And, if you don’t think these attacks don’t put even more fear of God (and cloud) in corporate IT buyers, you have another think coming.

Data security concerns remain the biggest inhibitor to cloud adoption. This is true even though most IT folks who, if they’re being honest, would admit privately that their own on-premises server rooms, are hardly paragons of security. But perception is reality and people are wary of putting valuable data in a cloud they can’t control. This is a problem that will only grow with the new year.

As [company]General Catalyst[/company] Managing Partner Steve Herrod wrote recently:

… As bad as 2014 has been, and it has been bad, we’re just seeing the tip of the iceberg. Given the steady increase in value going through our systems (credit cards numbers, personal information, IP), organized crime and nation-sponsored attacks will continue to rise in quantity and sophistication.

Cloud turf war

This is worrisome for Jane Q. Consumer, but even more so for big IT vendors. all of whom are trying to woo corporate customers to their respective clouds. Legacy giants [company]IBM[/company], [company]HP[/company], [company]Oracle[/company], [company]VMware[/company], [company]Dell[/company], Microsoft all want to keep their existing customers in house and (dare they hope?) win new customers as well. Their well-founded fear — other than that security fiascos will keep people away from cloud altogether — is that a ton of those jobs are flowing to AWS which is somehow both an IT upstart and the industry leader in cloud.

[company]Google[/company] is the wild card here. The company which knows a little something about building massively scaled services, is showing itself to be serious about wooing enterprise customers to its  cloud as well — although there was some skepticism on that front which it has allayed somewhat with new peering agreements and VPN options; the hiring of enterprise savvy execs most notably former Red Hat CTO Brian Stevens; and a Microsoftian-looking partner program.

Brian Johnson onstage at Google Cloud Platform.

Brian Johnson onstage at Google Cloud Platform Live.

The biggest personnel move in cloud this year was the ascension of Satya Nadella to Microsoft CEO after a very public and somewhat painful 6-month search. Now even some Microsoft haters see the company as an up-and-comer in cloud. To be fair most of that hard work was accomplished on former CEO Steve Ballmer’s watch but Nadella is seen as more pragmatic and much less doctrinaire than his predecessor, who exhibited an almost pathological hatred of all things Apple or Google. Nadella, after all, broke tradition to bring Office to non Windows devices, a huge departure for the company.

Microsoft CEO Satya Nadella

Microsoft CEO Satya Nadella

Microsoft already has enterprise customers and partners in spades which could help in its hybrid cloud push. Ditto VMware, HP, IBM. But public cloud kingpin AWS is making a push into that hybrid scenario with products targeting VMware admins their Windows counterparts.

Structure Show: Docker, Docker, Docker

We didn’t do a show on this holiday week, but check out our last podcast with Docker CEO Ben Golub if you haven’t already. Golub addresses how the competitive landscape has shifted with the CoreOS decision to launch Rocket, a container of its own.

[soundcloud url=”https://api.soundcloud.com/tracks/182147043″ params=”color=ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false” width=”100%” height=”166″ iframe=”true” /]

Watchdog group urges FTC to scrutinize latest Oracle deal

The Center for Digital Democracy wants the Federal Trade Commission to really look over Oracle’s proposed acquisition of Datalogix, announced Monday morning.

The combined consumer data gathered by [company]Datalogix[/company] (via partnerships with [company]Facebook[/company] and [company]Twitter[/company] and other sources) along with Oracle’s earlier buyout of BlueKai, another data broker, may give [company]Oracle[/company] just a little too much consumer data for the public’s own good, according to the CDD.

In a quick phone interview, CDD Executive Director Jeff Chester said the regulatory agency needs to look at data privacy issues when considering the competitive aspects of such M&A activity. The CDD has a list of M&A deals that it says indicate a hastening consolidation of data aggregating companies including Axcient’s buy of Liveramp in May; Adobe Systems’ acquisition of Neolane in June 2013, and Oracle’s acquisitions of Eloqua in December 2012 and Responsys a year later..

Per an earlier emailed statement Chester said:

“Through the data it gathers on what we buy, and with its relationship with Facebook and other powerful marketers, Datalogix consists of a online treasure trove of data on Americans. The Oracle deal announced today follows its recent acquisition as well of Bluekai, which holds reams of information on consumers.”

The CDD also pointed out that under a previous settlement with the FTC, Facebook agreed to obtain consumers’ permission before sharing their data. The specter of that agreement surfaced when Facebook bought WhatsApp.

Chester also said:

“[Given] the FTC’s 20-year consent decree with Facebook, and the role that Datalogix plays with the social network, it also must review whether the deal requires additional safeguards under that decree. The growing consolidation of information on every American and whatever we do — regardless of location, time of day, whether we are online or off — should trigger action, as well as soul searching by both policymakers and the public.”

In August the CDD asked the FTC to look into the practices both of data brokers — including Datalogix and Acxiom as well as marketing software companies like Salesforce.com — to make sure they were complying with the Safe Harbor between the U.S. and European Union. That safe harbor provision lets these U.S. vendors “self-certify” that they are adhering to strong data protection standards.

Oracle hopes to use that data to inform its marketing automation software and “data cloud.” In that arena it competes competition with [company]Salesforce.com[/company], [company]Adobe Systems[/company] and others.

Oracle had no comment for this story.

Note: this story was updated at 11:25 a.m. PST with Jeff Chester’s comment and again at 11:51 a.m. PST with CDD’s list of data aggregator-linked mergers and acquisitions..

Apple pays on time

Contrary to its popular image, Apple’s successes have more often come from fixing what’s broken than from building anew. And that puts today’s credit card payment system right in Apple’s wheelhouse.