Webcast and Easier Tools Aim to Demystify Hadoop

by Ostatic Staff - Jul. 22, 2014

Hadoop is steadily making its way into many enterprises, thanks to its ability to surface unique insights from very large data sets. It power and success as an open source platform are a direct result of the fact that it can perform analytics that go beyond what traditional analytics platforms are capable of.  All of this came to the fore at the Hadoop Summit held recently in San Jose, California.

At the same time, there is a big need for more understanding of how the Hadoop platform works, and how tools in the Hadoop ecosystem work. These topics are on the agenda for a July 23 webcast from the CTOs of Cloudera and Splunk. 

You can sign up for the webcast here, and here are more details on new bridges to understanding and working with Hadoop.

In a recent announcement, Cloudera, Dell and Intel said they will launch a dedicated Dell In-Memory Appliance for Cloudera Enterprise, to be known as Dell Engineered Systems for Cloudera Enterprise. It's basically an integrated appliance solution that can make advanced Hadoop-driven analytics easy to implement in data centers.  These moves and the upcoming webcast with Splunk are part of Cloudera's effort to make Hadoop more approachable for enterprises.

“The market opportunity for companies to gain insight and build transformative applications based on Hadoop is tremendous,” said Tom Reilly, CEO of Cloudera, in a recent statement. “Clearly, demand is accelerating and the market is poised for growth."

As GigaOM notes, regarding the upcoming webcast:

"The CTOs of Cloudera and Splunk will talk about how their companies are working together to make it easier for everyone in an organization to rapidly explore, analyze and visualize raw unstructured data in Hadoop. You don’t need a Ph.D. in math or computer science to get started and be successful in quickly turning raw big data into refined business insights."

Enterprises interested in demistifying Haddop may want to look into Splunk's Hunk offering.  According to the company:

"Whether you're using Hadoop or NoSQL data stores, getting value and insights out of your data is difficult. Traditional analytics tools aren't designed for the diversity and size of big data sets. And your data's becoming too big to move to separate in-memory analytics stores. In short, gaining meaningful insight can often take months and require specialized skills."

"Hunk goes beyond typical data analysis methods and gives you the power to explore, analyze and visualize data, all from one integrated platform that you can use in minutes, not months. With Hunk you can detect patterns and find anomalies across terabytes or petabytes of raw data in Hadoop and NoSQL data stores without specialized skills, fixed schemas or months of development.
Big data doesn't have to be a science project."

"Whether this is your first Hadoop or NoSQL cluster, or your organization runs production clusters with thousands of nodes, Hunk delivers dramatic improvements in the speed and simplicity of getting insights from big data. To find out more, visit the Hunk tutorial."

Approachable, easy front ends for working with Hadoop, which make it easier to sift large data sets, are also appearing. Talend, which offers a number of open source middleware solutions, has one, and Microsoft is making it easier to work with Hadoop from the Excel spreadsheet.

Talend Open Studio for Big Data, which provides a front end for easily working with Hadoop to mine large data sets, is available under an Apache license. Meanwhile, Hortonworks has worked with Microsoft to link the Excel spreadsheet to Hadoop, according to Computerworld.

If you or your organization have been interested in working with Hadoop, the tools for doing so are becoming more varied and more approachable. As we noted here, Hadoop skills are very highly valued in the tech job market at this point, and we have also written about Hortonworks University, which focuses on teaching Hadoop skills. You can find a class near you and register here.