The Big Promise of Big Data | Features | ChannelWorld.in

PARTNER HOTLINES

The Big Promise of Big Data

By Joab Jackson, IDG News Service on Mar 15, 2012

For Twitter, making sense of its mountains of user data was big enough of a problem that it purchased another company just to help get the job done. Twitter's success is dependent entirely on how well it exploits the data its users generate. And it has a lot of data to work with: It hosts over 200 million accounts, which generate 230 million Twitter messages a day.

Last July, the social networking giant purchased BackType, a company with software, called Storm, that could parse live data streams such as millions of Twitter feeds. After the acquisition, Twitter released the source code of Storm, having no interest in commercializing the product itself.

Storm is valuable for Twitter for its own operations specifically because it can be useful in identifying emerging topics as they are unfolding, in real time, on the company's service. For instance, Twitter uses the software to calculate how widely Web addresses are shared across multiple Twitter users in real-time.

Such a job "is a really intense computation, which could involve thousands of database calls and millions of follower records," said Nathan Marz, Twitter lead engineer for Storm, who explained the technology in December at a New York conference held by Big Data software vendor DataStax.

Using a single machine, computing the reach of a Web address could take up to 10 minutes. But spread across 10 machines, Marz explained, it could execute in as little as a few seconds. For a company that makes money selling ads against emerging trends, the faster operation can be crucial.

Like Twitter, organizations are finding that they have a great deal of data on hand, and that the data could potentially be used to maximize profits and improve efficiencies -- if they can organize and analyze it quickly enough. This pursuit, made possible by a number of new technologies that are mostly open source is often referred to as big data.

"It absolutely gives us a competitive advantage if we can better understand what people care about and better use the data we have to create more relevant experiences," said Aaron Batalion, chief technology officer for online shopping service LivingSocial, which uses technologies such as the Apache Hadoop data processing platform to glean more information about what their users want.

"The days are over when you build a product once and it just works," Batalion said. "You have to take ideas, test them, iterate them, use data and analytics to understand what works and what doesn't in order to be successful. And that's how we use our big data infrastructure."

Big data getting bigger

Last May, consulting firm McKinsey and Company issued a report that anticipated how organizations would be deluged with data in the years to come. They also predicted that a number of industries -- including health care, public sector, retail, and manufacturing -- would benefit by analyzing their rapidly growing mounds of data.

Collecting and analyzing transactional data will give organizations more insight into their customers' preferences. It can be used to better inform the creation of products and services, and allow organizations to remedy emerging problems more quickly.

"The use of big data will become a key basis of competition and growth for individual firms," the report concluded. "The use of big data will underpin new waves of productivity growth and consumer surplus."

Of course, Teradata, IBM and Oracle, among many others, have been offering terabyte scale data warehouses for more than a decade. These days, however, data tends to be collected and stored in a wider variety of formats and can be processed in parallel across multiple servers, which would be a necessity given the amounts of information being analyzed. In addition to exhaustively maintained transactional data from databases and carefully culled data residing in data warehouses, organizations also are reaping untold amounts of log data from servers and forms of machine generated data, customer comments from internal and external social networks and other sources of loose, unstructured data.

"Traditional data systems simply don't handle big data very well, either because they can't handle the variety of data -- today's data is much less structured because it evolves very quickly, and because [such systems] just cannot scale at the rate it which they must ingest data," said Eric Baldeschwieler, chief technology officer of Hortonworks, a Yahoo spinoff company that offers a Hadoop distribution.

Such data is growing at an exponential rate, thanks to Moore's Law, pointed out Curt Monash, of Monash Research. Moore's Law states that the number of transistors that can be placed on a processor wafer doubles approximately every 18 months. Each new generation of processors is twice as powerful as its most recent predecessor. And, not surprisingly, the power of new servers also doubles every 18 months, which means their activities will generate correspondingly larger datasets as well.

The big data approach represents a major shift in how data is handled, said Jack Norris, vice president of marketing for MapR. Before, carefully culled data was piped through the network to a data warehouse, where it could be further examined. With increasing amounts of data, however, "the network becomes the bottleneck," he said. Distributed systems such as Hadoop allow the analysis to occur where the data resides.

Instead of creating a clean subset of user data to place in a data warehouse to be queried against a limited number of predetermined ways, big data software just collects all the data an organization generates, and allows administrators and analysts to worry about how to use the data later. In this sense, they are more scalable than traditional databases and data warehouses.

How the Internet spurred big data

In many ways, the giant online service providers such as Google, Amazon, Yahoo, Facebook and Twitter have been on the cutting edge of learning how to make the most of such large data sets. Google and Yahoo, among others, had a hand in developing Hadoop. Facebook engineers first developed the Apache Cassandra distributed database, also open source.

Hadoop got its start from a 2004 Google white paper, one that described the infrastructure Google built to analyze data across many different servers, using an indexing system called Bigtable. Google kept Bigtable for internal use, but Doug Cutting, a developer who had already created the Lucene/Solr open source search engine, created an open source version, naming the technology after his son's stuffed elephant.

One early adopter of Hadoop was Yahoo. The company hired Cutting and started dedicating large amounts of engineering work to refining the technology around 2006. "Yahoo had lots of interesting data across the company that could be correlated in various ways, but it existed in separated systems," said Cutting, who now works for Hadoop distribution provider Cloudera.

Yahoo is now one of Hadoop's biggest users, deploying it on more than 40,000 servers. The company uses the technology in a variety of ways. Hadoop clusters hold massive log files of what stories and sections users click on. Advertisement activity is also stored on Hadoop clusters, as are listings of all the content and articles Yahoo publishes.

"Hadoop is a great tool for organizing and condensing large amounts of data before it is put into a relational database," Monash said. The technology is particularly well suited for searching for patterns across large sets of text.

Another big data technology that got its start at an online service provider was the Cassandra database. Cassandra is able to store 2 million columns in a single row, making it handy for appending more data onto existing user accounts, without knowing ahead of time how the data should be formatted.

Using a Cassandra database can also be advantageous in that it can spread across multiple servers, which helps organizations scale their databases easily beyond a single server, or even a small cluster of servers.

Cassandra was developed by social-networking giant Facebook, which needed a massive distributed database to power the service's inbox search, said Jonathan Ellis, the Apache Cassandra project chairman and cofounder of DataStax, a company that now offers professional support for Cassandra.

Like Yahoo, Facebook wanted to use the Google Bigtable architecture, which could provide a column-and-row-oriented database structure that could be spread across a large number of nodes. The limit of Bigtable was that it was a master-node-oriented design. The whole operation depended on a single node to coordinate read and write activities across all the other nodes. In other words, if the head node went down, the whole system would be useless.

"That's not the best design. You want one where if one machine goes down, the others keep going," Ellis said.

So Ellis and his peers built Cassandra using a distributed architecture developed by Amazon, called Dynamo, which Amazon engineers described in a 2007 paper. Amazon first developed Dynamo to keep track of what its millions of online customers were putting in their shopping carts.

The Dynamo design is not dependent on any one master node. Any node can accept data for the whole system, as well as answer queries. Data is replicated across multiple hosts.

To the enterprise

The good news is that many of the tools first developed by these online service providers are becoming more available for enterprises as open source software. These days, big data tools are being tested by a wider range of organizations, outside the large online service providers. Financial institutions, telecommunications, government agencies, utility companies, retail, and energy companies all are testing big data systems, Baldeschwieler noted.

"There is an air of inevitability" with Hadoop and big data implementations, he said. "It's applicable to a huge variety of customers."

So how does an organization start to use its heaps of machine generated and social networking data?

Perhaps surprisingly, setting up the infrastructure will not be the biggest challenge for the CIO. Vendors such as Cloudera, Hortonworks, MapR and others are commercializing big data technologies, in effect, making them easier to deploy and manage.

Rather, finding the right talent to analyze the data will be the biggest hurdle, according to Forrester Research analyst James Kobielus.

Organizations will "have to focus on data science," Kobielus said. "They have to hire statistical modelers, text mining professionals, people who specialize in sentiment analysis."

Big data relies on solid data modeling, Kobielus said. "Statistical predictive models and test analytic models will be the core applications you will need to do big data," he said.

Many are predicting that big data will bring about an entirely new sort of professional, the data scientist. This would be someone with a deep understanding of mathematics and statistics who also knows how to work with big data technologies.

These people may be in short supply. By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions, McKinsey and Company estimated.

Despite these limitations, organizations need to forge ahead just to stay competitive and efficient, said MapR's Norris. As an example, he pointed to Google, which entered the field of Internet search years after the competition did, only to have dominate the market within two years.

"A lot of that was due to the advantages of Google's back-end architecture," Norris said. Big data "is a big paradigm shift that has the potential to change industries."


Latest Features

EDITOR'S PICK

Forecast 2015: IT Spending On An Upswing

As purse strings loosen up, CIOs blend innovation into 2015 IT budgets, but security and cost containment remain top priorities.

‘Security Compliance is Not a Proactive Phenomenon in India’

Pavan Duggal, Cyber Law Expert at the Supreme Court of India, explains why channel partners need to look beyond the IT Act 2000 as the security standards, given today’s fast-changing threat landscape, rapidly evolve.

IT is Indispensable for Business Optimization: David Aires, Intel

David L. Aires, VP, Information Technology Group, and GM, Information Technology Operations, believes security to be the biggest challenge in the current IT environment.

Is the CIO Role Nearing Extinction?

New technologies are shifting power to the hands of the user, endangering the CIO role. But do Indian CIOs consider that a threat or an opportunity? 

The Authentication Market is Big Play for Channels: Gaurav Chawla, Gemalto

We are building a partner network to address the increased demand for authentication solutions across India, says Gaurav Chawla, Director, IAM, Gemalto India.

Versatile Infosecurity: Riding the Security Wave

It takes vision and persistence to stay on top of the security curve. Versatile Infosecurity has mastered that art.

How Futurenet Technologies Helped Sterlite Copper Adopt Next-gen Client Computing

Sterlite Copper was able to successfully adopt next-gen client computing facilities with hand-in-hand assistance from Chennai-based Futurenet Technologies.

DigitalTrack Solutions: Right on the Security Track

DigitalTrack is keeping pace with the changes in the IT security space through DDoS and WAF solutions and is pushing security audits as part of its next move.

SLIDESHOWS

6 Leaders Who Headed for an Abrupt Exit

The abrupt exit of top leaders of Indian and global tech companies this year, with many of them citing ambiguous reasons, surprised the technology world.

Gartner Executive Summary Survey 2014

Gartner's Annual CIO Survey highlights the trends that will drive organizational IT spend in 2014.

10 Overhyped Tech Products That Crashed and Burned

The demos blew everyone away. Then reality hit.

Gartner Executive Summary Survey 2014

Gartner's Annual CIO Survey highlights the trends that will drive organizational IT spend in 2014.

ChannelWorld Survey: State of the Market 2014

Partners poll their sentiments, expectations, pain points, and challenges for the coming year.

FAST TRACK

Mudra Electronics

A vendor-agnostic strategy helped us sustain business, says Bharat Shetty, CMD, Mudra Electronics.

Systematix Technologies

Our USP is a customer-friendly approach backed by services, says Akhilesh Khandelwal, Director, Systematix Technologies.

CorporateServe Solutions

Our ability to turnaround complex ERP projects in record time is what gets us customer referral, says Vinay Vohra, Founder & CEO, CorporateServe Solutions.

KernelSphere Technologies

We are emerging as an end-to-end systems integrator, says Vinod Kumar, MD, KernelSphere Technologies.

Uniware Systems

We constantly validate emerging technologies for first-mover advantage, says Vergis K.R., CEO, Uniware Systems.

Astek Networking & Solutions

An innovative approach helps us stay successful, says Ashish Agarwal, CEO, Astek Networking & Solutions.

CSM Technologies

Our approach is backed by innovation and simplicity, says Priyadarshi Nanu Pany, CEO, CSM Technologies.

ETSC Computers

We want to be recognized as a complete solution provider, says Kailash Gupta, Director, ETSC Computers.

VIDEOS

Arun Parameswaran on VMware’s Cloud, Mobile, SDx Strategy

Arun Parameswaran, MD, VMware India, talks about transformation, strategy, roadmap, and VMware’s role in driving the shift to cloud, mobile, and SDx.

Parag Arora, Citrix: Our Portfolio Will Augment Our Strategy

Parag Arora, Area Vice President, Citrix India, elaborates on his action plan for the company after taking over operations in India.

Shibu Paul, Array Networks: ADN is a Great Business Opportunity for Channels

Shibu Paul elaborates on how Array Networks is empowering its partner ecosystem to address the modern datacenter challenges in India.

Scott Robertson, WatchGuard: We are an End-to-End Security Solutions Company

Scott Robertson of WatchGuard elaborates on the company’s partner roadmap in India and its subsequent shift in the security space.

Gaurav Ahluwalia, R&M: Channels Will Accelerate Our Datacenter Business

Gaurav Ahluwalia of R&M speaks on the company’s renewed focus to build its channel ecosystem and address the datacenter demands of India Inc.

Venkat Murthy, 22by7 Solutions: Real Value is in Solutions

Venkat Murthy, Prime Mover, 22by7 Solutions, elaborates on the need to look at a solutions approach rather than a mere hardware approach.

What Channel Partners Can Learn from a Sahara Adventurer

Steve Donahue, a desert adventurer and a best-selling author, takes experiences from this travels in the Sahara and turns them into lessons for channel partners, as they navigate the shifting sands of today's business and IT environment.

Rahul Agarwal, Lenovo: Profitability and Value Proposition are Vital

Rahul Agarwal, executive director, Commercial Business Segment, Lenovo India, talks about Lenovo’s renewed channel strategy and why the company is now an attractive proposition for its partners.

EMC PARTNER SHOWCASE

Partnering for Profitability

Atul H. Gosar, Director, Network Techlab, shares how the company’s association with EMC has provided it with a competitive edge and a wide customer base, leading to increased profitability.

Sponsored Content

Promising Pipeline

Venkat Murthy, Prime Mover, 22by7 Solutions, shares how EMC brings in competitive edge by enabling technology, GTM and lead generation, helping 22by7 acquire new customers and retain old ones.

Sponsored Content

Powerful Performance

Deepak Jadhav, Director, VDA Infosolutions, says initiatives by EMC around training and certification have helped the company’s staff improve its performance and enhance customer experience.

Sponsored Content

Performance Booster

Rajiv Kumar, CEO, Proactive Data Systems, says that the solution provider’s association with EMC has helped expand its customer base and added value to existing offerings.

Sponsored Content

Pursuit of Profitability

Santosh Agrawal, CEO, Esconet Technologies, shares insights on how the systems integrator’s association with EMC has spelled sustained success over the years.

Sponsored Content

Non-Performance is Not an Option

Nitin Aggarwal, Director, Trifin Technologies, shares insights on how the association with EMC has helped the system integrator stand out and empowered its personnel to deliver consistent performance.

Sponsored Content

STRATEGIC DIRECTIONS 2014

Driving IT to Make an Impact: IDC

IT is being increasingly viewed as something which would help drive revenue rather than just another cost line-item.

Software-Defined Infrastructure: Forrester

Firms must invest in transforming infrastructure to eradicate complex infrastructure to keep pace with business needs.

Better Safe Than Sorry: PwC

Organizations should create a culture of security that starts with commitment of top executives and cascades to all employees and third parties.

New Skills for a New Era: Gartner

A new talent strategy is required—one that is a key part of the evolving IT strategy and one that focuses on a blend of business and modern IT skills.

The Rise and Growth of Big Data: Ernst & Young

Leading organizations are reaping rich rewards on their investment in big data even as competition struggles to keep pace.

SOCIAL MEDIA @ CW India
SIGNUP FOR OUR NEWSLETTER

Signup for our newsletter and get regular updates.