Home North AmericaNorth America I 2014 Big Data in the Zettabyte Era

Big Data in the Zettabyte Era

by Administrator
Mike HummelIssue:North America I 2014
Article no.:14
Topic:Big Data in the Zettabyte Era
Author:Mike Hummel
Title:Co-Founder & CEO
Organisation:ParStream
PDF size:217KB

About author

Mike Hummel is the co-founder and CEO of ParStream. He previously co-founded Empulse, a portal solutions and software consulting company now specializing in Web 2.0 projects. Mike began his career in managing large-scale software integration projects serving logistics organizations at Accenture, Germany.

He holds a degree in Electrical and Electronic Engineering from University of Hatfield, UK; a degree in Computer Integrated Manufacturing from Cranfield University, UK; and earned a diploma in Technical Computer Science in Esslingen, Germany.

Article abstract

“Forget size, speed is what matters” – fast data means not just swift downloads, but also timely information based on dynamic data. Analysis of big data has been hampered by slow tools and procedures that transfer data to central repositories, instead of performing it locally, in real time. As an example, performing analytics at each cell tower avoids transporting the vast amount of accumulated records and provides fast response to current network events. This timely analytics offer fresh insights on users’ behaviour and needs, enabling communication service providers to compete on every customer by improving their experience.

Full Article

Benjamin Franklin said that there are only two certainties in life: death and taxes. That famous quote may need to be updated for the 21st century. One sure-fire certainty in today’s ‘always connected’ world is that there’s always data being generated. And lots of it. Harnessing the power of this data can result in amazing opportunity for businesses, but it can also be their hardest challenge yet.

Here’s a quick snapshot of what happens in 60 seconds: 2,000,000 Google searches; 680,000 Facebook updates; 300,000 Tweets and American consumers are estimated to have spent around US$272,000 in shopping. If this all happens within the span of a minute, imagine the amount of data being generated over weeks, months and years.

The rise and rise (and rise) of data

The plethora of devices and gadgets connected to the Internet mean that every click, swipe and tap is producing data. No wonder then that some estimate the world to possess about 1.8 zettabytes of data. To put it into perspective, that’s about 250 billion DVDs. Happy viewing!

It has also been estimated that around 90% of data in existence today was generated in the last two years. So the amount of data that will be generated over the next couple of years will be exponential to today’s volume. In fact, it is predicted that the amount of global data could rise by as much as 50% year-over-year. Needless to say, all this provides organisations with major opportunities if they know the proper method of leveraging the power of the data. Along with the opportunities, there are also challenges. The good news is that a number of businesses appreciate the mountain of data they are sitting on and are looking to analyse it, in order to gain a competitive advantage.

Telecommunications in zettabytes

One industry that certainly has vast amounts of data is telecommunications. From texting and phone calls to people’s online shopping habits, and from how we interact with social media to watching streaming videos and downloading music, mobile devices have provided communications service providers (CSPs) with a wealth of readily-available data from their networks. Mining this information can help CSPs drive new revenue streams, reduce churn and maximise operational efficiencies.

CSPs have to manage hundreds of terabytes of data that is being generated each day in the network so analysing all this data in real-time is easier said than done. It is a known fact that operators collect vast amounts of data at cell towers. However, most operators currently transfer data to a central data warehouse for analysis, which is time-consuming and resource-consuming. Since traditional databases provide limited import bandwidth, operators can only “sample” the data since they can only access and analyse a fraction of the data at any one time. Traditional database platforms are only able to analyse a finite amount of data within a specific period of time, making real-time response and analysis on newly imported data virtually impossible. That’s because traditional database tools were not designed to manage data in the zettabyte era.

When every millisecond counts

CSPs have multiple sources of structured and unstructured data. Structured data refers to data that is organized in a pre-defined manner. There’s a systematic method of how this data is recorded and accessed, and has the advantage of being easily queried and analysed. Unstructured data refers to information that doesn’t reside in a row-column format. Most organizations naturally have semi-structured data – a combination of structured and unstructured data – which is the main source of actionable intelligence.

CSPs are able to derive incremental value by gaining insights from analysing massive amounts of seemingly unrelated information. From customer activity such as churn and cross-sell, to merchant activities such as mobile marketing campaigns, and infrastructure events such as dynamic bandwidth control and network monitoring, gathering insights from the data drives revenue, decreases costs, and ultimately improves profit.

It might sound cliché, but in telecommunications, time is money. Today’s CSPs operate in an ultra-competitive marketplace where every customer and every second matters. That’s why it is more important than ever for CSPs to have access to the latest data to support their decision-making.

Thankfully, it’s no longer the norm to make queries and receive results in a few hours. Data scientists can now analyse data while continuously importing new data to produce real-time results. It is worth bearing in mind that most existing big data analytic platforms cannot import and analyse data at the same time. Platforms in the future need to combine what they are doing now – analysing stored data – and combine it with real-time analytics to really have a clear picture of what the data is saying.

One area where CSPs can learn how to manage and analyse data is science – fast data is crucial as data can be generated in vast amounts very quickly. An example of where science is generating vast amounts of data can be found at CERN (European Organization for Nuclear Research). In the last 20 years CERN has generated over 100 petabytes, 75% of which has come from the last three years. So uncovering the mysteries of the universe is going to require big data analytics and fast data analytics. The tools that are required for CERN will eventually find themselves in other areas of science, for instance imagine how cancer treatment can benefit from analysing the data from thousands of patients, and the break through this can offer. It is therefore not only CSPs that can benefit from fast data.

Forget size, speed is what matters

Big Data, as fast data, can be seen to be of great importance for science, it also offers many advantages that have a day to day impact on people’s lives. As Google has predicted that more people have access to a mobile phone than a toothbrush, our mobile lives are extremely important. So, how can CSPs utilise Fast Data to drive their revenues and improve user’s experience?

Fact is, if operators turned their data into analytical insights faster, they could generate significant premium revenues and optimize cost-savings. For example, Fast Data can provide a telecom operator OTT (Over-the-Top) revenues through premium analytical services for business partners used in geo-fencing, re-targeting, etc. Analysing data at each cell tower, or radio access controller, locally rather than sending it all to a central data warehouses for analytics, makes data available earlier and frees up network bandwidth

Many operators choose to analyse technical network data, such as performance and capacity. Adding customer data, correlated with network data, can enable imperative insights. Customer data also let operators provide customer service in real-time. An operator that uses Fast Data to analyse customer behaviour will have instant information about their customer’s current needs and the service quality provided, and operators can react immediately if there is a misalignment. Customers demand instant satisfaction – Fast Data gives operators the means to provide it.

The advantages for network users can be seen improved service, with networks able to know their habits, and tailor the network to match their needs. This could mean discovering that they need more cells in area than another to deal with an increase in usage, or that during certain times they need to optimize their network as there is greater strain on the network. Anyone who has been to a major city and tried to use their data allowance has come across painfully slow download and uploads speeds. Big data and fast data can help solve this by highlighting areas that need attention and giving the operator the best possible picture of how their network is working. In the future, network users will benefit from this information and be able to offer an improved service, at the lowest possible cost.

In data we trust

No wonder some people say that data never sleeps. Interestingly, it is not just organisations that place their faith in data to find answers to complicated questions. The famous statistician, W. Edwards Deming was quoted as saying: “In God we trust. All others must bring data”.

In the 21st century we will have to update Benjamin Franklin’s finding – not only death and taxes, but also data is a certainty in life. It is time for CSPs to harness the full potential of data to power their futures.

Related Articles

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More