|Topic:||Network innovation: the age of evolution|
|Author:||Marvin (Eddie) S. Edwards, Jr & Morgan Kurk|
|Title:||President & CEO/ SVP,Wireless|
Morgan Kurk serves as Senior Vice President of the Enterprise Intelligent Building Business Unit at CommScope. Mr. Kurk holds a BSEE from Brown University, an MSEE from The University of Michigan, and an MBA from the Kellogg School of Management at Northwestern University.
Eddie Edwards is President and CEO of CommScope. Mr. Edwards has previously served as President of Alcatel North America Cable Systems and President of Radio Frequency Systems.
The next-generation of mobile networks will need to not simply adjust to greater capacity demands; they will also need to address a multi-user environment. Networks will have to support both the services that human subscribers demand as well as the needs of M2M and the ‘Big Data’ generated by connected devices. The multitude of technologies on the horizon will meet an ever evolving connection of devices to the Internet.
Mobile telephony dates back to the 1940s; however implementation of initial cellular-style systems didn’t happen until the 1970s. Since then, the technology has gone through four major decade-long development and implementation cycles driven by a relentless increase in consumer demand for capacity on a relatively limited amount of wireless spectrum.
After the launch of the first fully automatic mobile network by Nordic Mobile Telephone in 1981, a variety of cellular systems began to be deployed around the globe. The first iteration (or 1G) was based on analogue modulation, and standardized under names such as Total Access Communications System (TACS) and Advanced Mobile Phone System (AMPS). The major challenge for operators in the 1980s was providing wide area outdoor network coverage primarily for people in transit.
1G provided basic voice services with reasonable quality, but as usage increased exponentially, operators became desperate to use their spectrum more efficiently. In the 1990s this led to the development of second generation (2G) digital standards such as the Global System for Mobile Communications (GSM) and Code Division Multiple Access (CDMA).
As mobile phones attained consumer ubiquity, and networks saw their first real data demands, operators again looked to improve the network protocol. Third generation (3G) built on 2G’s CDMA-based digital technology but improved its capacity via wider bandwidths so that new services, such as mobile video calls, and true web browsing would be possible. As with all technologies, every improvement carries some disadvantages. While increasing the data rate and improving the air-interface efficiency of 3G made the system highly desirable, it also decreased the coverage area of cells and limited their ability to penetrate indoors. This resulted in a densification effort outside and the beginning of significant public area site builds.
Bill Gates’ famous, but possibly apocryphal statement that “640K ought to be enough [memory] for anybody” is a great example of how difficult it is to predict the future of technology. The evolution of mobile networks is no exception to this, but it is possible to examine the forces that are driving their development and, perhaps, also learn some lessons from history.
Smaller, faster, cheaper?
On average, a new mobile generation has appeared every 10th year since the first 1G system was introduced in Scandinavia during 1981. A new protocol is typically standardized close to the end of one decade and reaches general acceptance worldwide by the middle of the next. We might, therefore, expect to hear details about the next-generation of mobile networks towards the end of this decade. Steps towards this can already be seen, with Ofcom stating that it will clear the UK’s 5G airwaves by 2018 and launch the spectrum auction as soon as it can after that, “probably within the next couple of years”, according to Ed Richards, its Chief Executive. Meanwhile, the UK’s University of Surrey recently secured £35m for a new 5G research centre, joint funded by a consortium of international mobile operators and infrastructure providers, including Huawei, Samsung, Telefonica and Fujitsu.
It is too early to speculate as to the exact technologies and frequency bands that will be deployed in the next generation of mobile networks as much of the world has yet to deploy the current implementation of 4G. Nevertheless, we can discuss the factors that will be fuelling their future evolution.
User demands and competition rather than obsolescence will continue to be the primary drivers for network evolution. As new devices with new functionality arrive, the network must increase its capabilities, which span from greater bandwidth to greater cell density to new technology.
Subscriber data demands continue to sprint ahead. The number of Internet users is forecast to reach 2.7 billion by 2015, representing over 40 percent of the world’s population with more than half being mobile. With all services collapsing into a single data pipe and high-bandwidth applications like video and mobile gaming becoming ever more popular, operators see a tsunami coming and are looking to develop networks to support greater and greater amounts of data.
Deploying new services can help bolster operator ARPUs by opening up new revenue streams. Many consumers are willing to pay a premium for a good data experience and recent research by Accenture has revealed that 69 percent of all Internet access now takes place via mobile devices. The evolution of future networks continues to be driven by tariff structures. Much as SMS was driven by a low relative cost vs. a comparable phone call, fees to watch direct-to-mobile live broadcasts of events like football matches or concerts are likely to enhance the operators’ bottom line and simultaneously drive network change.
It’s also clear that new mobile devices will push networks to deliver higher bandwidth, lower latency content with the arrival of HD video screens, and high quality unified communications and video conferencing. Meanwhile, other developments that will become integral to handsets, such as mobile wallet services, will tax the network in new ways creating demand for new protocols and security measures.
Rise of the machines
Historically it has been the demands of subscribers that have pushed mobile operators to provide better capacity and coverage. This may be about to change, with many commentators seeing the greatest pressure on future mobile networks as coming from machine-to-machine communications (M2M).
This is a technology trend that has gone by many names – for some it’s M2M, for others, it’s all about ‘embedded’ or ‘connected’ devices and ‘the internet of things’. Whatever the terminology, this technology allows devices to share data with each other without direct human intervention. It is now possible to link almost any type of remote machine or device to critical information systems and gather real-time field intelligence that can be used to improve efficiency, reduce costs, introduce new services and gain competitive advantage. Devices as diverse as utility meters, signs, cameras, remote sensors, laptops and domestic appliances can be connected to support a variety of new uses and achieve increased efficiencies.
While individual connected devices do not, in themselves, transmit large volumes of data, there are likely to be tens or hundreds of connected devices for every human subscriber in the long-term. In the European market the number of connected devices grew by 60 percent over the past year and Gartner sees the M2M market as continuing to grow at a rate of 30-40 percent per year. Thanks largely to the rise of M2M, mobile data traffic is forecast to grow tenfold between 2011 and 2016.
Thus the next-generation of mobile networks will need to not simply adjust to greater capacity demands; they will also need to address a multi-user environment. Networks will have to support both the services that human subscribers demand as well as the needs of M2M and the ‘Big Data’ generated by connected devices. A crucial challenge in this will be prioritising different types of traffic in ubiquitous networks – human to human, human to machine, machine to machine, emergency transmissions and so on.
Over the next ten years we’re going to see a dramatic breakthrough in M2M applications as organisations realise the unheralded potential they represent for revenue generation and improved customer satisfaction. Power companies are already rolling out smart meters that can feed real-time information on domestic and commercial power consumption back to a central database. Similarly, automotive firms are rolling out connected cars to support services like vehicle monitoring and recovery, accident notification and location positioning. Yet, the rollout of these devices is only the beginning. Once they are deployed ubiquitously, their potential is greatly magnified. For example, universal smart meters will allow power companies to intelligently match energy creation to demand, improving services while simultaneously lowering costs and driving efficiencies – giving birth to the ‘smart grid’.
Such revolutions in individual industries will become increasingly common in the next decade and may be considered part of the larger ‘smart city’ transition. In essence, this term predicts the convergence of smart information and communication technologies, intended to improve the efficiency and effectiveness of urban systems and services. Representative technologies that will enhance the intelligence and connectedness of the city include smart meters, sensor networks, fibre optic and wireless communication networks, software to provide data analytics for city services and a myriad of other hardware and software components. This will be a huge influence on mobile networks, with investment in smart city technology infrastructure expected to total US$108 billion from 2010 to 2020.
The multitude of technologies on the horizon will meet an ever evolving connection of devices to the Internet. While today’s connectivity demands – similar to the data needs of SMS – only require data collection mechanisms (i.e. I’m out of stock in a vending machine), we must be aware of the ever-changing landscape. The emergence of innovations such as remotely managed consumer devices could see data loads dramatically change architectures, the process of which would need to be carefully managed throughout the transition to enable the most effective user experience possible.
Today’s subscribers have heightened expectations and will continue to drive focus on customer experience. The more innovative and personalised the technology or service, the better the quality of service required. By planning, optimising and managing networks, operators can ensure a high-level of performance in a scalable environment for the long-term.