|Issue:||Africa and the Middle East 2014|
|Topic:||Why software-defined matters?|
|Title:||President & CEO, Co-founder|
George Teixeira is the CEO and President of DataCore Software
Mr. Teixeira creates and executes the overall strategic direction and vision for DataCore Software, a leader in software-defined storage. Mr. Teixeira co-founded the company and has served as CEO and President of DataCore Software since 1998. Prior to that time, Mr. Teixeira served in a number of executive management positions including Worldwide VP of Marketing and GM of the Product Business Group at Encore Computer Corporation, where he also played a major role as the team leader of OEM marketing and sales to Amdahl, IBM, and DEC. His work culminated in the $185 million sale of Encore’s storage control business to Sun Microsystems in 1997. He also held a number of senior management positions at the Computer Systems Division of Gould Electronics.
Software-defined is no longer a promise, but a real-world reality!
Why SDN [Software Defined Networking] is so important is that it addresses today’s traffic patterns based on public and private clouds. It adds flexibility to management and access on demand. In comparison, today’s highly complex networks cannot scale sufficiently to support the explosion of traffic.
To stay competitive, companies must become more adaptive to deal with constant change. The Software-Defined model allows greater agility, flexibility, responsiveness. and productivity gains.
Cloud computing and the move to virtualize everything has caused a major revolution. Virtual data centres and new cloud delivery models have changed business expectations in regards to service levels and how to improve productivity. Infrastructure has evolved, giving us more physical resources to utilize, but a major part of the advancements are due to virtualization breakthroughs at the software layer.
These advances have led us to software-defined technologies of every shape and form. The challenge is to understand the different components of this new revolution, and how they work and fit together.
The Software-Defined revolution:
The hottest topics in IT technology today are Software-Defined Networking (SDN), Software-Defined Storage (SDS), and Software-Defined Data Centre (SDDC). These are all part of a broader trend that the industry is now calling ‘software-defined everything’. The movement towards a software-defined infrastructure is about decoupling the underlying hardware from the software layer that provides the management and functionality. The promise is that by separating the intelligence from the computers, storage and networks, the underlying hardware can become cheaper (commodity buying power) and interchangeable (avoiding vendor lock-in) while the overarching software becomes more feature-rich and flexible.
Software-Defined Data Centre and Software-Defined Everything:
Software-defined data centres hold the promise of improving performance and dramatically lowering costs.
The first wave of improvements was based on the revolution caused by virtualization, today we are already living in a virtual server world; just look around and see how virtual machines and server hypervisors have changed our views on productivity, provisioning and deploying new systems and applications rapidly. Where does this all go, the next step is obvious, we can do for storage and networking what has already happened at the server level.
Software-Defined Storage (SDS) can be simply defined as the abstraction and automation of storage services from the physical storage hardware.
Software is the basis for flexibility and smart storage virtualization and management software can improve the utilization of storage resources so that you optimize and right-size to meet your needs. Hardware-defined by definition is rigid and inflexible therefore it leads to purchasing more than you want since you don’t want to underestimate your needs. Software can also allow the latest innovations like Flash-memory SSDs to be easily incorporated into your infrastructure without having to ‘rip and replace’ your existing storage investments.
In other words, hardware-defined is the mantra for storage hardware vendors who want you to ‘buy more hardware’ and repeat the same process every year versus getting the most value from your investments and ‘future-proofing’ your infrastructure. Software-defined means optimize what you already have, whereas ‘Hardware-defined = Over Provisioning and Oversizing.’
Software is what endures beyond hardware devices that ‘Come and Go’
Think about it. – why would you want to lock yourself into this year’s hardware solution or have to buy a specific device just to get a software feature you need? This is old thinking, and before virtualization, this was how the server industry worked. The hardware decision drove the architecture. Today with software-defined computing exemplified by VMware or Hyper-V, you think about how to deploy virtual machines versus are they running on a Dell, HP, Intel or IBM system. Storage is going through this same transformation and it will be smart software that makes the difference in a ‘software-defined’ world.
Interestingly, while the marketing terms may have changed, the vision and passion that we had in the concept of ‘software-defined’ was what made us found DataCore Software in 1998. We dedicated ourselves to building a pure Software-Defined Storage solution. The original founders and I believed that software-defined was an inevitable proposition. Today, we have over 25,000 software licenses deployed at over 10,000 customer sites around the world so we know the vision has turned to reality. Likewise, the other elements of a software-defined world are also evolving.
Industry leaders such as Cisco and VMWare have already made huge acquisitions in SDN players. IDC claims SDN will grow to US$3.7 billion in three years from 2013’s US$360 million level. What does this mean?
Networking architectures are rapidly evolving towards an SDN reality, and this will eventually completely revamp how we think of a network today. Today, building networks, configuring, or reconfiguring them involves dealing with each switch used manually, which will, under SDN be automated. Server administrators will be able to remotely accomplish network management.
Why SDN is so important is that it addresses today’s traffic patterns based on public and private clouds. It adds flexibility to management and access on demand. In comparison, today’s highly complex networks cannot scale sufficiently to support the explosion of traffic. They are also vendor dependent further challenging standards, interfaces, and the very speed of innovation that will be necessary to keep up.
With all the promise of ‘Software-Defined Everything’, what are the biggest obstacles?
First the mindset – most people think about hardware and systems versus an architecture that is flexible to meet future needs. Hardware vendors tease with just ‘buy another box’ and your problems are over, but the costs and complexity lead to great inefficiencies. Architecture matters.
Simply put, software-defined everything is about using software to bridge not only the technological gaps between ‘separate islands’ or silos that arise from having separate vendor offerings and hardware specific functionality to manage and instead to create an underlying infrastructure – a flexible architecture that can endure to meet future needs – that can be managed holistically as part of the business. Rather than individual elements (compute, storage, and networking), infrastructure will be treated as a set of resources required for specific workloads. In this world, the application, the end user, and hopefully the business are the ultimate masters.
But these changes are more than technological, they are also organizational.
Traditionally teams work in separate domains or departments with different functional organizations – they become ‘separate islands’, now technology comes along and causes change, the teams now also have to change how they work together to take full advantage of the new technologies,. This means things like buying decisions, budgets, reporting structure; service level metrics and processes must be re-thought.
Who participates in the purchasing decision? For example, if your company is buying new storage, in the past it was likely that the storage team made the decision. Similarly, compute folks made the server decisions. But if these must work together, it is likely that people from different disciplines must get involved. Also in a software-defined world the hardware matters less. In the past prior to server virtualization, the name brand was critical, today if an organization decides it is using a ‘software-defined’ approach and chooses for example to go with a software server hypervisor from VMware or Microsoft for example, then the choice of vendor (Dell, HP, IBM, Intel, etc.) becomes more an issue of price or personal preference versus a specific hardware-defined feature.
What is the reporting structure? Who has budget control? Are priorities universal? How are priorities communicated? Budgets for instance create organizational conflicts. If the networking team owns the budget for example, what happens if input from the compute team suggests that the more expensive solution should be purchased?
Who determines right Service level Agreements and performance metrics? In a world of ‘separate islands’, service levels and performance metrics tend to be tied more to availability than end user experience. For example, networking teams might be measured on total downtime in a given period. However, downtime is a poor measure of productivity, response times and user experience. In a software-defined world, performance metrics need to be reconsidered and applied broadly across teams.
IT is moving towards a software-defined future. It can take years to transform an organization and people’s mindsets. Getting the people and processes in place to help drive this kind of change will be critical for any company looking to take advantage of these next-generation technologies. Software-defined everything demands a rethinking of not just technology but about how to organize and redefine processes to achieve optimal business productivity.
SDDC technologies can clearly help optimize datacentres, cloud, and overall IT infrastructures – but it takes more than technology to do so, architecture, new mindsets and organizational rethinking all matter.
Software-defined is no longer a promise, but a real-world reality!
Today we see a growing number of companies in the Middle East relying on Storage-defined architectures with a storage virtualization layer that empowers those organizations to manage and scale their data storage, delivering massive performance gains at a fraction of the cost of solutions offered by legacy storage hardware. Companies like eHosting DataFort, the Mohammed Bin Rashid Housing Establishment and the United Arab Shipping Company (UASC) shipping colossus, based in Dubai are good examples and successful adopters of the Software-defined approach.
UASC, the world’s 3rd largest shipping organization, has combined the enterprise feature set and the management capabilities of the virtualization software layer with the power of fast Flash-based Solid State Disks (SSDs). Ashraf Jamal, UASC’s Data Centre Manager, Dubai, observes: “Whilst SSDs can be up to 100 times faster than SAS hard disk drives, there is high price tag for this performance – up to 20 times higher cost per GB.” So the company decided to use SSD with legacy SATA disk drives, while the Software-defined Storage layer optimizes usage and performance of the different hardware devices through its integrated auto-tiering capabilities. UASC saves by requiring significantly less storage hardware to house, manage and cool, according to Mr. Jamal: “Now our High Perfromance Computing (HPC) environment is affordable, secure and easy to maintain and our critical and core applications certainly run faster and are more manageable through our unified storage management and virtualization layer…”
To stay competitive, companies must become more adaptive to deal with constant change. Information technology and IT infrastructures have to reflect this change and thus the drive to a software-defined model that allows greater agility, flexibility and responsiveness. This along with productivity gains is accelerating the move to a software-defined world, and soon it will no longer be a choice but become a ‘must have’ and a critical competitive advantage.