Software-defined is a concept that refers to the ability to control some or all of the functions of a system using software. The concept is sometimes incorrectly characterized as a buzzword or marketing jargon, when in fact it has a clear meaning that needs to be understood by organizations looking to keep pace with change.
When technologies become software-defined, there are major systemic benefits for organizations that use them, including lower costs, higher quality products and services, and less risk.
At the same time, software-defined technologies require major organizational changes for incumbent enterprises to adopt and use effectively. This often involves expensive and risky transformation projects that reengineer the value stream to take advantage of decoupled components, reduced dependencies and new management capabilities.
Today we will look at the origins of the “software-defined” concept and how its application presents both opportunities and challenges to the enterprise.
The beginning: ‘Software-defined Radio’
The software-defined concept comes to us from the evolution of radio transmission technology. A traditional radio communications system uses physically connected components that can only be modified through physical intervention. The antenna connects to the amplifier, which connects to the modulator, and so on. Operators are locked into the specifications of the components, the order in which they are connected, and whatever controls they expose. It’s an extremely inflexible technology and changes are best done by simply buying a new system.
As you can imagine, for businesses that operate large-scale radio deployments such as wireless telecom providers, technology decisions are hugely impactful. They can last decades and demand large upfront planning and capital costs. Keeping pace with change is extremely expensive and difficult.
In the mid-eighties however, researchers began to take specific components of the radio and make them digital, implementing functions like oscillators, mixers, amplifiers and filters by means of software on a computer. By emulating these functions in software, the system becomes adaptive and programmable, and can be configured according to the needs and requirements of the operator, rather than the specifications or the manufacturer.
In 1995, the term Software-Defined Radio (SDR) was coined to describe the commercialization of the first digital radio communication system, and this development changed the way these services and products can be delivered.
On the technical side, in becoming software-defined, many functional limitations are removed from radio systems. For example, by simply reprogramming the software, a device can have its frequency spectrum changed, allowing it to communicate with different devices and perform different functions. This has enabled a quick succession of technical advances that were previously the domain of theory and imagination, like ultrawideband transmission, adaptive signalling, cognitive radio and the end of the “near-far” problem.
On the business side, the changes are equally profound, having a significant impact on the value stream of enterprises throughout the wireless and radio industry, and the industry itself. A wireless telecom provider employing software-defined radio can easily add new features to its network, adapt its systems to take advantage of new spectrum bands, or reconfigure itself when a new handset technology like LTE 4G becomes available. A telecom provider able reconfigure its infrastructure by deploying updates to software rather than by buying new hardware can take advantage of huge operational savings while eliminating capital expenses.
SDR therefore provides significant strategic advantage to these businesses, introducing adaptability, modularity and agility to the organization where it was previously rigid and inflexible.
Taking advantage of SDR, however, is a long, transformational process, needing a lot of capital and a significant departure from the status quo. Not only does it require changing all infrastructure over to the new technology, but it also requires the business to think differently and reengineer the value chain to take advantage of the new capabilities.
Software-defined Infrastructure
The IT industry has also been deeply impacted by the advent of software-defined technologies. The following examples have created industries and enabled a generation of evolved products and services:
- Hypervisors – A hypervisor is an operating system that runs virtual machines, like VMWare ESXi or Microsoft Hyper-V. It runs directly on the physical machine, abstracting and distributing the hardware resources to any number of virtual machines. This has undoubtedly been one of the largest and most impactful advances in IT in the last 20 years, ushering in the era of point and click server deployment and changing the way we manage and deliver IT services.
- Software-defined Networking (SDN) – Traditionally, operating a network means managing lower level infrastructure that allows devices to connect, communicate with each other, and figure out where to send their packets. These switching devices – called “layer 2 devices” – each need to maintain their own state and configuration information, and make decisions about how to route packets based only on limited, locally available information. SDN abstracts layer 2 networking, and is the ‘secret sauce’ behind cloud computing – a critical functionality for all public cloud services including AWS, Azure and OpenStack-based providers. It allows the service provider to centralize routing and switching, and provides the orchestration capability required for large-scale multi-tenancy i.e. the ability to create and manage millions of logically isolated, secure networks.
- Network-function virtualization (NFV) – Building upon SDN, NFV allows services like load balancers, firewalls, IDS, accelerators, and CDNs to be deployed and configured quickly and easily. Without NFV, to operate infrastructure at scale you would need a lot capital investment and an experienced team of highly specialized network engineers. NFV makes it easy to deploy, secure and manage these functions without having to understand the complexities underneath the hood.
“Software-defined” Defined
Having looked at where the concept came from and a few examples of modern software-defined technologies, I propose the following definition for what it means to be “software-defined”:
Software-defined means some or all of the functions of a system can be managed and controlled through software.
Some key attributes of a software-defined technology:
- The functions are abstracted
- Software-definition strives to have stateless functions i.e. functions that do not maintain their configuration or state themselves. State and configuration information is maintained outside the function, i.e. in the software. By decoupling the state and configuration from the function and centralizing it, we gain adaptability, resilience, and the benefit of visibility at scale.
- Software controls functionality
- No direct operator or human intervention is required for the function to operate – functions are managed solely through software. Management and administration are therefore decoupled from the function. We gain the ability to automate processes and activities, and manage the system independently from functional limitations.
- Functional components are modular
- The software layer operates independently from any dependency on functional components. This means the functional components can be commoditized, modular and scalable. We can easily change or replace these components without disrupting the system.
Adoption through Transformation
On the face of it, software-defined technologies are better-faster-stronger, and companies that use them will have a competitive advantage over those that do not. They lead to lower costs, higher quality and less risk for the business. Smaller organizations building products and services that leverage these technologies can use them to disrupt incumbent enterprises.
For those enterprises, however, especially those locked in the middle of a legacy lifecycle, software-defined technologies present a significant challenge. Adoption requires rethinking the value stream and integrating with legacy systems. As stated in the book Lean Thinking,
“We are all born into a mental world of ‘functions’ and ‘departments,’ a commonsense conviction that activities ought to be grouped by type so they can be performed more efficiently and managed more easily” (p. 25)
Not only do software-defined technologies present a threat to the enterprise in the hands of startups, they also explicitly change the way functions and activities are organized and managed. Adoption demands rethinking how, where, when and by whom functions should operate in the value stream. This entails changing culture, reorganizing roles and team structures, and reengineering the value stream. This kind of change must be driven through risky and expensive transformation projects.
Traditional enterprises face major challenges with developing competencies for new software-defined systems. Leaders in these organizations will need to be highly flexible and open to a paradigm shift in how they think about their work.
Becoming software-defined is not just limited to hardware, however. In my next article we will look at an example of how software-defined technology can be used to emulate processes, and use the concepts we have learned here to gain a better understanding of what impacts this will have on an organization.