Predicting technology trends is a challenging task—even more so coming out of a year where the world flipped in unforeseen ways. In 2020, a year that was anything but typical, circumstances accelerated years’ worth of digital transformation in mere months, bringing a profound and sustained impact to how we live and work. COVID-19 has expedited digital adoption in every business across every industry. It has also highlighted the critical role digital infrastructure and technology play in enabling business success.
Technologists across Equinix recently sat down to discuss what we believe are the biggest technology predictions and digital infrastructure trends for the coming year and beyond. Here’s what we think will happen.
Cloud-native infrastructure will dominate
It’s no secret that traditional infrastructure was not designed to meet the demands of today’s digital business.
Digital business is increasingly powered by modern software stacks and extensive use of open-source and cloud-native technologies. Put simply, cloud-native is a software development approach that promotes the use of cloud computing technologies and tenets such as microservices, API-first, containers and DevOps, as well as related capabilities such as container orchestration (e.g., Kubernetes), service mesh (e.g., Istio) and immutable infrastructure.
Together, these technologies empower organizations to rapidly build, run and orchestrate scalable applications that can be distributed and deployed globally, typically leveraging a hybrid multicloud architecture. These distributed deployments have increasingly stringent requirements in terms of latency, availability, performance and agility. They rely heavily on infrastructure that offers self-provisioning, autoscaling and self-healing capabilities through software.
Digital infrastructure matters more than ever. Years of digital transformation have essentially taken place in mere months, and the trend will only accelerate. IDC predicts that by the end of 2021, “80 per cent of enterprises will put a mechanism in place to shift to cloud-centric infrastructure and applications twice as fast as before the pandemic.”
This shift represents a fundamental change to how traditional infrastructure was designed.
Modern digital infrastructure (data centre, network and hardware) should be fully abstracted through APIs and orchestrated through software. This approach empowers application developers to deploy and manage distributed infrastructure at software speed, so they can focus on what’s important—innovating and building great apps. Such abstractions require building real-time observability into infrastructure state and developing programmatic interfaces through which the desired state can be defined declaratively, for any component, or any combination of components, end-to-end, from the edge to multicloud.
With such a framework in place, deploying and managing distributed infrastructure comes down to building a closed-loop, adaptive distributed system. The only way this can be achieved effectively at scale is through software and open technologies.
Our vision for software-defined infrastructure is that anything that can be automated should be automated through software. The ability to virtualize and/or containerize and abstract workloads from underlying physical devices has given rise to shifting paradigms such as infrastructure as code and immutable infrastructure, permitting rapid deployment of infrastructure resources and faster implementation timeframes, especially in a hybrid multicloud environment.
2021 will bring a proliferation and an accelerated adoption of cloud-native technologies across virtually every layer of the infrastructure stack, as well as for digital infrastructure orchestration from the edge to multicloud.
Edge-first paradigm will fuel innovation
According to distinguished VP Gartner analysts Nick Jones and David Cearley, more than 50 per cent of enterprise-generated data will be created and processed outside the data centre or cloud by 2023. That’s up from less than 10 per cent in 2019. In a world that is increasingly living and working at the edge, computing continues to move—at an unprecedented pace—away from centralized data centre to a distributed, interconnected infrastructure positioned at edge locations proximate to data creation and consumption sources.
Whether it’s video conferencing, collaboration tools, streaming, gaming or ridesharing, today’s modern applications are increasingly architected from the ground up for automated and elastic deployment at the edge. There, vast amounts of data originating from multiple sources must be processed quickly. The edge is also where many applications and microservices must interconnect with low latency to deliver the best possible user experience. Deploying distributed applications across multiple edge locations and infrastructure tiers, from the edge to multicloud, requires a thorough understanding and evaluation of architectural tradeoffs, including the design of availability zones, distributed service meshes, traffic management, data pipelines, security, caching and state management (stateless versus stateful)—to name a few.
Moreover, as compute and data shifts to the edge, new edge-specific infrastructure constraints will arise. These include capacity and availability requirements related to footprint, power, network, compute and storage hardware, as well as needs for modularity and extensibility, multi-tenancy, fully automated operations (NoOps) and availability zones spanning several data centre. These, and more, will need to be well understood and optimized concurrently.
In 2021, we will see continued momentum in edge-first deployments and a wave of technology innovations across the infrastructure stack to address the increased complexity of reliably scaling and orchestrating distributed infrastructure at the edge.
5G will be powered by optimally placed edge infrastructure
5G represents a major technological inflection point. We predict that over time, 5G will be to wireless what broadband was to wireline.
2021 will be the year enterprises consider 5G in their infrastructure deployment planning. High-performance 5G capabilities require physical infrastructure that optimally extends into the edge. By placing applications and “fixed-end” IT environments proximate to 5G access and core functions in cloud-adjacent, richly interconnected data centre, enterprises can reap the benefits of this powerful new technology. The combination of new digital infrastructure and existing macro-edge data centres will form a powerful architectural model characterized by enormous amounts of data and distributed computing resources available at lower latencies. This enables novel computational paradigms for new use cases that couldn’t previously take advantage of such advances.
As an access network technology, 5G will provide wider area coverage, greater reliability, higher bandwidth and better security. It will offer an always-on, ubiquitous experience, with significant improvements in capacity and performance—including 100 times faster data rates (multi-Gbps), very low radio-access-network latency (down to 1ms) and high device density. Such capabilities will open new opportunities and novel possibilities for robotics, drones, autonomous vehicles, telemedicine and tactile internet, to name a few.
Delivering on the true vision and promise of 5G won’t be easy. As global momentum for 5G buildouts grows, success will rely increasingly on creating a rich digital ecosystem of producers and consumers, as well as on optimizing the multivariable function of the underlying infrastructure substrate, including spectrum, radio access network, edge data centres, transport networks, hardware and interconnection.
Today, approximately 80 per cent of the U.S. urban/metropolitan population can be connected to Equinix data centres within a 10-millisecond network round trip time. These macro-edge data centre campuses are richly interconnected to public clouds, network providers, content providers and enterprises, producing a framework that allows 5G traffic to directly and locally break out. This enables the “fixed-side” ecosystem to efficiently connect to the 5G network.
A particularly exciting 5G capability that can enable new business models and use cases is network slicing, which allows architects to create and manage interconnection between various components on the same physical network for specific requirements such as latency, throughput or security. 5G-enabled applications will need to interconnect to resources that span the internet, public or private clouds, and edge compute workloads, preferably placed in proximity to the user plane function (UPF).
Advanced 5G use cases will require a fundamental change in underlying infrastructure before they can scale. For Equinix, making 5G a reality begins with leveraging existing infrastructure footprint and capabilities, while identifying and planning for future use cases that can benefit in a meaningful way from potential expansion of neutral, multi-tenant physical infrastructure deployments farther out to the edge. We believe that data centre and physical infrastructure for 5G should be modern, scalable, flexible, interconnected, neutral and multi-tenant.
AI will be distributed and move toward the edge
AI is certainly not a new concept, but with advances in both machine and deep learning, AI is poised to transform virtually every industry, just as electricity did some 100 years ago. According to IDC, by 2022, 80 per cent of organizations that shift to a hybrid business model will spend 4x more on AI-enabled and secure edge infrastructure to deliver business agility and insights in real time.
The amount of compute used in the largest AI trainings has increased exponentially; in fact, it’s doubling almost every three-and-a-half months, and AI algorithmic efficiency is doubling every 16 months. These metrics drastically eclipse the two-year doubling period of Moore’s Law. Such significant rates of improvement in both hardware and algorithmic efficiency enable more AI workloads to be executed with less hardware and fewer processing resources. This rate of change will only increase in 2021 and beyond, as the adoption and pervasiveness of AI expands across every industry and organization.
In a typical AI workflow, large amounts of data are collected and pre-processed for modeling. The training models are then used for prediction or inference and can be iteratively fine-tuned. Public clouds have traditionally been an attractive place to deploy AI, as AI algorithms and trainings work best with large datasets and compute clusters that can auto-scale. However, for an increasing number of use cases, there is a need to deploy AI in a distributed manner, and at the edge.
In these cases, an additional set of stringent requirements related to latency, performance, privacy and security require that some AI data and processing—for both inference and training—be proximate to users and sources of data creation and consumption.
When it comes to running AI training workloads at the edge, there are considerations and trade-offs that must be taken into account. These include power, performance, data privacy, data security, data gravity and aggregation, and simplicity. Similarly, for inference at the edge, considerations include latency, availability, device resources, data privacy, data security and aggregation.
In 2021, we will see an accelerated pace of AI deployments at the edge for both AI training and inference, along with enhanced as-a-service capabilities for infrastructure deployment automation and orchestration of hybrid multicloud AI environments.
Data centres will shift toward grid-positive
As the global climate crisis deepens, leading organizations are shifting corporate sustainability goals from avoiding negative impact toward creating positive change. At a minimum, companies are crafting strategies aligned with the goals of the Paris Accord and recognizing the urgent need to decarbonize global economies.
IDC predicts that by 2025, 90 per cent of G2000 companies will mandate reusable materials in IT hardware supply chains, carbon neutrality targets for providers’ facilities. They will also lower energy use as prerequisites for doing business.”
To date, many data centre companies manage their energy consumption and corresponding carbon emissions through design innovations and energy efficiency measures. Several operators have gone a step further by committing to 100 per cent renewable energy and carbon neutrality, with some aligning to the European Union’s (EU) Green Deal calling for complete carbon-neutrality in data centre by 2030. In 2021, the digital economy is expected to continue expanding and accelerating, placing data centre in a key position, with responsibility to drive a positive environmental impact.
In 2021, we will see movement toward the first major “grid-and-sustainability”-positive data centre projects.
There are a variety of potential ways that data centres can make an environmentally positive impact. From indirect opportunities to influence the development of hosting platforms that can accurately combine weather forecasts, patterns of usage and demand, and capabilities such as load shedding, to specific actions including leveraging large, on-site energy storage solutions providing flexible, instantaneous power sources, or even utilizing waste energy in the form of heat to displace local energy demand.
The move from neutrality to positive impact will require a technology-driven approach. It will also require building both global and local ecosystems of highly interested parties. Long term, the convergence of this trend and the emergence of next-generation applications requiring ultra-low roundtrip latencies will result in a change in data centre locations. The next generation of data centres will be decentralized and integrated into communities, serving as resilient ecosystems for compute, connectivity, power and heat.
With an increased focus on sustainability comes a shift toward open data centre infrastructure standards – from design and operation to power management to next-generation fuel cells and cooling. This trend will accelerate data centre innovation and play a vital role toward grid positivity by reducing the considerable barriers that equipment providers face in developing platforms to serve mission-critical data centre facilities.
A confluence of factors—including advancement in enabling technologies (in part through more open hardware platforms and better interoperability between vendors); increasing urgency in resolving climate change through the development of renewable energy sources and integration into wholesale power markets; the associated challenges of storage and new platforms requiring ever-lower, end-to-end latency; and the need to drive compute and network resources closer to the edge—will spawn a new generation of grid-positive data centre projects.
Shaping digital infrastructure for the future
While we hope 2021 will be more predictable and less surprising than 2020 was, it’s quite clear that there’s no going back to the way things were. Digital growth and acceleration are here to stay, and with that realization comes the need for digital leaders to embrace the technologies and trends that will give their organizations a clear advantage. Business and technology leaders who understand and embrace these macro trends will be better prepared to contribute to our ever-changing future.
Learn more about these and other trends driving digital growth, and how to turn them to your advantage. Download the Global Interconnection Index Volume 4 now.
Would you recommend this article?
We’d love to hear your opinion about this or any other story you read in our publication. Click this link to send me a note →
Jim Love, Chief Content Officer, IT World Canada
Related Download
Sponsor: LG
LG Business Solutions
New technologies can ease the burden on IT departments while enhancing productivity and satisfaction for the end users they serve.
Learn More
More Stories
What Is a DWF File and how to open it? –
Improve Your Supply Chain with Microsoft D365 Transportation Management
Over 80 projects will be shown at the “Orbit Engineering Expo.”