background-image

Building AI From the Ground Up: Why Network Transformation Is the Critical First Step to Successful AI Adoption

May 1, 2025

Share:

AI is founded on access to data—and lots of it. As adoption grows, AI will force a reshaping of corporate networks, since legacy infrastructure simply can’t provide the flexibility, capacity, and performance that this data-hungry technology demands. But with deployments coming in so many shapes and sizes, how can enterprises build a network to not only support AI today, but is also ready for the demands of future AI models?

It’s hard to think of a technology that’s generated as much hype as AI. More than two-thirds (69%) of businesses are preparing to take on AI or are already using it at scale, though how they’re deploying it is likely to continue evolving.

The focus for the past few years has been on GenAI, and many enterprises have begun exploring the possibilities of large language models, which are a subset of GenAI and focus on language-based tasks.

These neural networks are trained on large datasets and can generate content, summarize text, and create or review programming code, helping to boost productivity and provide better insights for decision-making.

They’re gaining momentum in the enterprise world, but we may soon see a shift to small language models, which are simpler to manage, more cost-effective, and faster to train.

However, GenAI isn’t making decisions or acting alone, and that’s why autonomous agentic AI is now garnering interest. Gartner predicts that one-third of enterprise software applications will include agentic AI by 2028, compared to less than 1% in 2024.

Agentic AIs are systems that can independently solve problems and make decisions to achieve certain goals, even across longer and more complex workflows.

In retail, for example, an agentic AI could identify where stock is running low, establish that the normal supplier is out of stock or has increased its prices, source alternative suppliers with acceptable costings, and then make the order and update logistics, tracking, and inventory systems.

Few CIOs Believe Their Networks Are Ready for AI

AI can only realize its full potential if it’s underpinned with robust connectivity—but only one in seven CIOs think their enterprise networks are fully ready for that.

The main reasons that enterprises say their networks are unprepared for AI are:

  • Network inability to scale flexibly on demand: 38%
  • Network performance (e.g., application responsiveness or latency): 38%
  • Lack of bandwidth for very large data transfers: 29%
  • Inadequate connectivity to the cloud or between clouds: 27%

But what demands does AI place on connectivity that makes the enterprise network so unprepared?

AI Can Touch Almost All Parts of Enterprise Infrastructure

An AI model only offers value if it has access to data.

This data might be generated in a variety of locations—within the enterprise, in the data center or cloud, or at the edge. Think of all the information, for example, gathered by customer service applications running in the cloud, intelligent traffic sensors in a smart city, or predictive maintenance sensors in a factory. This data then needs to be stored somewhere too.

This is only the beginning of the story, though.

Training AI models typically demands a large amount of computing power, so it’s often based in the cloud. Once an AI model has been trained, it goes live and is applied to real-world scenarios (this is called inferencing).

Again, it may be deployed within the cloud, at the edge, or on premises (this option is particularly attractive for organizations that need to comply with strict data privacy and security regulations).

What all these stages have in common is connectivity. For AI to deliver on its promises, the network needs to be able to move data smoothly between all these different sources and destinations, and keep it safe and secure at all times.

Different AI Models Demand Different Network Characteristics

There are so many variables in the demands AI might place on the network that there’s no single “ideal” network setup in terms of bandwidth, performance, and latency.

Data might be processed all at once in large batches—a retail store might analyze customer browsing data overnight to create more personalized recommendations, for example.

Depending on where this information is gathered and stored, this model might create regular, very large bursts of network traffic that demand a lot of bandwidth, but latency is less important, as the analysis isn’t happening in real time.

Real-time inferencing, on the other hand, harnesses new data instantly for immediate insights and decisions. AI-powered banking fraud detection systems, for example, need real-time inferencing to identify fraudulent transactions before they’re completed. Here, high network reliability and low latency are vital.

If AI models are deployed on edge devices, like smartphones and IoT sensors, network traffic is kept to a minimum. However, there will still be flows of data from these devices to centralized systems, so connectivity is still important.

Cloud-to-cloud connectivity plays a part too. The majority (94%) of enterprises are already using multiple clouds, and it’s reasonable to assume that they’ll continue to harness a multicloud approach for AI. But connecting multiple different clouds can lead to complexity and poor visibility, impacting network performance.

Legacy networks simply can’t provide the performance, security, and simplified multicloud connectivity that effective AI deployments demand. Instead, enterprises need to adopt a flexible unified network architecture model.

SD-WAN and SASE Provide Centralization and Consistency

Software-defined wide area networking (SD-WAN) provides centralized control over a virtualized network architecture. This simplifies enterprise infrastructure, significantly improves visibility and performance, and allows consistent network and security policies to be rolled out across the whole network, irrespective of the access technologies being used.

SD-WANs are simpler to manage compared to traditional WANs and provide much greater flexibility in response to the dynamic needs of AI-driven applications. For example, an SD-WAN can dynamically adjust routing and bandwidth and prioritize time-sensitive and mission-critical apps, enabling consistent performance even when there are peaks in traffic or the AI system is using a lot of capacity.

Built-in network security is essential, too, to keep data, users, and applications secure from cyber threats and make sure regulatory requirements are met.

Secure access service edge (SASE) combines SD-WAN with a range of robust next-generation security services, delivering consistent security coverage across all users, locations, and connectivity types.

One of the key benefits of these software-defined architectures is that they’re highly adaptable and can accommodate new AI use cases as they evolve and begin to be adopted.

Diverse Connectivity Options Are Needed for Diverse Use Cases

Having access to a broad range of connectivity options helps to accommodate the varying needs of different AI deployments.

For example, a factory might use AI-powered quality control systems, which harness edge computing to look for defects at inspection points throughout the process, analyze issues, and identify remedial action.

These systems might then use private 5G to securely communicate any necessary adjustments to individual pieces of equipment or to broader workflows (such as adjusting operating temperatures or slowing conveyor belts to allow for issues further ahead in the workflow).

This process takes place at the edge—in other words, within the facility itself—but high-level performance insights might then be communicated to head office via 5G or wired connectivity to help with decisions about efficiency and productivity.

A Future-Ready Network Doesn’t Come Ready-Made

These network solutions aren’t one-size-fits-all. A successful implementation demands a broad range of security and networking skills and expertise and the ability to marry these up with a comprehensive view of an organization’s existing technology, ways of working, and business-level goals and strategies.

This can be a complex process, so many enterprises are relying on technology partners to guide them through the maze of network transformation. This approach relieves organizations of the burden of finding the skills, experience, and resources to build an AI-ready network and allows them to focus instead on realizing the true value of this indispensable technology.