26 September 2023

How AI is changing data Infrastructure

Start the conversation

Arthur Cole* says AI will have an impact on how data infrastructure’s consumed by new generations of smart applications and services.


The impact artificial intelligence (AI) is having on enterprise data processes and workloads is well-documented, as is its capability to monitor and manage complex systems.

But what is not widely recognised at this point is how AI will change data infrastructure, not just in design and architecture, but rather in how it’s consumed by new generations of smart applications and services.

While infrastructure may seem immutable, the fact is even a physical plant is highly dynamic, right down to the processing capabilities in servers and networking devices, as well as the media used for storage.

Virtualization has only added to this dynamism, to the point where infrastructure can be quickly tailored to meet the needs of any workload.

Changing containers

The latest twist on virtualization is containers, and as The Enterpriser’s Project’s Kevin Casey showed in a recent report, running AI at scale requires some retooling at the container level.

For one thing, AI workloads require a lot of data gathering and processing up front, before you even get to the training.

Once the model hits production, it must be supported with performance monitoring, performance metrics and a host of other services.

Containers can certainly streamline these processes, but they must be optimized for consistency and repeatability to provide the maximum benefit to the workload.

At the same time, it’s important to note that containers cannot fix a flawed process.

By themselves, they can’t do anything to correct bias in training data, nor can they produce a desired outcome from a poorly designed algorithm.

All they can do is speed up certain aspects of the workflow.

We can also expect to see some changes in the cloud as AI picks up steam.

Tech designer Kivanic Uslu noted on Towards Data Science recently that cloud delivery models like PaaS, SaaS and IaaS, which were created to handle extremely heavy data loads, are now evolving around the needs of AI.

This means CPU, memory, networking and other resources are becoming available at even greater scale and with less lead time earlier cloud platforms.

The cloud is also enabling AI capabilities just like any other service, allowing greater access to an ever widening pool of users.

Most of the real action is happening out on the edge, so much so that observers like TechTalk’s Ben Dickson claim that the cloud is actually the bottleneck to greater AI implementation.

When we begin to fathom the Internet of Things and its always-on, always-connected, always-running service environment, even the cloud will have trouble keeping up.

This is why start-ups like Seattle’s XNor are working toward Edge AI (aka, fog computing), which seeks to delink the cloud from the AI data chain.

In doing this, the expectation is the AI will produce new generations of services even as the edge itself becomes faster and less costly because it won’t require expensive networking equipment at each node.

Backhaul overhaul

To the contrary, it’s precisely for this reason that we’ll see significant investment in edge-to-cloud networking, says Jason Carolan, chief innovation officer at managed network services provider Flexential.

No matter how much data remains on the edge, vast sums of it will still need to return to the cloud and the core for the enterprise to capture its value.

As 5G, Wi-Fi6 and other formats emerge on the edge, we can expect this backhaul network to increase dramatically in both speed and scale, with backbone capacity pushing into the terabit level.

At the same time, highly dynamic architectures will be needed to handle the considerable spikes in traffic the edge is expected to produce.

In a connected world, of course, nothing exists in a vacuum.

Changes to one layer of the digital stack will invariably produce changes in the others.

The difference with AI is that it has the capacity to change everything all at once, which can leave some of us struggling to find our footing in the new reality.

Ultimately, of course, we can expect AI-driven changes to infrastructure to be net positive.

Physical architectures should become more streamlined so they perform at higher levels and consume fewer resources.

At the same time, this will enable new ways to do more things more quickly and accurately — and maybe give the human workforce a little more time for leisure and abstract thinking.

*Arthur Cole is a technology journalist and enthusiast who has been covering the high-tech industry for more than 30 years.

This article first appeared at venturebeat.com.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.