IT Brief UK - Technology news for CIOs & IT decision-makers
Modern european office desk computer declining budget chart cloud icons

UK & EMEA firms cut AI, IT budgets as cloud costs soar

Tue, 15th Jul 2025

Businesses across the UK and EMEA are experiencing increased cloud infrastructure costs and difficulties in realising measurable returns from artificial intelligence investments, according to a recent study conducted by Akamai.

The research highlights broad concerns regarding the disconnect between rising cloud and AI spend and the ability of organisations to track tangible returns on these investments. The study, which surveyed 750 IT decision makers in the UK, France, and Germany, shows this financial strain is prompting companies to cut AI, cybersecurity, and IT staffing budgets in order to offset the rising costs of cloud services.

Cloud costs and provider satisfaction

Findings indicate that only 35% of EMEA businesses remain with their current cloud provider due to satisfaction, with the majority open to alternatives. Nonetheless, 67% of respondents expect cloud costs to continue rising over the coming year, and 42% anticipate these increases to exceed 10%. Top contributors to these rising expenses include cloud storage, analytics services (39%), and AI-related services (37%).

Despite dissatisfaction and concerns about rising costs, the prospect of migrating data and applications is seen as daunting. Some 41% believe the cost and complexity of migration outweigh potential benefits of switching providers.

Impact on IT and innovation

Rising cloud costs are causing over two-thirds (68%) of businesses to reduce spending in other IT areas. As AI demands fuel further cloud spend, budgets for new AI initiatives (26%), cybersecurity (26%), and IT staffing (24%) are being reduced. One in five organisations described their cloud computing expenses as "unmanageable."

"Cloud spending is growing fast - exponentially for some - and it's holding businesses back from investing in growth and innovation. This is especially true with AI, where businesses are struggling to squeeze ROI out of their investments. Against this backdrop, cloud hyperscalers continue with contract lock-in and egress pricing, which means keeping cloud costs under control is impossible for many," said James Kretchmar, Global CTO, Cloud Technology at Akamai Technologies.

AI investment versus ROI

While 65% of organisations intend to increase AI investment in the next year, most have yet to adopt robust strategies for tracking AI returns. A notable 82% of those surveyed reported lacking a method for measuring the return on investment (ROI) for AI projects. Only 11% claim their AI initiatives are currently self-sustaining through measurable cost or productivity gains, and just 25% say their AI budgets fully support the scope of projects they wish to pursue.

Kretchmar commented further: "Leaders need to take a hard look at where they're spending and what outcomes they expect. Traditional ROI models don't map neatly to AI - productivity alone isn't enough. Companies have to prioritise the quality of outcomes and use the right tool for the job. This includes looking beyond the legacy cloud providers to those architected for performance-sensitive applications like inference."

Other findings

The study also reports that only 14% of participating organisations are exploring advanced AI applications or have fully integrated AI across their operations. The current geopolitical landscape is shaping provider choice, with 67% searching for cloud partners offering greater data sovereignty features. In response to regulatory changes such as the EU AI Act, 57% are planning to invest in AI governance and compliance solutions, prioritising EU-based suppliers.

AI inference and edge computing

AI inference, the process of applying trained AI models to real-time data, is cited as the fastest-growing area of artificial intelligence. It has become increasingly important to businesses seeking automation, predictive analytics, and timely decision-making capabilities. Unlike AI training, inference depends on the application of learned patterns in real operational settings.

To better manage performance and costs, organisations are turning to distributed, edge-native infrastructure. Running AI inference closer to where data is generated, rather than relying on centralised cloud platforms, helps reduce latency and aligns investments more directly with operational needs.

The Akamai study underscores the changing priorities of IT leaders as they navigate escalating cloud costs, the complexity of AI ROI, and new regulatory demands.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X