IT Brief UK - Technology news for CIOs & IT decision-makers
Moody uk it operations office night burnout cluttered monitors

Alert fatigue drives UK IT outages & rising burnout

Tue, 27th Jan 2026

Recent Splunk research reveals that three-quarters of UK IT teams have suffered outages resulting from overlooked critical alerts, a failure largely attributed to alert fatigue, tool sprawl, and ambiguous incident ownership.

The findings, derived from a survey of 300 British ITOps and engineering professionals as part of a wider study on the state of observability, highlight a sector under significant pressure. Respondents identified the sheer volume of notifications, a high frequency of false positives, and fragmented tooling as the primary catalysts for heightened stress levels within IT operations departments.

UK respondents reported a higher-than-average tendency to ignore alerts. Splunk said 15% of UK participants admitted to deliberately ignoring or suppressing alerts, compared with a global average of 13%.

False Alerts

False alerts have emerged as a significant organisational challenge rather than a purely technical grievance. Over half of UK respondents, 54%, reported that false alerts are actively damaging team morale.

When identifying the specific drivers of workplace stress, tool sprawl was ranked as the most significant factor by 61% of British professionals. This was followed closely by the impact of false alerts at 54%, while the sheer volume of notifications was cited as a primary stressor by 34% of those surveyed.

The research linked these pressures to operational resilience. Respondents said missed alerts have real-world consequences in the form of outages and disruption for customers. Splunk said 75% of UK teams had suffered outages as a direct result of missing critical alerts.

Incident Ownership

Splunk also highlighted a gap in incident response practices. Only 21% of UK respondents said they regularly isolate incidents to a specific team. Another 36% said they rarely isolate incidents.

The data points to a lack of clarity over who owns an incident when an alert triggers, particularly in environments where multiple teams handle infrastructure, applications and security. Splunk said this ambiguity increases the risk that important security alerts go unaddressed.

One implication concerns security monitoring, where missing an alert can have consequences beyond service availability. Splunk said unclear ownership, combined with a high volume of alerts, raises the risk of security issues being overlooked while teams triage operational noise.

Splunk framed the issue as both a tooling and organisational challenge. It said IT teams face constant interruptions, and it described stress and burnout as factors shaping how teams respond to alerts.

Petra Jenner, SVP & General Manager, EMEA at Splunk, said the problem starts with volume and a lack of context. "IT teams are drowning in noise. Every day they're hit with alerts, but without the right context or ownership, it's almost impossible to know which ones really matter. This lack of clarity puts a lot of pressure on teams and slows response times," said Petra Jenner, SVP & General Manager, EMEA, Splunk.

Jenner said outages and customer disruption can quickly escalate into wider business impact. "When critical alerts get lost in that noise, organisations risk downtime and customer disruption, which can quickly translate into revenue loss and lasting reputational damage," said Jenner.

Tool Sprawl

The survey results placed tool sprawl at the centre of the stress picture. Respondents said multiple monitoring and observability tools contribute to workload and complicate the process of understanding what an alert means and who should act on it.

Splunk's findings also suggest that maturing incident response practices matters alongside tooling. Teams that consistently isolate incidents to a specific group may reduce ambiguity during a fast-moving outage or security event.

Jenner linked alert fatigue to staff wellbeing and the practicalities of how tools present information and actions. "To build resilience and combat alert fatigue, organisations need to consider the psychological wellbeing of their IT staff and ensure the tools they use genuinely support them. This means observability tools that accurately triage alerts, understand context, suggest clear remediation paths, and reduce the number of interfaces already-stressed teams are required to work with. With the right systems in place, alongside better cross-departmental co-ordination, teams can act quickly, with confidence and avoid the pitfalls of alert fatigue," said Jenner.

Security Alignment

Splunk also pointed to closer working between observability and security functions as a factor that can reduce missed alerts. It said stronger collaboration improves how ownership gets defined across teams.

In the wider, global research sample, Splunk said 64% of respondents reported that stronger collaboration between observability and security teams reduces customer-impacting incidents. The company said organisations that tighten cross-team co-ordination and improve observability practices report better resilience outcomes.

Splunk said the UK results indicate a higher level of reported alert fatigue than the global average. It said incident ownership, tooling choices and collaboration between operational and security teams will remain areas of focus for organisations that want to reduce outages linked to missed alerts.