AI can be used to deliver public safety at the Glasgow 2026 Commonwealth Games and beyond, writes Nicola Leuchars
When Glasgow hosts the Commonwealth Games this summer, it will welcome thousands of athletes and hundreds of thousands of spectators over an 11-day programme of sport and culture. Delivering an event of this scale is a major undertaking. Behind the scenes, organisers must manage complex challenges around crowd safety, transport, digital infrastructure and resilience while maintaining a positive experience for visitors, businesses and communities.
To meet these demands, event organisers and public authorities are increasingly turning to artificial intelligence (AI). From monitoring crowd movements to protecting critical systems from cyber threats, AI is becoming central to how large events are planned and delivered. While the technology offers clear advantages, it also raises important questions around privacy, ethics and trust that businesses cannot afford to ignore.
Among AI’s most visible uses at major sporting and cultural events worldwide is crowd monitoring. AI-enabled cameras can analyse live footage to detect unusual movement, congestion or behaviour that could indicate a developing risk, such as a brewing fight. Rather than relying solely on human monitoring, these systems flag issues early, allowing security teams to intervene before problems escalate.


Beyond fixed cameras, AI is being deployed through aerial surveillance. At large UK football matches, police have used AI-integrated drones to assess crowd density around stadiums and transport hubs, identifying bottlenecks or surges in footfall, to support faster, better-informed decisions and prevent dangerous overcrowding.
Cyber security is also a concern which AI can help address. Major events rely on complex digital systems including ticketing, accreditation and communications networks, all of which can be targets for cyberattacks. Machine-learning tools can analyse vast volumes of network data in real time, detecting anomalies that signal an attempted breach and allowing technical teams to respond quickly. For events on a global stage, digital resilience is just as important as on-the-ground security.
Facial recognition technology is another developing area. Internationally, some venues use it to streamline entry or prevent known troublemakers entering. In the UK, its use remains sensitive, but it is increasingly used at Premier League matches by police, in a shift towards data-driven security.
These developments extend beyond sporting bodies and law enforcement. Many of the challenges faced at major events are shared across sectors. Retailers, venue operators, transport providers, technology suppliers and sponsors all have a stake in how AI is used, particularly where customers, employees or the public are affected.


AI can offer businesses greater efficiency, improved safety and better insights. However, adopting these tools without planning can expose organisations to legal, ethical and reputational risks.
Regulatory compliance is critical. Many AI systems rely on processing personal data, which brings data protection obligations into play. Businesses must be clear about why AI is being used, what personal data it will process and whether its processing is legal. It is also a requirement to ensure that processing is necessary, proportionate and transparent, and to install safeguards.
Ethical considerations are equally important. Surveillance technologies, in particular, can undermine public trust if used without clear justification or communication. Organisations must balance security benefits with respect for individual rights. Equality and bias should also be addressed. High-profile legal cases have shown the risks of deploying AI without understanding how systems are trained or whether they produce biased outcomes.
The risks associated with AI can be managed. Key steps include working with reputable suppliers, understanding how systems operate, ensuring human oversight and regularly reviewing performance.
Nicola Leuchars is Managing Associate, TLT. This article was jointly written by Caroline Loudon, Partner, TLT

