
Keeping Innovation Investable: Why Simplicity Wins in Gaming Tech
22 October, 2025Winna Media Opinions & Editorial
14 October, 2025
AI can enhance and customise physical and online gaming experiences.
 
															As we count down to Winna Media’s AI in Gaming Investment Seminar on 25 November in Bangkok it’s time to remember that human beings remain at the heart of any discussion.
The monitoring, measurement and personalisation of players’ experiences is increasingly something that operators are looking at Artificial Intelligence (AI) to provide. Their focus is in these key areas – habits and preferences; drawing insights from the way customers act; providing customised experiences and monitoring potentially harmful behaviours.
However, regulators, the media and consumer advocates have also highlighted the downsides. Namely, that’s challenging how the massive volumes of personal data is used – or potentially abused. It’s also how to tread the fine line between personalising an experience or manipulating it.
AI has also been seen to exhibit biases stemming from the data used to create a platform or tool. That could mean players are unfairly classed as low value or high risk. What’s worse is that these decisions can be opaque in their reasoning which means they’re hard to challenge even if they’re wrong.
Finally, in an industry in which the regulatory framework changes by the day in one jurisdiction or another, lawmakers are constantly reassessing and usually tightening the rules around AI and gambling.
Let’s return to personalisation.
On the positive side, AI can analyse players’ habits, betting patterns and the size of their wagers. That will include if they use autoplay, chase or cut losses. The analytic tools and machine learning algorithms that collect data can process it almost instantly.
Similarly, in physical casinos facial recognition will ID players while smartcards and wearable technology will track their betting activity. From there, it is a tiny step to clustering individual player data into groups that can be modelled by preferences, predicted behaviour and even their likely emotional responses.
The plus side for operators is in the creation of customised gambling experiences in which the games and even the physical environment can be adjusted to minimise diversions. Individual players can get tailored offers, bonuses and rewards, as well as recommended games that resonate with their preferences. Equally, by keeping a close eye on the players, early signs of problem behaviours can be identified and automated interventions like warnings and enforced cooling-off periods invoked.
Any industry professional can share their personal experience of when something they see as a positive becomes a negative in the opinion of customers, regulators and other stakeholders.
The personalisation cited earlier can be positioned as manipulative if it is felt that subtle nudges are gently, but inexorably, keeping players betting, using the physical environment like music and lighting to build engagement.
Suppose a regulator feels players are being taken down this path. In that case, they can attempt to make a case against an operator on the grounds that behavioural insights are being used to maximise profits at the expense of player welfare.
Regulators are also demanding transparency around how algorithms reach their conclusions. They also want more clarity about how players at risk can opt out or be identified as problem gamers through the operators’ detection systems.
The fact is that the gaming industry is no different to so many others when it comes to developing and operating best practices around AI and their customers.
Safeguarding measures are well known and relatively straightforward to put into place, including:
- Being open and clear about how players’ data is being used
- Clearly dividing the analytics for engaging players from those used to monitor risk
- Independently auditing the AI platforms for fairness and compliance
- Ensuring human beings continue to oversee the overall operation in a transparent way
As Earle G. Hall, co-chair, AI and Cybersecurity Policy Committee with the Las Vegas Chamber of Commerce, and President of AXES.ai, said in his recent article ‘The Future of Human Value in the Age of AI’ – “Interpersonal skills – empathy, communication, emotional intelligence, negotiation – will become the ultimate differentiators. As automation handles the “what,” humans will be valued for the “how”: how we influence, how we collaborate, how we inspire action in others.”


