Speed Read (Get the headlines here, and read on below)
AI chatbots aren’t just for customer service. New research shows that chatbot communication styles can impact how people report information to the police.
If employed the right way, it can also impact sports gambling and integrity efforts for everything from responsible gambling to integrity incident reporting to post-incident investigations.
The research found that for chatbots, perceived empathy and friendliness, and more “natural” communication styles can positively impact user experiences.
Chatbots that can consider real-time gambling context can both make users more likely to return to a site or app and build in important safety and security protocols from the start.
In my role as a Research-to-Practice fellow at the University of Nebraska at Omaha’s National Counterterrorism Innovation, Technology, and Education Center (NCITE) I am fortunate enough to work with some very smart researchers and get a front row seat to their interesting work. While their focus is not strictly on the sports world or the gambling world, there are always real-world applications for their work. One of those research projects is my subject today.
In a report that is soon to be published in the academic journal Law and Human Behavior, an NCITE research team tested how AI chatbot communication styles impacted outcomes for reporting information to the police. Many of us, myself included, are familiar with using chatbots for things like customer service applications. A small little box asking if I need any help seems to be a ubiquitous part of the desktop internet user experience these days. This certainly is true for sports teams and leagues, along with betting platforms like DraftKings and FanDuel.
At first glance, this research project may not seem applicable at all to the sports gambling world, but I think it actually presents an interesting data point for all members of the sports ecosystem to use if they want to build their tools not just for revenue generation, but also with security in mind. As I have previously written, many investments in sports teams and facilities can have a dual purpose of revenue generation and security. As we look at the research, you’ll see that this is another prime example.
The research starts by looking at the fact that when it comes to real world interactions between the police or other public safety officials and the public that there are styles and ways of interacting that generate better results. It also finds that there are certain societal factors that create a barrier to people voluntarily sharing information with the police or with the government in general. The researchers find that these same barriers largely exist with chatbots, especially as they improve and “conversations” with them become more life-like.
For the sports industry, they obviously want to tailor their chatbots for their customer-base to increase use and ease their human resources costs. The more likely fans or bettors are to use a chatbot to solve their problems or issues quickly, the more likely they are to have a positive experience with the product and the lower the need for human customer service interaction. But the same applies for security and safety reasons. If there are certain patterns of behavior that can be associated with problematic gambling behavior—whether it be early signs of addiction, or indications that someone may be involved in using inside information, or information that someone might be trying to distort a betting market for their own gain— chatbots could be the canary in the coal mine for detecting patterns of certain problems in the digital world before they manifest in the physical world.
As the research paper points out, several technological advances allow for anonymity, convenience, and no direct interaction with the authorities. These positively correlate with better information and better communications outcomes. And those are just the advances from webforms before even contemplating GPT-based chatbots. But there may be drawbacks as well. In particular, the researchers point out that, “[u]nlike chatbots developed for entertainment or customer service, a chatbot used to report information to law enforcement must consider the potentially high stress and emotionally laden context in which the user may find themselves.” They go on to say that things like perceived empathy and friendliness, and more “natural” communication styles can positively impact user experiences, at least for e-commerce chatbots.
This seems like a real opportunity for the broader sports industry, but especially the sports gambling and prediction markets industry. Using AI chatbots that can consider the specific gambling context and take an appropriate tone or style with gamblers can achieve two things at once: first, it can improve the user experience, leaving customers happier and more likely to return to the app or website and continue placing bets or trading event futures. Second, it can also build in important safety and security measures from the start that will improve the product, improve trust in the platform’s dedication to customer safety, and improve sport integrity protections.
As businesses in the sports industry race to find uses for AI to improve their revenue streams or reduce costs, security teams should be simultaneously making the case to their board that security use-cases can similarly save time and money.