Home > Analysis > Robot Student Counsellor

Robot Student Counsellor


Lawbot is a chatbot which can help with determining whether users have been a victim of crime. It is an online tool which covers 26 different criminal offences and takes users through a series of questions designed to help them with the next steps to take whether that is consulting a solicitor or contacting the police.

The service was set up by a group of Cambridge law students earlier this year following on from a volunteer school sexual consent class and Lawbot was designed and built in just six weeks. 

The program is still in beta version, but Managing Director Ludwig Bull hopes it will help victims of crime, at Cambridge and beyond, to get justice. “A victim can talk to our artificially intelligent chatbot, receive a preliminary assessment of their situation, and then decide which available actions to pursue,” he said.

Bull explains that he was motivated by the alarming figures on sexual assault in the UK, where it is estimated that two thirds of offences go unreported, according to the Rape, Abuse and Incest National Network (RAINN). The problem is especially urgent at universities, with the scale of abuse recently likened to the cases of Jimmy Savile or the Catholic church. 

“Sexual assault was the first kind of offence we dealt with,” he says. “Much confusion surrounds the nature of consent. Individuals may feel as though they have been assaulted, but not actually be sure of their legal position.”

The program uses randomised, “empathetic” language choices, written in consultation with real-life therapists and psychologists. But the bot is unlikely to put flesh-and-blood criminal defence lawyers out of work any time soon. It has more in common with DoNotPay, the chatbot lawyer that helps with parking ticket appeals, designed by 19 year-old Joshua Browder. One difference, however, is that LawBot’s creators aim to address a wide range of criminal offences.

Given the complexity of the law, and the preference for simplicity when it comes to AI programming, it’s an ambitious project. “We’d like to expand to other areas of civil law, and we’re already in touch with German universities,” Bull says. “But we’re not out to make a program that provides a too-complex analysis. We really want to keep it as a starting point for victims.”

As Rebecca Agliola, the Marketing Director, points out, “At present, the majority of legal tech’s applications are for the benefit of large law firms (such as IBM Watson’s Ross). LawBot has revealed an as-yet untapped component of this emerging market: the application of legal technology for the benefit of not only individuals, but victims of crime.” 

Over a 4 week period from its start in September, the site has hosted over 38,000 bot interactions. 

How does it work?

LawBot runs on an AIML (Artificial Intelligence Markup Language) script. Based on the user’s request, LawBot determines which question the user should be asked next. This process is repeated until the script finishes and a result is reached. LawBot is comparable to a decision tree.

The difference between an AIML chatbot and a decision tree is that the AIML allows for a greater scope of interaction. This is reflected in the current feature of saying ‘define x’. Similar features can be added in the future. LawBot is an interactive decision tree.

The AIML is coupled with the command system of the interface. This command system allows the site to display non-text elements, such as pictures, maps, hyperlinks etc. The command system intercepts the bot’s response to the user’s request at certain questions and returns the desired element.

LawBot is also able to correct common typos automatically and recognize synonyms for the usual user requests. For example, LawBot will understand that “yup” is the same as “yes.” LawBot also uses basic word-stemming. For example, LawBot understands that “define psych harm” means “define psychological harm”. Both of these techniques are important to improve user-friendliness. LawBot’s ability to use these techniques currently is however limited.

LawBot also uses randomized responses to ensure that conversations are unique. Depending on the particular point in the decision tree that the user is currently at, the bot response includes an affirmation, or sometimes an emotive reaction to the request.

Lawbot is not machine learning or artificially intelligent in the sense that it learns anything from the interactions, but more of an interactive decision tree to help victims to get to the next step of the legal process. 

What does the Future hold?

The Lawbot team are expanding into new areas of expertise with contract and family law currently in focus with more to come. “The possibilities are endless, and we are learning more about this market and our users’ needs every day. LawBot, and initiatives like it, are here to stay," said Agliolo.