arlier this year, Joshua Browder, the CEO of DoNotPay, an AI startup, attempted to introduce a robot lawyer into a California courtroom. However, he was aware that this was prohibited in nearly all 50 states.
DoNotPay is marketed as the "world's first robot lawyer" and aims to democratize legal information and self-help services. Its services cater to the economically disadvantaged section of society, including contesting medical bills, disputing bank fees, and challenging credit reports. The company boasts of successfully helping over 160,000 individuals contest parking tickets in New York and London.
According to Gillian Hadfield, a law professor and director of the Schwartz Reisman Institute for Technology and Society at the University of Toronto, the robot lawyer was not allowed inside the California courthouse because, except for Utah, only lawyers with bar licenses are authorized to provide legal assistance.
Andrew Perlman, the dean and a law professor at Suffolk University Law School, notes that Browder's attempt with DoNotPay is indicative of what is on the horizon. He believes that certain legal services, especially routine legal tasks, can be automated and delivered using tools such as LegalZoom, which is already available to consumers.
Many people believe that such assistance is desperately needed, especially in the United States, where a study by the Legal Services Corporation (2022) found that low-income Americans do not receive sufficient legal assistance for 92% of their civil legal issues. The study revealed that nearly half of those surveyed do not seek help due to high legal costs, and 53% doubt their ability to find an affordable lawyer if they require one.
Andrew Perlman notes that this "access-to-justice gap" is a critical issue, and automated tools can play a crucial role in addressing this problem.
The use of AI in the courtroom may be inevitable, and it could potentially eliminate human biases and errors from the legal system, according to Terence Mauri, an AI expert and founder of Hack Future Lab. This could result in a new, more equitable form of digital justice, where human emotions and biases no longer influence legal outcomes.
Gillian Hadfield, a law professor and director of the Schwartz Reisman Institute for Technology and Society at the University of Toronto, believes that the most exciting aspect of AI is its potential to democratize legal services. She suggests that while AI can already lower the cost of legal services in the corporate sector, its real impact will be in addressing the significant access-to-justice gap that exists today.
However, there is still work to be done before AI can be widely used in the courthouse. Technical errors are not tolerated in the legal field, and the stakes are too high. John McGinnis, a law professor at Northwestern University, explains that while AI like ChatGPT can often summarize the law correctly, it still makes mistakes, which could be problematic in a courtroom setting.
Hadfield has been working in Utah and other areas to establish regulatory regimes for licensing providers other than lawyers to offer some legal services. She believes that consumer access to legal services is crucial for fairness and is becoming increasingly feasible with the rapid advancement of technology. The goal, according to Hadfield, is to establish standards to ensure that AI tools like DoNotPay are licensed in a way that benefits users.
The potential of AI to provide affordable and accessible legal services may not be limited to the US alone. The developing world, in particular, could benefit from AI-powered solutions. According to a study by Boston Consulting Group on the use of AI in government, people in less developed economies with higher levels of corruption are more likely to support the use of AI. Countries such as India, China, and Indonesia indicated the strongest support for government AI applications, while Switzerland, Estonia, and Austria showed the weakest support, as per the study.
Chesterman notes that human lawyers and judges will still be necessary for more complex legal issues in the near future. The BCG survey also revealed that most global respondents did not approve of using AI for sensitive decisions related to the justice system, such as making recommendations for parole boards or sentencing.