904 356-JOBS (5627)

904 356-JOBS (5627)

AI in Law (Courtesy of the Jacksonville Business Journal) — Miami-Dade Public Defender Carlos Martinez implemented artificial intelligence (AI) into his office’s day-to-day workflow out of necessity while facing staff shortages due to low wages and the Covid-19 pandemic.

More than five years later, AI’s use in law is evolving as both seasoned veterans and a new generation of professionals learn to leverage it for research, case preparation, electronic discovery and more.

The emergence of new AI tools is proving to be both helpful and concerning, experts say. The legal industry faces a long-term reliance on technology in a sector historically reliant on humans to interpret laws and serve as trial lawyers, judges and jurors.

“Imagine if we can use AI to prevent wrongful convictions, rather than somebody being convicted [and] then we figure out we screwed up in trial after they spend 30 years in prison,” Martinez told Orlando Business Journal. “Imagine if you can do it proactively while in court to review the work we’re doing, the work police have done and the things witnesses have said to try to prevent wrongful convictions.

“That’s a big help. Not just to people who work in the criminal justice system, but for society to be able to make sure that we didn’t make any human error,” he said.

The Miami-Dade Public Defender’s Office is recognized as the first government agency in the United States to adopt CoCounsel, an AI legal assistant built on large language models (LLM) rooted in Westlaw legal research offered by Toronto-based Thomson Reuters.

The earliest instance of AI use by Martinez’s office centered on the clerical processing of tens of thousands of documents previously compiled manually. Through machine learning, the office receives paperwork via email, which is then automatically placed in the corresponding folders of clients.

But it hasn’t been entirely easy to navigate for law firms. The risk of relying on AI, specifically incorrect data it can sometimes provide users, came to a head during a case litigated in part by an Orlando firm earlier this year — one that resulted in a court sanction warning attorneys “to not blindly rely on AI platforms’ citations.” Such risks make learning to properly incorporate these types of tools in the practice of law paramount.

Kristen Chittenden (pictured above), vice president and deputy general counsel at plagiarism-detection software company Turnitin, is among those working to educate students on the pros and cons. Chittenden developed a course on AI in law at Stetson University in DeLand.

“For the majority of cases, AI is good, it’s efficient, it’s a tool we want to be using,” Chittenden said. “As lawyers we’re risk averse, so a lot of times there is slow adoption, but I’m pushing my students to use it because I use it in my practice and because they’re going to have to use it moving forward, it’s not going away.”

Strengths

On the surface, experts note commonly used general AI models including OpenAI’s ChatGPT, Microsoft Copilot and Google’s Gemini are beneficial for tasks such as creating summaries, outlines and drafts of non-legal documents.

Tools such as Lexis+AI and Vincent AI by vLex are law-specific AI models, like CoCounsel, that can “enhance productivity while maintaining high ethical standards and protecting client confidentiality,” according to The Florida Bar’s Special Committee on AI Tools & Resources.

“We’ve moved from book training to online research where we can look up cases using Boolean search terms,” Chittenden said. “From a technology perspective, going from rule-based programming to natural language and now shifting to prompting [AI tools] has been an interesting progression in a really short period of time for how lawyers are actually doing law.”

Martinez sees the use of AI as leveling the playing field, “because right now as the system works the prosecution has [lawyers] and their employees and the vast numbers of police who are helping with the investigation, so they have unseen resources in terms of what they’re reliant on.”

AI assistance goes beyond analyzing and summarizing lengthy documents, he said. It provides lawyers a paralegal capable of suggesting questions for witnesses and potential jurors and helps build a more robust case record for clients and their attorneys.

Additionally, the Miami-Dade Public Defender’s Office is saving time on compiling electronic evidence from police body-worn video and audio recordings with the help of Axon AI. The software allows recordings from multiple cameras to be produced as a simultaneous transcript for investigative purposes.

“Even though crime overall is down, the work that has to be done on cases has been exponentially higher because of all the digital evidence,” Martinez said. “This tool has the ability to sync all the videos, so an attorney is looking at the whole thing at once.”

The various benefits of leveraging AI in law are not lost on private practices throughout the state, many of which compete for the same business.

Steven Hicks-Safra, general counsel at Miami-based Cole, Scott & Kissane, told OBJ it is important to identify trusted platformsthat ensure protection of client confidentiality, privacy and privilege. The litigation firm has more than 650 lawyers in 12 offices throughout the state, including one in Orlando.

“We try to be progressive in our approach to technology implementation or integration and stay ahead of the curve and find ways in which we can make the attorneys day-to-day practice easier and then more efficient,” he said.

Weaknesses

Costs associated with securing new AI tools and concerns over their reliability have caused some hesitancy when it comes to implementation.

Hicks-Safra recommends partnering with an AI platform that comes with a legal research component and ensures a firm’s information is only accessible by its employees. “You need to be vigilant on the front end and proactive in having proper risk prevention policies and procedures in place,” he said.

That includes confirming sensitive information uploaded via a document or prompt query is not used for machine learning and that an experienced professional verifies any output before it is presented in court.

Known as “hallucinations,” inaccurate information produced by AI agents has been known to create citations and quotes for judicial cases that do not exist.

That issue had emerged in February when a personal injury lawsuit garnered attention due to the plaintiff’s lawyers — which included two from Orlando-based Morgan & Morgan P.A. and another from a different firm — being fined for citing nine cases without verifying their accuracy, according to a court order filed in the U.S. District Court in Wyoming. It was later discovered, the order said, that eight cited cases were fictional accounts generated from an in-house AI system.

The lawyers were quick to realize the issue and remedied the situation, including “Implementing policies, safeguards, and training to prevent another occurrence in the future,” according to the court order, which also fined the three attorneys a total $5,000. Representatives with Morgan & Morgan did not respond to requests for comment as of press time.

“Litigators are beginning to make the jump from those databases into the world of artificial intelligence. When done right, AI can be incredibly beneficial for attorneys and the public,” the court’s Feb. 24 sanction read.

The order added: “As attorneys transition to the world of AI, the duty to check their sources and make a reasonable inquiry into existing law remains unchanged.”

Hallucinations are among the issues that make Hunter Hagood-James of Orlando-based firm Hagood & Hagood reluctant to go all-in on AI aside from using it to review large quantities of documents, which he says has been beneficial.

But some of his other experiences have fallen short of expectations. They’ve also raised concerns about the risk of malpractice issues if the technology is not properly used.

“I’ve had it provide me with a synopsis that would have been wonderful to use in response to a motion, but after checking it multiple times, [the platform] finally admitted the reference cases weren’t real,” Hagood-James said. “AI sort of skips a step there. It might have been faster just to do research through Westlaw or Lexis.”

It’s not uncommon for clients to use ChatGPT or other services for legal advice, “because they think they’re going to save cost” before ultimately seeking professional services, he said.

“There’s a growing perception that if you’re not implementing AI into your practice, you’re going to be crushed by those who do,” Hagood-James said. “That’s why firms are feeling a lot of pressure to do it.”

While accuracy is of utmost importance, the price tag that comes with legal AI tools – ranging in cost from roughly $150 to $1,000 per user per month – presents another obstacle.

For Martinez, whose office handled 75,000 cases in the last fiscal year with a staff of about 350 employees, including 230 attorneys, the financial investment is one of the biggest limitations.

“The reality is there are a lot of very good tools and valuable tools, but there are challenges with cost, and in particular in the criminal justice arena, as to what is funded [and] what is not funded,” he said.

Opportunities

Martinez envisions a future “probably two or three years down the road” where an attorney, judge and other individuals are using AI while reviewing details of a case as it’s being tried in court.

“Imagine if the judge and everyone else is equipped with an AI that is actually hearing everything that is said, and then it’s providing feedback in real time of what’s happening in terms of things that appear to be in error,” he said.

It’s an idea that at one point seemed unrealistic, but now, with the continued development and advancement of AI tools, offers many possibilities.

“Some of that happens now because obviously the lawyers, both prosecution and defense, are trained to be able to spot issues and to bring up case law to the judge, but those are advocates on either side pushing for their particular angle,” Martinez said.

Martinez noted the contribution of mitigation specialists who assist attorneys with compiling client records of mental illness, substance abuse or any other relevant issues as especially useful in helping explain and contextualize a person to the prosecutor or judge.

Those examples – of AI and humans working in unison – are what Chittenden stresses to her students at Stetson when discussing the concerns of job displacement as the technology continues to evolve.

“I’ve been telling my students that AI is not going to be able to fully replace humans,” Chittenden said. “It doesn’t have a consciousness and full reasoning capabilities and the emotions or real-world experience or nuance of humans.”

But what is necessary, she said, is understanding the intricacies of prompting AI “because you’ve got to know how to work this machine the right way,” in order to get maximum results.

Chittenden cited AI’s use in the civil legal aid sector, which provides representation for people who cannot otherwise afford it, as most impactful when it comes to utilizing time and resources.

Hicks-Safra said continued education at law firms and the hiring of at least one individual who specializes in AI use are important first steps in unleashing its full potential in the years ahead.

“There is less risk and less vulnerability when you have not just jumped in. View this as a marathon and not a sprint,” he said. “In terms of specific tasks on a day-to-day basis, it’s important to know what the trusted bounds are of the technology today and that enhancement of AI is going to continue.”

Threats

No matter where the use of AI in law evolves from here, expect to see a fragmented user profile as options and intent differ.

“Trust in AI by our legal professionals has rightly been slower with usage than I anticipated, which is a good thing for the future,” Hicks-Safra said. “It’ll allow for natural progressions with hopefully decreasing costs to allow for more widespread usage within large firms and availability to smaller firms.”

According to The Florida Bar, Florida was the first state to issue an ethics opinion regarding the use of AI in law when it did so in January 2024.

The non-binding opinion read in part, “lawyers using generative AI must take reasonable precautions to protect the confidentiality of client information, develop policies for the reasonable oversight of generative AI use, ensure fees and costs are reasonable, and comply with applicable ethics and advertising regulations.”

Other issues soon became prevalent as gradual reliance on AI grew in the legal community.

The Bar announced in August that more guardrails are being sought to address potential risks as “appellate judges were growing concerned after seeing Florida lawyers file AI-generated court documents that contain citations to non-existent judicial opinions.”

That came after OpenAI’s ChatGPT 5 was released and found to have produced mistakes that it would in turn defend by generating fake evidence to support its findings.

In Miami-Dade County, Martinez saw AI models using predictive analytics showed bias against defendants from underserved communities. Predictive analytical models predict a person’s future behavior based on previous incidents tied to their respective neighborhood.

“Anytime AI is used for predictive analytics, that is where you exacerbate the problems because we now know how many injustices have been perpetrated here over the decades,” Martinez said. “That is a legitimate concern.”

Complicating matters more are instances of defendants not being granted a public defender, being unable to afford other representation and entering courtrooms without counsel to face a prosecuting team equipped with potentially misleading AI output.

“That happens in Florida frequently on misdemeanor cases where the individual is indigent and they don’t know how to defend themselves,” Martinez said.

“As humans we are limited with how much information we process,” he said. “So the challenge here is going to be that AI and the things AI facilitate don’t get ahead of human capacity.”

Photo of Kristen Chittenden of  Turnitin and Stetson University courtesy of Stetson University