top of page

How AI and Automation Are Changing Court Reporting – and How to Adapt

Updated: Apr 2


The AI hype cycle

If you’re worried about substitute service providers and existing companies breaking into your livelihood, read on. Stop asking whether AI will take your job. It can’t, not with inherent physical and technical limitations and the high costs to train LLMs. Instead, ask how soon could someoneuse AI to augment their capabilities and take your job. Understand how you differentiate yourself (or not) from services that are trying to whittle away your customer base. AI has ethical and environmental considerations that may lead to entities moderating its use and throttling its energy consumption through pricing.


On October 14th, I was invited to present to the annual conference of the Certified Court Reporters Association of New Jersey (CCRA-NJ). I was honored to be included in a series of presentations that revolved around the impact of AI on the jobs of reporters, plus a timely and apt session on leveraging social media to raise the profile of reporting agencies and freelancers.


The overriding theme of the day really was AI, AI, and AI. The sessions, in order, were: Better Than AI; Steno, Social Media, and Self-Care; my session on how AI is already changing court reporting, and finally, Addressing Ethical Issues Related to AI. The last one opened my eyes to some of the urgent broader issues that AI raises.


My message to the CCRA-NJ members was, to steal a phrase, “DON’T PANIC.” And I would say the same is largely true about the ethical issues.


  1. AI is not a substitute for human thought or a source for official records but a valuable augmentation, when used correctly. We know that generative AI, all the rage since November 2022, is not a panacea nor a substitute for the original thought, research, and analysis that attorneys must perform.

  2. When it comes to using speech recognition software to capture argument and testimony for court proceedings, that ship has sailed. Many states are content to substitute STT (not necessarily AI) for low-stakes hearings and certainly non-criminal matters. That doesn’t mean that courts and private reporting clients don’t want live reporters – it just means that there’s not enough of them to go around. There are straightforward solutions to remedy the court reporter shortage. All states could make accredited court reporting schools free for certificate students. California has already moved to recognize voice writing as an official recording method for proceedings. Crazy thought: reporting schools or certifying bodies could introduce additional new levels of certification that remove input and proofing/editing speed barriers – allowing a new input modality – and focus exclusively on accuracy. Slower transcript turnaround, but also lower cost, and high accountability.

  3. Government regulation and laws are still a barrier to entry. For high stakes matters, not only do some states mandate certified court reporters, but also there are technical and physical limitations which require the perception of nuance, the judgment, and the unsurpassed adaptability of a trained human being. I listed some of the built-in limitations of all STT products, such as speaker attribution, internet connectivity, the idiosyncrasies of formatting, running proceedings, stopping parties when appropriate such as when they talk over each other, sound quality and noise, and software issues.

  4. AI is entrenched in many legaltech solutions, albeit in far less volatile form than GPT. Freelancers, agencies and other service providers have already been using AI in various contexts as a backup transcription aid, a proofreading/scoping assistant, video synchronization, and other uses. It accelerates the speed of turnaround for transcripts. It’s not perfect, but good enough for many circumstances.

  5. The media hype around AI is 75% hype. High profile stories of people being sanctioned or otherwise penalized for cheating via generative AI echo stories from 30 years ago when “the World Wide Web” was the insidious accomplice of all sorts of crimes. Generative AI apps like ChatGPT are alluring tools that magnify our impulses and character, for good or for bad. ChatGPT and assorted flavors thereof really have bupkes to do with court reporting. The 25% you should be concerned about are valuable augmentations legal professionals and service providers are capitalizing on, as well as addressing ethical concerns which I’ll relate below. Whether you’re a 30-year veteran stenographer or a 25-year-old student at a community college, you should be actively learning about the full breadth of tech tools available to you, not just AI, and learning about how legaltech solutions in the litigation space contribute to or consume your work product, the official record.

As part of the presentation I described a handy tool, Gartner’s Hype Cycle, one of which of course has been designed for AI. Click through to see it. As of this writing, Generative AI is literally at the “peak of inflated expectations,” while more mature, field-tested AI services are on the “slope of enlightenment.”


Other presenters delivered valuable content, but the last session delivered by Irina Raicu of the Internet Ethics Program at the Markkula Center for Applied Ethics at Santa Clara University was particularly apt. I’ll paraphrase two of her major points, hopefully faithfully:

  1. Generative AI raises ethical concerns. Studies and reports abound of the built-in biases that LLMs contain because they are built on existing data – data that includes latent racial, cultural, religious, and gender biases baked into everyday discourse. Generative AI has a pernicious, automatic tendency to perpetuate those biases. A tangential example: AI-driven transcription often makes guesses when it isn’t entirely sure about a word or phrase. Those guesses come from whatever large corpus of existing speech the model’s creators point it at, supplemented by the creators’ training. What’s the risk of latent bias being introduced into official records?

    1. Suppose unqualified, uncertified service providers use AI-driven transcription tools and simply accept that output as the truth.

    2. Suppose a certified reporter makes the fatal mistake of relying on a backup transcription instead of authoritatively editing and proofing a transcript. The AI flubs its transcription guesses frequently enough that it impacts outcomes.

    3. Game over.

  2. While data centers already consume enormous resources driving greenhouse gas emissions greater than the entire airline industry, emissions (and water pollution) are increasing due to the use of AI models.

    1. However, it's my personal opinion that as more AI users come online, government and private entities see that they have a critical stake in putting throttles on AI and data centers in general. Companies don’t want their electricity and cooling costs to skyrocket, so they’ll pass operational and reg compliance costs on to users, and governments are beginning to flex their capacity to hold data center/cloud providers accountable for environmental standards.

Takeaways

AI in all its manifestations is not a substitute for human beings who offer scarce skills and talents. AI-supported products (like Cloud Court’s Gibson!) are an “exosuit,” like that worn by Tony Stark as Iron Man, that augments our abilities. But it’s here to stay, so we need to adapt by learning, evaluating, and using appropriate tools, raise awareness of its pros and cons, and understand and differentiate human-only capabilities from tasks that automation can (and should or should not) handle.


Notes

Want a solid, yet accessible primer on AI that’s also entertaining? Subscribe to The ABCs of AI by Maria Pere-Perez at Databricks.

Here’s a link to a good list of court reporting tools from the National Network Reporting Company.

Almost prescient Forbes article from 2020 about the environmental impact of AI. Relevant now more than ever.

Litigation pros: I recommend Alex Su’s blog Off the Record for, among many other things, his thoughts on future-proofing legal careers and how AI will cross the legal chasm.

97 views

Recent Posts

See All
bottom of page