AI & Access to Justice: Opportunities, Risks, and Next Steps workshop

As more lawyers, court staff, and justice system professionals learn about the new wave of generative AI, there’s increasing discussion about how AI models & applications might help close the justice gap for people struggling with legal problems.

Could AI tools like ChatGPT, Bing Chat, and Google Bard help get more people crucial information about their rights & the law?

Could AI tools help people efficiently and affordably defend themselves against eviction or debt collection lawsuits? Could it help them fill in paperwork, create strong pleadings, prepare for court hearings, or negotiate good resolutions?

The Stakeholder Session

In Spring 2023, the Stanford Legal Design Lab collaborated with the Self Represented Litigation Network to organize a stakeholder session on artificial intelligence (AI) and legal help within the justice system. We conducted a one-hour online session with justice system professionals from various backgrounds, including court staff, legal aid lawyers, civic technologists, government employees, and academics.

The purpose of the session was to gather insights into how AI is already being used in the civil justice system, identify opportunities for improvement, and highlight potential risks and harms that need to be addressed. We documented the discussion with a digital whiteboard.

An overview of what we covered in our stakeholder session with justice professionals.

The stakeholders discussed 3 main areas where AI could enhance access to justice and provide more help to individuals with legal problems.

  1. How AI could help professionals like legal aid or court staff improve their service offerings
  2. How AI could help community members & providers do legal problem-solving tasks
  3. How AI could help executives, funders, advocates, and community leaders better manage their organizations, train others, and develop strategies for impact.

Opportunity 1: for Legal Aid & Court Service Providers to deliver better services more efficiently

The first opportunity area focused on how AI could assist legal aid providers in improving their services. The participants identified four ways in which AI could be beneficial:

  1. Helping experts create user-friendly guides to legal processes & rights
  2. Improving the speed & efficacy of tech tool development
  3. Strengthening providers’ ability to connect with clients & build a strong relationship
  4. Streamlining intake and referrals, and improving the creation of legal documents.

Within each of these zones, participants had many specific ideas.

Opportunities for legal aid & court staff to use AI to deliver better services

Opportunity 2: For People & Providers to Do Legal Tasks

The second opportunity area focused on empowering people and providers to better perform legal tasks. The stakeholders identified five main ways AI could help:

  1. understanding legal rules and policies,
  2. identifying legal issues and directing a person to their menu of legal options,
  3. predicting likely outcomes and facilitating mutual resolutions,
  4. preparing legal documents and evidence, and
  5. aiding in the preparation for in-person presentations and negotiations.
How might AI help people understand their legal problem & navigate it to resolution?

Each of these 5 areas of opportunities is full of detailed examples. Professionals had extensive ideas about how AI could help lawyers, paraprofessionals, and community members do legal tasks in better ways. Explore each of the 5 areas by clicking on the images below.

Opportunity 3: For Org Leadership, Policymaking & Strategies

The third area focused on how AI could assist providers and policymakers in managing their organizations and strategies. The stakeholders discussed three ways AI could be useful in this zone:

  1. training and supporting service providers more efficiently,
  2. optimizing business processes and resource allocation, and
  3. helping leaders identify policy issues and create impactful strategies.
AI opportunities to help justice system leaders

Explore the ideas for better training, onboarding, volunteer capacity, management, and strategizing by clicking on the images below.

Possible Risks & Harms of AI in Civil Justice

While discussing these opportunity areas, the stakeholders also addressed the risks and harms associated with the increased use of AI in the civil justice system. Some of the concerns raised include over-reliance on AI without assessing its quality and reliability, the provision of inaccurate or biased information, the potential for fraudulent practices, the influence of commercial actors over public interest, the lack of empathy or human support in AI systems, the risk of reinforcing existing biases, and the unequal access to AI tools.

The whiteboard of professionals’ 1st round of brainstorming about possible risks to mitigate for a future of AI in the civil justice system

This list of risks is not comprehensive, but it offers a first typology that future research & discussions (especially with other stakeholders, like community members and leaders) can build upon.

Infrastructure & initiatives to prioritize now

Our discussion closed out with a discussion of practical next steps. What can our community of legal professionals, court staff, academics, and tech developers be doing now to build a better future in which AI helps close the justice gap — and where the risks above are mitigated as much as possible?

The stakeholders proposed several infrastructure and strategy efforts that could lead to this better future. These include

  • ethical data sharing and model building protocols,
  • the development of AI models specifically for civil justice, using trustworthy data from legal aid groups and courts to train the model on legal procedure, rights, and services,
  • the establishment of benchmarks to measure the performance of AI in legal use cases,
  • the adoption of ethical and professional rules for AI use,
  • recommendations for user-friendly AI interfaces, that can ensure people understand what the AI is telling them & how to think critically about the information it provides, and
  • the creation of guides for litigants and policymakers on using AI for legal help.

Thanks to all the professionals who participated in the Spring 2023 session. We look forward to a near future where AI can help increase access to justice & effective court and legal aid services — while also being held accountable and its risks being mitigated as much as possible.

We welcome further thoughts on the opportunity, risk, and infrastructure maps presented above — and suggestions for future events to continue building towards a better future of AI and legal help.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.