Insights AI News How AI curriculum for law students builds practical skills
post

AI News

18 Nov 2025

Read 13 min

How AI curriculum for law students builds practical skills

AI curriculum for law students trains UChicago law students in practical AI tools, boosting readiness.

The AI curriculum for law students at UChicago Law blends firm legal training with hands-on tech practice. Students first build core research and writing skills, then learn how to use and supervise AI tools. Clinics, electives, and an AI Lab add real projects and ethical guardrails, preparing students to meet law firm expectations. Artificial intelligence is changing how lawyers research, draft, and serve clients. The University of Chicago Law School is responding with a structured, practical program that helps students master judgment while learning modern tools. Faculty guide students through when to use AI, when to avoid it, and how to check it like a junior colleague. By weaving an AI curriculum for law students into core courses, clinics, and a new lab, the school builds real-world skills without short-cutting the learning process.

Why future lawyers need AI literacy

Law firms expect competence with modern tools

Large law firms are adopting AI for research, document review, and drafting support. New lawyers who can use these tools safely and effectively have an edge. They also need to understand model limits, confidentiality risks, and how to check AI output against binding law.

Tools assist, judgment leads

UChicago Law leaders stress a simple idea: AI helps, but it cannot replace human judgment. Students learn to treat AI as a support system, not as an expert. They must verify facts, confirm citations, and own the final work. That professional mindset is a key outcome of the program.

Inside the AI curriculum for law students

Orientation: AI and the legal profession

First-year students begin with a clear policy talk. They hear how generative AI can aid learning, where it can mislead, and why lawyering still depends on careful analysis and ethics. The message is practical: know the tools, respect their limits, and take responsibility for results.

Required AI modules in the first quarter

The Law School is rolling out self-paced modules that set a baseline for AI literacy. Students who already know the basics can move faster. The modules focus on:
  • What generative AI does well and poorly
  • Risks: hallucinations, bias, and confidentiality
  • Legal workstreams where AI can help
  • Safer tools for professional use
  • These units act as building blocks for later classes, clinics, and projects.

    The Bigelow Program: a phased approach to skill-building

    During the fall, 1Ls do not use AI in legal writing and research. They learn the fundamentals the hard way first. In winter, they may use AI under guidance to support research and drafting. This staged plan teaches the right order: master the basics, then add tools that enhance speed and quality without replacing thinking.

    Teaching methods that protect learning

    Clear rules and disclosure

    In clinics and upper-level courses, many professors allow AI use—but only with transparency. Students must disclose when and how they used AI. That way, teachers and students can review the output, debate its quality, and improve prompts and verification steps. This culture of openness mirrors real practice, where lawyers must be able to explain their process and defend their choices.

    When to restrict AI to build foundations

    Some courses ban AI to keep focus on core craft. For example, in contract drafting, students write clauses and agreements themselves. They build a feel for language, risk, and structure. Only once students can judge quality can they safely bring AI into the workflow. This protects clients and fosters strong judgment.

    When to permit AI for practice-ready tasks

    Other courses fully permit AI within clear guardrails. In a litigation clinic, students can draft discovery requests, research, or outline arguments with AI assistance. They must still verify legal cites, check facts, and ensure the work meets ethical duties. The disclosure rule turns each assignment into a learning lab for supervision skills.

    Building, not just using, legal tech: the AI Lab

    A hands-on course with real deliverables

    The AI Lab is a standout feature. Students do not just click prompts—they build a legal tool. This year, the project centers on renters’ rights across the United States. The class creates a vetted database of state rules and common questions, then connects it to an AI interface.

    Human-centered research and scope

    Students interview potential users to learn what people actually need. They study where renters get stuck, what terms confuse them, and how to phrase guidance. This step teaches scoping, data quality, and product sense—skills that matter to firms and legal tech teams alike.

    Impact and access to justice

    The final output is a public-facing AI assistant, more reliable than a general chatbot because it uses a curated legal database. It will help people who cannot easily reach a lawyer. The project shows students how careful design, verified content, and narrow scope can produce safe, helpful tools.

    What skills do students gain?

    Core legal abilities

  • Research depth: finding, reading, and comparing cases and statutes
  • Writing clarity: framing issues, organizing arguments, and citing correctly
  • Judgment: choosing sources, weighing risk, and deciding what to trust
  • AI-era competencies

  • Prompt design: asking clear, testable questions
  • Verification: checking quotes, citations, and factual claims
  • Model awareness: spotting hallucinations and vague answers
  • Confidentiality: avoiding risky data sharing and preserving privilege
  • Workflow design: knowing where AI helps and where it hurts
  • Product and teamwork skills

  • Scoping: defining a useful, narrow problem
  • Data curation: building clean, accurate legal datasets
  • User feedback: interviewing, testing, and iterating
  • Collaboration: splitting roles and managing deadlines
  • Practical tips for students

    Use AI as a support, not a substitute

  • Draft an outline yourself, then use AI to surface counterarguments
  • Ask for sample structures, but write the text in your own words
  • Use AI to brainstorm issues to research, not to decide the answer
  • Always confirm cites in primary sources before submitting work
  • Practice ethical habits daily

  • Never paste confidential facts into public tools
  • Keep a log of your prompts and what you used from the output
  • Disclose AI use when rules require it
  • Assume you are responsible for every word you submit
  • Practical tips for law schools

    Set guardrails that encourage learning

  • Ban AI early in core writing, then introduce guided use later
  • Require disclosure so instructors can coach better
  • Use short, self-paced modules to level up basic literacy
  • Expand real-world projects

  • Create labs where students build narrow, verified tools
  • Invite legal tech founders and firm partners to teach
  • Offer clinics that test AI in live-client settings, with supervision
  • Risks, guardrails, and ethics

    Supervise AI like a junior colleague

    AI can write clean sentences and still be wrong on the law. Students learn to treat AI as a helper that needs supervision. They check quotes, verify holdings, and ensure the reasoning matches jurisdiction and facts. This habit reduces risk and raises quality.

    Protect confidentiality and privilege

    Public AI tools can log prompts and data. Students learn to avoid entering sensitive information, use safer platforms where available, and follow school and client policies. Protecting privilege is non-negotiable.

    Own your work

    In every course, the author is accountable. If an AI suggestion is wrong, the student still owns the outcome. This clear line mirrors professional rules and helps build trust with future clients and partners.

    How success can be measured

    Student outcomes and employer feedback

    Schools can track:
  • Writing and research quality in 1L and upper-level courses
  • Externship and clinic performance with and without AI tools
  • Employer surveys on readiness and professionalism
  • Bar passage, clerkship placement, and early career milestones
  • Product impact and community value

  • Public usage of student-built tools
  • User satisfaction and error rates
  • Time saved on routine tasks without loss of quality
  • Access-to-justice gains for people without lawyers
  • Why this approach stands out

    The program balances discipline and exploration. Students first learn to think like lawyers. Then they learn to use AI to work faster and smarter. They build tools that solve real problems and study the ethics that make those tools safe. This mix helps graduates meet firm expectations and lead change.

    Final take

    Legal education works best when it builds strong minds and practical habits. UChicago Law shows how an AI program can do both: defend core writing and research while adding careful, supervised tool use. An AI curriculum for law students should train judgment first, then layer in skills that make work precise, efficient, and client-ready.

    (Source: https://news.uchicago.edu/story/law-school-updates-curriculum-prepare-uchicago-students-ai-era)

    For more news: Click Here

    FAQ

    Q: What is the goal of UChicago Law’s AI curriculum for law students? A: The AI curriculum for law students at UChicago Law aims to equip students with AI competency while preserving mastery of legal research, analysis, and judgment. It balances encouraging exploration of new tools with avoiding pedagogical shortcuts. Q: How does UChicago Law introduce AI to incoming students? A: UChicago Law introduces AI in fall orientation through a session called “AI and the Legal Profession” that explains school policies and the technology’s limits and risks. The AI curriculum for law students also includes self-paced modules launching in early 2026 to establish baseline generative AI literacy and guide students to safer professional tools. Q: When are 1L students allowed to use AI in the Bigelow Program? A: The AI curriculum for law students phases AI use in the Bigelow Program, with no AI allowed in the Autumn Quarter so students learn fundamentals and guided AI use permitted in Winter. This staged plan teaches foundational skills first and introduces tools later to enhance, not replace, students’ judgment. Q: How do professors supervise or restrict AI use in coursework? A: The AI curriculum for law students emphasizes flexible faculty approaches that combine transparency and supervision. Students who use AI typically must disclose that use so instructors can review outputs, and some foundational courses ban AI while clinics permit it under clear guardrails. Q: What is the AI Lab and what do students build there? A: The AI curriculum for law students includes the AI Lab, a hands-on course where students build legal tools focused this year on renters’ rights. Students create a vetted database of state rules, connect it to an AI chatbot intended for public release, and learn scoping, interviewing, and product design. Q: How does the curriculum teach ethics and confidentiality around AI? A: The curriculum teaches ethical habits such as never pasting confidential facts into public tools, using safer platforms when available, and keeping a log of prompts. Students must disclose AI use where required and are held accountable for every word they submit, reflecting the duty to supervise AI like a junior colleague. Q: What specific skills do students gain from the program? A: Through the AI curriculum for law students, students gain core legal abilities—research depth, writing clarity, and judgment—alongside AI-era competencies such as prompt design, verification, spotting hallucinations, and protecting confidentiality. They also learn product and teamwork skills like scoping, data curation, user testing, and collaboration in labs and clinics. Q: How will the Law School measure the success of its AI program? A: The AI curriculum for law students will be measured by student outcomes and employer feedback, including writing and research quality, externship and clinic performance, employer surveys on readiness, and career milestones such as bar passage and placements. The school will also track product impact for student-built tools, such as public usage, user satisfaction, error rates, time saved, and access-to-justice gains.

    Contents