"

37 Legal research using Generative Artificial Intelligence (AI)

The application of AI technologies in the legal sector is increasing and evolving rapidly as AI capabilities improve. With AI functionality now incorporated into all the main legal research databases, this chapter will focus specifically on the responsible use of generative AI tools for legal research.

Please note this page is based on information known to the authors at the time of writing (October 2025).

On 28 August 2025, the Law Institute Victoria (LIV) published guidelines for the ethical and responsible use of artificial intelligence, for legal practitioner and their support staff.

On 6 December 2024, the Law Society of NSW, the Legal Practice Board of Western Australia, and the Victorian Legal Services Board and Commissioner issued a joint statement on the use of artificial intelligence in Australian legal practice, and in September 2025, the Supreme Court of Queensland issued a practice direction on Accuracy of Reference in Submissions. Ethical professional practice is the core guiding principle, with emphasis on legal practitioners’ obligations to maintain client confidentiality and privacy, provide accurate and trustworthy independent advice, and ensure fair and reasonable costs. Transparency is important and lawyers should disclose their use of AI tools, under professional or court rules.

UQ students should refer to the AI Student Hub for guidance on the appropriate use and attribution of AI for study while at the University. Additionally, the diagram below describes seven factors to consider for the ethical use of generative AI, including the concepts of transparency, reliability, bias, equity, sustainability, data sovereignty, and privacy.

Know your AI.Evaluate and understand its limitations. There is no substitute for human expertise, judgement, and reasoning. Ethical AI [is the intersection of]: Privacy. Take appropriate measures to protect sensitive data: your own, and others'. If you do not pay for the service, you are the service. Data Sovereignty. Who's data is it? Are you allowed to use it? Is it appropriate for you to use it? This is particularly important in relation to indigenous data. Sustainability. LLMs and image-based models can use cast amounts of energy and water to generate computations. They may also have high carbon emissions and create e-waste. Do these environmental issues impact your decision to use the AI? Do you need to use AI? Equity. Does everyone have the same access? Is it fair to use the AI? Bias. AI models can perpetuate societal bias and discrimination. Evaluate for fairness, equity and access to justice. Reliability - Reliability and accuracy. Are you confident the data is true? Cross-reference or 'triangulate' your information with reliable sources. Transparency - Transparency and academic integrity. Where did the data come from? Verify and acknowledge your sources.
Know Your AI” by Kate Thompson, shared under CC-BY-NC 4.0.

Key Considerations

It is important that legal researchers who use generative AI apply caution and critical evaluative judgement of outputs, fact-check for accuracy and verify all information sources, case law and legislation cited. They should be viewed as a tool to augment routine tasks but not as a replacement for legal interpretation and analysis. They can be used as a starting point for research but require careful verification for every output. Note: This is not an endorsement of the accuracy or efficacy of generative AI while undertaking legal research.

Generative AI and Legal Research

Some ways in which generative AI can be used in legal research include:

  • summarising complex texts, concepts and legal cases into plain language to aid understanding of topics and facilitate further research,
  • providing broad answers to common legal questions and general information relating to legal topics,
  • conducting broad secondary legal research to guide primary research into case law and legislation,
  • helping to identify key research concepts and create basic search strings,
  • brainstorming and generating ideas for further research.

Australian court guidelines on using generative AI tools in litigation

Australian courts have recognised the attractiveness of using generative AI technologies, particularly for self-represented litigants. Just as it is important to be aware of court practice notes and procedural guidelines, it is essential to keep up to date with court guidelines relating to the use of generative AI technologies.

Guidelines have not yet been issued by all Australian courts. Currently, guidelines exist from the following courts:

The predominate directive is that there is a clear duty to the court for litigants to ensure all information provided to a court or tribunal is accurate, including any information generated by AI tools and to fulfill their obligations under the legal profession uniform laws and rules. Guidelines provide information on key risks and issues associated with using AI and some suggestions for minimising them.

The Law Society of New South Wales has a Court protocol on AI webpage which contains links to other court and law society websites. It covers the Common Law jurisdictions of Australia, New Zealand, Canada, the United Kingdom and the United States.

Some examples of Law Society and Law Reform Commission guidelines include:

Generative AI tools for Australian legal research (current as of October 2025):

As a rule, AI tools developed for the legal profession are more accurate and reliable for legal research then general tools, such as ChatGPT, Google Gemini, or Microsoft Copilot. This is because these tools have had additional training by experienced professionals to prioritise authoritative legal resources and structure the outputs to the requirements of the legal context.

See Mitchell Adams’ GenAI for Legal Practice (2025) for in-depth information on specific tools.

The following AI tools have been developed for the Australian legal context. Please note: This is not an endorsement of their accuracy or efficacy.

Many of the major legal research databases are now including AI features, which may or may not be available to UQ students and staff. While specific functionality differs across platforms and tools, there are some fundamental skills for effectively using AI.

How to use generative AI tools effectively

A prompt must be given to a generative AI tool in order for it to produce content in response. When entering your prompt, provide as much context, detail, and boundaries as possible. The prompt can specify:

  • the points you want addressed—be clear,
  • the role or perspective from which the text should be written (e.g. a legal practitioner),
  • the focus, format, style, and intended audience and text length,
  • specific requirements, such as not to hallucinate if information is not known, to be accurate and not creative in its response.

Prompting is an iterative process. Responses generally improve when additional questions are asked or rephrased.

Basic prompt ideas

For legal research and analysis

Prompt: Act as a legal practitioner who is advising a client. Conduct legal research on [legal issue or topic]. Summarise the relevant case law, statutes, and delegated legislation in [jurisdiction]. Provide analysis and conclusions based on your research. Please be concise and accurate. Do not hallucinate or respond creatively.

To uncover precedents

Prompt: Act as a legal practitioner who is advising a client. Provide an overview of the legal precedents in [area of law]. List the leading cases that exist in this area of law, and the key legal arguments for each case. Provide analysis and conclusions based on your research. Please be concise and accurate. Do not hallucinate or respond creatively.

Define legislation and regulations for [legal issue]

Prompt: Act as a legal practitioner who is preparing a case. Which are the governing statutes and delegated legislation relating to [legal issue] in [jurisdiction]? How do these statutes and delegated legislation compare to statutes and delegated legislation from other Australian States and Territories. List the leading cases that have cited this legislation, and the key legal arguments for each case. Please be concise and accurate. Do not hallucinate or respond creatively.

The University of Arizona has a useful resource with ideas on writing prompts for legal research using generative AI. Their guide discusses the RICE (Role, Instructions, Context, Expectations) elements that can be incorporated into a prompt.

Risks and benefits of using generative AI for legal research

Some of the key risks and benefits of using generative AI for legal research are outlined below:

Potential risks Potential benefits
  • Users need expertise to understand law and specific areas of the law in order to develop good prompts, ask effective questions, and evaluate the output for accuracy.
  • Gen AI tools may suggest relevant case law that was missed during manual legal research.
  • Propensity of gen AI tools to hallucinate and produce inaccurate, non-factual information.
    • Hallucinations include citing case law and legislation inaccurately.
  • Gen AI tools may flag discrepancies between jurisdictions that might initially be missed.
  • Output may be unrelated to the jurisdiction in question and not give enough depth or analysis for the specific legal topic.
  • Specific search skills, such as applying Boolean techniques, are not needed in gen AI tools.
  • Presence of bias or mis-/disinformation in outputs due to the generic, global nature of information it draws from and/or malicious actors.
  • Iterative prompting facilitates deeper diving into legal topic areas.
  • Negligent use or upload of proprietary, confidential or copyrighted materials to AI tools without applying professional judgement, or the necessary gen AI tool settings to meet professional and client obligations.
  • Gen AI tools can summarise large amounts of information to distil the critical matters faster.
  • Possible negative impact on human capability to ideate, analyse, interpret, and synthesise information.
  • Lack of user awareness of how AI and algorithms work, leading to over reliance on gen AI output.
  • Gen AI tools can use substantial amounts of energy and water to complete computations and raise serious issues around sustainability.

At The University of Queensland, in the majority of scholarly journals, and under the Australian Copyright Act 1968 (Cth), AI is not considered an “author” and has none of the rights or responsibilities of authorship. If you use AI tools or AI generated outputs in your work, you are the author of that work and are ultimately responsible for its accuracy and authenticity—failure to thoroughly check the output and acknowledge the use of AI will reflect poorly on you, and impact your professional reputation.

Visit the chapter on Academic Misconduct for more information.

Further resources

Australian Law Journal: The 2024 September issue of the Australian Law Journal was a special issue on ‘AI and the law’. See: (2024) 98 Australian Law Journal 631.

Adams, M. (2025). GenAI for Legal Practice. Swinburne University of Technology.
“This essential resource equips legal professionals, law students, and legal educators with the knowledge and skills necessary to work effectively, efficiently, ethically, and safely with AI tools while maintaining the highest professional standards.”

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Legal Research Essentials Copyright © 2023 by The University of Queensland is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.