Using Generative AI

If you are considering using Generative Artificial Intelligence (AI) to prepare court or tribunal documents, it is important for you to understand how AI tools work and how their use may cause issues in your case.

Queensland courts and tribunals have prepared detailed information about Generative AI to assist non-lawyers (including self-represented litigants, McKenzie friends and lay advocates) who represent themselves or others:

The use of Generative Artificial Intelligence (AI): Guidelines for responsible use by non-lawyers (PDF, 303.1 KB)

The above document is briefly summarised below:

Understand Generative AI and its applications

Generative AI chatbots, such as ChatGPT, Microsoft Copilot and Google Gemini, cannot reliably answer questions that require a specialised understanding of language or any idea of truth.

The chatbots may help you prepare for your case in some ways:

  • identify and explain laws and legal principles that might be relevant to your situation
  • organise the facts into a clearer structure
  • suggest suitable headings
  • format documents
  • suggest grammar, tone, vocabulary and writing style
  • plan a speech and produce an outline of potential speaking points.

Generative AI chatbots cannot:

  • understand the unique facts in your case
  • understand your cultural and emotional needs
  • understand the broader Australian social and legal context
  • predict the chance of success or the outcome of your case
  • be trusted to always provide legal or other information that is relevant, accurate, complete, up-to-date and unbiased.

Uphold confidentiality, suppression, and privacy

Do not enter any private, confidential, suppressed or legally privileged information into a Generative AI chatbot as it could become publicly known. This could result in you unintentionally breaching suppression orders, or accidentally disclosing your own or someone else’s private or confidential information.

Ensure accuracy

You are responsible for ensuring that all information you rely on or provide to the court or tribunal is accurate.

Generative AI chatbots can:

  • make up fake cases, citations and quotes
  • refer to legislation, articles or legal texts that do not exist
  • provide incorrect or misleading information about the law or how it might apply in your case
  • provide information based on overseas law and court procedure that does not apply in Australia
  • get facts wrong.

Fake material can seem like it has been taken from a real source even when it has not.

Do not rely on Generative AI as your sole or main source of legal information. It is not an alternative to seeking tailored legal advice and cannot replace a qualified lawyer.

Check the accuracy of any information you get from a Generative AI chatbot before using that information in your case – with a lawyer (if possible) or through publicly available legal resources such as:

You can apply for legal aid through Legal Aid Queensland.

Be aware of ethical issues

The use of Generative AI chatbots may raise copyright and plagiarism issues, such as:

  • a summarised portion of a textbook or other intellectual property could breach the author’s copyright
  • the summarised portion needs to be carefully reviewed to ensure it carries the same meaning as the original content
  • depending on context, the source may need to be acknowledged and citations added.

Maintain security

Follow general cyber-safety best practices to maintain your own security and that of the courts and tribunals.