As artificial intelligence (AI) continues to revolutionise various industries, the legal sector is taking a cautious stance. In New South Wales, the state’s Chief Justice, Andrew Bell, has raised concerns about the growing influence of AI and its potential impact on the legal system, particularly as restrictions on its use are set to take effect from February 3, 2025.
The Power and Risks of AI
Chief Justice Bell’s caution stems from the power held by big tech companies behind generative AI tools, such as ChatGPT. He highlighted the potential for these companies to manipulate data and information, raising concerns about the accuracy and integrity of AI-generated content. “The power of those who control big tech and the ability to influence what underlying data is and is not included in the system causes me to think that we should be cautious,” he said.
One of the core issues is the increasing sophistication of AI tools, which can produce content that closely resembles authentic human-written documents. This capability poses a risk to the legal process, particularly when it comes to the generation of documents like affidavits, witness statements, and character references. In response, the Chief Justice has implemented a partial ban on the use of AI to create or alter such documents, a move he describes as “hard and conservative.”
The Risk of AI in Legal Evidence
The key concern is that AI tools can generate what appears to be legitimate evidence, such as witness statements or legal arguments, which may not be based on real events or information. Chief Justice Bell warned that this could undermine the credibility of court proceedings. “AI can be used to produce what looks like a person’s evidence. But if it’s not the person’s actual evidence, that undermines the whole process,” he explained.
The regulations also restrict the use of AI in preparing legal documents where there is a risk of distortion or fabrication. Lawyers and self-represented litigants are still permitted to use AI for research and drafting, but they must verify the accuracy of the citations and content generated. However, AI’s ability to generate plausible but fictitious information, often referred to as “hallucinations,” is a serious risk. Sydney barrister Victor Kline pointed out that AI tools sometimes produce cases that don’t exist, posing a danger to the integrity of legal submissions.
Generative AI’s Role in the Legal Profession
Although AI is seen as a valuable tool in some areas of legal work, its potential to mislead or confuse makes it problematic in others. The Supreme Court of Victoria, for example, has only advised “particular caution” when using AI, unlike NSW, which has taken a more conservative approach.
Chief Justice Bell’s concerns are not limited to the risk of fake evidence but also extend to the broader ethical and legal implications of using AI in decision-making. The judge stressed that the role of the judiciary is inherently human, stating, “The task of judging in our society is a human one. If that were to be abdicated to machines, a very important part of our democratic fabric would be lost.”
Ensuring Integrity and Privacy
The NSW guidelines, which come into effect in February, aim to balance the use of AI with the need to preserve the integrity of the legal process. One of the main revisions to the initial practice note involves allowing sensitive information, such as documents subject to suppression orders, to be fed into bespoke AI programs—provided that this data is not made publicly available or used to train broader AI models.
The guidelines also stress the importance of confidentiality and privacy, which Chief Justice Bell believes should be upheld by AI programs. However, some, like Victor Kline, have raised concerns about the security of confidential data, particularly when shared with large law firms.
Looking Ahead
As AI continues to evolve rapidly, the guidelines set by Chief Justice Bell will be regularly reviewed to adapt to new developments in technology. While AI presents both opportunities and challenges for the legal system, it’s clear that NSW is taking a measured approach to ensure that the technology does not undermine the fairness and integrity of legal proceedings.
In summary, while AI may enhance certain aspects of legal practice, its use in critical areas like evidence generation and judicial decision-making requires strict regulation to prevent abuse and ensure trust in the justice system. The NSW Chief Justice’s cautious stance reflects a broader concern about the power of AI and the need to preserve human judgment in the legal process.