You can save this article by registering for free here. Or sign-in if you have an account.

Earlier this year, the Supreme Court of British Columbia released a costs decision in the case of Zhang v. Chen, flowing from a family law dispute relating to parenting time.

Subscribe now to read the latest news in your city and across Canada.

Subscribe now to read the latest news in your city and across Canada.

Create an account or sign in to continue with your reading experience.

But that’s not why I am writing about it this weekend.

In seeking costs on the outcome of the application, Nina Zhang also sought special costs against the lawyer of her former spouse for including two non-existent cases that were discovered to have been invented by ChatGPT.

After receiving Chen’s Notice of Application in December, 2023, Zhang’s counsel advised that they could not locate two cases cited in his application. Chen’s counsel apologized, indicating they would look into it. Zhang’s counsel continued to demand copies of the two cases referenced in the notice of application. They were never produced.

To confirm the cases didn’t exist at all, Zhang’s counsel even hired a legal researcher to seek out the two cases. The researcher, too, determined they didn’t exist.

Get the latest headlines, breaking news and columns.

By signing up you consent to receive the above newsletter from Postmedia Network Inc.

A welcome email is on its way. If you don't see it, please check your junk folder.

The next issue of The Winnipeg Sun's Daily Headline News will soon be in your inbox.

We encountered an issue signing you up. Please try again

On the date of the hearing of the application, Chen’s counsel provided an email to the court admitting to using ChatGPT “without verifying the source of information.”

She also said: “I had no idea these two cases could be erroneous.”

While I have previously written about a New York lawyer that was outed for relying on ChatGPT research in a legal brief, this is the first Canadian example I have found where, again, ChatGPT has made a dangerous example of itself.

In responding to the request for special costs against her, the lawyer at issue swore an affidavit, where she deposed, “I am now aware of the dangers of relying on Al generated materials.”

In deciding if costs should be awarded against her, the court found that citing fake cases in court filings and other materials handed up to the court is “an abuse of process and is tantamount to making a false statement to the court.”

The court noted though that Zhang had a well resourced legal team, the cases were withdrawn before the hearing and “there was no chance here that the two fake cases would have slipped through.”

While the court didn’t order special costs against Chen’s lawyer, it did find that additional expense and effort was incurred and that she would personally bear the costs of that.

This is a very public example of what is likely happening in every industry, even by trained and educated professionals.

And while one can sympathize with the lawyer here in some respects, I can’t help but reflect on what the impact would have been if the two fictitious cases in this case did slip through.

Our Canadian legal system would be seriously imperiled if fake ChatGPT cases were quoted in courts and accepted by less vigilant counsel and judges. On second thought, it likely already has happened.

So what’s the call to action?

The call is to people like you, fair reader, to take the reins of your own AI use. Use it judiciously, not always. Treat AI as you would a brilliant young child – one that can see the world much differently than most, but still needs help crossing the street.

Chen’s lawyer was remorseful and obviously naive to the pitfalls of using generative AI.

Have a workplace question? Maybe I can help! Email me at sunira@worklylaw.com and your question may be featured in a future column.

The content of this article is general information only and is not legal advice.

Postmedia is committed to maintaining a lively but civil forum for discussion. Please keep comments relevant and respectful. Comments may take up to an hour to appear on the site. You will receive an email if there is a reply to your comment, an update to a thread you follow or if a user you follow comments. Visit our Community Guidelines for more information.

QOSHE - CHAUDHRI: ChatGPT works its way into Canadian courtrooms - Sunira Chaudhri
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

CHAUDHRI: ChatGPT works its way into Canadian courtrooms

7 0
06.04.2024

You can save this article by registering for free here. Or sign-in if you have an account.

Earlier this year, the Supreme Court of British Columbia released a costs decision in the case of Zhang v. Chen, flowing from a family law dispute relating to parenting time.

Subscribe now to read the latest news in your city and across Canada.

Subscribe now to read the latest news in your city and across Canada.

Create an account or sign in to continue with your reading experience.

But that’s not why I am writing about it this weekend.

In seeking costs on the outcome of the application, Nina Zhang also sought special costs against the lawyer of her former spouse for including two non-existent cases that were discovered to have been invented by ChatGPT.

After receiving Chen’s Notice of Application in December, 2023, Zhang’s counsel advised that they could not locate two cases cited in his application. Chen’s counsel apologized, indicating they would look into it. Zhang’s counsel continued to demand copies of the two cases referenced in the notice of application. They were never........

© Winnipeg Sun


Get it on Google Play