‘Deepfake’ warning over online courts


Facial recognition: Issue for online courts

Video manipulation software, including ‘deepfake’ technology, poses problems for remote courts in verifying evidence and that litigants or witnesses are who they say they are, a report has warned.

Not only could successful deepfakes find their way into evidence, “potentially condemning the innocent or exonerating the guilty”, it said, but the mere existence of deepfakes allowed litigants and their lawyers “to cast doubt on video or audio that is legitimate”.

The report on ‘virtual justice’ by New York-based privacy group Surveillance Technology Oversight Project (STOP) noted that parties to online court proceedings may be asked to verify their identity by providing sensitive personal information, biometric data, or facial scans – in the state of Oregon, judges sign into their virtual court systems using facial recognition.

It said: “Distrust around digital records has persisted with the advent and ease of photoshopping. Altered evidence can still be introduced if the authenticating party is itself fooled or is lying.

“In the coming years, courts must also be mindful of emerging AI technology around deepfakes, which allows a user to manipulate images and audio in real time. While this technology is nascent today, it is rapidly advancing and may soon pose a potent threat to trust in online communication.”

STOP said programs such as Avatarify superimpose another’s face onto a user in real time and is already being used on conferencing platforms.

“While faceswap technologies like Avatarify use an algorithm trained on another’s image, usually requiring several photos of the person’s face that you’re trying to animate, technology like First Order Motion approaches deepfakes inversely, manipulating a user’s photo by way of video of another person without any prior training on the target image.

“AI software companies like SenseTime can create deepfakes from audio sources by using a third party’s audio clip and video of the user to generate footage of the user saying the words from the recording.

“This can not only allow a person to fabricate their identity but can allow a litigant or witness to use their own voice to make the claim that they said something different than what the opposing party claims.”

The report said courts could learn from China, where the Beijing Internet Court requires litigants to set up an online account using their national identity cards and a facial recognition system before bringing a case remotely.

More broadly, the STOP report warned that online courts “may transform the digital divide into a justice divide, as the lack of computer access and broadband internet robs low-income litigants of their day in court”.

It also highlighted privacy and due process concerns with online court software and the growing role of private vendors, including the lack of clear rules on how confidential data is collected, stored, and accessed, as well as the inability of lawyers and clients to communicate confidentially.

The report noted that the courts could not monitor for unauthorised recordings of proceedings as well.




    Readers Comments

  • Andy Clarke says:

    My concerns are that there are legitimate therapy companies who offer services to people with skin/scar problems.
    ‘Skin camouflage services’ and ‘restore therapy’.
    They could unwittingly get drawn into a situation where someone wishes to disguise their appearance.


Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


The rise of the agent

We believe AI agents are going to represent the biggest change to the way in which the general public interact with professional services business for generations.


The lonely role of a COFA: sharing the burden of risk management

Compliance officers for finance and administration in law firms can often find themselves walking a solitary path. But what if we could create a collaborative culture of shared accountability?


Mind the (justice) gap: Why are RTAs going up but claims still down?

The gap between the number of road traffic accident injuries and the number of motor injury claims continues to widen, according to the latest government data.


Loading animation