pwshub.com

Real-time face-swapping technology goes viral, fueling fears of identity fraud

Serving tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.

A hot potato: As deepfake technology continues to evolve, the potential for misuse grows. While current tools still require users to mimic mannerisms, voice, and other details, advancements in voice cloning and video synthesis could make creating digital doppelgängers in real-time even more convincing.

In the past few days, a new software package called Deep-Live-Cam has been making waves on social media, drawing attention for its ability to create real-time deepfakes with incredible ease. The software takes a single photo of a person and applies their face to a live webcam feed, tracking the person's pose, lighting, and expressions on the webcam. While the results are not flawless, the technology's rapid advancement underscores how much easier it has become to deceive others with AI.

Ars Technica notes that the Deep-Live-Cam project has been in development since late last year, but it has recently attained viral attention after example videos began circulating online. These clips show individuals imitating prominent figures like Elon Musk and George Clooney in real time. The sudden surge in popularity briefly propelled the open-source project to the top of GitHub's trending repositories list. The free software is available from GitHub, making it accessible to anyone with a basic understanding of programming.

Some experiments - it works almost flawlessly and it's totally real-time. Took me 5 minutes to install. pic.twitter.com/Ow0QRF7WOj

– joao (@jay_wooow) August 9, 2024

The potential misuse of Deep-Live-Cam has sparked concern among tech observers. Illustrator Corey Brickley had the epiphany that most recent breakthrough technologies are ripe for abuse.

"Weird how all the major innovations coming out of tech lately are under the Fraud skill tree," Brickley tweeted, adding, "Nice remember to establish code words with your parents everyone."

Weird how all the major innovations coming out of tech lately are under the Fraud skill tree

– Corey Brickley Illustration. Justice for Palestine (@CoreyBrickley) August 13, 2024

While Brickley's comment is intentionally sardonic, it highlights the potential for bad actors to use such tools for deception. Considering the prevalence and accessibility of deepfake technologies, setting up a safe word to confirm your identity to family and friends is not that crazy an idea.

Face-swapping technology itself is not new. The term "deepfake" has been around since 2017. It originated from a Reddit user who used it as a handle. The redditor frequently posted pictures and videos in which he swapped a porn performer's face with that of a celebrity. At that time, the technology was slow, expensive, and far from real-time. However, those primitive techniques have improved to an incredible degree. Projects like Deep-Live-Cam and others are smarter and faster and have lowered the barrier of entry, allowing anyone with a standard PC to create deepfakes using free software.

The potential for abuse is already becoming well documented.

In February, scammers in China impersonated company executives, including the CFO, in a video call and tricked an employee into making over $25 million US in money transfers. The employee was the only real person on the conference call. In a similar case, someone in the US recently cloned Joe Biden's voice to dissuade people from voting in the New Hampshire primary. With the rise of real-time deepfake software like Deep-Live-Cam, instances of remote video fraud may become more common, affecting not just public figures but ordinary individuals as well.

Source: techspot.com

Related stories
1 month ago - Using one photo and free software, someone can impersonate your appearance in a video chat.
3 weeks ago - The fake photos and videos look convincing, and it doesn't help when former presidents post them on social media.
1 day ago - As AI deepfakes sow doubt in legitimate media, anyone can claim something didn't happen.
1 month ago - Commentary: Spending time in the hospital and recovering after surgery reminded me that smart glasses can be truly helpful if you can't get around.
3 weeks ago - Whether you want a leather or silicone case, these are the best protective cases for your iPhone 13.
Other stories
10 minutes ago - Act fast to grab this high-performing mesh router for less than $500, keeping you connected while saving some cash too.
10 minutes ago - If the old-school PlayStation is dear to your heart, you can soon relive those totally sweet 1990s memories. Sony is releasing a series of products...
10 minutes ago - If you've got an old phone to part with, T-Mobile is offering both new and existing customers the brand-new Apple iPhone 16 Pro for free with this trade-in deal.
10 minutes ago - Who doesn't want the best for their beloved pooch? Grab some of these tasty treats to make your dog feel special.
16 minutes ago - To be fair, Joe was probably taking a nap The Iranian cyber snoops who stole files from the Trump campaign, with the intention of leaking those documents, tried to slip the data to the Biden camp — but were apparently ignored, according...