You can’t do anything to stop AI from stealing your voice.

According to a recent study, voice cloning software, the majority of which are free, has weak safeguards against non-consensual impersonations [ AI ].

According to a Consumer Reports investigation, the majority of popular AI voice cloning applications lack significant safeguards against individuals unintentionally mimicking the voices of others.

With just a few seconds of sample audio, several services can now accurately duplicate a person’s cadence thanks to impressive advancements in voice cloning AI technology in recent years. A flashpoint occurred during last year’s Democratic primary when voters received robocalls asking them not to cast their ballots featuring a phony Joe Biden. The Federal Communications Commission has subsequently outlawed AI-generated robocalls, and the political consultant who acknowledged orchestrating the scam was fined $6 million.

Five of the top six publicly accessible AI voice cloning technologies, according to a recent survey, have readily circumvented security measures, making it straightforward to clone someone’s voice without that person’s permission. Software for detecting deep fake audio frequently has trouble distinguishing between synthetic and genuine speech.

There aren’t many federal regulations governing the rapidly developing field of generative artificial intelligence (AI), which replicates human characteristics including appearance, writing, and voice. The majority of safety and ethical standards in the sector as a whole are self-imposed. In 2023, Biden signed an executive order on AI that contained some safety requirements; but, when he took office, President Donald Trump overturned the order.

download 8

A synthetic audio file is created by extrapolating a person’s voice from an audio sample of them speaking. This is how voice cloning technology operates. In the absence of security measures, anyone who creates an account may easily upload audio of someone speaking—for example, from a YouTube or TikTok video—and have the service mimic them.

ElevenLabs, Speechify, PlayHT, and Logo are four services that only require a checkbox to be checked indicating that the individual whose voice is being cloned has granted permission.

Instead of just letting users upload recordings, Resemble AI, another service, needs real-time audio recording. But by just playing an audio clip from a computer, Consumer Reports was able to get around that restriction with ease.

Descript, the sixth service, was the only one with a somewhat good protection. A unique consent statement must be recorded by the potential cloner, and it is hard to fake other than by cloning through a different provider.

The public can access all six services through their websites. The only companies that charge for a personalized voice clone are Eleven Labs and Resemble AI, which charge $5 and $1, respectively. Everyone else is free.

According to some of the businesses, misuse of their tool can have detrimental effects.

images 2

In an emailed comment to NEXT TECH PLUS News, a Resemble AI official said, “We have put strong safeguards in place to prevent the creation of deep fakes and protect against voice impersonation because we recognize the potential for misuse of this powerful tool.”

AI voice cloning has valid applications, including assisting individuals with disabilities and producing audio translations of speakers of other languages. However, Sarah Myers West, co-executive director of the AI Now Institute, a think tank that examines the effects of AI policy, acknowledged that there is also a great deal of potential for harm.

As West told NEXT TECH PLUS News, “this could obviously be used for fraud, scams, and disinformation, for example, impersonating institutional figures.”

There is minimal study into how frequently AI is employed in audio-based scams. In “grandparent scams,” a criminal calls a victim and claims an emergency involving a family member, such as being kidnapped, arrested, or injured. The Federal Trade Commission has cautioned that similar schemes may leverage artificial intelligence, even though the scams predate the technology.

Cloned voices have been used to create music without the permission of the depicted artist, as was the case with a viral 2023 song that falsely claimed to be by Drake and the Weeknd, and other musicians have fought to maintain control over their image when others release music including their voices.

By :- Next Tech Plus

Rate this post

Stay in the Loop

Get the daily email from CryptoNews that makes reading the news actually enjoyable. Join our mailing list to stay in the loop to stay informed, for free.

Latest stories

- Advertisement - spot_img

You might also like...