ElevenLabs’ AI voice generation is ‘highly likely’ to have been used in the Russian influence operation
Generative AI has a number of well-documented negative uses, from creating academic papers to copying artists. And now, it seems that it is growing in the activities of the influence of the state.
Another recent campaign “is very likely” helped by AI voice production products, including technology released publicly by the hot ElevenLabs, according to a recent report from the Massachusetts-based intelligence intelligence company Recorded Future.
The report describes a Russian-led campaign designed to undermine European support for Ukraine, dubbed “Operation Undercut,” that prominently used AI-generated voiceovers in fake or misleading “news” videos.
The videos, which were aimed at a European audience, attacked Ukrainian politicians as corrupt or questioned the usefulness of military aid to Ukraine, among other topics. For example, one video stated that “even jammers can’t save American Abrams tanks,” referring to the devices used by US tanks to deflect incoming missiles – reinforcing the point that sending military weapons to Ukraine is pointless.
The report says video creators are “highly likely” to use voice-generated AI, including ElevenLabs’ technology, to make their content appear legitimate. To confirm this, Record Future researchers submitted the clips to ElevenLabs’ AI Speech Classifier, which empowers anyone to “detect that a sound clip was created using ElevenLabs,” and find a match.
ElevenLabs did not respond to requests for comment. Although Record Future noted the possible use of several AI voice generation tools, it did not name any other than ElevenLabs.
The usefulness of the voice generation of AI was revealed by the orchestrators of the influence campaign, who – inadvertently – released some videos with the voices of real people “with a Russian accent.” In contrast, AI-generated voiceovers speak in many European languages such as English, French, German, and Polish, without accenting foreign sounds.
According to Recorded Future, the AI has also allowed misleading clips to be released quickly in many languages spoken in Europe such as English, German, French, Polish, and Turkish (presumably, all languages supported by ElevenLabs.)
Record Future revealed that the project was created by the Social Design Agency, a Russian-based organization that the US government sanctioned in March for using “a network of more than 60 websites posing as real news organizations in Europe, and using fake social media accounts to increase misinformation.” The content of the websites is corrupted. ” All this was done “on behalf of the Government of the Russian Federation,” the US State Department said at the time.
The campaign’s overall impact on public opinion in Europe was minimal, Record Future concluded.
This is not the first time that ElevenLabs products have been singled out for alleged misuse. The company’s technology was behind a robocall impersonating President Joe Biden who urged voters not to go to the polls during the January 2024 primary election, the company concluded, detecting voice fraud, according to Bloomberg. In response, ElevenLabs said it has rolled out new security features such as automatically blocking politicians’ voices.
ElevenLabs prohibits “unauthorized, harmful, or deceptive impersonation” and says it uses various tools to enforce this, such as both automatic and human moderation.
ElevenLabs has experienced phenomenal growth since its founding in 2022. It recently grew ARR to $80 million from $25 million less than last year, and could be valued at $3 billion, TechCrunch previously reported. Its investors include Andreessen Horowitz and former Github CEO Nat Friedman.
Source link