Voice cloning technology has opened up a world of possibilities, from enhancing productivity tools to aiding in creative marketing strategies. However, a recent study by Consumer Reports has shed light on a concerning issue plaguing the industry: the lack of safeguards against scammers.
According to the study, several companies developing generative AI tools for voice cloning have failed to implement sufficient barriers to prevent malicious use of their technology. In fact, four out of six companies examined did not have measures in place to ensure that individuals’ voices were not fraudulently cloned without consent.
Consumer Reports’ testing revealed that it was alarmingly easy to create voice clones using publicly available audio with tools from these companies. Merely checking a box claiming the legal right to create a voice clone was all it took to bypass any semblance of consent or technical safeguards.
While companies like Descript and Resemble AI fared better with some safeguards in place, even these measures were not foolproof. The study’s findings have prompted calls for the voice-cloning industry to adopt stricter norms and standards to mitigate the risk of fraudulent activities.
The implications of unchecked voice cloning technology are severe. Scammers are increasingly using this technology for social engineering tactics, such as impersonating close relatives or friends to extract money or sensitive information from unsuspecting victims. The FBI has even issued alerts about the use of voice-cloning schemes in financial fraud.
In response to these challenges, organizations like Starling bank are recommending customers to use unique phrases for verification and to be cautious about sharing personal information on social media. Additionally, productivity tools like Synthesia and D-ID are being employed in enterprises for various purposes, such as creating realistic digital avatars for marketing and presentations.
To address potential misuse of their technology, Synthesia has implemented an ethics and AI policy that includes biometric checks to prevent non-consensual cloning. The company also enforces stringent content moderation to ensure harmful content is not generated using their platform.
As the voice-cloning industry continues to evolve and expand, it is crucial for companies to prioritize security and ethical considerations. By establishing robust safeguards and adhering to industry best practices, developers can help prevent the misuse of this powerful technology and protect users from potential harm.