Truecaller's AI assistant can now clone your voice to answer calls

Truecaller's AI assistant can now clone your voice to answer calls

 TL;DR

  • Truecaller’s AI Assistant offers features like automatic call answering, message taking, and spam detection.
  • Users can now train the AI Assistant to answer calls using a digital clone of their voice.
  • Previously, users had to choose from seven pre-set digital voices for the assistant.

Truecaller, the well-known dialer and spam-blocking app, has rolled out a major enhancement to its AI Assistant feature. Now, paid users can create a digital replica of their voice to manage incoming calls. This intriguing yet slightly unsettling advancement stems from Truecaller's partnership with Microsoft, leveraging the latest Personal Voice technology from Azure AI Speech.

Truecaller introduced its AI Assistant in September 2022, offering a suite of AI-powered features like automatic call answering, call screening, message taking, and call recording. The assistant also interacts with callers to ascertain the reason for their call, effectively filtering out spam with an accuracy of over 90%. Initially, users could select from seven pre-set digital voices for their assistant.


 Users now have the ability to create a personalized digital voice clone for the same purpose. After consenting, they record a few seconds of a script in their own voice to generate the digital copy. To ensure transparency, Truecaller limits the customization of the initial greeting with the personal voice option, making it clear to callers that they are interacting with a “digital” version of the user.

The Personal Voice feature is being introduced gradually across different regions, starting with the USA, Canada, Australia, South Africa, India, Sweden, and Chile, with plans to expand to more countries soon.

This development offers a new level of personalization but also raises concerns about the potential misuse of voice cloning technology. Numerous AI voice-generating apps are already available, and incidents of scammers in India using AI to impersonate the voices of acquaintances to defraud people are becoming more frequent.

As the distinction between real and synthetic content diminishes, there is growing apprehension that AI advancements might be progressing too quickly, outstripping our ability to fully understand the potential consequences.

Next Post Previous Post
No Comment
Add Comment
comment url