Updated: 21-08-2025 at 3:30 PM
3k
🚀
10M+
Reach – Join the
Movement!
❓
50K+
Queries Answered
- Ask Yours Today!
🔥
1.5L+
Users Benefiting
- Why Not You?
🚀
10M+
Reach – Join the
Movement!
❓
50K+
Queries Answered
- Ask Yours Today!
🔥
1.5L+
Users Benefiting
- Why Not You?
Indian authorities have recently alerted citizens about emerging phone scams that leverage AI to imitate people’s voices. These attacks clone the voices of relatives and friends to sound desperate for help before tricking victims into online transactions.
Fraudsters first gather personal information about potential targets through social media sites, matrimonial profiles, or data leaks. Then they use AI techniques like deepfakes to clone the target’s voice or a family member’s voice stored in the background of previous calls.
The scammers make up elaborate stories about unavoidable needs. With some personal details interwoven, the victims believe it is their actual relative and make online payments. The money ends up in intermediate accounts and disappears without a trace to the scammers.
Some fraudsters have even replicated bank manager voices to extract OTPs. Sometimes, to activate instant loans in their names. The cloned voices sound natural, making detection of fakery harder, especially for seniors.
Read the AI voice cloning article to get answers to questions like ‘What is voice cloning, how does AI voice cloning work, is AI voice cloning legal, and how to detect AI voice cloning?’.
The table below summarises some key details about the AI voice cloning article that one should know.
What is voice cloning? | These are scams wherein fraudsters use AI software to impersonate the voice of people’s relatives or acquaintances for personal gain. |
---|---|
How does AI voice cloning work? | Through the use of AI and voice cloning apps |
Is AI voice cloning legal? | Yes, voice cloning is illegal. |
How to detect AI voice cloning? | Verify the identity of the caller, never share sensitive information over phone call, and simply trust your judgement if there is any hint of doubt. |
Voice cloning refers to AI systems that can mimic almost any human voice by analysing it. These can then be used to generate new fake utterances aimed at fraud.
In India, an exponential rise in voice cloning apps has enabled scammers to clone identities through three key modes:
Social engineering frauds Scammers contact victims by mimicking the voices of known bank managers and customer care agents. Then they ask for confidential information, like passwords or OTPs. The familiarity achieved through vocal similarity tricks victims.
Celebrity endorsement scams. Using the synthesised voices of celebrities, fake endorsements for products, investments, or donations are done among fans. Victims get manipulated by perceived credibility.
Deep Fake social media profiles Clone voices are linked to realistic AI-generated images linked to cloned social media accounts. This is then used for cheating mass targets through messages and posts.
Read more: How To Report Cybercrime And Online Fraud In India?
Major concerns surrounding AI Voice Impersonation scams are laid down below in detailed points:
Increased credibility Perception: AI-powered, mimicked voices build high levels of familiarity and trust, causing people to lower their defences. Making them share sensitive data, assuming the speaker's identity to be genuine.
Difficult Technical Detection: Unlike images, detecting manipulated vocal deep fakes is still challenging.
No legal accountability frameworks: India still lacks comprehensive laws addressing voice cloning used to execute fraud. The anonymity of blockchain-based clone apps also causes problems.
Rapid Proliferation on the Dark Web: Cheap voice-cloning services are available on the dark web. Making it easy for scammers globally to target Indians.
Limited user awareness: Most people are oblivious to advancements in AI voice cloning. This leaves them vulnerable to psychological manipulation through such scams.
Read more: Quishing: The New Threat Of QR Code Phishing And How You Can Stay Safe!
The reasons behind voice cloning comes off as an AI voice cloning debate. However, some reasons that satisfy the AI voice cloning debate is as follows:
In 2021, the voice of a popular celebrity was cloned to make fake promotion calls to thousands. Promoting an e-commerce platform that turned out to be fraudulent, causing financial losses.
A Mumbai-based professor received an OTP request call from her bank manager, asking for urgent validation. The ₹3 lakh transaction detected later was because of sensitive information shared with the fake cloned voice.
Over 2000 people lost money in a cryptocurrency wallet scam. Using the synthesised voice of a well-known business tycoon to gain credibility and coax victims.
As risks speed up, public awareness, law enforcement capacity building around deepfakes and counter-technologies, and strong legal deterrence are crucial to combating the unique threats posed by AI voice-cloning-based frauds. There is a need for technological safeguards by fintech companies against spoofed voices for transaction authorisation.
As AI improves, cloned voices will become more refined and ubiquitous across scam media. Especially for messaging apps and websites. For law enforcement, tracking these scams back to fraudsters also becomes harder. The only thing a citizen can do is exercise precaution and stay logical.
Never share sensitive financial or personal information with unknown callers.
Verify the emergency claims of relatives through other contacts before initiating any transfers.
Use security questions or video calls to authenticate your identity if suspicions arise.
Limit sharing family information on social media sites. Or avoid doing so if your account is public.
Notify local authorities immediately upon detecting a voice-cloning scam.
Always give the benefit of your doubt to your sense of rationality first before panicking.
The scams surrounding AI voice impersonation are rising day by day. Hence, it is extremely important for all of us to keep ourselves educated and aware about the kind of techniques used by fraudsters and the ways we can protect ourselves from them.
At Jaagruk Bharat, we’re committed to helping you stay safe online. If you’ve been a victim or suspect of fraud, visit our Cyber Crime Complaint Page to report the incident and seek guidance. Our team ensures your complaint is directed to the right authorities with clarity and speed.
You can also reach out to Jaagruk Bharat through their Community Page with any questions that you might have.
Frequently Asked Questions
0
0
3k
0
0
3k Views
1
No comments available
Our Company
Home
About
T&C
Privacy Policy
Eula
Disclaimer Policy
Code of Ethics
Contact Us
Cancellation & Refund Policy
Categories
Women
Insurance
Finance
Tax
Travel
Transport & Infrastructure
Food
Entertainment
Communication
Government ID Cards
E-commerce
Traffic guidelines
Miscellaneous
Housing and Sanitation
Sports
Startup
Environment and Safety
Education
Agriculture
Social cause
Disclaimer: Jaagruk Bharat is a private organization offering support for documentation and government scheme access. We are not affiliated with any government body. Official services are available on respective government portals. Our goal is to make processes easier and more accessible for citizens.
All Copyrights are reserved by Jaagruk Bharat