Jump to content

Voice Cloning Scams


You are about to reply to a thread that has been inactive for 439 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Heard on the news: In some countries, "voice cloning" is being used to falsely impersonate relatives to ask for money. In the story, the technology was compared to the so-called "voice ID" technology used in some countries for voice verification with banks, etc. using voice passcodes such as, "my voice is my password".

This story made me realize, that those who demand "voice verification" in Second Life could have altogether different reasons for wanting to capture your voice, in our new "Age of AI".

Any thoughts on this?

  • Like 1
Link to comment
Share on other sites

That’s quite unnerving Love and I read that too, all done with AI voice cloning (it’s amazing btw if you have a chance to play with it).

Like all good things some people find a way to exploit those for nefarious purposes. Like the screwdriver or text box 🙈

My thoughts on the exploitation of voice cloning using SL captured samples is that people are naive (to use a kind word) to go for voice verification in the first place. Why would anyone do that? It isn’t as if that verification is provided by LL and that makes it even more suspicious in my view. You are exposing your reaL self to a self-appointed “voice verificator” (perv) for what purposes exactly… 😂

  • Like 1
Link to comment
Share on other sites

5 minutes ago, Krystina Ferraris said:

You are exposing your reaL self to a self-appointed “voice verificator” (perv) for what purposes exactly… 😂

Perhaps they find it..gratifying. So to speak.

But to take the logical leap: 1) Other user demands voice "verification". 2) User also uncovers your RL info (bonus: using bots!!1!!). 3) User steals your RL identity and monies using your cloned voice from #1. 4) You blame LL because it must be their fault. 5) The forces of evil win.

 

  • Like 2
Link to comment
Share on other sites

Conclusion:  Never accept to use voice verification for important things, like bank accounts and never use voice on the Internet, where de facto even your friends are total strangers like in SL or other online games.

BTW: I never had a business that asked me to start to verify my account with voice over here in NL.
 

Edited by Sid Nagy
Time for another coffee.
  • Like 3
Link to comment
Share on other sites

3 hours ago, Love Zhaoying said:

Heard on the news: In some countries, "voice cloning" is being used to falsely impersonate relatives to ask for money. In the story, the technology was compared to the so-called "voice ID" technology used in some countries for voice verification with banks, etc. using voice passcodes such as, "my voice is my password".

This story made me realize, that those who demand "voice verification" in Second Life could have altogether different reasons for wanting to capture your voice, in our new "Age of AI".

Any thoughts on this?

no matter who asks me .. email/phone and all other online and non personal contacts  ... and even in rl ;) .... the answer is NO

The "voice verification"in SL has only to do with hearing you'r indeed a guy or girl.

Edited by Alwin Alcott
  • Like 5
Link to comment
Share on other sites

You can't clone a voice from just a sentence or two. I don't know the exact number but you need A LOT of spoken lines of someone in order to use these new AI tools.

The point rather is that "voice verification" will effectively become useless in a few years when EVERYONE will be able to realistically alter their voice.

Edited by xDancingStarx
  • Like 1
Link to comment
Share on other sites

4 hours ago, Ardy Lay said:

Yeah, and never say anything if you answer a telephone.

 

4 hours ago, Sid Nagy said:

Conclusion:  Never accept to use voice verification for important things, like bank accounts and never use voice on the Internet, where de facto even your friends are total strangers like in SL or other online games.

BTW: I never had a business that asked me to start to verify my account with voice over here in NL.
 

And then there is this..

When doing business on the phone with many different types of companies, you get asked "this call is being recorded etc." and at some point if you are agreeing to some "service" during the call, they record your voice "agreement" to the service (with its recurring fees, etc.).  I'm not saying those recordings are / can be abused, just that there is a natural connection between the concepts of your voice being "used for something".

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

56 minutes ago, Love Zhaoying said:

, you get asked "this call is being recorded etc." and at some point if you are agreeing to some "service" during the call, they record your voice

Exactly this. It also happened here (read it on the news and was reported internally in our vet corporate) a few times that 3rd parties doing various type of customer support for banks or credit card companies were found guilty of selling customer’s PI (and credit card details) on the black market.

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

7 minutes ago, Krystina Ferraris said:

Exactly this. It also happened here (read it on the news and was reported internally in our vet corporate) a few times that 3rd parties doing various type of customer support for banks or credit card companies were found guilty of selling customer’s PI (and credit card details) on the black market.

I assume that scam "cold robo-calls" also do this a lot.  If they can get you talking, they can catch you agreeing to something and record it.

  • Like 1
Link to comment
Share on other sites

3 minutes ago, Love Zhaoying said:

I assume that scam "cold robo-calls" also do this a lot.  If they can get you talking, they can catch you agreeing to something and record it.

Yes and this was really an issue between mid 2020 and 2021, we were absolutely under siege with these fake bot calls here in Ireland. I used to get between 5 and 6 a day at their peak. This happened after our health service was hacked and the PPI of millions of citizens was sold on the black internet. They never recovered that.

Most people wouldn’t answer and just hang up right away but some clearly didn’t as thousands reported getting scammed badly. The typical bot would say it was the Revenue calling about your owed tax.

 

  • Like 1
Link to comment
Share on other sites

3 hours ago, xDancingStarx said:

You can't clone a voice from just a sentence or two. I don't know the exact number but you need A LOT of spoken lines of someone in order to use these new AI tools.

The point rather is that "voice verification" will effectively become useless in a few years when EVERYONE will be able to realistically alter their voice.

Seems these days, with modern AI techniques, they can clone a voice with a few seconds of recording already.

Link to comment
Share on other sites

20 minutes ago, Zeta Vandyke said:

Seems these days, with modern AI techniques, they can clone a voice with a few seconds of recording already.

I think it depends on context, I'd easily believe they can clone a voice enough to fool the other ai that checks voices for 'bank verification' or w/e, but I don't think it's to the point it can fool someone listening to it for a full 5-minute 'my wallet was stolen by a Nigerian prince' scam.

  • Like 1
Link to comment
Share on other sites

4 minutes ago, Quistess Alpha said:

I think it depends on context, I'd easily believe they can clone a voice enough to fool the other ai that checks voices for 'bank verification' or w/e, but I don't think it's to the point it can fool someone listening to it for a full 5-minute 'my wallet was stolen by a Nigerian prince' scam.

Well seems they have made quite some advancements

"In January, Microsoft unveiled an AI that can clone a speaker’s voice after hearing them talk for just three seconds. While this system, VALL-E, was far from the first voice cloning AI, its accuracy and need for such a small audio sample set a new bar for the tech.

Microsoft has now raised that bar again with an update called “VALL-E X,” which can clone a voice from a short sample (4 to 10 seconds) and then use it to synthesize speech in a different language, all while preserving the original speaker’s voice, emotion, and tone."

From this article: https://www.freethink.com/robots-ai/voice-cloning

With 4 to 10 second it clones a voice and can speak multiple languages with it. That's quite amazing, and a bit scary :)

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

7 minutes ago, Quistess Alpha said:

I think it depends on context, I'd easily believe they can clone a voice enough to fool the other ai that checks voices for 'bank verification' or w/e, but I don't think it's to the point it can fool someone listening to it for a full 5-minute 'my wallet was stolen by a Nigerian prince' scam.

That's kind of what the news story implied: The voice could fool a relative (who you'd think is hard to fool) into sending money. I did not catch/remember if it were a "conversation" or just  recorded message.  But since the context was IIRC a "kidnapping ransom" scam, you'd think a recording would suffice to scare your relatives into paying.

Link to comment
Share on other sites

3 minutes ago, Zeta Vandyke said:

Well seems they have made quite some advancements

"In January, Microsoft unveiled an AI that can clone a speaker’s voice after hearing them talk for just three seconds. While this system, VALL-E, was far from the first voice cloning AI, its accuracy and need for such a small audio sample set a new bar for the tech.

Microsoft has now raised that bar again with an update called “VALL-E X,” which can clone a voice from a short sample (4 to 10 seconds) and then use it to synthesize speech in a different language, all while preserving the original speaker’s voice, emotion, and tone."

From this article: https://www.freethink.com/robots-ai/voice-cloning

With 4 to 10 second it clones a voice and can speak multiple languages with it. That's quite amazing, and a bit scary :)

The AI's that write news stories can probably already lie / exaggerate about the abilities of AI's in the story details!

Soon, we won't know what to believe in the AI-generated news stories about AI's.

  • Like 1
Link to comment
Share on other sites

11 hours ago, Love Zhaoying said:

Heard on the news: In some countries, "voice cloning" is being used to falsely impersonate relatives to ask for money. In the story, the technology was compared to the so-called "voice ID" technology used in some countries for voice verification with banks, etc. using voice passcodes such as, "my voice is my password".

This story made me realize, that those who demand "voice verification" in Second Life could have altogether different reasons for wanting to capture your voice, in our new "Age of AI".

Any thoughts on this?

Are you suggesting that perhaps voice cloning is originating from online games in general?  

Aside from those who say they are on voice to seek other people's genders, I often thought those on voice might be trying to avoid keyloggers by using voice.  

But, I'm wondering where are people getting the voice to clone if one takes online games out of the equation?  Did your news story say where the voice is cloned and how?

Edited by EliseAnne85
Link to comment
Share on other sites

I have no idea if this is related or not, but there's an entire category of "record voice samples" work on freelance platforms, though you see much, much more of it on the low tier sites. There's tons of it in all kinds of languages, as well - not just English. Zero chance I'd ever bother with it as it's underpaid as all hell (pennies to a few dollars, at most, for sometimes up to 75+ spoken phrases). It's likely just being used to help train voice assistants and whatnot, but anyone can jump on there and set up projects without revealing who they actually are, so who even knows where some of that stuff ends up.

  • Like 1
Link to comment
Share on other sites

23 minutes ago, Ayashe Ninetails said:

but anyone can jump on there and set up projects without revealing who they actually are

Time to go PIOF for everything then.  Which, some are saying or suggesting in many "scam-related" threads, PIOF only makes it worse.  

I'm going to call my real bank today and talk to them about this and see what they have to say.

Link to comment
Share on other sites

28 minutes ago, Ayashe Ninetails said:

I have no idea if this is related or not, but there's an entire category of "record voice samples" work on freelance platforms, though you see much, much more of it on the low tier sites. There's tons of it in all kinds of languages, as well - not just English. Zero chance I'd ever bother with it as it's underpaid as all hell (pennies to a few dollars, at most, for sometimes up to 75+ spoken phrases). It's likely just being used to help train voice assistants and whatnot, but anyone can jump on there and set up projects without revealing who they actually are, so who even knows where some of that stuff ends up.

Perhaps those samples are re-used by earlier AI software...? If so, they could be part of the earlier "proto-voice cloning" movement.

Link to comment
Share on other sites

4 minutes ago, EliseAnne85 said:

Time to go PIOF for everything then.  Which, some are saying or suggesting in many "scam-related" threads, PIOF only makes it worse.  

I'm going to call my real bank today and talk to them about this and see what they have to say.

Well, they do need to verify their payment with the platform in order to pay freelancers through the system. I'm just saying there's no way for a freelancer to know exactly who they're providing their voice to and what it's being used for (the platform may know the company's and project manager's names perhaps, but anyone doing work for them isn't necessarily told).

 

2 minutes ago, Love Zhaoying said:

Perhaps those samples are re-used by earlier AI software...? If so, they could be part of the earlier "proto-voice cloning" movement.

Very possible. But much of that work is just handed over blind, so who even knows. I used to think it was for voice assistants, but there's so much of it and the pay is always so suspect. I've seen the occasional project like that come through from the company I currently freelance with, but I know they're heavily involved in the AI and NLP space. Still won't do it, but the pay was actually far more sane.

As far as SL is concerned, I wouldn't really think that's what anyone's using voice verification for. Though the idea of a guy hanging around in adult sims trying to rip voices from people for financial scams is a bit funny - but unlikely.

Link to comment
Share on other sites

1 hour ago, EliseAnne85 said:

Are you suggesting that perhaps voice cloning is originating from online games in general?  

No.

1 hour ago, EliseAnne85 said:

But, I'm wondering where are people getting the voice to clone if one takes online games out of the equation?  Did your news story say where the voice is cloned and how?

I did not find today's BBC story, and I did not listen closely enough to remember the details (I assume it was from voice clips + AI).  Here is a YouTube video that was published today from "Good Morning, America" which gives some deetails:

 

Link to comment
Share on other sites

1 hour ago, EliseAnne85 said:

Time to go PIOF for everything then.  Which, some are saying or suggesting in many "scam-related" threads, PIOF only makes it worse.  

I'm going to call my real bank today and talk to them about this and see what they have to say.

Upwork and sites like that tell you if the person/company behind an ad has paid sub and also if they posted before and their rep. It’s easy enough to spot dodgy stuff without adding anything else to it.

Link to comment
Share on other sites

1 hour ago, Ayashe Ninetails said:

s far as SL is concerned, I wouldn't really think that's what anyone's using voice verification for. Though the idea of a guy hanging around in adult sims trying to rip voices from people for financial scams is a bit funny - but unlikely.

Voice verifying with some rando exposes enough of your RL to allow those with unsavoury intentions to get enough info. Some places ask that along with voice verification, you also submit a picture holding a sign with a specific phrase on it or cam with the person.

Why anyone would do that or think it’s a great idea is beyond me, but to each their own.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

1 hour ago, Ayashe Ninetails said:

As far as SL is concerned, I wouldn't really think that's what anyone's using voice verification for. Though the idea of a guy hanging around in adult sims trying to rip voices from people for financial scams is a bit funny - but unlikely.

I agree, I was just putting it out there. Since people DO get doxxed, so who knows what else can happen?

  • Like 1
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 439 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...