More
    35.1 C
    Delhi
    Saturday, May 11, 2024
    More

      How AI Scam Calls Imitating Familiar Voices Are a Growing Problem | RRD’s Opinion

      Scam calls using AI to mimic the voices of people you might know are being use to exploit unsuspecting members of the public. These calls use what’s known as generative AI, which refers to systems capable of creating text, images, or any other media such as video, based on prompts from a user.

      Deepfakes have gain popularity over the last few years with a number of high-profile incidents, such as actress Emma Watson’s likeness being use in a series of suggestive adverts that appear on Facebook and Instagram.

      There was also the widely share video from 2022 in which Ukrainian president Volodymyr Zelensky appear to tell Ukrainians to “lay down arms”.

      The technology to create an audio deepfake, a realistic copy of a person’s voice, is becoming increasingly common.

      To create a realistic copy of someone’s voice you need data to train the algorithm.

      That means having lots of audio recordings of your intend target’s voice.

      The more examples of the person’s voice that you can feed into the algorithms, the better and more convincing the eventual copy will be.

      Many of us already share details of our daily lives on the internet.

      That means the audio data require to create a realistic copy of a voice could be readily available on social media.

      But what happens once a copy is out there?

      A deepfake algorithm could enable anyone in possession of the data to make “you” say whatever they want.

      In practice, this can be as simple as writing out some text and getting the computer to say it out loud in what sounds like your voice.

      ALSO READ  Why the Bihar Caste Survey Data Can Impact Lok Sabha Polls | RRD’s Opinion

      This AI risks increasing the prevalence of audio misinformation and disinformation.

      It can be use to try to influence international or national public opinion, as seen with the “videos” of Volodymyr Zelensky.

      But the ubiquity and availability of these technologies pose significant challenges at a local level too, particularly in the growing trend of “AI scam calls”.

      Many people will have receive a scam or phishing call that tells us, like, that our computer has compromise and we must immediately log in, potentially giving the caller access to our data.

      It is often very easy to spot that this is a spam, especially when the caller is making requests that someone from a legitimate organisation would not.

      So, now imagine that the voice on the other end of the phone is not just a stranger, but sounds exactly like a friend or loved one.

      This injects a whole new level of complexity, and panic, for the recipient.

      A recent story highlights an incident where a mother receive a call from an unknown number.

      When she answer the phone, it was her daughter.

      The daughter had allegedly kidnapped and was phoning her mother to pass on a ransom demand.

      In fact, the girl was safe and sound.

      The scammers had made a deepfake of her voice.

      This is not an only incident, with variations of the scam including a suppose car accident, where the victim calls their family for money to help them out after a crash.

      This is not a new scam in itself, the term “virtual kidnapping scam” has around for many years.

      It can take many forms but a common approach is to trick victims into paying a ransom to free a loved one they believe is being threaten.

      ALSO READ  West Bengal News : Student Creates AI-Based 'Smart Glasses' for Visually-Impaired at Just Rs 200 | Details Inside

      The scammer tries to establish unquestioning compliance, in order to get the victim to pay a quick ransom before the deception is discover.

      So, the dawn of powerful and available AI technologies has upped the ante significantly and made things more personal.

      It is one thing to hang up on an anonymous caller, but it takes real confidence in your judgment to hang up on a call from someone sounding just like your child or partner.

      There is software that can be use to identify deepfakes and will create a visual representation of the audio call as a spectrogram.

      When you are listening to the call it might seem impossible to tell it apart from the real person, but voices can be distinguish when spectrograms are analyse side-by-side.

      At least one group has offer detection software for download, though such solutions may still require some technical knowledge to use.

      Most people will not be able to generate spectrograms so what can you do when you are not certain what you are hearing is the real thing?

      As with any other form of media, you might come across, be skeptical.

      If you receive a call from a loved one out of the blue and they ask you for money or make requests that seem out of character, call them back or send them a text to confirm you really are talking to them.

      As the capabilities of AI expand, the lines between reality and fiction will increasingly blur.

      And it is not likely that we will be able to put the technology back in the box.

      ALSO READ  Why Modi Government Likely To Retain Power in Lok Sabha Elections 2024 | RRD’s Opinion

      That means people will need to become more cautious. 

      Rahul Ram Dwivedi (RRD) is a senior journalist in 2YoDoINDIA.

      NOTE : Views expressed are personal.

      Related Articles

      LEAVE A REPLY

      Please enter your comment!
      Please enter your name here

      Stay Connected

      18,804FansLike
      80FollowersFollow
      720SubscribersSubscribe
      - Advertisement -

      Latest Articles