ARTIFICIAL intelligence lovers could be “weaponized” by cyber-criminals – and used to steal from you.
Security experts have told The U.S. Sun how the dangers of AI boyfriends and girlfriends are growing.
Plenty of AI chatbot apps now allow people to create virtual romantic partners.
Even if an AI service isn’t advertised as offering the feature, regular chatbots can often be convinced to role-play as partners just by asking the right questions.
It might sound harmless fun, but experts have told us that there are serious hidden dangers.
“Deepfake technology has come on leaps and bounds in the past few years,” said Jamie Akhtar, Co-Founder and CEO at CyberSmart, speaking to The U.S. Sun.
“And, although avatars of lost loved ones or virtual romantic partners can still appear robotic and a little uncanny, the technology is improving all the time.
“The problem is that this technology can be used for malicious ends.”
“Cybercriminals have long weaponized human emotions.
“Most successful social engineering attacks rely on an emotional component, whether creating a sense of urgency or tugging on the victim’s heartstrings with a fictional moving story.
“Deepfakes and griefbots simply take this up a level.”
Like AI lovers, griefbots impersonate a human – although typically a loved one who has passed away.
Both have in common the fact that a regular person can become emotionally invested in AI.
That’s not necessarily a huge problem if you’re talking to a safe chatbot, but experts say it can make AI bots run by criminals more effective.
“It’s not hard to imagine a scenario in which cybercriminals use a griefbot or AI boyfriend/girlfriend to extort money from a victim or trick them into downloading malicious software,” Jamie explained.
“In fact, there are already examples of this kind of attack.
“Earlier this year, a finance worker at a multinational firm was tricked into paying out $25 million to cybercriminals posing as the company’s chief financial officer using deepfake technology.
Cyber expert’s advice on spotting a romance scam
A cyber expert with knowledge of romance scams spoke to The U.S. Sun and warned users of red flags to look out for. Roger Grimes, a defense evangelist at cyber-firm KnowBe4, listed 8 main questions to ask yourself:
- Is the person trying to move you off the dating site to an unmonitored app?
- Are they asking lots of questions before revealing info about themselves? This could be a bid to establish common ground and build trust.
- Are they avoiding phone calls and video chats and coming up with excuses?
- Do they say they travel a lot or that they’re not in the same country? This is often a lucrative part of the scam as they will need money.
- Have they said they are traveling on a certain day and something unforeseen happens, so they need money to get there?
- Do they ask for deep, dark secrets, or incriminating or nude photos? This could be used for blackmail further down the line.
- Have they said they have been scammed before? This is sometimes a ploy to build trust and convince you that they’re not a scammer.
- Do they have a sob story? Like their spouse died suddenly or left unfairly? Or they’ve been left with kids or massive bills to pay?
Read more here.
“As the technology improves and becomes more affordable to lower-level criminals, we expect to see it being used more and more against individuals as well as multinational businesses.”
So, if you’re speaking to chatbots, make sure that you’re using official, well-known, and well-reviewed apps.
If you’re downloading chatbots from third-party app stores or using suspicious websites, you don’t know who runs the AI behind the scenes.
And that’s not all – even official chatbots aren’t necessarily secure.
Experts have long warned that you shouldn’t share too much private info with any chatbot—even on well-known apps like OpenAI’s ChatGPT or Google Gemini.
Chatbots can leak information that you’ve shared.
And if this info ends up in the wrong hands, it could be used against you.
This is a problem – as chatbots seem so human-like that they lull you into a false sense of security.
As users get more comfortable with talking to an AI bot, they will likely become more chatty, revealing additional information about themselves.
Chris Hauk
It’s especially dangerous if you’re talking to a chatbot created by a mysterious third party.
“There are several security and privacy concerns about AI girlfriends/boyfriends and ‘griefbots’,” said Chris Hauk, Consumer Privacy Advocate at Pixel Privacy, speaking to The U.S. Sun.
“These apps gather a wealth of information about their users, use trackers that send information back to companies in China and Russia, as well as other data hungry companies, including Facebook and Google.
“Many of these apps do not make it clear as to what data is shared with third parties, nor are they clear about the AI they use.
AI ROMANCE SCAMS – BEWARE!
Watch out for criminals using AI chatbots to hoodwink you…
The U.S. Sun recently revealed the dangers of AI romance scam bots – here’s what you need to know:
AI chatbots are being used to scam people looking for romance online. These chatbots are designed to mimic human conversation and can be difficult to spot.
However, there are some warning signs that can help you identify them.
For example, if the chatbot responds too quickly and with generic answers, it’s likely not a real person.
Another clue is if the chatbot tries to move the conversation off the dating platform and onto a different app or website.
Additionally, if the chatbot asks for personal information or money, it’s definitely a scam.
It’s important to stay vigilant and use caution when interacting with strangers online, especially when it comes to matters of the heart.
If something seems too good to be true, it probably is.
Be skeptical of anyone who seems too perfect or too eager to move the relationship forward.
By being aware of these warning signs, you can protect yourself from falling victim to AI chatbot scams.
“As users get more comfortable with talking to an AI bot, they will likely become more chatty, revealing additional information about themselves.”
So resist the temptation to overshare, even when talking to regular chatbots.
Don’t have any personal info or anything identifying exactly who you are, and certainly don’t agree to send any money.
Treat online chatbots like strangers online – and don’t make mistakes that might benefit cybercriminals.