CHATBOTS powered by artificial intelligence are on the rise – but there are some things you should never tell them.
These bots use AI to talk as if they were real humans, but don’t be fooled: There’s a big hidden danger.
AI chatbots might seem like your friend – but be careful what you say to them[/caption]Chatbots are seemingly everywhere, with millions of people using Google’s Gemini, OpenAI’s ChatGPT, and Microsoft’s Bing AI.
And there are countless more out there, taking note of everything you say to them.
Sadly, you have to be extremely careful with what you say to an AI chatbot.
The U.S. Sun spoke to cybersecurity expert Dr. Martin J. Kraemer, who revealed the dangers of giving too much away.
“Never share any sensitive information with a chatbot,” said Dr. Kraemar, security awareness advocate at KnowBe4.
“You might have to share your flight booking code or parts of your address with an airline chatbot, but that should be the exception.
“You can always call instead of using the chatbot. Generally, never ever share your password or other authentication credentials with a chatbot.
“Do not share your personal thoughts and intimate details either. it is safe to assume that someone else will gain access to them.
“The bot will not keep everything to itself. Equally do not share business information.”
CHATTING AWAY
It’s easy to end up sharing too much info with a chatbot.
Their humanlike conversational style can lull you into a false sense of security.
But one simple reason why you should be wary of what you share is the risk of your account being broken into.
In more sinister cases, the AI might be designed to collect personal information to be used later in scams and fraud.
Paul Bischoff
Similarly, unencrypted chats could be spied on by savvy hackers.
And chatbot apps also risk having conversations leaked in cyber-breaches.
That’s not all: Your conversations could end up being pulled back into the AI’s own systems.
“Never share any private or personally identifying information with an AI chatbot,” said Paul Bischoff, consumer privacy advocate at Comparitech, speaking with The U.S. Sun.
AI ROMANCE SCAMS – BEWARE!
Watch out for criminals using AI chatbots to hoodwink you...
The U.S. Sun recently revealed the dangers of AI romance scam bots– here’s what you need to know:
AI chatbots are being used to scam people looking for romance online. These chatbots are designed to mimic human conversation and can be difficult to spot.
However, there are some warning signs that can help you identify them.
For example, if the chatbot responds too quickly and with generic answers, it’s likely not a real person.
Another clue is if the chatbot tries to move the conversation off the dating platform and onto a different app or website.
Additionally, if the chatbot asks for personal information or money, it’s definitely a scam.
It’s important to stay vigilant and use caution when interacting with strangers online, especially when it comes to matters of the heart.
If something seems too good to be true, it probably is.
Be skeptical of anyone who seems too perfect or too eager to move the relationship forward.
By being aware of these warning signs, you can protect yourself from falling victim to AI chatbot scams.
“Your information could become part of the AI’s training data, which means anyone who uses the AI could theoretically access it.
“In more sinister cases, the AI might be designed to collect personal information to be used later in scams and fraud.”
You should be especially wary if you’re downloading artificial intelligence chatbots from untrusted sources.
It’s best to stick to official app stores and avoid chatbots with very few (or poor) reviews.
If you do decide to speak to a chatbot, keep in mind that it’s not a real person – and that the info you share may not be safe or secure.
Google’s Gemini is just one of a range of increasingly popular AI chatbots[/caption]