[ad_1]

A group of cybersecurity experts in Australia are developing a chatbot that can impersonate a human and sit on a scam phone call to waste a fraudster’s time.

Researchers at Macquarie University in Sydney are creating the chatbot system to act as a “honeypot” that lures scammers into 40-minute-long conversations that amount to nothing.   

“Our model ties them up, wastes their time, and reduces the number of successful scams,” says(Opens in a new window) Macquarie University professor Dali Kaafar. “We can disrupt their business model and make it much harder for them to make money.”

The project started after Kaafar received a spam call and kept the scammer on the line for 40 minutes while entertaining his kids at lunch. Others have done the same, which can include pranking(Opens in a new window) the caller. But even though it can be fun or gratifying to turn the tables on a scammer, doing so takes time.

“Then I started thinking about how we could automate the whole process and use Natural Language Processing to develop a computerized chatbot that could have a believable conversation with the scammer,” says Kaafar.

The result is Apate(Opens in a new window), a chatbot named after the Greek goddess of deceit. It essentially takes ChatGPT-style technology and pairs it with voice cloning to create a dummy human designed to hold long and convincing conversations with a scammer. Kaafar’s team has been training Apate on transcripts of real-world scam conversations, including phone calls, emails, and social media messages, so the bot can generate human-like responses when it answers a scam call. 

Apate deployment model


(Credit: Apate)

According to the university, the team has been testing Apate on real scam calls through a prototype, which can assume a wide range of personalities. To prompt calls, “we’ve put these ‘dirty’ numbers all around the internet, getting them into some spam apps, or publishing them on web pages and so on, to make them more likely to receive scam calls,” Kaafar says.

Recommended by Our Editors

The goal is to make the bot smart enough to trick a scammer into a 40-minute conversation. Currently, Apate is only averaging 5 minutes. But Kaafar notes: “We found the bots react pretty nicely to some tricky situations that we were not expecting to get away with, with scammers asking for information that we didn’t train the bots for—but the bots are adapting, and coming up with very believable responses.”

Apate’s development also occurs as scammers are already exploiting AI technologies, such voice-cloning and deepfakes, to further their own schemes. So in a sense, Kaafar is exploring whether the same technologies could be used to fight back. 

“I suggest the ultimate meta-scenario might see scammers adopting AI themselves, training their own scam chatbots —which are then diverted into speaking to chatbots owned by the telecommunications providers,” he says, noting his team is already in talks with a number of telecommunication providers about Apate.

Like What You’re Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *