Actually, artificial intelligence should improve our world. But now more and more criminals are using tools like ChatGPT to commit fraud. The AI ​​makes crime as easy as pie.

The past has already shown us one thing: every new technology can take us great strides forward and improve our world. At the same time, however, there is also the risk of abuse. The result is powerful tools for committing crimes and fraud. This apparently also applies to the artificial intelligence ChatGPT.

Actually, OpenAI released the algorithm to make the world a little bit better and to support us. But what interested people can use to research school presentations is also helpful when finding illegal information. At least that is what a report by Europol makes clear.

Crimes and Fraud: ChatGPT helps manufacture crack

Because criminals apparently use the algorithm to find instructions on the Internet. In addition to plans for building pipe bombs, ChatGPT also provides information about the production of the drug crack. Although OpenAI implemented certain filter mechanisms in the AI ​​​​to avoid this, these already seem to be bypassable.

So it’s easier than ever to get such illegal information. Europol emphasizes that the content was previously available on the Internet, but ChatGPT now significantly speeds up finding. So the next crime seems just a chat message away.

Is ChatGPT the ideal tool for criminals?

ChatGPT also seems to be an ideal tool for scams. Because if we look at phishing attacks, for example, scammers can often be caught on the basis of spelling or grammatical errors. But when an AI writes an email about a lost package or a family member in distress, it can quickly become difficult.

So what can be done? OpenAI should urgently optimize the filter mechanisms and create a reporting system that allows users to report answers outside the legal area. This is the only way that ever-improving algorithms do not undermine our security as well.

Also interesting:

Source: https://www.basicthinking.de/blog/2023/03/30/chatgpt-betrug/

Leave a Reply

Your email address will not be published. Required fields are marked *