Hackers have apparently managed to bypass a chat bot's restrictions to make malware and phishing emails. ChatGPT is a chat bot you can use to make simple code and stuff like that and just like most chat bots it has restrictions so you don't go asking it to write a code to hack into NASA or something. But as the old saying goes we can't have nice things that's where the hackers come in they have managed to bypass those restrictions and are now selling the steps to bypass those restrictions for 5.50 dollars for every 100 queries or something like that. (Note I am not advertising this nor do I condone this type of stuff.)

https://arstechnica.com/information...e-that-uses-chatgpt-to-generate-malware/