With talk of AI integration and the cryptocurrency industry mostly focusing on how AI can help the crypto industry fight scams, experts aren’t paying attention. to the fact that it could have the completely opposite effect. In fact, Meta recently warned that hackers appear to be taking advantage of OpenAI’s ChatGPT to attempt to gain access to users’ Facebook accounts.
Meta reported blocking over 1,000 malicious links masked as ChatGPT extensions in March and April alone. The platform has gone so far as to call ChatGPT “the new crypto” in the eyes of scammers. Additionally, searching for the keywords “ChatGPT” or “OpenAI” on DEXTools, an interactive crypto trading platform that tracks a number of tokens, collectively reveals over 700 token trading pairs that mention one. or the other of the two keywords. This shows that scammers are using the hype around the AI tool to create tokens, although OpenAI is not announcing an official entry into the blockchain world.
Social media platforms have become popular channels to promote new scam coins online. Scammers take advantage of the wide reach and influence of these platforms to generate a large number of subscribers in a short time. By leveraging AI-powered tools, they can further amplify their reach and build a seemingly loyal fanbase of thousands. These fake accounts and interactions can be used to give the illusion of credibility and popularity to their scam schemes.
Related: Think AI tools aren’t harvesting your data? guess again
Much of crypto operates on social proof of work, which suggests that if a cryptocurrency or project seems popular and has a large following, it must be popular for a reason. Investors and new buyers tend to trust projects with larger and more loyal online followings, assuming others have done enough research before investing. However, the use of AI can challenge this assumption and undermine social proof of work.
Now, just because something has thousands of genuine likes and comments doesn’t necessarily mean it’s a legit project. This is just one attack vector, and the AI will create many more. One such example is “pig butcher” scams, where an AI instance may spend several days befriending someone, usually an elderly or vulnerable person, only to end up scamming them. The advancement of artificial intelligence technologies has allowed scammers to automate and scale fraudulent activities, potentially targeting vulnerable people in the cryptosphere.
Scammers can use AI-driven chatbots or virtual assistants to engage with individuals, provide investment advice, promote fake tokens and initial coin offerings, or provide high-yield investment opportunities. Such AI scams can also be very dangerous as they are capable of mimicking human-like conversations to a T. Additionally, by taking advantage of social media platforms and AI-generated content, scammers can orchestrate elaborate pump-and-dump schemes, artificially inflating the value of tokens and sell their holdings for large profits, leaving many investors with losses.
Related: Don’t be surprised if the AI tries to sabotage your crypto
Investors have long been warned to beware of deep crypto scams, which use artificial intelligence technologies to create highly realistic online content that swaps faces in videos and photos or even alters audio content to give the impression that influencers or other well-known personalities endorse scam schemes.
A very prominent deepfake that affected the crypto industry was a video of former FTX CEO Sam Bankman-Fried directing users to a malicious website promising to double their crypto.
Earlier this year, in March 2023, the so-called AI project Harvest Keeper defrauded its users of around $1 million. Also, around the same time, projects began to emerge on Twitter under the name “CryptoGPT”.
However, on a more positive note, AI also has the potential to automate the boring and monotonous aspects of crypto development, acting as a great tool for blockchain experts. The tasks required by each project, such as setting up Solidity environments or generating base code, are simplified through the use of AI technology. Eventually, the barrier to entry will be significantly lowered, and the crypto industry will be less about development skills and more about whether or not the idea has any real utility.
In some niche cases, AI will have a surprising way of democratizing processes that we currently assume are beholden only to an elite class – in this case, well-studied senior developers. But with everyone having access to advanced developer tools and crypto launch pads, the sky is the limit. As AI makes it easier to scam projects, users should exercise caution and due diligence before investing in a project, such as watching out for suspicious URLs and never investing in anything that seems to come out of nowhere .
Felix Roemer is the founder of Gamdom. He briefly attended ILS Fernstudium in Germany before founding Gamdom in 2016 at the age of 22 – after investing in crypto, playing poker and making money from the game RuneScape.
This article is for general informational purposes and is not intended to be and should not be considered legal or investment advice. The views, thoughts and opinions expressed herein are those of the author alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.