More

    AI Integration in Cryptocurrency may Result in Increased Scams

    You may recall HarvestKeeper, the AI-based project that deceived users and swindled $1 million. It is anticipated that analogous AI-based scams will surge in the coming years.

    While there is much talk about how artificial intelligence (AI) can help the cryptocurrency industry combat scams, experts are overlooking the fact that it could instead facilitate fraudulent activities. This was exemplified by Meta’s recent warning that hackers were exploiting OpenAI’s ChatGPT to gain entry into users’ Facebook accounts. In March and April, more than 1,000 malicious links disguised as ChatGPT extensions were blocked by Meta. It described ChatGPT as “the new crypto” in the eyes of scammers. 

    DEXTools, an interactive crypto trading platform, reveals over 700 token trading pairs that mention “ChatGPT” or “OpenAI,” indicating that scammers are exploiting the hype around AI to create tokens, despite OpenAI not having announced an official entry into the blockchain world.

    Social media platforms have become popular channels for promoting new scam coins online, with scammers leveraging AI-powered tools to create a seemingly loyal fanbase consisting of thousands of people. These fake accounts and interactions can be used to give the illusion of credibility and popularity to their scam projects. However, the use of AI can challenge the assumption that a cryptocurrency or project is popular for a reason, undermining social proof-of-work.

    AI can give rise to many other attack vectors, such as “pig butchering” scams, where an AI instance can spend several days befriending someone, usually an elderly or vulnerable person, before scamming them. Scammers may also use AI-driven chatbots or virtual assistants to provide investment advice, promote fake tokens, and offer high-yield investment opportunities. By leveraging social media platforms and AI-generated content, scammers can orchestrate elaborate pump-and-dump schemes, artificially inflating the value of tokens and selling off their holdings for significant profits, leaving numerous investors with losses.

    Investors have been warned to remain vigilant against deepfake crypto scams, which utilize AI technologies to create highly realistic online content that swaps faces in images and videos or even alters audio content to make it appear as though influencers or other high-profile individuals are endorsing fraudulent projects. A notable example was a deepfake video of former FTX CEO Sam Bankman-Fried directing users to a malicious website offering to double their crypto.

    In March 2023, the AI project Harvest Keeper scammed its users out of around $1 million. Around the same time, Twitter saw the emergence of projects calling themselves “CryptoGPT.” However, AI can also automate the tedious and repetitive aspects of crypto development, making it an invaluable tool for blockchain experts. Tasks such as creating base code or setting up Solidity environments are made easier by leveraging AI technology. This will eventually reduce the barrier to entry, making the crypto industry less about development skills and more about the utility of one’s ideas.

    In some instances, AI can democratize processes that were once exclusively reserved for a select group of senior developers. However, with everyone now able to access advanced development tools and launchpads in crypto, there is a risk of projects scamming people. Users must exercise caution and due diligence before investing in a project, such as being wary of suspicious URLs and avoiding investments in projects that appear out of nowhere.

    Most Popular

    Related Posts