Nation-state hackers from Russia, Iran, North Korea, and China are using AI for basic tasks only

3 hours ago 1255

Tech companies and government officials keep warning about artificial intelligence becoming a dangerous weapon for foreign enemies, but new evidence tells a different story. Recent reports show that state-sponsored groups are using the same publicly available tools as regular internet users, and often struggling just as much.

OpenAI recently shared details about how government-linked groups tried to use their platforms.

The most notable case came from a Chinese influence campaign that got exposed by accident when a Chinese law enforcement official used ChatGPT like a personal diary. The official wrote about an operation targeting Chinese critics living in other countries.

The campaign involved hundreds of operators and thousands of fake social media accounts, according to OpenAI.

The operation tried to impersonate United States immigration officers to scare a dissident by falsely saying their public statements broke American law. In other cases, operators used forged documents claiming to be from a county court to try getting critics’ social media accounts taken down.

Fake obituaries and forged documents part of harassment campaign

They created a fake obituary and gravestone photos to spread false rumors about one dissident’s death. These rumors actually showed up online in 2023, a Chinese-language Voice of America article confirmed. Ben Nimmo, who leads investigations at OpenAI, called the effort industrialized harassment aimed at critics of the Chinese Communist Party through multiple channels.

Using ChatGPT as a record-keeping tool ended up exposing the operation. ChatGPT worked as a journal for the operative to track the covert network, while other tools generated most of the actual content that got spread through social media. OpenAI banned the user after finding the activity.

OpenAI investigators matched descriptions from the ChatGPT user with real online activity. The user described faking a Chinese dissident’s death by creating a phony obituary and gravestone photos for posting online.

In another case, the ChatGPT user asked the system to create a plan for damaging the reputation of incoming Japanese Prime Minister Sanae Takaichi by stirring up anger over American tariffs. ChatGPT refused. But in late October, when Takaichi took power, hashtags showed up on a popular forum for Japanese graphic artists attacking her and complaining about tariffs.

The OpenAI report also covered several scam operations from Cambodia that used the platform for romance and investment fraud, plus influence campaigns linked to Russia targeting Argentina and Africa.

Microsoft report shows similar basic usage patterns

Microsoft published a separate report jointly with OpenAI looking at how nation-state actors from Russia, North Korea, Iran, and China are trying out large language models to support cyber attack operations. Both companies shut down efforts by five state-affiliated actors by closing their accounts.

The Microsoft report found these actors mainly wanted to use services for simple jobs like searching publicly available information, translating content, fixing coding errors, and running basic programming tasks. No major or new attacks using the models have been found so far.

This gap between fear and reality happens during tough competition between Washington and Beijing over control of this technology. What role it plays in military and economic matters has become a major fight. The Pentagon recently told another company, Anthropic, it has until Friday to remove certain safety features from its model or risk losing a defense contract.

Microsoft said it’s working on principles to lower risks from bad use of these tools by nation-state groups and criminal organizations. These principles include finding and stopping bad users, telling other service providers, working with other groups, and being transparent.

Want your project in front of crypto’s top minds? Feature it in our next industry report, where data meets impact.

Read Entire Article