A US safety body for artificial intelligence (AI) experts has admitted a researcher from China-owned ByteDance was mistakenly added to an online discussion channel.
The worker, employed by the parent company of TikTok, was granted access to a Slack group chat used by members of the US National Institute of Standards and Technology (NIST).
It is an embarrassing incident for the organization given that TikTok is currently embroiled in a high-profile national debate in the States on whether or not the popular video-hosting app is an asset for the Chinese government to spy on, or influence a significant number of American citizens.
TikTok is said to have almost 150 million users in the US, its most important market, with the company ready to strongly defend its status after the House of Representatives passed a bill that could result in the social media platform being banned in the US within six months unless the company divests from its Chinese ownership.
The proposed legislation will now proceed to the Senate, where the outcome is less predictable.
A spokesman for the Beijing government implored the US has “never found evidence that TikTok threatens national security,” and the prohibitive action “will inevitably come back to bite the United States itself.”
How has NIST responded to the news?
The researcher, supposedly based in California, was added to a Slack conversation between members of NIST’s US Artificial Intelligence Safety Institute Consortium, which is said to be made up of around 850 users.
An email from the safety body has set out how it has reacted to the mishap.
“Once NIST became aware that the individual was an employee of ByteDance, they were swiftly removed for violating the consortium’s code of conduct on misrepresentation,” it read.
The specific working group on AI safety developments is a multi-agency collaboration, set up to address concerns and measure risks of the evolving technology. The Institute comprises involvement from various American big tech firms, researchers, startups, NGOs, and others.