Amidst a torrent of backlash from industry experts, India’s regulatory authorities have backtracked on their planned framework for the deployment of cutting-edge artificial intelligence (AI) technologies.
An updated communication from the Ministry of Electronics and IT in recent days detailed that government approval would no longer be a prerequisite ahead of launching AI models to users in the domestic market.
It came after the government department expressed fears on interference in the democratic system in the South Asian state with the AI advisory issued to tech firms so their services and products “do not permit any bias or discrimination or threaten the integrity of the electoral process.”
India’s IT Deputy Minister Rajeev Chandrasekhar commented “signalling that this is the future of regulation,” adding “We are doing it as an advisory today asking you to comply with it.”
This instruction no longer applies and instead, companies will be asked to take steps to identify potential sources of bias by labelling under-tested or untrustworthy AI chatbots so that users are fully aware.
What is being asked of AI model creators under the new guidance?
The move represents another development, and something of a 360 degree turn, on India’s approach to its safeguards on emerging technology. Previously, the government was reluctant to regulate or interfere significantly on AI, but the short-lived advisory marked a change of direction.
An official document seen by Tech Crunch has set out the new approach in what is a sign of inconsistency from the Delhi administration. The letter outlined the need to adhere to existing Indian law and the imperative for the generated content to be free of bias, discrimination and any threats to democracy.
On any areas of potential unreliable AI prompts, creators have been asked to utilize “consent popups” or similar mechanisms to directly inform users. The Ministry stressed the importance of being able to identify deepfakes and misinformation, urging the use of specific identifiers or metadata to assist with this important purpose.
Image credit: Ideogram