China’s cyber regulator has unveiled draft regulations aimed at enhancing oversight of artificial intelligence services that simulate human personalities and engage users in emotional interactions. This initiative reflects the government’s commitment to shape the rapid deployment of consumer-facing AI technologies by bolstering safety and ethical standards.
The proposed regulations specifically target AI products and services available to the public in China, focusing on those that exhibit human-like traits, thinking patterns, and communication styles. These AI systems are designed to interact with users through various forms, including text, images, audio, and video.
Key aspects of the draft regulations require AI service providers to alert users about the risks associated with excessive usage. In particular, providers are expected to intervene when users demonstrate signs of addiction or excessive engagement with the service. The framework calls for service providers to take on safety responsibilities throughout the entire product lifecycle, implementing systems for algorithmic review, ensuring data security, and safeguarding personal information.
Furthermore, the draft guidelines address potential psychological risks associated with AI interactions. Providers will be responsible for identifying user states and assessing emotional responses, as well as determining any level of dependency on their services. In cases where users show extreme emotions or addictive behaviors, the regulations mandate that the providers take appropriate measures to intervene.
In addition to these measures, the draft outlines strict content and conduct regulations, prohibiting services from generating content that could jeopardize national security, spread misinformation, or promote violence and obscenity.
This regulatory framework illustrates China’s proactive stance in managing the rapid development of AI technologies, ensuring that ethical considerations and user safety remain at the forefront of AI advancements in the consumer market.


