This photo taken on February 2, 2024 shows Lu Yu, head of Product Management and Operations of Wantalk, an artificial intelligence chatbot created by Chinese tech company Baidu, showing a virtual girlfriend profile on her phone, at the Baidu headquarters in Beijing.Jade Gao | Afp | Getty ImagesBEIJING — China plans to restrict artificial intelligence-powered chatbots from influencing human emotions in ways that could lead to suicide or self-harm, according to draft rules released Saturday.The proposed regulations from the Cyberspace Administration target what it calls “human-like interactive AI services,” according to a CNBC translation of the Chinese-language document.The measures, once finalized, will apply to AI products or services offered to the public in China that simulate human personality and engage users emotionally through text, images, audio or video. The public comment period ends Jan. 25.Beijing’s planned rules would mark the world’s first attempt to regulate AI with human or anthropomorphic characteristics, said Winston Ma, adjunct professor at NYU School of Law. The latest proposals come as Chinese companies have rapidly developed AI companions and digital celebrities.Compared with China’s generative AI regulation in 2023, Ma said that this version “highlights a leap from content safety to emotional safety.”The draft rules propose that: AI chatbots cannot generate content that encourages suicide or self-harm, or engage in verbal violence or emotional manipulation that damages users’ mental health.If a user specifically proposes suicide, the tech providers must have a human take over the conversation and immediately contact the user’s guardian or a designated individual.The AI chatbots must not generate gambling-related, obscene or violent content.Minors must have guardian consent to use AI for emotional companionship, with time limits on usage.Platforms should be able to determine whether a user is a minor even if the user does not disclose their age, and, in cases of doubt, apply settings for minors, while allowing for appeals.Additional provisions …