China mandates that AI must follow “core values of socialism”
China has released new guidelines on generative AI services, limiting their public use while encouraging industrial development.
Reuters reported the Cyberspace Administration of China (CAC) softened its stance compared to draft rules in April. These new interim regulations will take effect on August 15th. The guidelines only affect organizations offering generative AI services to the public. Other entities developing the same technology but not for mass-market use do not fall under the measures.
The rules (translated via Google Translate) retain some wording from the April proposal. They continue to mandate generative AI services must “adhere to core values of socialism” and not attempt to overthrow state power or the socialist system. CNN reported the new rules removed potential fines of up to 100,000 yuan ($13,999) for violations.
China has been looking for ways to strengthen its generative AI offerings and hopes to become the leading provider, toppling the US’s current dominance.
But this has not come easy for China, a country that famously controls internet access and the spread of information within its borders. The government had told its tech giants not to access ChatGPT for fear of the chatbot giving “uncensored replies,” even though the tool is not available in China. Authorities also cracked down on citizens using ChatGPT, arresting a man who allegedly used the chatbot to write fake articles.
China’s generative AI rules also consider the importance of intellectual property rights of training data and prohibit the use of “algorithms, data, platforms, and other advantages to implement monopoly and unfair competition.” All training data must come from sources the government deems legitimate. Service providers must accept requests by individuals to review or correct information gathered for AI models.
The Chinese government said it would encourage the development of generative AI, including supporting infrastructure and public training.
Read the full article Here