Advertisment
how to

Google and OpenAI Ready to Create “Responsible” AI

Major players in artificial intelligence or AI technology headquartered in the United States including Microsoft, Google and OpenAI are taking significant steps towards developing responsible AI technologies.

The commitment comes as the United States White House places a strong emphasis on ensuring AI companies develop their technology in a responsible way, ensuring it benefits society without compromising safety, rights or democratic values.

According to a draft document seen by Bloomberg, the tech company will agree on eight suggested steps regarding safety, security and social responsibility.

These steps include letting independent experts test AI models to analyze potential bad behavior, investing in cybersecurity, and encouraging third parties to identify security vulnerabilities.

READ ALSO:

quoted The telephone from Gizmochina on Friday (21/07/2023), in order to address social risks, including bias and inappropriate use, the company will focus on thoroughly researching the implications.

They will also share trust and security information with other companies and governments, driving a collaborative approach to responsible AI development.

Additionally, they plan to flag AI-generated audio and visual content to prevent misuse or misinformation. At the same time committed to using advanced AI systems, known as the frontier model to address the significant challenges facing society.

The voluntary nature of this agreement clearly demonstrates the difficulties legislators face in keeping up with the rapid developments in artificial intelligence. Congress has introduced several laws to regulate AI.

READ ALSO:

As AI technology advances, this collaborative effort between governments and tech giants becomes crucial in shaping a fruitful future for humanity. Hopefully this commitment can be followed by many other AI companies so that the use of AI technology can be more beneficial for mankind.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button