OpenAI has successfully met the Italian Garante’s requirements, lifting Italy’s nearly month-long ChatGPT ban. The company made several improvements to its services, including clarifying personal data usage, to comply with European data protection legislation.
The resolution of this issue comes as the European Union moves closer to enacting the Artificial Intelligence Act, which aims to regulate AI technology and may impact generative AI tools in the future.
OpenAI Meets Garante Requirements
“#GarantePrivacy acknowledges the steps forward made by #OpenAI to reconcile technological advancements with respect for the rights of individuals and it hopes that the company will continue in its efforts to comply with European data protection legislation.”
To comply with the Garante’s request, OpenAI did the following:
While OpenAI resolved this complaint, it isn’t the only legislative hurdle AI companies face in the EU.
AI Act Moves Closer To Becoming Law
Before ChatGPT gained 100 million users in two months, the European Commission proposed the EU Artificial Intelligence Act as a way to regulate the development of AI.
This week, almost two years later, members of the European Parliament reportedly agreed to move the EU AI Act into the next stage of the legislative process. Lawmakers could work on details before it goes to vote within the next couple of months.
The Future of Life Institute publishes a bi-weekly newsletter covering the latest EU AI Act developments and press coverage.
A recent open letter to all AI labs from FLI to pause AI development for six months received over 27,000 signatures. Notable names supporting the pause include Elon Musk, Steve Wozniak, and Yoshua Bengio.
How Could The AI Act Impact Generative AI?
Under the EU AI Act, AI technology would be classified by risk level. Tools that could impact human safety and rights, such as biometric technology, would have to comply with stricter regulations and government oversight.
Generative AI tools would also have to disclose the use of copyrighted material in training data. Given the pending lawsuits over open-sourced code and copyrighted art used in training data by GitHub Copilot, StableDiffision, and others, this would be a particularly interesting development.
As with most new legislation, AI companies will incur compliance costs to ensure tools meet regulatory requirements. Larger companies will be able to absorb the additional costs or pass it along to users than smaller companies, potentially leading to fewer innovations by entrepreneurs and underfunded startups.
Featured image: 3rdtimeluckystudio/Shutterstock