OpenAI CEO says possible to get regulation wrong, but should not fear it

OpenAI CEO says possible to get regulation wrong, but should not fear it

Technology

Many countries are planning AI regulation, and Britain is hosting a global AI safety summit in Nov

TAIPEI (Reuters) - The CEO of ChatGPT maker OpenAI said on Monday that it was possible to get regulation wrong but it is important and should not be feared, amid global concerns about rapid advances in artificial intelligence, or AI.

Many countries are planning AI regulation, and Britain is hosting a global AI safety summit in November, focusing on understanding the risks posed by the frontier technology and how national and international frameworks could be supported.

Sam Altman, CEO and the public face of the startup OpenAI, backed by Microsoft Corp (MSFT.O), said during a visit to Taipei that although he was not that worried about government over-regulation, it could happen.

"I also worry about under-regulation. People in our industry bash regulation a lot. We've been calling for regulation, but only of the most powerful systems," he said.

"Models that are like 10,000 times the power of GPT4, models that are like as smart as human civilization, whatever, those probably deserve some regulation," added Altman, speaking at an AI event hosted by the charitable foundation of Terry Gou, the founder of major Apple (AAPL.O) supplier Foxconn (2317.TW).

Altman said that in the tech industry there is a "reflexive anti-regulation thing".

"Regulation has been not a pure good, but it's been good in a lot of ways. I don't want to have to make an opinion about every time I step on an airplane how safe it's going to be, but I trust that they're pretty safe and I think regulation has been a positive good there," he said.

"It is possible to get regulation wrong, but I don't think we sit around and fear it. In fact we think some version of it is important."

Gou, currently running as an independent candidate to be Taiwan's next president, sat in the audience, but did not speak at the forum.