[ad_1]



CNN
 — 

Microsoft joined a sprawling global debate on the regulation of artificial intelligence Thursday, echoing calls for a new federal agency to control the technology’s development and urging the Biden administration to approve new restrictions on how the US government uses AI tools.

In a speech in Washington attended by multiple members of Congress and civil society groups, Microsoft President Brad Smith described AI regulation as the challenge of the 21st century, outlining a five-point plan for how democratic nations could address the risks of AI while promoting a liberal vision for the technology that could rival competing efforts from countries such as China.

The remarks highlight how one of the largest companies in the AI industry hopes to influence the fast-moving push by governments, particularly in Europe and the United States, to rein in AI before it causes major disruptions to society and the economy.

In a roughly hour-long appearance that was equal parts product pitch and policy proposal, Smith compared AI to the printing press and described how it could streamline policymaking and lawmakers’ constituent outreach, before calling for “the rule of law” to govern AI at every part of its lifecycle and supply chain.

Regulations should apply to everything from the data centers that train large language models to the end users such as banks, hospitals and others that may apply the technology toward making life-altering decisions, Smith said.

For decades, “the rule of law and a commitment to democracy has kept technology in its proper place,” Smith said. “We’ve done it before; we can do it again.”

In his remarks, Smith joined calls made last week by OpenAI — the company behind ChatGPT and that Microsoft has invested billions in — for the creation of a new government regulator that can oversee a licensing system for cutting-edge AI development, combined with testing and safety standards as well as government-mandated disclosure rules.

Microsoft vice chair and president Brad Smith speaks at the Semafor World Economic Summit on April 12, 2023 in Washington, DC.

Whether a new federal regulator is needed to police AI is quickly emerging as a focal point of the debate in Washington; opponents such as IBM have argued, including in an op-ed Thursday, that AI regulation should be baked into every existing federal agency because of their understanding of the sectors they oversee and how AI may be most likely to transform them.

Smith also called for President Joe Biden to develop and sign an executive order requiring federal agencies that procure AI tools to implement a risk management framework developed and published this year by the National Institute of Standards and Technology. That framework, which Congress first ordered with legislation in 2020, covers ways that companies can use AI responsibly and ethically.

Such an order would leverage the US government’s immense purchasing power to shape the AI industry and encourage the voluntary adoption of best practices, Smith said.

Microsoft itself plans to implement the NIST framework “across all of our services,” Smith added, a commitment he described as the direct outgrowth of a recent White House meeting with AI CEOs in Washington. Smith also pledged to publish an annual AI transparency report.

As part of Microsoft’s proposal, Smith said any new rules for AI should include revamped export controls tailor-made for the AI age to prevent the technology from being abused by sanctioned entities.

And, he said, the government should mandate redundant AI circuit breakers that would allow algorithms to be shut off by critical infrastructure providers or from within the data centers they depend on.

Smith’s remarks, and a related policy paper, come a week after Google released its own proposals calling for global cooperation and common standards for artificial intelligence.

“AI is too important not to regulate, and too important not to regulate well,” Kent Walker, Google’s president of global affairs, said in a blog post unveiling the company’s plan.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *