While chatbots have caught the world’s imagination, should we be more worried about “slaughterbots”?

The first international conference on responsible military uses of Artificial Intelligence (AI) is being held in the Netherlands next week.

The United States and China are among around 50 countries that will attend, with hopes of producing a declaration at the end of the meeting in The Hague on February 15 and 16.

Russia has not been invited over the invasion of Ukraine.

“We truly see this as a key shaping moment in the future of AI in the military,” Dutch Foreign Minister Wopke Hoekstra told a small group of journalists on Thursday.

“In a field that is really about life and death, you want to make sure that humans, regardless of all the flaws baked into our DNA, are part of the decision-making process.”

Militarily, AI is already used for reconnaissance, surveillance and situational analysis.

While one of the conference sessions on the future of war is called “Regulating Slaughterbots”, the prospect of fully independent killing machines remains far off.

But AI with the potential to autonomously pick targets could be just over the horizon.

These include so-called drone swarms and the use of AI in nuclear command and control systems.

The conference aims to take a first step towards international rules on “what is acceptable, what is not acceptable” in military uses of AI, Hoekstra said.

“You already see that AI is being used in the war that Russia is waging against Ukraine.”

He compared the debate to that surrounding the use of AI bots such as ChatGPT, which have beneficial uses but have also been used by students to cheat and write essays.

“It is not be something that should only scare us,” the minister said.

China was invited to the conference as a key player in tech and AI, Dutch officials said.

Ministers and high-ranking diplomats will be at the so-called REAIM (Responsible AI in the Military Domain) summit along with tech firms and experts.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *