[ad_1]

In 1965, the political scientist and Nobel laureate Herbert Simon declared: “Machines will be capable, within 20 years, of doing any work a man can do.” Today, in what is increasingly referred to as the fourth industrial revolution, the arrival of artificial intelligence (AI) in the workplace is igniting similar concerns.

The European parliament’s forthcoming Artificial Intelligence Act is likely to deem the use of AI across education, law enforcement and worker management to be “high risk”. Geoffrey Hinton, known as the “godfather of AI”, recently resigned from his position at Google, citing concerns about the technology’s impact on the job market. And, in early May, striking members of the Writers Guild of America promised executives: “AI will replace you before it replaces us.”

Yet, according to Philip Torr, professor of engineering science at the University of Oxford, the fallibility of AI tools – driven not by emotion, but by data and algorithms – means that the presence of humans in the workplace will remain essential.

“Industrial revolutions in the past have typically led to more employment, not less,” says Torr. “I think that we’ll see the types of jobs changing, but that’s just a natural progression.”

Torr, an award-winning research fellow at the Alan Turing Institute in London, compares the impact of large language models (LLMs) such as ChatGPT to the advent of the word processor: an extremely useful tool that will fundamentally change the way we work.

He is generally optimistic that humans can coexist productively alongside such technologies – and he is not alone in this view. Many experts in the field believe that, with the right education and legislation, automation could have a positive impact on the workplace.

There are, of course, those who predict a darker future in which workers are appraised by algorithms and replaced by automation. But there is one broad area of consensus: for better or worse, a growing number of industries are likely to be permanently and structurally altered by the march of AI.

Illustration of a stethoscope in the style of a circuitboard

Healthcare

Until now, the use of AI in medicine has centred on MRI scans, X-rays and the identification of tumours, says Torr. Research is also being conducted into dementia diagnosis via smartphone. Apps could track the length of time it takes a user to complete a routine task such as finding a contact, and flag an increase in this time as a possible sign of the syndrome.

Each of these applications could save valuable time for doctors and other medical staff. However, Torr says in the future LLMs will have the biggest impact for patients and practitioners.

He gives the example of arriving at a hospital, answering a set of questions and then being moved to another room, only to be asked the same set of questions. Instead, he explains, answers could be logged via an AI-driven app, which would then pass each patient’s information to the relevant staff.

Torr acknowledges, however, that, despite its efficiency, diagnosis by algorithm – or indeed automated surgery, which he also imagines is a likely development – may not prove popular with patients. “You can imagine making some sort of robotic salesman,” he says. “But people would still want to see the real thing.”

Where the technology could be more welcome, however, is among health service central planners. With large, complex organisations to run and targets to meet, they could be helped by AI suggesting plans and schedules to decrease mounting pressures faced by medical services worldwide.

Illustration of an open book with the paragraphs in the style of a circuitboard

Education

AI is already used in schools, colleges and universities, albeit in limited ways. However, as automation makes its way further into the classroom, Rose Luckin, professor of learner centred design at University College London Knowledge Lab, says the choices we make now will decide its future impact.

“There’s a dystopian version where you hand over far too much to the AI,” she says. “And you end up with an education system that’s much cheaper, where you have a lot of the delivery done by AI systems.”

In this future, teachers assisted in marking and lesson planning by LLMs would be left with more much-needed time to focus on other elements of their work. However, in a bid to cut costs, the “teaching” of lessons could also be delegated to machines, robbing teachers and students of human interaction.

“Of course, that will be for the less well-off students,” Luckin says. “The more well-off students will still have lots of lovely one-to-one human interactions, alongside some very smartly integrated AI.”

Luckin instead advocates a future in which technology eases teachers’ workloads but does not disrupt their pastoral care – or disproportionately affect students in poorer areas. “That human interaction is something to be cherished, not thrown out,” she says.

Illustration of faces in profile with headsets on in the style of a circuitboard

Call centres

Known for their high staff turnover, call centres are often stress-filled environments in which staff spend much of their day attempting to calm angry customers. For this reason, explains Peter Mantello, professor of media and cyber-politics at Ritsumeikan Asia Pacific University, the centres will increasingly become a popular home for what is known as emotional AI.

Using voice-tone recognition, such tools allow staff and managers to gauge the emotional state of their customers and workers. This means that staff can better assist callers, and managers can take better care of staff. Mantello warns, however, that the technology is also a form of surveillance.

“Surveillance is about social control and shaping people’s behaviours,” he says. “And so in the workplace, this idea of being positive, authentic and happy is going to be more and more linked to productivity.”

Mantello’s concerns stem from the possibility that the data AI generates could be misused by those in power, for example by a manager using data showing poor productivity to dismiss a worker they dislike, or making a purely statistical judgment on an individual’s value.

The growth of such technology has implications for those working across other sectors, too. From public relations to bartending, presenting a positive demeanour has long been a part of certain roles, but Mantell says: “I think we’re going to see emotion play an even more important part in creating or measuring the idea of a good worker.”

Illustration of fields with bales of hay in the style of a circuitboard

Agriculture

According to Robert Sparrow, professor of philosophy at Monash University’s Data Futures Institute in Australia, many areas of agriculture will prove resistant to increased automation. While farmers already benefit from the application of AI in climate forecasting and pests and disease modelling, he says that in order for the technology to cause real disruption, there would need to be significant progress in robotics.

“I can get ChatGPT to write better essays than many of my students,” he says. “But if you asked a robot to walk into this room and empty the wastepaper basket or make me a cup of coffee, it simply couldn’t do that.”

This lack of dexterity and inability to cope with unpredictable spaces or tasks, combined with the cost of such technology, makes robots unlikely to replace agricultural workers in the near future, he believes.

However, Sparrow describes agriculture as a technologically progressive industry. Food often travels across the world to reach consumers, and Sparrow describes logistics as an element of farming in which AI has real potential to increase efficiency – although this would not come without risks for human workers.

“All the people currently working to determine which pallets need to go on which truck, to get to which ship, to get to market on time – if they all lost their jobs because of improvements in AI, it’s not at all obvious that they will find jobs elsewhere,” he says.

Illustration of crosshairs in the style of a circuitboard

Military

Sparrow says military investment in AI is high, and the belief that it will drive the future of warfare is common. However, despite the introduction of semi-autonomous drones, tanks and submarines, the technology is used less than one might imagine.

This, however, is likely to change – particularly for those who serve at sea or in the air. “I’m not alone in thinking that, in the future, human beings won’t be able to survive air combat,” he says. “Flying without a pilot can be lighter, faster, more manoeuvrable and also more expendable.”

Sparrow also believes that commands could eventually be delivered by AI, rather than by senior officers. Although humans would remain involved in decision-making, the possibility of automation bias – the human tendency to defer to machines – raises concerns.

He gives the example of a battalion sent into heavy enemy fire by an AI general – something that he acknowledges human generals might also need to do. “You know those people are going to be killed,” he says, “but that’s harder to stomach if a machine gave the order.”

Autonomous warfare conducted from a distance could also lead to changes in military culture and the way in which working in the sector is perceived. While traits such as courage, mercy and compassion are often attributed to soldiers, Sparrow says that AI-driven fighting would “make it very hard to maintain these illusions”.

Changes in public opinion aside, the positives of removing military personnel from the dangers of direct combat are clear. However, Sparrow still holds serious concerns about a future in which humans play a lesser role than technology in warfare, and believes that automated weapons systems could one day be capable of drawing humans into war.

He is similarly sceptical about the future of AI across all workplaces. “The idea that these tools will leave the core of the job intact is often a marketing pitch,” he says. “If the technology is genuinely better than a person at the role, why would we employ people?”

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *