Red Ice News

The Future is the Past

If an autonomous machine kills someone, who is responsible?
New to Red Ice? Start Here!

If an autonomous machine kills someone, who is responsible?

Source: guardian.co.uk
The Royal Academy of Engineering has published a report exploring the social, legal and ethical implications of ceding control to autonomous systems.


The supercomputer Hal in Stanley Kubrick's 2001: A Space Odyssey embodies our worst fears about autonomous machines. Photograph: RGA

Within a decade, we could be routinely interacting with machines that are truly autonomous – systems that can adapt, learn from their experience and make decisions for themselves. Free from fatigue and emotion, they would perform better than humans in tasks that are dull, dangerous or stressful.

Already, the systems we rely on in our daily lives are being given the capacity to operate autonomously. On the London Underground, Victoria line trains drive themselves between stations, with the human "driver" responsible only for spotting obstacles and closing the doors. Trains on the Copenhagen Metro run without any driver at all. While our cars can't yet drive themselves, more and more functions are being given over to the vehicle, from anti-lock brakes to cruise control. Automatic lighting and temperature control are commonplace in homes and offices.

The areas of human existence in which fully autonomous machines might be useful – and the potential benefits – are almost limitless. Within a decade, robotic surgeons may be able to perform operations much more reliably than any human. Smart homes could keep an eye on elderly people and allow them to be more independent. Self-driving cars could reduce congestion, improve fuel efficiency and minimise the number of road accidents.

But automation can create hazards as well as removing them. How reliable does a robot have to be before we trust it to do a human's job? What happens when something goes wrong? Can a machine be held responsible for its actions?

"It's a very difficult area for the law because the idea that a machine might be responsible for something is not an easy concept at all," says Chris Elliott, a systems engineer, barrister and visiting professor at Imperial College London.

"If you take an autonomous system and one day it does something wrong and it kills somebody, who is responsible? Is it the guy who designed it? What's actually out in the field isn't what he designed because it has learned throughout its life. Is it the person who trained it?

"If we can't resolve all these things about who's responsible, who's charged if there's an accident and also who should have stopped it, we deny ourselves the benefit of using this stuff."

Aside from the legal implications, there are questions that arise from our personal reactions to these technologies. Would you want to live in a home that monitored your movements and called for help if you didn't take your medicine? If your loved one died on the operating table, would you feel differently if the surgeon was a robot?

In order to help society prepare for their arrival, the Royal Academy of Engineering has published a report on the social, legal and ethical issues surrounding autonomous systems. Elliott, one of the report's contributors, believes that engaging with the public early on is critical to manage people's expectations and ensure that an appropriate regulatory framework is in place.

"Part of my concern is that when we start seeing these things emerging, we're going to suddenly find that the people who could bring benefits to us won't because they're scared of the legal uncertainty," he said. "So one of the things we're trying to promote is a debate about the rights and wrongs – the ethics – and that should inform the law afterwards."

Article from: Guardian.co.uk




Red Ice Creations - Robots, Cyborgs & A.I.

Michael Tsarion - The Post Human World

Kevin Warwick - Artificial Intelligence & The Rise of the Machines in 2020 (Subscription)

Michael Tsarion - New Technology: Possibility or Danger? (Subscription)

Jim Elvidge - The Singularity, Nanobot's & Reality Simulation (Subscription)

Comments

Red Ice Radio

3Fourteen

UK White Riot: Channeling The Rage
Jayda Fransen - UK White Riot: Channeling The Rage
The Covid to "Hate" Pipeline & Imprisonment For Protesting Covid Rules
Morgan May - The Covid to "Hate" Pipeline & Imprisonment For Protesting Covid Rules

TV

Wokeness Backfired. What Comes Next?
Wokeness Backfired. What Comes Next?
The Great Realignment, Establishment Libs Abandoning Wokeness? - FF Ep283
The Great Realignment, Establishment Libs Abandoning Wokeness? - FF Ep283

RSSYoutubeGoogle+iTunesSoundCloudStitcherTuneIn

Design by Henrik Palmgren © Red Ice Privacy Policy