Headline Nov 01, 2014/

''' YO-HO''' MACHINE !! 


IN the classic science-fiction film  ''2001''  the ship's computer,  HAL, faces a dilemma.

His instructions require him both to fulfil the ship's mission   -investigate an artefact  near Jupiter-   And to keep the mission's true purpose secret from the ship's crew. 

To resolve the contradiction, he tries to kill the crew.

As robots become more autonomous, the notion of computer-controlled machines facing ethical decisions is moving out of the realm of science fiction into the real world.

Society needs to find ways to ensure that they are better equipped to make moral judgements than HAL was.  

MILITARY TECHNOLOGY.  unsurprisingly, is at the forefront of the march towards self-determining machines.

Its evolution is producing an extra-ordinary variety of species.

The  Sand Flea  can leap through a window or onto a roof,  filming all the while. It then rolls along on wheels until it needs to jump again.

RISE, a six-legged cockroach, can climb walls.

LS3, a dog-like robot, trots behind a human over rough terrain, carrying up to 180kg of supplies.

SUGV, a briefcase-sized robot, can identify a man in a crowd and follow him.

There is a flying surveillance drone the weight of a  wedding ring, and one that carries  2.7 tonnes of bombs.

Robots are spreading in the civilian world, too, from the flight deck to the operating theatre. Passenger aircraft have long been able to land themselves.

Driverless trains are a commonplace.

Volvo's new V-40 hatchback essentially drives itself in heavy traffic. It can brake when it senses an imminent collision, as can Ford's B-Max minivan. Fully self-driven vehicles are being tested around the world.

Google's driverless cars have clocked up more than  250,000  miles in America, and Nevada has become the first state to regulate such trials on public roads. In Barcelona, just over two years ago-

Volvo demonstrated a platoon of autonomous cars on a motorway.

As they become smarter and more widespread, autonomous machines are bound to to end up making   life-or-death  decisions in unpredictable situations, thus assuming  -or at least appearing to assume  -moral agency. 

Weapon systems currently have human operators  ''in the loop'', but as they grow more sophisticated, it will be possible to shift to   ''on the loop''  operation, with machines carrying out orders autonomously.

As that happens, they will be presented with ethical dilemmas:

Should a drone fire on a house where a target is known to be hiding, which may also be sheltering civilians? 

Should a driverless car swerve to avoid pedestrians if that means hitting other vehicles or endangering its occupants?

Should a robot involved in disaster recovery tell people the truth about what is happening if that risks causing a panic?

Such questions have led to the emergence of the field of  ''machine ethics'' , which aims to give machines the ability to make such choices appropriately-

In other words, to tell right from wrong.

The Honour and Serving of this operational research continues. Thank you for reading and don't miss the next one.

With respectful dedication to all the Scientists, Thinkers and labourers who brought the world to this point. See Ya all on !WOW!  -the World Students Society Computers-Internet-Wireless:

''' Excellence Pursuer '''

'''Good Night and God Bless

SAM Daily Times - the Voice of the Voiceless


Post a Comment

Grace A Comment!