11-27-2012, 09:37 AM
|
#3
|
Person who doesn't update the user title
Join Date: Jun 2010
Location: Bottom lands of the Missoula floods
Posts: 6,402
|
It depends on your (relative) POV... a lot !
Daily Mail Online
Damien Gayle
11/27/12
A human will always decide when a robot kills you:
Pentagon's 'reassurance' over fears of machine apocalypse
Quote:
Department of Defense policy directive calls for autonomous weapons
to be designed so they always need human authorisation to open fire
Promise comes after Human Rights Watch report warned that
killer robots could be deployed on the battlefield within 20 years<snip>

Automatic death: Samsung's machine gun sentry robot, which is already
in use in South Korea, can spot unusual activity, challenge intruders and,
when authorised by a human controller, open fire<snip>
Steve Goose, arms division director at Human Rights Watch, added:
Quote:
'A number of governments, including the United States, are very excited
about moving in this direction, very excited about taking the soldier off the battlefield
and putting machines on the battlefield and thereby lowering casualties.'<snip>
|

However, the [Pentagon's] directive does leave the way completely open
for increased autonomy in a range of military robots that aren't intended as killing machines.
It '[d]oes not apply to autonomous or semi-autonomous cyberspace systems
for cyberspace operations; unarmed, unmanned platforms; unguided munitions;
munitions manually guided by the operator (e.g., laser- or wire-guided munitions);
mines; or unexploded explosive ordnance,' the policy says.
That means the Pentagon does not need to apply similar safeguards
when developing computer viruses, bugs or surveillance drones.
As Wired puts it: 'While everyone’s worried about preventing the Rise of the Machines,
the machines are getting a pass to spy on you, under their own power.'
|
And if you believe this policy will be followed the world over...
|
|
|