• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Electrical Engineering News and Products

Electronics Engineering Resources, Articles, Forums, Tear Down Videos and Technical Electronics How-To's

  • Products / Components
    • Analog ICs
    • Battery Power
    • Connectors
    • Microcontrollers
    • Power Electronics
    • Sensors
    • Test and Measurement
    • Wire / Cable
  • Applications
    • 5G
    • Automotive/Transportation
    • EV Engineering
    • Industrial
    • IoT
    • Medical
    • Telecommunications
    • Wearables
    • Wireless
  • Learn
    • eBooks / Handbooks
    • EE Training Days
    • Tutorials
    • Learning Center
    • Tech Toolboxes
    • Webinars & Digital Events
  • Resources
    • White Papers
    • Design Guide Library
    • Digital Issues
    • Engineering Diversity & Inclusion
    • LEAP Awards
    • Podcasts
    • DesignFast
  • Videos
    • EE Videos and Interviews
    • Teardown Videos
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Bill’s Blogs
  • Advertise
  • Subscribe

How I learned to stop worrying and love “killer robots”

June 20, 2013 By Jason Lomberg, Technical Editor

Stars and Stripes recently ran an article from the Pittsburgh Tribune-Review with the provocative title “Militaries’ growing use of ground robots raises ethics concerns.” It rehashes old concerns about “killer robots” — lack of accountability, ethical responsibility — but its thesis is a fear of modern technology.

This irrational fear assigns higher moral standards to robots vs. flesh-and-blood troops, retarding technological progress and putting lives at risk. We should never downplay anything with the power to kill, but fear of the unknown shouldn’t paralyze us.

In the case of unmanned weapons systems (aka, “killer robots”), which have the power to reduce collateral damage and save lives, we should support and encourage their development, not preemptively ban them and set disproportionately high ethical standards as a function of their deployment.

And these tricky ethical concerns — responsibility, accountability — won’t get away anytime soon. The war on terror — Afghanistan and Waziristan, in particular — has dramatically raised the profile of unmanned weapons systems. Since 2004, we’ve launched 354 UAV strikes in the vicinity of northwest Pakistan, and the frequency has shot up under the present administration.


A wide swath of human rights groups, political factions, and interested parties have charged these artificial warriors with collateral rates up to 35%. Human Rights Watch has called for a preemptive ban. But drones are almost certainly more humane than 20th century tools of warfare, which killed up to three civilians for every enemy soldier.

Let’s be clear — every civilian death is tragic; collateral damage always has the same result, regardless of the source. But shouldn’t we emphasize weaponry that causes fewer civilian casualties?

In the minds of critics, it’s preferable to die a noble death at the behest of a human than suffer the indignity of losing your life to a cold, unfeeling robot. Here’s a hint: The poor sap is dead either way.

This quote from Steve Goose, Arms Division director at Human Rights Watch, says it all: “Giving machines the power to decide who lives and dies on the battlefield would take technology too far.” “Human control of robotic warfare is essential to minimizing civilian deaths and injuries,” he said.

How does Mr. Goose know that humans are more adept at preventing collateral damage? Why would unmanned weapons systems — operating with robotic precision — cause more civilian casualties?

If Human Rights Watch and other NGOs were truly concerned with reducing collateral damage, they would support the development of military robots.

Robots aren’t prone to mental distractions and have far more information at their disposal than we do. Their identification friend or foe (IFF) programs are infinity more sophisticated than the fallible judgment of a human.

The IEEE Spectrum nails it: “A robot can use high-resolution cameras, infrared imaging, ultraviolet imaging, radar, LIDAR, data feeds from other robots, and anything else you can think of all at once to determine very quickly how tall a person is, how much they weigh, and whether they’re holding an ice cream made of ice cream or a gun made of metal.”

Robots don’t get fatigued. They don’t experience stress. They won’t succumb to post-traumatic stress disorder (PTSD) and the associated dangers. And humans define their technical and moral parameters.

Forget Hollywood’s killer-robot hyperbole. Real autonomous military robots — like the Navy’s PHALANX Close-in Weapon System (CIWS) — operate in a highly restrictive environment and can only make reactive “decisions”. The CIWS, for example, detects incoming projectiles and autonomously defeats them.

Their “autonomy” is based entirely on hypothetical scenarios. They can’t formulate their own strategies, and they won’t plot the enslavement of mankind. Even autonomous robots — which act without human input — cannot help but follow their programming. They have no choice in the matter.

As The Spectrum points out, calling a robot “killer” ascribes a sinister motive to it, but robots have no will of their own.

So why the irrational fear of “killer” robots? Fear of the unknown — of advanced technology that critics don’t fully understand — is the biggest obstacle. And we should never let fear stand in the way of progress.

You Might Also Like

Filed Under: Uncategorized

Primary Sidebar

EE Engineering Training Days

engineering

Featured Contributions

Five challenges for developing next-generation ADAS and autonomous vehicles

Robust design for Variable Frequency Drives and starters

Meeting demand for hidden wearables via Schottky rectifiers

GaN reliability milestones break through the silicon ceiling

From extreme to mainstream: how industrial connectors are evolving to meet today’s harsh demands

More Featured Contributions

EE Tech Toolbox

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE Learning Center

EE Learning Center
“ee
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.
“bills
contribute

R&D World Podcasts

R&D 100 Episode 10
See More >

Sponsored Content

Advanced Embedded Systems Debug with Jitter and Real-Time Eye Analysis

Connectors Enabling the Evolution of AR/VR/MR Devices

Award-Winning Thermal Management for 5G Designs

Making Rugged and Reliable Connections

Omron’s systematic approach to a better PCB connector

Looking for an Excellent Resource on RF & Microwave Power Measurements? Read This eBook

More Sponsored Content >>

RSS Current EDABoard.com discussions

  • Exporting sensor readings as data...
  • 21V keeps getting shorted to my UART line.
  • STC8G1K08 Clone - Anyone know tools that work
  • How to read eeprom from stc8g1k08A mcu?
  • Inconsistent Charge Termination Voltage with battery charger

RSS Current Electro-Tech-Online.com Discussions

  • using a RTC in SF basic
  • Is AI making embedded software developers more productive?
  • Why can't I breadboard this oscillator?
  • Parts required for a personal project
  • Cataract Lens Options?
Search Millions of Parts from Thousands of Suppliers.

Search Now!
design fast globle

Footer

EE World Online

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDABoard Forums
  • Electro-Tech-Online Forums
  • Engineer's Garage
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips

EE WORLD ONLINE

  • Subscribe to our newsletter
  • Teardown Videos
  • Advertise with us
  • Contact us
  • About Us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy