Saturday, November 15, 2014

Moral Machines

 on  with No comments 
In ,  
This is an assignment calling for a reaction to an article presented and discussed in class regarding driver-less cars.  The article was called "Moral Machines" and was written by Gary Marcus.  It was published on The New Yorker website November 24, 2012.  In it, the author looks to a time when autonomous vehicles are the norm, and asks at such a time, will driving yourself be though of as immoral?

The idea of driver-less cars being the future, and that driving a car yourself would someday become uncommon or possibly illegal, is a frightening prospect.  As someone who works in Information Technology, I have seen first-hand multiple examples of a piece of software or an update to a piece of software failing to perform as expected overall, or in specific cases not anticipated by the manufacturer.  But yet in the article, Marcus claims that driver-less cars will someday become “able to drive better, and more safely than you can.”

Many of us who have come to rely on computers in our day to day work know about Microsoft Patch Tuesday very well. According to the Search Security web site, Patch Tuesday “is the second Tuesday of each month, when Microsoft releases the newest fixes for its Windows operating system and related software applications."  Anyone with a Windows computer who pays attention to the Windows Update function is well aware of the number of updates that are released by Microsoft every month.  While the process is mostly unobtrusive and does increase the security and reliability of your computer, I’ve personally seen applications that stop working correctly with new versions of Windows and updates that go as far as causing an entire company’s machines to reboot over and over again until an administrator intervenes.

Microsoft is not alone in these embarrassing flaws in the software engineering process. As discussed by Thomson, a flaw in Apple Maps left users as far as 40 miles from their intended destination. Police in one affected area called it a “potentially life-threatening issue.”  Imagine for a second your driver-less car being navigated by flawed software such as this.  Your car would be driving you to the wrong destination, and even if you realized what was happening you may be powerless to do anything about it.

Driver-less cars may well be the future of travel, and most would agree that this is a promising and exciting technology.  But the idea that driving your car yourself would ever be considered illegal or immoral as proposed by Marcus is laughable at best and irresponsible at worst. Trusting your well being to a piece of software should never be a mandatory decision and anyone who uses a computer today should understand why.
Share: