fbpx

The Autonomous Report: Self-drivers spotted in Toronto; Walmart tests grocery delivery

Walmart is testing a new grocery delivery service with the help of Uber in Dallas and Orlando. Testing in autonomous mode is expected to begin by the end of 2017.

By Jeff Sanford

Toronto, Ontario — August 27, 2017 — In this week’s Autonomous Report, the University of Toronto gets into the self-driving race, the German government releases ethical rules for autonomous vehicles (AVs), Walmart tests Uber AVs for grocery delivery and much, much more!

– A report by the Toronto Star notes that two autonomous vehicles have begun operating in manual mode around the University of Toronto campus. The tests began on Tuesday and went on for the week. According to the report “manual mode” means that although the cars have self-driving capability, “… they will be operated by human drivers … The cars aren’t available for rides: they will be conducting mapping tasks.”

– Walmart is testing a new grocery delivery service with the help of Uber. The world’s largest retailer is rolling out a grocery delivery service in Dallas and Orlando. Users shop online, and then pay an extra $9.95 to have the items delivered. According to a story on TechCrunch about the experiment, Uber will begin to, “… test the cars in autonomous mode by the end of 2017.”

– A recent survey finds that the key to crushing the fear of self-driving cars is, “… riding in one.” According to a CNBC report on a study carried out by AAA, 78 percent of drivers are, “… afraid to ride in a self-driving car.” But after a ride in the backseat, “… the participants all said they were happy to allow robots to roll them around town. Some of the other feedback: “… passengers felt like they didn’t need to be shown everything the car was doing, as it was too distracting. Also, having the steering wheel, moving by itself in the car actually added to the anxiety, and the riders said it would be better if it were removed altogether.”

One other interesting idea from the story: “The company also learned that voice communication was extremely important, both as a way to control the car and to get feedback from the vehicle.” Another idea from the story: According to advocates, “…technology will actually make driving safer, since statistics indicate human behaviour is the major cause of most auto crashes. But many consumers familiar with the tendency of other electronic devices to sometimes malfunction or perform erratically still seem to have trouble accepting the idea of being held in a vehicle that could fail.” The story winds up claiming that, “The big issue is that people are afraid of being in vehicles where they cannot exercise some control.”

– A headline this week in the Wall Street Journal notes what many have suspected, “Tesla’s Push to Build a Self-Driving Car Sparked Dissent Among Its Engineers.” Many have criticized Elon Musk for being a little aggressive in trying to promote his car as a self-driving vehicle, even though Tesla’s don’t have LiDAR and can only navigate by the use of sensors and cameras. It was only a year ago that Musk was pushing the idea that his cars should be considered AVs. He promised then that by the end of 2017 a Tesla would drive itself across America. But we’re not hearing that sort of talk much today. A couple of accidents, some lawsuits and all of a sudden Musk is no longer promoting the Tesla as the world’s first AV. The gist of the WSJ story is that the engineers at Tesla were uncomfortable with Musk’s aggressive posturing on the AV status of current Teslas.

– A story this week in the Detroit News notes that, “… cash-strapped states are considering taxing self-driving cars, as they look for ways to replace revenue lost from gas tax collections that have dwindled as cars have become more fuel efficient.” The report goes on to notes that state lawmakers in Massachusetts have, “… introduced legislation that would impose a 2.5 cents-per-mile tax on self-driving cars. A similar measure that would establish a 1 cent-per-mile fee for self-driving cars, and a 2.6 cent-per-mile fee for autonomous trucks that have more than to axles has been approved by the state Senate in Tennessee.”

Arguably, there are not any true AVs on the road yet, and the government is already talking about taxing these devices. Seems about right.

– A report by Business Insider states that a corporate spy has revealed that, “The development of Apple’s autonomous driving technology is at about the stage where Google’s self-driving car project ‘was three years ago,’ according to a person who has seen Apple’s tech and is familiar with the technology of several other autonomous car front runners.” According to an unnamed source cited in the story, “Apple is just trying to play catch up.” Google’s project, now spun out into its own company called Waymo, is by many accounts the furthest along in self-driving technology.”

– A report from Wired says Waymo has developed microphones that let robocars, “… hear sounds twice as far away as previous sensors while also letting them discern where the sound is coming.” The company recently spent a day, “… testing the system with emergency vehicles from … police and fire departments. Police cars, ambulances, fire trucks, and even unmarked cop cars chased, passed, and led the Waymo vans through the day and into the night. Sensors aboard the vans recorded vast quantities of data that will help create a database of all the sounds emergency vehicles make, so in the future, Waymo’s driverless cars will know how to respond.”

The Register reported on the ethical guidelines for AVs recently released by the German government. German Transport Minister Alexander Dobrindt was quoted in a statement as having said, “The interactions of humans and machines is throwing up new ethical questions in the age of digitalization and self-learning systems. The ministry’s ethics commission has pioneered the cause and drawn up the world’s first set of guidelines for automated driving.” Fourteen scientists came up with 20 rules. The report summarized a couple of the most basic ones:

1. “The protection of human life always has top priority. If a situation on the road goes south, and it looks as though an accident is going to happen, the vehicle must save humans from death or injury even … if it means wrecking property or mowing down other creatures.”

2. “If an accident is unavoidable, the self-driving ride must not make any choices over who to save. It can’t wipe out an elderly person to save a kid … Decisions could not be made according to age, sex, race, disabilities and so on. All human lives matter.”

3. “A surveillance system should be in place—such as a black box—that records the steps leading to an accident so that it’s obvious who was driving at the time and who is therefore at fault: the human behind the wheel or the computer. The identity of the driver should also be documented. It should be entirely possible to proportion blame accurately, essentially.”

4. “Drivers should have full control over what personal information is collected from their vehicles. This would basically stop tech giants taking location data on the down-low to customize advertising, for example. Ultimately, drivers will still bear responsibility if their autonomous (vehicle) crashes, unless it was caused by a system failure, in which case the manufacturer is on the hook.”

SHARE VIA:
Facebook
LinkedIn
Twitter
Email

Sign-up for the Collision Repair daily e-zine and never miss a story –  SUBSCRIBE NOW FOR FREE!

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

IAA Tech video
PlayPlay
Arslan Accuvision
PlayPlay
previous arrow
next arrow

Recent Products

Recent Posts

Stay on top of the latest INDUSTRY news and trends by subscribing to our daily e-zine!

Our other sites

Our other sites

Days
Hours
Minutes
Seconds