Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 29, 2018 8:09:19 GMT -5
100 percent human error. (Blame it on Louise.) Thanks, BOGC! I guess this means that even if self-driving cars become the norm, any accident will be ultimately due to human error. Oh, the irony of it all!
|
|
|
Post by BOGC on Mar 29, 2018 9:27:00 GMT -5
Thanks, BOGC! I guess this means that even if self-driving cars become the norm, any accident will be ultimately due to human error. Oh, the irony of it all! If you recall that movie and the next...it was human error in a sense, not realizing that the secret nature of the mission would cause contradictions for HAL, which led to his faking the malfunction and killing off the crew (except for Bowman). Human error may be present in various roles: operator, but also designer, programmer, etc. Just about nothing is hack-proof, and even if it's otherwise reasonably sound, that's scary. Non-self-driving cars probably have even more human error; but in one sense, they're more robust: drivers come in all skill levels, but a small number of hacks could cause major pileups everywhere. Short of nerve gas, you can't hack brains on that massive a scale. :-)
|
|
|
Post by agog on Mar 29, 2018 23:12:54 GMT -5
Thanks, BOGC! I guess this means that even if self-driving cars become the norm, any accident will be ultimately due to human error. Oh, the irony of it all! If you recall that movie and the next...it was human error in a sense, not realizing that the secret nature of the mission would cause contradictions for HAL, which led to his faking the malfunction and killing off the crew (except for Bowman). Human error may be present in various roles: operator, but also designer, programmer, etc. Just about nothing is hack-proof, and even if it's otherwise reasonably sound, that's scary. Non-self-driving cars probably have even more human error; but in one sense, they're more robust: drivers come in all skill levels, but a small number of hacks could cause major pileups everywhere. Short of nerve gas, you can't hack brains on that massive a scale. :-) It's all computerized BOGC. Nothing can go wrong.
|
|
|
Post by Socal Fan on Mar 30, 2018 0:25:34 GMT -5
a small number of hacks could cause major pileups everywhere. But if someone wanted to kill a lot of people by hacking, they would hack into nuclear power plants or our power grid.
|
|
|
Post by BOGC on Mar 30, 2018 10:27:43 GMT -5
a small number of hacks could cause major pileups everywhere. But if someone wanted to kill a lot of people by hacking, they would hack into nuclear power plants or our power grid. I don't know about nuke plants, except that I'd hope they could manage a safe shutdown even then, mostly; although demonstrably, if their backup power went out, they'd have a problem. I wouldn't rule out the grid, there are thought to be lots of vulnerabilities, and at least rumors of prior attempted attacks, if not large scale ones, and plenty of unfriendly probing. But it doesn't take the resources of a nation to cause problems. The motivation might not be something you'd anticipate, so neglecting security is always a mistake. There are those who will do such things just because they can, and sometimes have it get out of control (look up Morris Internet worm). Or they might just be incredibly oblivious to consequences. Vehicles and medical devices are among those where harm to life is possible, but defense against attack is seldom even contemplated, and regular updates are far from assured. Vehicles are numerous. Even a small number of vulnerable models could be thousands of vehicles - with possible consequences worse than years of incompetent human driver caused fatality, injury, and damage.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 2, 2018 14:47:01 GMT -5
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 2, 2018 14:51:36 GMT -5
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 2, 2018 14:52:35 GMT -5
|
|
|
Post by Socal Fan on Apr 2, 2018 18:39:35 GMT -5
Uber settles with family of woman killed by self-driving car I very much hope it was not a big settlement. Because the primary fault was with the victim and Uber was only secondarily at fault. I've read numerous articles about the incident and nobody, but nobody, ever asked: "Aren't pedestrians supposed to look both ways before crossing the road?"
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 2, 2018 19:15:46 GMT -5
Uber settles with family of woman killed by self-driving car I very much hope it was not a big settlement. Because the primary fault was with the victim and Uber was only secondarily at fault. I've read numerous articles about the incident and nobody, but nobody, ever asked: "Aren't pedestrians supposed to look both ways before crossing the road?" Hi socalfan. No contest here. But I would bet somebody asked that question although we may never see it in print. Especially since there was no crosswalk for pedestrians at the site of the crash: Big settlement? I would bet over a million bucks. But that's chump change compared to some settlements. From the article and what the future may hold: The fatality also presents an unprecedented liability challenge because self-driving vehicles, which are still in the development stage, involve a complex system of hardware and software often made by outside suppliers.And I still feel sorry for the driver: .. footage showed the human driver who was behind the wheel mostly looking down and not at the road in the seconds before the incident.That might be the real reason behind the settlement.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 6, 2018 13:35:14 GMT -5
More navigator mishaps: CBS News April 6, 2018, 9:13 AM L.A. residents complain GPS app Waze is creating "insanity" on their streetwww.cbsnews.com/news/los-angeles-baxter-street-accidents-waze-traffic/It's bumper-to-bumper traffic on Baxter Street as Los Angeles commuters make their way home. Jeff Hartman has lived here for 20 years and said he's never seen it this bad. He said he thinks GPS apps are responsible for the traffic. The app most people on the street blame is Waze, reports CBS News correspondent Carter Evans. When you put in an address across town, the normal route is full of traffic, so the routes drivers through Baxter Street. But Baxter Street looks more like a roller coaster track with a 30 percent grade, making it one of the steepest streets in the country. It's even steeper than San Francisco's world-famous Lombard Street. "And are they prepared for it when they get here?" Carter asked. "I don't think so," Hartman said. "When you get to the top, you can't see the hill on the other side, or the street, so people tend to stop. And that's where a lot of the problems come." Hartman has seen it all: cars flipped into his neighbor's yard or stalled and slipping down the hill in the rain. His neighbors have documented a number of dangerous accidents. "They took out my trellis, my retaining wall, my picket fence… it looked like a plane crashed through my front yard," one neighbor said. "He lost control of the car and ended up rolling over two driveways," another said. It's even worse for bigger vehicles that can get stuck at the top when their wheels lose traction. And it's not just in Los Angeles, GPS-based apps are creating problems in cities across the country. "People will do whatever the app tells them to and it's scary sometimes," said Tom Rowe, police chief in Leonia, New Jersey. Leonia solved the problem by restricting side streets to residents only during rush hour, which caused Waze to remove the shortcuts. As for Baxter Street in Los Angeles, Waze told CBS News that since "the city has placed a public road there... it should be considered usable within Waze." Neighbors say Waze needs to do a better job of warning drivers and say the problem can't be curbed soon enough. "Screeching and honking and sirens and insanity all the time," Diana Wagman said. Los Angeles officials said making this a one-way street or restricting turns could rule out Baxter as a shortcut. Waze said it encourages drivers to report any hazardous conditions, including steep inclines, and it often takes that information into account when Waze refines its maps. © 2018 CBS Interactive Inc. All Rights Reserved.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 6, 2018 13:47:28 GMT -5
"People will do whatever the [GPS] app tells them to and it's scary sometimes," said Tom Rowe, police chief in Leonia, New Jersey. Yup.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 6, 2018 14:05:06 GMT -5
|
|
|
Post by Socal Fan on Apr 6, 2018 15:50:40 GMT -5
Big settlement? I would bet over a million bucks. But that's chump change compared to some settlements. Here's the irony. The family of the victim really lucked out. If the car had been driven by a human, the verdict would have been: victim's fault, she crossed outside a crosswalk, at night, in front of the car, no compensation. Now they get a few million.
|
|
|
Post by Socal Fan on Apr 6, 2018 15:57:21 GMT -5
Interesting. My 2013 car was recalled by Chevrolet because the catalytic converter might, under certain circumstances, operate at too high a temperature, reducing its effectiveness and longevity. I brought my car to the dealer a few days ago and they fixed the problem by loading a fix to the relevant software. Auto mechanics just isn't what it used to be.
In the future when cars drive themselves, their driving skills can be improved by appropriate software fixes.
|
|