|
Post by Socal Fan on Mar 20, 2018 13:56:19 GMT -5
The problem with gut feelings is that they are as likely to be wrong as right. So gut feelings may avoid some accidents but cause some others.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 20, 2018 16:55:28 GMT -5
The problem with gut feelings is that they are as likely to be wrong as right. So gut feelings may avoid some accidents but cause some others. My own hunch or gut feeling is that the car in question was speeding. If so, it made the very same mistake that human drivers seem to make on a regular basis. But regardless of 'who' may be at fault, a person is dead. Consider this: If we use your logic about gut feelings, the odds are human driven and self-driven cars are likely to have the same number of accidents involving fatalities. And, because human beings love to scapegoat, which one of the two types of drivers do you think will bear an unfair amount of the blame? Shareholders of stock in self-driving cars may be in for a bumpy ride!
|
|
|
Post by Socal Fan on Mar 20, 2018 17:16:19 GMT -5
If we use your logic about gut feelings, the odds are human driven and self-driven cars are likely to have the same number of accidents involving fatalities. Human driven and self-driven cars are likely to have the same number of accidents specifically with respect to the gut feelings factor. But when it comes to other factors (e.g. response time, distractions, senses, alcohol, etc) self-driven cars have a big advantage.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 20, 2018 17:48:33 GMT -5
If we use your logic about gut feelings, the odds are human driven and self-driven cars are likely to have the same number of accidents involving fatalities. Human driven and self-driven cars are likely to have the same number of accidents specifically with respect to the gut feelings factor. But when it comes to other factors (e.g. response time, distractions, senses, alcohol, etc) self-driven cars have a big advantage. Duly noted, socalfan. What say you about ghosts in the machine? I have plenty of them in my computer. Sometimes it will abruptly and simply stop working correctly, for no apparent reason. Although we may need to come to accept the occasional (or frequent) computer errors in self-driving vehicles, accidents will remain on the horizon of human ambition whether they originate from human or computer error......which leads me to this question: How many computer-generated fatalities are acceptable?
|
|
|
Post by Socal Fan on Mar 20, 2018 19:47:34 GMT -5
What say you about ghosts in the machine? It turns out that modern cars are already largely run by computers, which in cars are known as Electronic Control Units (ECUs). Some modern motor vehicles have up to 80 ECUs. Embedded software in ECUs continues to increase in line count, complexity, and sophistication. en.wikipedia.org/wiki/Electronic_control_unitSo a self driving car is merely an expansion in the use of automotive computers, not a new use. So far, ghosts appear not to be a major problem.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 20, 2018 19:56:19 GMT -5
What say you about ghosts in the machine? It turns out that modern cars are already largely run by computers, which in cars are known as Electronic Control Units (ECUs). Some modern motor vehicles have up to 80 ECUs. Embedded software in ECUs continues to increase in line count, complexity, and sophistication. en.wikipedia.org/wiki/Electronic_control_unitSo a self driving car is merely an expansion in the use of automotive computers, not a new use. So far, ghosts appear not to be a major problem. Underscore so far...
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 20, 2018 23:17:00 GMT -5
Socalfan, are you familiar enough with self-driving cars in order to tell us if they are analogous to aircraft on autopilot?
|
|
|
Post by agog on Mar 21, 2018 3:50:33 GMT -5
Very few pedestrians are run over by airplanes.
|
|
|
Post by BOGC on Mar 21, 2018 5:10:47 GMT -5
It turns out that modern cars are already largely run by computers, which in cars are known as Electronic Control Units (ECUs). Some modern motor vehicles have up to 80 ECUs. Embedded software in ECUs continues to increase in line count, complexity, and sophistication. en.wikipedia.org/wiki/Electronic_control_unitSo a self driving car is merely an expansion in the use of automotive computers, not a new use. So far, ghosts appear not to be a major problem. Underscore so far... I think there have been (unreleased) remote (cellular or WiFi) hacks of non-self-driving cars, that were sufficient to engage auto-braking or collision avoidance or otherwise interfere with actual driving. Security, notoriously difficult, and often neglected because it's cheaper to pay off losses than do it right, would be all the more critical for a self-driving vehicle. Also, IMO for the foreseeable future, the real flaw in socalfan's statement is that it's only statistically true that a self-driving car would be safer than a human driver. It would be safer than a marginally competent human driver, or a human driver that is exhausted or in conditions beyond their experience. It would not be safer than a very competent and widely experienced human driver operating well within their high-functional range (not operating tired or with low blood sugar or when sick or acutely distracted or upset, etc). So it's true in the same sense that a vaccine might protect vastly more people than it harms (and at sufficient usage, even those that don't get vaccinated will be safer); but there will be a few bad reactions, a few people harmed, that would not have been without the vaccine. The conclusion is that eventually the collectivists would want to require the collectively (statistically) safest practice only, notwithstanding that it denies people the option of having more control over their own outcomes; or, in the case of vaccines, undercuts Darwin by harming those with otherwise irrelevant vulnerabilities while keeping alive more of those with relatively common vulnerabilities.
|
|
|
Post by BOGC on Mar 21, 2018 5:20:49 GMT -5
Socalfan, are you familiar enough with self-driving cars in order to tell us if they are analogous to aircraft on autopilot? Aside from auto-landing (which has a lot of ground based support), autopilots have a much easier job than self-driving cars. Why? Because even ordinary driving is like the Blue Angels or Thunderbirds flying in tight formation. Rest assured, they aren't on autopilot! Despite an aircraft being much faster than a car, the distances are so much greater, and you have an additional dimension for avoidance (and it's like an infinite lane road, where you can't get trapped between traffic in the next lane and the side of the road), so that most of the time, reaction time is much smaller in a car. Also, aircraft with more sophisticated autopilots (some are pretty primitive, mechanical even) already have collision avoidance features, etc. Finally, there are LOTS more cars in the tiny space of roads than there are airplanes in the much larger available airspace, except again, during or near to takeoff or landing, which are usually the most difficult and dangerous parts of routine flying (by routine, I mean not combat, formation, aerobatic, or with other complications; just basic transport). So in a practical sense, IMO getting a self-driving car right is far more challenging than getting an autopilot right. And for now, there's always a pilot (or a drone pilot) in control, even if they're just waiting to deal with whatever problems arise; not like a fully autonomous self-driving car that doesn't even have controls. (BTW, AFAIK the recent accident was a car that had a human safety driver on board, and they likely couldn't have spotted the pedestrian in time either. There will always be a few situations that even the combination of the best available driver AND self-driving system can't avoid.)
|
|
|
Post by wilbur on Mar 21, 2018 10:00:50 GMT -5
its not the outdated software its the virus uploaded that i would be worried about
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 21, 2018 10:13:26 GMT -5
After reading members' comments this morning, my questions have been answered (thank you) except one: How many computer-generated fatalities are acceptable?Answer: Apparently, a ny number of deaths will be acceptable, if there is no way (or willingness) to prevent them. That is, if we accept the reality of this new development and its consequences.
|
|
|
Post by Socal Fan on Mar 21, 2018 11:30:14 GMT -5
Apparently, a ny number of deaths will be acceptable, if there is no way (or willingness) to prevent them. For me, any number of deaths is acceptable as long as it is less than with human driven cars. Given all the human frailties, I think that's a low bar so I think it is very likely it can be achieved.
|
|
|
Post by Socal Fan on Mar 21, 2018 11:34:29 GMT -5
its not the outdated software its the virus uploaded that i would be worried about Agreed. But that goes far beyond self driven cars. It may be that the Russians or Chinese can hack into and destroy our electrical grid, telecommunications systems, and financial systems.
|
|
|
Post by Socal Fan on Mar 21, 2018 11:42:33 GMT -5
Socalfan, are you familiar enough with self-driving cars in order to tell us if they are analogous to aircraft on autopilot? IMO getting a self-driving car right is far more challenging than getting an autopilot right. Agreed. In the air, everything is rigidly controlled and there are no traffic lights, 4 way stop signs, pedestrians, lane merges, yields or rights of way. On the ground, it is a free for all where anything can happen.
|
|