Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 6, 2018 16:33:52 GMT -5
Big settlement? I would bet over a million bucks. But that's chump change compared to some settlements. Here's the irony. The family of the victim really lucked out. If the car had been driven by a human, the verdict would have been: victim's fault, she crossed outside a crosswalk, at night, in front of the car, no compensation. Now they get a few million. All things considered, the verdict seems fair. The irony for me is the fact that the car was 'self-driven' with the expectation that accidents can be necessarily avoided for that reason.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 6, 2018 16:51:00 GMT -5
Interesting. My 2013 car was recalled by Chevrolet because the catalytic converter might, under certain circumstances, operate at too high a temperature, reducing its effectiveness and longevity. I brought my car to the dealer a few days ago and they fixed the problem by loading a fix to the relevant software. Auto mechanics just isn't what it used to be. In the future when cars drive themselves, their driving skills can be improved by appropriate software fixes. Just wondering: Do you think that there will be a time when cars have no human driver at all? Just passengers? No human 'safety' back-up?
|
|
|
Post by Socal Fan on Apr 6, 2018 16:59:59 GMT -5
Just wondering: Do you think that there will be a time when cars have no human driver at all? Just passengers? No human 'safety' back-up? Absolutely. Machines are imperfect but humans are even more so. So it makes sense to eliminate the weakest link.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 6, 2018 21:46:06 GMT -5
Just wondering: Do you think that there will be a time when cars have no human driver at all? Just passengers? No human 'safety' back-up? Absolutely. Machines are imperfect but humans are even more so. So it makes sense to eliminate the weakest link. Perhaps the 'weakest link' will always be the programmer/developer for software.
|
|
|
Post by Socal Fan on Apr 6, 2018 22:25:35 GMT -5
Perhaps the 'weakest link' will always be the programmer/developer for software. Perhaps. But, as I've said before, the software only has to be better than a human driver, which is a very low bar to clear.
|
|
|
Post by agog on Apr 6, 2018 23:06:13 GMT -5
In the future when cars drive themselves, their driving skills can be improved by appropriate software fixes. It'll be so much fun in a decade or so when self driving cars are the norm if not the totality of autos allowed on public roads. Some mischievous 14yo Chinese hacker floors the gas on all the cars in America. That'll be a hoot. Can't wait.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Apr 6, 2018 23:07:28 GMT -5
Perhaps the 'weakest link' will always be the programmer/developer for software. Perhaps. But, as I've said before, the software only has to be better than a human driver, which is a very low bar to clear. Our conversation reminds me of these words: ...in this world, people make gods and worship what they have created. It would be more fitting for gods to worship people.
|
|
|
Post by Socal Fan on Apr 7, 2018 0:16:54 GMT -5
Some mischievous 14yo Chinese hacker floors the gas on all the cars in America. But it would be so more more fun to wipe out all the balances in the Federal Reserve Banking system. Turn Bill Gates and Warren Buffet into paupers overnight.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on May 15, 2018 21:03:29 GMT -5
www.msn.com/en-us/money/companies/teslas-autopilot-engaged-during-utah-crash/ar-AAxh9VQTesla's Autopilot engaged during Utah crashBy JULIAN HATTEM, Associated Press © The Associated Press In this Friday, May 11, 2018, photo released by the South Jordan Police Department shows a traffic collision involving a Tesla Model S sedan with a Fire Department mechanic truck stopped at a red light in South Jordan, Utah. Witnesses indicated the Tesla Model S did not brake prior to impact. Police Sgt. Samuel Winkler said the car's air bags were activated and that the Tesla's 28-year-old driver suffered a broken right ankle, while the driver of the mechanic truck didn't require treatment. Police in a Salt Lake City suburb say it's not immediately known whether a Tesla Model S sedan's semi-autonomous Autopilot driving system was in use when it rear-ended a truck apparently without braking before impact at approximately 60 mph. (South Jordan Police Department via AP) SALT LAKE CITY — The driver of a Tesla electric car had the vehicle's semi-autonomous Autopilot mode engaged when she slammed into the back of a Utah fire truck over the weekend, in the latest crash involving a car with self-driving features. The 28-year-old driver of the car told police in suburban Salt Lake City that the system was switched on and that she had been looking at her phone before the Friday evening crash. Tesla's Autopilot system uses radar, cameras with 360-degree visibility and sensors to detect nearby cars and objects. It's built so cars can automatically change lanes, steer, park and brake to help avoid collisions. The auto company markets the system as the "future of driving" but warns drivers to remain alert while using Autopilot and not to rely on it to entirely avoid accidents. Police reiterated that warning Monday. A Tesla spokesperson did not comment following the disclosure about the use of the feature. On Twitter, co-founder Elon Musk said it was "super messed up" that the incident was garnering public attention, while thousands of accidents involving traditional automobiles "get almost no coverage." South Jordan police said the Tesla Model S was going 60 mph (97 kph) when it slammed into the back of a fire truck stopped at a red light. The car appeared not to brake before impact, police said. The driver, whom police have not named, was taken to a hospital with a broken foot. The driver of the fire truck suffered whiplash and was not taken to a hospital. "What's actually amazing about this accident is that a Model S hit a fire truck at 60 mph and the driver only broke an ankle," Musk tweeted. "An impact at that speed usually results in severe injury or death." The National Transportation Safety Board has not opened an investigation into the crash, spokesman Keith Holloway said, though it could decide to do so. Over the past two months, federal officials have opened investigations into at least two other crashes involving Tesla vehicles. Last week, the NTSB opened a probe into an incident in which a Model S caught fire after crashing into a wall in Florida. Two 18-year-olds were trapped in the vehicle and killed in the flames. The agency has said it does not expect the semi-autonomous system to be a focus of that investigation. The NTSB and the National Highway Traffic Safety Administration are also looking into the performance of the company's Autopilot system in the March crash of a Tesla Model X on a California highway. The driver in that incident died. In March, an Arizona pedestrian was killed by a self-driving Uber car, in the first death of its kind. A driver was behind the wheel of the test vehicle in that case but failed to halt in time. The investigation into the crash in Utah is ongoing, police said. The driver of the Tesla may face charges for failing to maintain the safety of her vehicle, which would be a traffic infraction, according to police spokesman Sgt. Samuel Winkler.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on May 31, 2018 15:04:11 GMT -5
www.rt.com/usa/428195-tesla-crash-laguna-police/Tesla 'on autopilot' smashes into parked police carA Tesla Model S crashed into a parked police vehicle while reportedly driving on autopilot in Laguna Beach, California, injuring the Tesla driver and badly damaging the cop car. Laguna Beach Sergeant Jim Cota said on Twitter that the cop car was unmanned at the time of the impact, while the Tesla driver sustained minor injuries. The crash happened soon after 11am Tuesday on Laguna Canyon Road. Cota told the LA Times the “police car is totalled.” The driver told police the car was operating on autopilot at the time of the crash. Tesla’s autopilot system uses sensors, cameras, and radar to study the vehicle’s surroundings and carry out automatic functions which include emergency braking. The company describes it as being a “driver assistance system.” “When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times,” Tesla said. “Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents.” According to Cota, a Tesla on autopilot crashed into a semi-truck at the same location last April. Police are examining whether something about the area’s lane markings or topography are creating a problem for the cars’ technology, KTLA reports. Tesla cars have been involved in a number of recent crashes involving autopilot mode. In March, a Model X hit a highway divider while on autopilot in California, killing the driver. In Utah, a Model S in autopilot crashed into a fire truck stopped at a red light in May. Two people were injured in the crash. According to the vehicle’s data, it sped up just before hitting the truck and the driver hit the brakes seconds before impact. In January, a Model S crashed into a parked fire truck in California while on autopilot. Tesla founder Elon Musk complained about the press coverage Tesla crashes are receiving earlier this month. “It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage,” Musk wrote about the Utah crash on Twitter.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on May 31, 2018 19:23:25 GMT -5
At least two items from the above article I have found to be noteworthy: Tesla says, “When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times..” Such an statement begs this question, why have autopilot in the first place? Secondly, this: "According to Cota, a Tesla on autopilot crashed into a semi-truck at the same location last April. Police are examining whether something about the area’s lane markings or topography are creating a problem for the cars’ technology.." Computers can neither anticipate a possible situation nor use intuition when driving. Nor can they extrapolate from what they 'see' in the immediate surroundings when things are not as normally expected, demanding that proper adjustments be made. One can only wonder how many variables a self-driving car can handle when on any road. A computer cannot possibly handle all of the possibilities that might lead to an accident. Of course, neither can we humans. But I still believe that we are much better at it.
|
|
|
Post by agog on Jun 14, 2018 22:31:12 GMT -5
It'll be so much fun in a decade or so when self driving cars are the norm if not the totality of autos allowed on public roads. Some mischievous 14yo Chinese hacker floors the gas on all the cars in America. That'll be a hoot. Can't wait. ************************************************************************* THIS JUST IN!: Experts at the Department of Homeland Security have indicated it is only a “matter of time” before hackers manage to interfere with a commercial airliner, leading to a potentially “catastrophic disaster,” according to a report from CBS News on Tuesday. Things just keep on getting better.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Jun 30, 2018 13:42:31 GMT -5
This self-driving car will sacrifice itself to save pedestriansThe vehicles will take part in a pilot program with Kroger in the fall. by Rob Verger, Popular Science, June 29, 2018 www.popsci.com/self-driving-grocery-car-nuro?dom=prime&src=synThe vehicles will take part in a pilot program with Kroger in the fall. The sensors on this autonomous car include a LIDAR unit, at top. Most self-driving cars are designed to carry people. That’s what Google’s autonomous vehicles do in Arizona, for example. But a company called Nuro has created a diminutive ‘bot that can drive itself down streets toting eggs, toilet paper, hotdogs, and other kinds of groceries to people’s homes. And interestingly, when humans aren’t your cargo, different ways of handling safety present themselves. A pilot program involving Nuro and the grocery chain Kroger is scheduled to kick off this fall in a to-be-announced city, meaning that in the autumn, people in a test urban area should be able to order groceries by app, then have them delivered by a little independent car. Here’s how the vehicle, which was designed from the ground up just for cargo, works. How does it work?In terms of size, the Nuro car is “pretty similar to a big guy on a motorbike,” says Nuro co-founder Dave Ferguson, who previously worked with Google’s self-driving car project, now called Waymo. The vehicle is about 6 feet high, 8 feet long, and 3.6 feet wide—about half a Toyota Corolla’s width. It’s electric-powered, with a battery system designed to last all day without a recharge, Ferguson says. It weighs around 1,500 pounds. The produce-schlepper will go just 25 miles per hour, but will increase its max speed in the future. This generation of the vehicle has two main compartments, each of which can hold six grocery bags. In a future version, those two compartments will be able to hold 10 bags each—enough for some serious barbeque supplies. Obviously, there’s no human on board to steer it around. For that, this car, like other self-driving vehicles, needs a perception system to see the world. A spinning laser unit at the top, called a LIDAR, uses light to measure the distance from the vehicle to objects. Cameras and radar units also have a view 360 degrees around the Nuro, the latter of which can also measure the velocity of moving objects near it. It has access to map data to know where it is, as well as GPS. All of that means the car can the see the world around it. And since GPS might not be totally precise, the car’s sensors can help localize it, too. “It can effectively compare what it’s seeing with what it expects in the map, and it can use that to correct its position,” Ferguson says. The little transporter will be driving around with goods onboard like eggs and hamburger buns, and not people, meaning Nuro can think about safety differently. That’s an even more important subject in the self-driving car world since one of Uber’s autonomous cars killed an Arizona pedestrian, despite the fact that a human safety driver was behind the wheel at the time. “If you’re no longer trying to protect an occupant above all else, and in fact you’re trying to protect the most vulnerable road users—a pedestrian, cyclists—at all costs, then you can do things like self-sacrificing the vehicle,” he says. Given a situation where the car has to decide between hitting a person or a tree, Ferguson imagines, “we will always drive into the tree.” Or even, he says, a parked car. “We’ve designed the whole front end of the vehicle to collapse,” he adds, “and to try to absorb as much energy as possible, in the case of any collision.” How will the robot grocery system work?Once the pilot program kicks into gear, it should work this way: a person in the delivery area will order the groceries they want via a Nuro app, or the Kroger website (the grocery chain already has a delivery service). After Kroger staff stock a Nuro vehicle with the goods, the car will set off for the customer’s house; the grocery customer will be able to follow the whereabouts of the car, like it’s an Uber. Once it arrives, the human uses the app to gain access to the vehicle or a touchscreen to input a code. When they open the cargo door, they’ll have access to only their own groceries, Ferguson says, so no one swipes anyone else’s bananas. Ride-sharing companies like Uber and Lyft are exploring self-driving car tech because ultimately, it could save money—you don’t need to pay a driver. Ferguson says that the idea behind this service is that it could bring groceries to people wherever they are: “We’re talking about being able to provide a service that is so lost cost that people can use it that currently don’t use any services,” he says. Ultimately, the proof will be in the robot-delivered pudding during the pilot program this fall.
|
|
|
Post by Socal Fan on Jul 1, 2018 0:31:05 GMT -5
Computers can neither anticipate a possible situation nor use intuition when driving. Nor can they extrapolate from what they 'see' in the immediate surroundings when things are not as normally expected, demanding that proper adjustments be made. One can only wonder how many variables a self-driving car can handle when on any road. A computer cannot possibly handle all of the possibilities that might lead to an accident. Of course, neither can we humans. But I still believe that we are much better at it. Only when we are not drunk, sleepy, high on drugs, taking prescription medication, texting our friends, playing video games, talking on our cell phones, distracted, depressed, suicidal, etc. So my money is still on the computer.
|
|
|
Post by rickolsen on Jul 1, 2018 1:45:11 GMT -5
The computer chips in my cars regularly failed so they wouldn't start. It cost $350 to replace a $20 computer chip. I'd never trust a computer to drive a car. If they fail you're dead.
|
|