Video of AZ Uber crash

Bruce

Grand Admiral
May 23, 2017
520
1,889
1,350
RSI Handle
ABAP
What you're saying is that all of these systems failed, and that's OK, because the woman wasn't in a crosswalk. Look I get it, I said she had some responsibility, but don't deny Uber theirs.
No, what I'm saying is - we need to see what was visible to those systems and how it was classified before making any assumption about their state and capabilities. In my life I worked on some RT decision making systems for very fast movement objects, and I can tell you - it is very tough for machines and people to properly recognize the object that isn't behaving as supposed ( flying up car at the altitude of 10 000 feet for instance would confuse everyone for a long period of time ( unconstrained neural network may figure out that it is car eventually, but I wouldn't bet on it being able to do it within seconds )
 
  • Like
Reactions: Sraika and Bambooza

Bambooza

Space Marshal
Donor
Sep 25, 2017
5,778
18,296
2,875
RSI Handle
MrBambooza
What you're saying is that all of these systems failed, and that's OK, because the woman wasn't in a crosswalk. Look I get it, I said she had some responsibility, but don't deny Uber theirs.
This. While the woman wasn't in a crosswalk, I do wonder if the automated car system Uber is using wouldn't have also failed if the woman was in a cross walk.
 

Bruce

Grand Admiral
May 23, 2017
520
1,889
1,350
RSI Handle
ABAP
This. While the woman wasn't in a crosswalk, I do wonder if the automated car system Uber is using wouldn't have also failed if the woman was in a cross walk.
Most probably it wouldn't ... people on the shoulders and on crosswalks are identified and system is trained to err on their side ... but "some" object with shape of the car ( wide in the bottom due to bike ) and on the lane of cross traffic outside of the normal pedestrian movement areas is quite different story (also due to the angle of the roads some slow movement towards the lane of the uber car could be also considered fine and not a crossing). Again - without complete review of the data it is quite difficult to say what exactly was the reason.
 
  • Like
Reactions: Bambooza and Sraika

Vavrik

Space Marshal
Donor
Sep 19, 2017
5,476
21,988
3,025
RSI Handle
Vavrik
This. While the woman wasn't in a crosswalk, I do wonder if the automated car system Uber is using wouldn't have also failed if the woman was in a cross walk.
That is a good question you know. The word "proven" is a big topic in this. The problem is that the system is a Level 2 certified automation systems (where we are now... state of the art) have proven they are able to stay in a lane, and follow the speed limit. They have not proven any other capability level, even though they might have some aptitude. It might be the software would behave differently near a crosswalk, but it cannot be or has not been proven. That's precisely why they require an operator/driver to monitor the system's performance, and override if they need to. They require this until Level 5 is achieved. We're only at Level 2.

What gets me in all this, is that there are companies who are putting fully autonomous vehicles on public roads right now, and States that are allowing it. This even despite the lack of certification that proves the system is capable. All due to hype surrounding AI, but the truth is, AI is not anything close to what we (collectively) think it is.
 
  • Like
Reactions: Bambooza

Sirus7264

Space Marshal
Donor
Apr 5, 2017
3,364
11,195
2,800
RSI Handle
Sirus7264
I'm sorry but i can't blame the car at all on this. The driver on the other hand i beleive is at the most fault. He is to busy playing on his cellphone or falling asleep at the wheel. If i was that driver first reaction would have been turn right off the side of the road which was highly probable. Not only would this whave reduced how much of an impact the victim received it would have put a big part of the inertia in a different direction. An automous car i dont think would have known to do that. Also if you engage the emergency break and your normal breaks at the same time you can stop much faster and harder. With that said there is blame on the victim also as she was crossing at a no-crossing, with no reflective gear, at night time not paying attention(most likely earbuds i bet this is becomming an issue here in japan). Technology isn't perfect that is why there is a driver there especially when technology like this is brand new there is bound to be bugs or imperfections.
 
  • Like
Reactions: Sraika

Bambooza

Space Marshal
Donor
Sep 25, 2017
5,778
18,296
2,875
RSI Handle
MrBambooza
Such a non clear cut case. If the driver was fully engaged and paying attention then it would have been ease to say "didn't see in time to provide any correction to avoid hitting the pedestrian". In this case some of the blame falls directly on the driver of the car for clearly being distracted, and while being distracted or not distracted would have changed the outcome its hard to tell from that clip.
Most of the blame falls on the pedestrian for clearly being where they should not have been, and have paid the ultimate price.
As for the AI while it seems to be receiving most of the blame and indeed it clearly failed at this point in time, it should be allowed to fail as @Vavrik pointed out its still in the early stages of development.

Still I would love to see the recorded playback of what the AI saw and was reacting to.
 

Thalstan

Space Marshal
Jun 5, 2016
2,084
7,400
2,850
RSI Handle
Thalstan
The problem with self driving cars that rely on a human to react and take it out of an unidentified situation is that the driver will inevitably disengage from their duty. Heck, even in cars today where the person is supposed to be driving, we have people texting, reading, putting on make-up, combing their hair, etc. now take that same person, give them less responsibility, and see how much people will zone out.

Things that make me wonder. It looks like only low beams were on, based on traffic and lighting, I am pretty sure I would have had my high beams on to give me better visibility. This would give me greater warning in case of people or animals. Often it’s not what’s in The lane in front of me, but what is all around me.

IMO, self driving cars that rely on humans to act in emergencies are a BAD IDEA. A human will never be as focused on the task as they would be if they were driving, and when time to react is measures in milliseconds, a human can not refocus, evaluate, and decide on a corrective action in time.

Second, cars will never learn patterns and sense things just aren’t right.

One last thing. Why did’t it Detect the person? If it did, why did it mo slow down? If it made a logic of (I see this issue,but I have right of way,) that is a problem. It should anticipate people doing stupid things. If it did not detect the perso , why not? Cause if it did not detect it, then this could happen again when the person does have right of way.
 
  • Like
Reactions: Bambooza

Tealwraith

Heresy detector
Donor
May 31, 2017
1,056
4,822
2,650
RSI Handle
Tealwraith
Every state has different laws for driving, roads, pedestrians etc. One thing that doesn't change is that if you don't get in front of a car, it won't hit you. I realize it's out of fashion to talk about this, but I'm so old that my parents taught me to look both ways before crossing a street and never put myself in danger of getting hit by a car by never allowing my body to be in the path of a vehicle.

Now you may return to arguing over who was responsible for what and how things should have happened.
 
  • Like
Reactions: Bambooza

Vavrik

Space Marshal
Donor
Sep 19, 2017
5,476
21,988
3,025
RSI Handle
Vavrik
One last thing. Why did’t it Detect the person? If it did, why did it mo slow down? If it made a logic of (I see this issue,but I have right of way,) that is a problem. It should anticipate people doing stupid things. If it did not detect the perso , why not? Cause if it did not detect it, then this could happen again when the person does have right of way.
The simple answer is, the technology is not as good as the hype and expectations surrounding it claims it is. It might have detected the person at some level, it certainly had the sensor technology to do so, but it doesn't necessarily follow that the software knows what to do with the sensor information. So, yes it could potentially happen again, even when the person does have the right of way.

Why did this happen now? It's like cherry picking. That woman was low hanging fruit. People crossing crosswalks usually do so on a green light, or when traffic has stopped - and you look for the traffic. She wasn't even near crosswalk, and didn't look for traffic, and a bunch of other really stupid things. Quite honestly at best it was simply a matter of time for her. But people, even in a crosswalk, should be aware of traffic around them. The laws of physics are not suspended just because you have the right of way. It's a good thing to keep in mind.
 
  • Like
Reactions: Bambooza

Crymsan

Space Marshal
Mar 10, 2016
954
2,964
1,550
RSI Handle
Crymsan
This crash highlights one of the problems with driverless cars. Who is responsible in the event of anything bad happening. The human back up driver is very much acting as a passenger (which is not overly surprising really and very much what you would expect, given something else is driving).

It is very easy to say you should pay attention as if you are driving ready to take control at any moment, but can people really do that all the time whilst a computer programme drives?
 
  • Like
Reactions: Bambooza
Forgot your password?