The Cellar

The Cellar (http://cellar.org/index.php)
-   Current Events (http://cellar.org/forumdisplay.php?f=4)
-   -   Uber Killing (http://cellar.org/showthread.php?t=33418)

xoxoxoBruce 03-22-2018 10:58 PM

Uber Killing
 
You've probably heard Sunday night an Uber self-driving car with a driver who is supposed to intercede if the car fucks up, killed a pedestrian walking her bike across the road.
They've been telling us the big advantage of self drivers is Lidar which will see in the dark and avoid things people can't.
The fortunate thing about these cars is the have full time video rolling inside and out.


xoxoxoBruce 03-23-2018 01:39 AM

1 Attachment(s)
The only successful self driving car.

Griff 03-23-2018 06:22 AM

Robots take an early lead 1-0.

glatt 03-23-2018 07:23 AM

Can you blame them, after all the shit Boston Dynamic does to their robots to test them?

Clodfobble 03-23-2018 11:17 AM

Can anyone really claim they would have seen that woman, as a human driver? Maybe the camera makes it less clear than it was to normal eyes at the time, but she came out of freaking nowhere.

xoxoxoBruce 03-23-2018 01:25 PM

At 3 seconds into the video a driver might have swerved or at least braked. The car, which is supposed to use lidar to see her when the driver can't, hit her at full speed.

Griff 03-23-2018 01:29 PM

This the same week Cadillac begins pimping their highway snooze technology.

Pico and ME 03-23-2018 02:55 PM

Quote:

Originally Posted by Clodfobble (Post 1006126)
Can anyone really claim they would have seen that woman, as a human driver? Maybe the camera makes it less clear than it was to normal eyes at the time, but she came out of freaking nowhere.

I agree, she shouldn't have been there in the first place. Never trust traffic, ever.

And...If Tempe is anything like Tucson, the roads are very dark at night.

glatt 03-23-2018 03:19 PM

Quote:

Originally Posted by Clodfobble (Post 1006126)
Can anyone really claim they would have seen that woman, as a human driver? Maybe the camera makes it less clear than it was to normal eyes at the time, but she came out of freaking nowhere.

I saw a picture yesterday where somebody went to the scene and took a handheld picture with their phone in HDR mode so show more closely what a human eye would have seen. This is that photo. You can see multiple shadows of the photographer, not using a tripod.

I think that video footage is not terribly accurate at showing what a human would see. Much darker. The road is divided with vegetation in the middle, so there were no headlights of oncoming traffic blinding a driver. I think a human would have seen the woman much earlier than it seems in the video.

The woman was jaywalking at night in front of a car, but I think I would have seen her based on the phone camera photo.

tw 03-23-2018 03:56 PM

Quote:

Originally Posted by glatt (Post 1006148)
I saw a picture yesterday where somebody went to the scene and took a handheld picture with their phone in HDR mode so show more closely what a human eye would have seen.

Does not matter what a human eye would see. Technical answers will discuss what is seen at different frequencies - many that no human eye would see. List all frequencies (with amplitude) to know what the vision system saw.

Assumption is that the item (victim) was not seen. Nonsense. Autonomous vehicles only at stage 2 will see things and still be confused. Too many reasons exist - all are suspect. Darkness should be a least likely suspect.

Why did a Telsa in complete daylight run into the side of an 18 wheeler? Vision problems. Pattern recognition defect? Far too much is involved and unknown to make any conclusion. But that too was only a stage 2 system. It also must be supervised by the human who is responsible for a car's actions.

Only one conclusion is possible. A 2nd stage system is too experimental. No where near as sophisticated as Google's stage 4 systems that still require human supervision.

That means a human supervisor who does not constantly pay attention should be prosecuted for criminally negligent homicide. Because he was no less guilty than a driver who was drunk.

85% of all problems are directly traceable to top management. In a stage 2 system, that is clearly a required human supervisor. That guy in the driver's seat is criminally negligent.

xoxoxoBruce 03-23-2018 10:42 PM

Quote:

Originally Posted by glatt (Post 1006148)
I saw a picture yesterday where somebody went to the scene and took a handheld picture with their phone in HDR mode so show more closely what a human eye would have seen. This is that photo. You can see multiple shadows of the photographer, not using a tripod.

I think that video footage is not terribly accurate at showing what a human would see. Much darker. The road is divided with vegetation in the middle, so there were no headlights of oncoming traffic blinding a driver. I think a human would have seen the woman much earlier than it seems in the video.

The woman was jaywalking at night in front of a car, but I think I would have seen her based on the phone camera photo.

That's definitely the spot, right where the white line on the right turns from solid to broken. The phone camera and onboard video are using very different lenses. The video lens make the building look much further away and much darker. Of course the car doesn't use what the video camera sees, but I agree the phone picture makes me think if the driver was paying attention he would have seen her much sooner and at least scrubbed off some speed if not swerve around her.

tw 03-25-2018 06:55 PM

From the NY Times of 23 Mar 2018:
Quote:

Uber's Self-Driving Cars Were Struggling Before Arizona Crash

Uber's robotic vehicle project was not living up to expectations months before a self-driving car operated by the company struck and killed a woman in Tempe, Ariz.

The cars were having trouble driving through construction zones and next to tall vehicles, like big rigs. And Uber's human drivers had to intervene far more frequently than the drivers of competing autonomous car projects.

Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per "intervention" in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it. ...


And there also was pressure to live up to a goal to offer a driverless car service by the end of the year and to impress top executives.

footfootfoot 03-26-2018 10:11 AM

Quote:

Originally Posted by Clodfobble (Post 1006126)
Can anyone really claim they would have seen that woman, as a human driver? Maybe the camera makes it less clear than it was to normal eyes at the time, but she came out of freaking nowhere.

Not sure about your vision, but human eyes in general, have yet to be surpassed by cameras. I also highly doubt the equipment on this car was even close to top quality, state of the art. That last part is supposition.

tw 03-26-2018 07:57 PM

Quote:

Originally Posted by footfootfoot (Post 1006237)
Not sure about your vision, but human eyes in general, have yet to be surpassed by cameras.

That is not true. Human eyes cannot see in the dark. Cameras can. Motion detectors also do. That conclusion must be tempered by so many parameters including frequencies, motion, recognition, radiation intensity, etc.

xoxoxoBruce 03-26-2018 11:18 PM

They don't see, they detect, in this case patterns.


All times are GMT -5. The time now is 12:53 PM.

Powered by: vBulletin Version 3.8.1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.