top of page

Forum Posts

Clay
Jun 03, 2019
In Tech Talks
3/23/18 - Last updated 9/12/18 Uber one of the most notable and popular ride-share services is recently in a lot of heat over their self driving vehicles. While testing in Tempe, Arizona at night one of their vehicles struck a 49-year old woman named Elaine Herzberg at night crossing the road that later died in the hospital, This has sparked much controversy in the autonomous car sector we are here to talk about this. At this moment in time we do not have a full case but we have enough to talk about. We are not going to talk about this person illegally crossing the road, this will be discussing what happened and if the car should've stopped In the above video released from the local police department of the dash-cam footage of the Uber self driving vehicle we can see the car driving and heading toward a dark spot where Elaine was crossing the road with her bicycle and at the last second she appears. This may look all look like a visibility issue but many factors prove this should of never of happened as worse as it did Analyzing the video & Scene This is the first area that raises some eyebrows. Many people have gone out on their own and shot this exact spot at the same time (a few days after the accident, so lighting is the same) and we can clearly see an issue all of the cameras presumably smartphones are clearly able to see the area Admittedly I have seen some footage that did look dark, but something like this should have the best cameras possible (the released footage was not from the self driving system) besides that in modern autonomous vehicles cameras are mostly a system to read signs, lights, and is a secondary system to Radar and Lidar to get more information and to aid in obstacle detection. If the system was just camera operated we could see where it may of gone wrong but thats not the case  We can not forget about the "driver" in the vehicle who would of / should of seen her crossing long before but was not paying attention, this is in a city main street not some back road, lighting to the human eye and headlights. The car is a 2018? Volvo XC90 lighting would not be an issue. This is a big strike against why this shouldn't of happened. As laws restrict autonomous vehicles to level 2 (driver is responsible for the actions of the vehicle no matter what) for this exact reason What really happened? On May 24th America’s National Transportation Safety Board (NTSB) issued its preliminary report into the crash. What caused the accident, and what does it say about the safety of autonomous vehicles (AVs) more broadly? The computer systems that drive cars like these consist of three modules. The first is the perception module, which takes information from the car’s sensors and identifies relevant objects nearby. The Uber car, a modified Volvo XC90, was equipped with cameras, radar and LIDAR (a variant of radar that uses invisible pulses of light). Cameras can spot features such as lane markings, road signs and traffic lights. Radar measures the velocity of nearby objects. LIDAR determines the shape of the car’s surroundings in fine detail, even in the dark. The readings from these sensors are combined to build a model of the world, and machine-learning systems then identify nearby cars, bicycles, pedestrians and so on. The second module is the prediction module, which forecasts how each of those objects will behave in the next few seconds. Will that car change lane? Will that pedestrian step into the road? Finally, the third module uses these predictions to determine how the vehicle should respond (the so-called “driving policy”): speed up, slow down, or steer left or right. Of these three modules, the most difficult to build is the perception module, says Sebastian Thrun, a Stanford professor who used to lead Google’s autonomous-vehicle effort. The hardest things to identify, he says, are rarely-seen items such as debris on the road, or plastic bags blowing across a highway. In the early days of Google’s AV project, he recalls, “our perception module could not distinguish a plastic bag from a flying child.” According to the NTSB report, the Uber vehicle struggled to identify Elaine Herzberg as she wheeled her bicycle across a four-lane road. Although it was dark, the car’s radar and LIDAR detected her six seconds before the crash. But the perception system got confused: it classified her as an unknown object, then as a vehicle and finally as a bicycle, whose path it could not predict. Just 1.3 seconds before impact, the self-driving system realised that emergency braking was needed. But the car’s built-in emergency braking system had been disabled, to prevent conflict with the self-driving system; instead a human safety operator in the vehicle is expected to brake when needed. But the safety operator, who had been looking down at the self-driving system’s display screen, failed to brake in time. Ms Herzberg was hit by the vehicle and subsequently died of her injuries. The cause of the accident therefore has many elements, but is ultimately a system-design failure. When its perception module gets confused, an AV should slow down. But unexpected braking can cause problems of its own: confused AVs have in the past been rear-ended (by human drivers) after slowing suddenly. Hence the delegation of responsibility for braking to human safety drivers, who are there to catch the system when an accident seems imminent. In theory adding a safety driver to supervise an imperfect system ensures that the system is safe overall. But that only works if they are paying attention to the road at all times. Uber is now revisiting its procedures and has suspended all testing of its AVs; it is unclear when, or even if, it will be allowed to resume testing. Other AV-makers, having analysed video from the Tempe accident, say their systems would have braked to avoid a collision. In the long term, AVs promise to be much safer than ordinary cars, given that 94% of accidents are caused by driver error. But right now the onus is on Uber and AV-makers to reassure the public that they are doing everything they can to avoid accidents on the road to a safer future. NTSB preliminary report
Uber Self-Driving Vehicle Fatally Strikes Pedestrian content media
0
0
4

Clay

Admin
More actions
bottom of page