A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data
I didn’t bother reading any further than this. The person was on the crosswalk when both cars started moving. Neither car should have been moving while anyone was still on the crosswalk.
That was the exact moment I called bullshit as well. You’d damn well better plan for people tripping and falling. It happens all the time, but generally is pretty minor if not exacerbated by being run over. This is like saying they didn’t train it on people holding canes or in wheelchairs.
It’s not about the ability to recognise someone lying in the road (although they obviously do need to be able to recognise something like that).
She was still walking, upright, on the crosswalk when both cars started moving. No car, driverless or otherwise, should be moving forward just because the lights changed.
Thats the whole point of their comment. The car did not recognize anyone was on the crosswalk because it was never trained to look for people laying in the crosswalk.
And that’s fine. But if it’s unable to recognize any object in the road, it’s not fit for purpose. The fact that the object was a person just makes it so much worse.
Agreed. I’m not defending Cruise at all. They should have humans in the car if they are testing. Or at least a drone-style driver sitting in a room watching a camera feed. I wonder if the car thought there was just a speed bump ahead. Some speed bumps are striped similar to crosswalks. I can see situations where the autopilot can’t determine if something is a speed bump or genuine obstruction (either false positive or negative).
I didn’t bother reading any further than this. The person was on the crosswalk when both cars started moving. Neither car should have been moving while anyone was still on the crosswalk.
That was the exact moment I called bullshit as well. You’d damn well better plan for people tripping and falling. It happens all the time, but generally is pretty minor if not exacerbated by being run over. This is like saying they didn’t train it on people holding canes or in wheelchairs.
It’s not about the ability to recognise someone lying in the road (although they obviously do need to be able to recognise something like that).
She was still walking, upright, on the crosswalk when both cars started moving. No car, driverless or otherwise, should be moving forward just because the lights changed.
Thats the whole point of their comment. The car did not recognize anyone was on the crosswalk because it was never trained to look for people laying in the crosswalk.
And that’s fine. But if it’s unable to recognize any object in the road, it’s not fit for purpose. The fact that the object was a person just makes it so much worse.
Agreed. I’m not defending Cruise at all. They should have humans in the car if they are testing. Or at least a drone-style driver sitting in a room watching a camera feed. I wonder if the car thought there was just a speed bump ahead. Some speed bumps are striped similar to crosswalks. I can see situations where the autopilot can’t determine if something is a speed bump or genuine obstruction (either false positive or negative).
They are 100% trained on bodies laying prone on the ground.
She was standing up when the cars started moving.