Anyone got any thoughts on this?
If autopilot was engaged this is very worrying.
If it wasn’t engaged it’s driver error as in lots of other accidents.
- Very sad what’s happened, but the basic, undeniable reality of the situation is that the driver is always responsible. The same as in a plane when autopilot is enabled. No pilot would get away with an incident by saying it was the APs fault. Nope, you’re responsible for the vehicle, end-of. The drivers of these cars chose not to pay attention and are responsible for the lives of two people being lost. Tesla make it quite clear that it is Beta software and you are required to pay attention and be ready to take over. This doesn’t make the system useless, it massively reduces fatigue, but you still have to be aware of what’s going on around you.
- I’ve had times in the past where I’ve not been confident it’s seen a vulnerable road user, i.e. pedestrian or cyclist. In those situations I’ve taken control and it’s not been an issue. The car did its job - it tried its best to drive autonomously, and I did mine by supervising the situation and intervening as I saw fit. That said, since getting a newer model Tesla, I’ve had no such cases, in fact it is more confidence inspiring now and it clearly shows vulnerable road users and issues alerts for situations it deems unsafe (which you want). The take-away is, it’s always improving, and maybe hardware in newer vehicles helps improve the detection performance.
- I’m on the fence regarding the computer-vision only decision.
- Sad discussion aside, it is a truly wonderful system and is improving all the time with software updates every month or two. It massively reduces fatigue, stress and I think on balance, it makes me a safer driver (I do quite a bit of long distance driving for work). I’ve had it detect incidents that I’ve not yet anticipated or been aware of, i.e. vehicles merging into me, someone slamming on their brakes two cars ahead, etc. It’s almost prescient at times.
I genuinely expect to be killed by someone using such a system.
The over-hyping of it being “autopilot” really doesn’t help in the consumer mind. The reliance on computer vision alone without a secondary system to reference against and if they disagree then to disengage the system is a huge oversight in my opinion. It appears that the Tesla autopilot doesn’t fail safe.
It seems like another excuse not to pay attention and drivers don’t need another one of those. The general perception is that the car drives itself.
Sooner or later kids will start jumping infront of cars for fun because they know they’ll stop. And the criminal element will probably do the same to rob people.
That’s our problem right there. You are obviously a responsible driver and do what you are supposed to do, good for you, but even with non-autonomous vehicles (normal cars) we have an pandemic of drivers who should be concentrating on the roads are spending their attention on YouTube and WhatsApp. Everyone knows this is wrong, but everyone (most) does it. Just filter a bike past any queue of cars to see.
Once AVs become common, this same breed of driver will not only watch YouTube but will actually doze off at the wheel.
I agree with the above, i passed a guy earlier who was watching football on his phone while driving through Wandsworth at rush hour. Giving people an autopilot option is clearly going to lead to them paying less (or no) attention.
I think autopilot should at the least come with some kind of technology inside the car that checks the driver is engaged and looking where they are going. It can’t be a technological challenge if they can make the car drive itself.
That’s my biggest concern. People paying no attention because they think the don’t have to any longer.
We’re the only species that protects it’s idiots. We’re heading towards becoming the only species that arms it’s idiots with the way to easily kill others with few checks and balances.
It does. Newish models have a camera that checks where your eyes are looking. You don’t have to keep bug-eyed on the road, but if you look away for too long, it reminds you to pay attention. If you do not respond it warns you more and more, until eventually it disengages AP.
Sounds like they’ve based the design on the GF…
I wonder if the cars involved in the fatalities had that feature? If so, it doesn’t seem like it goes far enough.
From the video, it seemed that Tesla had a problem with the radar and the cameras not agreeing, in that situation, surely the driver should be stepping in rather than Tesla getting rid of the radar?
If a car barrelled into the back of a bike with 30 seconds to intervene then the driver clearly wasn’t paying attention.