Anyone got any thoughts on this?
If autopilot was engaged this is very worrying.
If it wasn’t engaged it’s driver error as in lots of other accidents.
- Very sad what’s happened, but the basic, undeniable reality of the situation is that the driver is always responsible. The same as in a plane when autopilot is enabled. No pilot would get away with an incident by saying it was the APs fault. Nope, you’re responsible for the vehicle, end-of. The drivers of these cars chose not to pay attention and are responsible for the lives of two people being lost. Tesla make it quite clear that it is Beta software and you are required to pay attention and be ready to take over. This doesn’t make the system useless, it massively reduces fatigue, but you still have to be aware of what’s going on around you.
- I’ve had times in the past where I’ve not been confident it’s seen a vulnerable road user, i.e. pedestrian or cyclist. In those situations I’ve taken control and it’s not been an issue. The car did its job - it tried its best to drive autonomously, and I did mine by supervising the situation and intervening as I saw fit. That said, since getting a newer model Tesla, I’ve had no such cases, in fact it is more confidence inspiring now and it clearly shows vulnerable road users and issues alerts for situations it deems unsafe (which you want). The take-away is, it’s always improving, and maybe hardware in newer vehicles helps improve the detection performance.
- I’m on the fence regarding the computer-vision only decision.
- Sad discussion aside, it is a truly wonderful system and is improving all the time with software updates every month or two. It massively reduces fatigue, stress and I think on balance, it makes me a safer driver (I do quite a bit of long distance driving for work). I’ve had it detect incidents that I’ve not yet anticipated or been aware of, i.e. vehicles merging into me, someone slamming on their brakes two cars ahead, etc. It’s almost prescient at times.
I genuinely expect to be killed by someone using such a system.
The over-hyping of it being “autopilot” really doesn’t help in the consumer mind. The reliance on computer vision alone without a secondary system to reference against and if they disagree then to disengage the system is a huge oversight in my opinion. It appears that the Tesla autopilot doesn’t fail safe.
It seems like another excuse not to pay attention and drivers don’t need another one of those. The general perception is that the car drives itself.
Sooner or later kids will start jumping infront of cars for fun because they know they’ll stop. And the criminal element will probably do the same to rob people.
That’s our problem right there. You are obviously a responsible driver and do what you are supposed to do, good for you, but even with non-autonomous vehicles (normal cars) we have an pandemic of drivers who should be concentrating on the roads are spending their attention on YouTube and WhatsApp. Everyone knows this is wrong, but everyone (most) does it. Just filter a bike past any queue of cars to see.
Once AVs become common, this same breed of driver will not only watch YouTube but will actually doze off at the wheel.
I agree with the above, i passed a guy earlier who was watching football on his phone while driving through Wandsworth at rush hour. Giving people an autopilot option is clearly going to lead to them paying less (or no) attention.
I think autopilot should at the least come with some kind of technology inside the car that checks the driver is engaged and looking where they are going. It can’t be a technological challenge if they can make the car drive itself.
That’s my biggest concern. People paying no attention because they think the don’t have to any longer.
We’re the only species that protects it’s idiots. We’re heading towards becoming the only species that arms it’s idiots with the way to easily kill others with few checks and balances.
It does. Newish models have a camera that checks where your eyes are looking. You don’t have to keep bug-eyed on the road, but if you look away for too long, it reminds you to pay attention. If you do not respond it warns you more and more, until eventually it disengages AP.
Sounds like they’ve based the design on the GF…
I wonder if the cars involved in the fatalities had that feature? If so, it doesn’t seem like it goes far enough.
From the video, it seemed that Tesla had a problem with the radar and the cameras not agreeing, in that situation, surely the driver should be stepping in rather than Tesla getting rid of the radar?
If a car barrelled into the back of a bike with 30 seconds to intervene then the driver clearly wasn’t paying attention.
BBC News - Elon Musk’s Tesla recalls two million cars over Autopilot defect
To put that in perspective, Tesla has manufactured three million cars.
Two thirds of their total output to be recalled.
Over the air software update tho innit. Just another feature being deployed.
The feature is to nag people more to remind them they’re responsible for a car running a beta software feature (Autopilot).
It’s a great feature. I use it ALL THE TIME. It’s a huge safety boon the vast majority of the time, but it’s not perfect and you have to be ready to intervene once a while. The same way a pilot of a plane may use autopilot, but they’re still responsible for overseeing things and ensuring everyone’s safety. Unfortunately some people are dumb and either ignore the risks or try to game the system.
I’d like to know if the Teslas commuting into London are likely to be using autopilot or not? Teslas are by some margin the worst and least considerate drivers I personally come across. It is almost every one of them, despite still being small in population.
Often I’ll test it to see if driver or autopilot by riding up to the front passenger door, if you did this to a non-autonomous driver of another car that is for example using their phone they usually swerve away from you when you’re less than 1cm from their door but the Teslas don’t budge.
I would have hoped sensors would be binging and bonging inside the car, but there’s never any reaction or acknowledgement to your presence. Anyone been inside a Tesla to verify what happens in this case?
When cycling around London I find Tesla drivers very inconsiderate. I doubt they are on autopilot in central London - although maybe they are!
A generalisation maybe but it is true of every Tesla I’ve encountered, and it must be into the thousands now; the drivers appear to have decided amongst themselves that they are no longer responsible for driving the car, (mis?)placing that into the built in safety features of the car. I’ll let you know how I get on with testing their bugfix. I have tried and failed to make them swerve to avoid a side impact but they don’t budge, maybe it weighed me up and decided I wouldn’t harm the vehicles structure on collision.
I have a Tesla. I use Autopilot everywhere.
From my experience of using it for tens of thousands of miles over the last six years, the things it’s GREAT at are:
- Dual carriageways
- Main roads
- Wide single track roads
Things it’s not so great at YET:
- Very twisty small roads
- Dealing with intrusions into the road, i.e. cyclists and cars over the line, or too far out from the kerb. It spots them (you can see all surrounding pedestrians and road-users on the screen) and will stop to avoid hitting the, but it won’t yet, gracefully move around them, or give them a wide berth as a human would. I dare say that’s coming in a future update.
It’s software, it’s in development, they’re always improving it and releasing updates over the air every month or so. I wouldn’t be without it now. It massively reduces driving fatigue and increases safety. There’s been numerous times over the six years I’ve been using AP when it’s performed emergency manoeuvres to avoid things. I’d like to say I’d have seen and responded to them without its help, but I don’t know and I’m okay with that.
I dare say self driving cars will not be programmed to accommodate bikers. That’s a human choice, i.e. moving over to allow filtering. We cannot expect that. On the up-side, they are more predictable than human drivers!
The recent change to the Highway Code requires 1.5m passing distance at <30mph and greater passing distance at >30mph.
Are you saying the programming does NOT follow the Highway Code?
I have certainly experienced my fair share of scary close-passes by Teslas.