I didn’t intend to become a clearinghouse for self-driving car and taxi updates, but I guess that it shows that my attention span is improving with recent reading and writing, so I’ll choose to be optimistic about that and plow ahead.
I am still not optimistic about the future of self-driving cars, and it seems that a growing sense of alarm might be more appropriate.
While Elon Musk has been warned by Tesla technical and legal people to not claim that self-driving is here, it seems likely that a few Tesla owners have followed his nods and winks rather than the caveats that his handlers have added. The following article notes that some bad behaviors tracked by the regulators have been addressed by a recent software update.
There is so much about the wording of this article that raises alarms for me.
The software that is being updated is known as the “Full Self-Driving” (FSD) software, an aspirational title, but one that now at least gets enclosing quotation marks.
I assume that Tesla had to show the regulators that the specific issues described are fixed by the software update, so that’s good. But Elon carps about having to call this a safety recall, since it does not require bringing the car into a dealer for a fix.
“The word ‘recall’ for an over-the-air software update is anachronistic and just flat wrong!” – Elon Musk
In fact, that saves Tesla money, so I’m sure that he’s happy about that. What he’s undoubtedly not happy about is that it is a very visible shaming for software that does not do what he claimed. And as the article notes, there is a death toll associated with this ongoing saga, so the alarms set off by the term “recall” are entirely appropriate.
Moving on to self-driving taxi trials, a friend forwarded me a link to the following article, and also pointed out the most alarming aspect of what it says, so I am posting this for Bob D:
Especially note the following couple of paragraphs from the article:
All companies testing their vehicles on public roads in the state of California are required to report every time their system disengages or whenever a human driver has to take over for the autonomous system while driving, usually due to safety concerns or software issues.
Zoox doesn’t even refer to these incidents as disengagements, but rather as cases where the vehicle needs support or guidance, so does not report them to the state.
It looks like the regulators hurt some feelings here too, with the term “disengagement”, but I am pretty sure that they would want to know if the vehicle needs support or guidance. I hope that they are reading the same news links that I am, and send some clarifying instructions to Zoox and the rest.