Totally Off Topic - Comma 3 Open Source Driving System Superior to Tesla

Saying there’s no official support doesn’t necessarily mean there’s no support at all.

I’m not familiar with this project, but popular open source projects typically provide a lot of support from the developers, contributors, and community members even though there are no overt promises or guarantees. They’re probably saying there’s no “official” support for CYA legal reasons, especially if they’re based in the US.

It should also be noted that the “official” support you get from commercial, closed source software isn’t necessarily good or responsive. Often it’s farmed out to low-paid customer service reps with minimal technical training and skills.

I’d be more inclined to agree if it weren’t for the fact that:

The government asked the company for information regarding testing and safety protocols, and in response they scrapped the whole project - then reinvented it in such a way as to skirt regulation. That’s where the “open source” and “no official support” part comes in.

Given that, I’m highly suspicious. Their official legal position - of necessity - is that they’re not responsible for anything done by the software that they provide.

2 Likes

I think the response by comma.ai was pretty rational. At the time I believe they had under 10 employees. The NHTSA query probably would have taken years of full-time work by multiple engineers to get approved - and in the interim the company would be on hold.

In truth, the pathway NHTSA attempted to initiate would have been more appropriate for a company proposing permanent modification of a car - which does not apply to comma.ai.

The response comma.ai took - which is to set up a design philosophy based upon an “equivalent safety” design - is not at all unusual. Big companies - like Apple - do the same thing. Apple did not get approval for its ECG detection or pulse oximetry as formal medical devices after a full FDA analysis; they instead used “workaround” rules establishing they pose no additional harm beyond existing devices in the industry.

The same approach is used in legally marketing all sorts of everyday products - including band-aids, wheelchairs, walkers, pulse oximeters, tampons, condoms, and many other items.

The puzzling question I have is - what specific harm do you fear from OpenPIlot? The idea of a CAM bus and third party devices connecting to an OBD port has been around for quite some time in cars. The auto industry has long had design principles which limit what devices connected to an OBD port can do.

So while it is true that a tweak to OpenPilot might cause some feature to not work, there is no chance that the driver is going to lose control of the car or that OpenPilot will execute steering or acceleration beyond safe limits before the driver resumes control.

What sort of scenario concerns you that could occur with OpenPilot but would not have a similar risk with Tesla Autopilot/FSD?

1 Like

I’ll never understand why USA doesn’t regulate driving more. I don’t have any real/modern number ready, but I think the state of Texas alone had 100 times more traffic deaths a year than my entire country (Norway) the last time I was there.
Why don’t you ask more of your government? Why don’t you ask for the protections other countries give you?
Cars are INCREDIBLY dangerous, and the both the people who drive them and the people who make them need to be tightly controlled.

And no, I don’t want to discuss weapons laws.

3 Likes

No argument here. In March my mom was killed by a driver who went through a red light at 90 MPH while on meth. The month before his mother brought him to the police station and begged police to arrest him. His list of prior offense was so long it boggles the mind why he was not in jail, no less why he had a valid driver license.

But that said - the issues there are our drivers - not the safety of our cars or our driving software.

I’m sorry for your loss.
But I include drivers in the problem. It’s all bad.

You can’t have ever been responsible for debugging and fixing real production software AND make such an absolute statement like this about what a software system will or won’t do.

OK - fair point. I did not state that well.

There is no reason to believe the risks of that sort of adverse event are any greater in OpenPilot than in Tesla Autopilot/FSD or in Ford Copilot 360 Plus.

In proprietary software, the risk of an Easter Egg or latent Zero-day bug are higher than in open source because there are fewer eyes looking at the code, especially post-release.

The question remains - what specific sort of scenario do you fear may happen while using OpenPilot?

Related question - do you think OBD ports should be removed from cars entirely?

That there are risks whether in OpenPilot or Tesla is sufficient for me not to use such things.

There are 1,000 or more software risks in any modern car.

Do you drive a 1975 Chevy Impala to avoid those risks?

Remove OBD ports? Now you are being ridiculous to make a point.

What I want is dangerous drivers removed from the roads and alert drivers at the wheel. I’m shocked that we seem to have accepted the premise that so-called self-driving cars are acceptable and safe. Just as I am shocked when I hear that drivers with multiple convictions for driving under the influence are still on the roads.

2 Likes

And yet that premise achieves exactly what you want!

Same for ADAS - safer across the board.

I could not agree more!

Tesla markets its product as “Full Self-Drive” with a small footnote saying otherwise. The result is that we have Youtubes of stupid-driver-tricks where “drivers” sleep or read books while on the road.

OpenPilot is adamant in many ways that their software is only a “driver assist” system. They aggressively discourage any suggestion that their software can or should be used autonomously. Not only do they have a camera to confirm the driver is alert and looking at the road, but also in situations the software identifies as beyond its capability (such as a tight turn) the software provides an additional alert for the driver to take control of steering.

Here is an example of a feature that can be implemented with one version of OpenPIlot. Many cars currently have “lane centering” that works with adaptive cruise. However if you turn the steering wheel too much or hit the brake, the lane-centering turns off. In practice that makes it somewhat impractical to use on many roads.

What I can do with one of the versions of OpenPilot is to turn on Lane-Centering independent of adaptive cruise. And I can set it so that I can easily overpower the software lane-centering at any time but then it will resume when I let go.

So it’s basically “collaborative steering” - the car assists with steering when I let go or reduce my grip, but it’s 100% mine as soon as I intervene. That makes it easy for me to monitor the system on either a curving road or a long straightaway; in either case I am fully expected and able to monitor continuously and correct the software instantly at any time.

This seems like a win-win to me as long as I understand the above design and remain alert/involved 100% of the time.

Frankly it’s not all that much different than when I fly with an airplane autopilot; the autopilot enables me to monitor the big picture of safety and reduces the moment-to-moment physical workload, but I need to be in the loop and ready to take over at any time.

1 Like

Waymo is interesting- and I am more than glad to discuss it.

But it’s important to realize Waymo is indeed “driverless” whereas comma.ai/OpenPilot/Tesla are simply “driver assist” systems that still require a full-time alert driver.

Not ridiculous. My point is that 3rd party access to OBD ports has long been accepted. That’s all OpenPilot does. Why do you object to that?

see my amended post - I added a link to stats similar to Waymo for ADAS systems

Thank you - and the statistics are impressive.

Moreover Consumer Reports compared OpenPIlot against all of the manufacturer-installed ADAS systems and concluded that OpenPilot/Comma.ai was # 1 overall

https://data.consumerreports.org/wp-content/uploads/2020/11/consumer-reports-active-driving-assistance-systems-november-16-2020.pdf

Maybe. That will be a good thing, if true. The Verge take pains to delve into some of the statistical acrobatics that underlie the pleasant headline.

For example, Waymo’s vehicles operate in geofenced areas in the three cities where they drive, which excludes highways. Human drivers don’t avoid these types of roads.

It is an inarguable fact that there are far fewer AVs on the road than human-driven vehicles …

1 Like

Sure. OK, how about Tesla’s ‘FSD’ stats?

We could improve the human driving stats by taking away licenses and actually removing careless, negligent, and under-the-influence drivers from the roads.

It is bad enough when a human being runs over and kills a bicyclist walking their bike across the street or strikes and drags a pedestrian. That human, at least theoretically, could be held accountable.

Self-driving vehicles killing human beings in the name of convenience, lower costs, and making a buck, is unconscionable.

1 Like