As an experienced developer would you ever trust a self driving car?
Posted by DeepBlueWanderer@reddit | ExperiencedDevs | View on Reddit | 62 comments
Personally, either it's AI or programmed by developers, based on my experience with both, I would never trust it over myself to drive the car autonomously.
AI can not be trusted. Simple fact, would never allow it to make such important decisions for me like driving a car with me and my family inside.
And no matter the technology I've used, developed by software engineers, there are always bugs, and these are in a lot more contained scenarios. Imagine all the edge cases and scenarios that can happen in RL while driving a car.. no way I would ever trust a software to take care of this situation for me, I have seen way too much bad development in my life to ever trust it over myself. I may not be the best driver in the world, but still trust myself more than what other people may predict.
CNNSOS@reddit
I disagree completely. Fully autonomous systems controlling every car on the road is the future. Now, I would prefer to drive most of the time because I like driving, but autonomous cars have the potential to be much safer and quicker.
All cars can communicate with each other, traffic lights won’t be necessary, they won’t get distracted, won’t get drunk etc.
If a child runs in front of it then I would trust sensors and code to react much faster than any human could. I already feel comfortable having autopilot do most the work on planes and I feel pretty confident that cars will be the same in the future
Equivalent-You-5375@reddit
Never gonna happen but nice in theory
CNNSOS@reddit
Self driving is absolutely gonna happen
opx22@reddit
Planes have human pilots onboard who can take over if needed/handle takeoff and landing
rogorak@reddit
I think your right, but the all cars communications may take a while because of competing standards / approaches. Once there is a ubiquitous standard system, assuming it's hard to hack, not only will it be safer, traffic will diminish because human drivers do things to exacerbate traffic a lot.
No-Economics-8239@reddit
All this. Plus, if it every makes a mistake, people will collectively lose their minds and get out their torches and Frankenstein rakes to burn the technological witch. And if the company survives the rabble mob, the cars will all be patched and/or upgraded so that particular problem doesn't happen again. It will continually and incrementally improve.
The system doesn't have to be perfect. It just has to be better than us. We're operating around fifty thousand fatalities on the road a year in the US. I don't have to trust the car or company. I just need to trust the statistics.
Designer_Holiday3284@reddit
As an experienced adult, I don't trust humans to drive cars
Ch3t@reddit
Would it be safer if we coupled these self-driving cars together in a chain or something that sound like chain? The lead car would be the only one with an engine to provide locomotion. Then we could place them all on rails. A road of rails if you will. No, that's way too scifi to ever take off.
sfbay_swe@reddit
I use Waymos all the time in SF. They’ve gotten good enough that I absolutely feel safer in a Waymo than in an Uber/Lyft.
Parents here are sending their kids to school alone in Waymos because they trust them more than other options.
Ch3t@reddit
I was in SF last October. I had just left the airport and was stopped at a traffic light. A Waymo pulled up beside me. The rear passenger-side window was down. There was a dog sitting on the back seat. I didn't see any people in the car. Image
etcre@reddit
Surprised to hear this. Neat.
double_en10dre@reddit
I also love that they actually let me cross the street. As a pedestrian I feel much safer around Waymos than humans
sfbay_swe@reddit
I also love how you can wave them forward if you changed your mind about crossing or want them to go first
Any-Ring6621@reddit
Came here to say this. My first ten minutes in a Waymo were like “holy shit”. After that, id take them over an uber every time
MissinqLink@reddit
You could say they trust them Waymore
IMovedYourCheese@reddit
I'd trust a Waymo. Would absolutely not trust a Tesla.
0Iceman228@reddit
I trust automation which is proven to work, like automated trains, planes. Cars currently have way too many variables to actually work like everybody wants them to. The other problem is the approach to solving traffic issues. Public transport is the only good solution for traffic, not having only automated cars.
cran@reddit
Yes and no. They are clearly safer than humans overall. But an attentive, sane driver’s brain is capable of processing many more situations than current models. As a whole, yes. Versus a known good human driver? No.
zero2g@reddit
As someone that worked in AV space for 5+ years, I only would trust waymo but not due to their technology rather their operations.
I firmly believe that from a pure future tech wise, ie pure machine learning like what tesla is touting, is not there and might not even be possible. What waymo is different is that they have superb end to end operation of both developing a safe foundational baseline (over thousands of modules handling rule based behavior and planning), as well as smooth teleop. I don't know how good their machine learning by itself is though, but they definitely operationally over fit for the city they decide to deploy in (ie road rules, driving behavior, context mapping, etc)
ObeseBumblebee@reddit
I would be happy to trust a self driving car.
... On a perfectly sunny day on a well traveled route with the CEO of the company and his daughter riding with me in the middle of a tuesday afternoon while everyone is at work.
opx22@reddit
🤣🤣🤣
kondorb@reddit
Trust? Yes. Definitely more than a random Uber driver who can’t be trusted to wipe his ass straight. Even when well rested, let alone after a 40 hour shift.
Enjoy? Nope. Give me my vintage Mazda any day of the week.
bravopapa99@reddit
No, never.
Possibly-Functional@reddit
I think the definition of trust here matters, because I don't trust any driver human or autonomous to be infallible.
When it comes to my decision it's all about data for me. If it's proved to be safer than a human driver I would choose the autonomous driver. Perceived safety matters less than actual safety for me.
As pieces of driving controllers humans are pretty shit honestly. It's not a matter of skill, though that plays part as well, we just don't have the sustained focus capacity, reaction time nor complete awareness of surroundings that would be preferred. So it probably won't be difficult to beat human drivers in safety because the bar is so low.
eloel-@reddit
Absolutely. As much as I don't trust AI, I trust random humans a lot less.
Experts in their field are better than AI, but 99% of drivers on the roads are not experts in driving. Most of them are, for lack of a better word, morons, at least when it comes to driving.
JamesWjRose@reddit
Yes, because the inverse of this is to trust ALL humans, and they are the problem not AI driving
monkey_work@reddit
I also trust my girlfriend to drive our car. A self driving one can't drive much worse.
TheInquisitiveLayman@reddit
I would trust them, yes. The math is on their side even considering unpredictability (if not now, further in the future)
Fyren-1131@reddit
No, not at all.
A car is a persistent life or death situation. And as a developer, I know just how many oversights does happen. Cocky software development practises in control of my life is not something I'm comfortable with.
Kindly_Climate4567@reddit
Development processes in automotive are much more rigorous than for regular softwae
the300bros@reddit
Are we talking about one of those 500 pound clown cars or a Mad Max style six wheeled truck with solid steel scoop on the front, roll cage and tires that can climb over other cars? Makes a diff
Sensitive-Ear-3896@reddit
Yes, because my brain has bugs too
rincewinds_dad_bod@reddit
I trust banks and airplanes all the time.
tybit@reddit
Why would you trust your gut on how much you trust software to be safe, when you can look at the data and make a rational decision?
Crazy to me how many people here are discussing their feeling on this subject when there’s plenty of data to evaluate how safe each companies self driving tech is compared to human drivers.
It’s a prime example of engineering syndrome.
DeepBlueWanderer@reddit (OP)
I'm talking about my personal experience not gut based, and letting it drive instead of myself. I'm not talking about general population or anything alike. It's highly possible that roads would be safer if we would switch drivers for machines.
08148694@reddit
I would trust a good AI over the average driver
It can never be 100% trustworthy, but it will never get tired or distracted and it will have far better spatial awareness than any pair of eyes
youassassin@reddit
yes, the problem with self driving cars is other human drivers.
funbike@reddit
tl;dr: only if it's a dedicated sub-system under formal methods, and with liability placed onto the software company.
Throughout the history of computer science, there has been tons of research on formal methods to make verifiably correct software, yet many industries that need that level of assurance don't use any of it. I've worked with the power grid, and was shocked at the low quality of code running critical systems.
To answer your question, "no", but if a unicorn high-quality system was created, I might answer "yes".
I want to see multiple systems. The system the human deals with doesn't do the actual driving. Whatever is doing the actual driving should be one or two dedicated systems with lots of failsafes. There should be a driver agent and a monitoring agent (like a driving-ed teacher), separately developed. Perhaps the driver could be neural-based AI, but the monitoring agent should be conventional hieristic programming watching for mistakes and hazards. Formal methods should have been used to ensure the software is correct. There should also be tons of simulated driving tests, using physics engines and video footage.
Require software development to be out in the open, or at least the QA part of it (i.e. automated testing, formal methods used, architecture).
I think insurance companies should be involved. Require companies developing this technology to get liability insurance from the same companies that sell auto insurance. If an accident occurs due to a software issue, the software is held liable (not the human driver). We can let premiums help self-regulate the industry. The insurance companies should have the right to commission code audits (with appropriate NDAs).
farox@reddit
Not without LIDAR. Only relying on optical is just not safe enough imo.
But you don't seem to want an actual answer
WeHaveTheMeeps@reddit
I’ve been in a self-driving car once. While the trip was rather uneventful, the car did take me unexpectedly into the wrong lane.
FWIW I’m a pilot and we’ve had self-flying planes for decades. The automation is rather trustworthy, but it’s always good to have a human in the loop for when things inevitably go wrong.
(Driving is a much harder problem than flying and there’s stuff that goes wrong with autopilot all the time)
ljsv8@reddit
We all have limitations. Experienced dev doesn’t mean anything here because this field evolves too fast. Per your reasoning, because human make mistakes all the time in driving so you would never take uber or Lyft?
elperroborrachotoo@reddit
Trust? No. Use? yes.
Sometimes it'Äs just easier to be wrong wiht everyone else, rather than being right on your own.
siqniz@reddit
They probably used builder.ai for most of it
ceirbus@reddit
Humans suck at driving, if the computer made as many mistakes it would be terrifying. Driving in general is the most dangerous thing I do, I would forego my ability to drive to take it from the lowest 10% skilled.
patient-palanquin@reddit
Waymo yes, Tesla no.
SkullLeader@reddit
I would IF all vehicle traffic was self driven cars with some sort of centralized control system. With self driving cars interacting in the road with human driven cars I will never trust it.
Tough-Leader-6040@reddit
Ues, I have... and drives better and safer than me 😂
ButchersBoy@reddit
I was thinking this just the other day. No.
Wooden-Glove-2384@reddit
fuck no
gdinProgramator@reddit
I do not trust 99% of human drivers.
That car has gone through rigorous testing. If a company that knows it will get sued to the ground for a slightest mistake wants to put it on the street, I trust it.
Former_Dark_4793@reddit
Autopilot in Tesla has been pretty good so far for long drive, I have been trusting it, takes long driving stress away
DisjointedHuntsville@reddit
Can you finish that sentence? “Trust a self driving car . . . Over the alternative of having a potentially tired or sub par human driver behind the wheel”
In a situation where split second decisions are needed to avoid a potentially fatal crash, it has been proven over and over again that even Formula One drivers don’t stand a chance against a decent autonomous system.
Yes, it’s always good to be skeptical and not take things at face value, but this seems to be a bit alarmist to say you don’t trust AI at all.
Antique-Stand-4920@reddit
Nope.
Both humans and automated drivers can make mistakes, but with a human I at least know that they have the same interest in getting to the destination in one piece as I do.
Thomase-dev@reddit
Fair take,
I’ve lived in SF for 22 years and I’ve seen how much testing these things went through. For years I saw these cars with supervision.
The testing is extremely rigorous. And that makes me trust us more.
Just like with software that I build, I sleep way better at night knowing I’ve got solid unit + integration + e2e tests passing.
I’ve taking using them in SF. Overall great experience and I am not worried.
Although, driving in SF is pretty easy compared to something like NYC.
Temperance_Lee@reddit
lol no, because I used to work at a company writing the software to go in then. I saw how the sausage was made. I quit. It was crazy.
thecodingart@reddit
As an experienced developer who’s worked close to self driving teams at multiple OEMs
Lvl 2&3 autonomy - sure
Lvl 4&5 - hell no
And Tesla software is hands down untrustworthy
Izacus@reddit
After trying a Waymo and comparing it to Bay Area Uber drivers (which regularly try to kill me)... self-driving car every. single. time. And it's not close.
muntaxitome@reddit
As an experienced human would you ever trust a human driven car? Honestly for the average piece trip if I had to pick between waymo and a random uber driver I'd think waymo will be safer.
You deal with programmed safety systems that work well enough to keep you safe every single day without realizing.
lurkin_arounnd@reddit
not until it’s significantly more mature than it is today, no
kaisean@reddit
I would trust the technology itself to a certain degree. I'd prefer to be able to sit in the driver seat and turn the self-driving off.
I don't trust the legal and insurance system to have my back when the self-driving inevitably falters and leaves me hanging
AppointmentDry9660@reddit
At a baseline, I generally don't trust humans to drive reasonably even more :)
Still don't want to be picked up in a Waymo
I'm moving somewhere that I don't need to commute with a vehicle
tetryds@reddit
As an SDET I don't trust the overwhelming majority of systems I am forced to interact with daily. At least cars have some sort of regulation.