Xiaomi Car with Driver Assistance Crashes, Three Reported Dead (bloomberg.com)

32 points by xqcgrek2 a day ago

diggan a day ago

> Local media reported that three people died in the incident that’s likely to spark scrutiny over the smart driving software deployed in many of today’s cars.

How come these things are "scrutinized" afterwards instead of before? Governments are severely dragging their feet behind their backs and should clearly outlaw and force certification of these "smart" systems before they're deployed, not afterwards.

Bananas how we're letting these companies use public roads with real humans as testing grounds for driver assistance software.

bayarearefugee a day ago

Even ignoring the FSD aspects of these electric vehicles (which are problematic enough at their current level of technology) the fact that they are allowed to do things like hide away the mechanical release handles (required to get out if power is lost, which is very likely in a crash) in weird barely accessible spots inside the door well seems just absolutely fucking bonkers to me.

And this is common, some of Tesla's are actually harder to locate than the ones in this Xiaomi

qzw a day ago

Yes that just boggles my mind. Why is there a desire for such designs on a fundamental and safety-critical part of a vehicle? Aesthetic minimalism? But why?! A car door is not a phone screen. There’s plenty of real estate on a car door and it has no other function. It doesn’t need a hamburger menu. Just put a handle there, like we’ve been doing forever. Why does that need to be changed? A decision to hide the handles on a car makes me question and doubt every other design decision that went into the thing, because the designers have obviously lost their minds.

xnx a day ago

dkjaudyeqooe a day ago

Or just ban non-mechanical interior releases? Whats the justification for electronic releases?

My cheap car has electronic locks on all doors but the mechanical interior releases defeat them (but not on the rear doors).

A neat feature I noticed the other day is that operating the release once won't let the door actually open, it only partially unlatches it and to open the door you have to hold the lever open and push the door. Great safety design.

toast0 a day ago

gruez a day ago

qzw a day ago

AlotOfReading a day ago

There's two main systems in use globally:

1. The US system, where regulators define a set of tests and manufacturers self certify compliance

2. homologation based systems like China and the EU where those tests are performed by an accredited third party to receive type approval.

But testing is ultimately reactive, not proactive. It's hard to write appropriate tests for areas of new technology that don't impose unnecessary or silly constraints. These aren't the kinds of regulations that are easy to put out or roll back either. It takes upwards of a year in the US to go through the NPRM process, and several years for gradual rollout into force.

As someone involved in this space, I'm also extremely skeptical that there's any way to develop these kinds of systems without significant testing on public roads. That testing should be monitored and come with explicitly strict development guidelines, but alternative approaches have been tried many times and have not panned out when deployed in practice.

maxdo a day ago

1 incident, how many drunk human drivers, watching their phones on the road.

these systems able to solve this problem once and forever, ban human from driving. Humans are extreemly bad at driving.

Unless they produce more fatal incidents compared to human it's ok to have such crashes.

giancarlostoro a day ago

I agree. Where I live in Florida, sometimes I look on the road, and there's easily a half dozen Teslas, and I rarely see them in accidents, but the few owners I know personal who have been in accidents have been due to other drivers. You can't make it perfect, but you can definitely get insanely safer. Airplanes are kind of the same, but we don't freak out over it.

qzw a day ago

gruez a day ago

>How come these things are "scrutinized" afterwards instead of before? Governments are severely dragging their feet behind their backs and should clearly outlaw and force certification of these "smart" systems before they're deployed, not afterwards.

What makes you think this wasn't scrutinized by Chinese government before release?

diggan a day ago

> Local media reported that three people died in the incident that’s likely to spark scrutiny over the smart driving software deployed in many of today’s cars.

This part makes me believe it wasn't scrutinized before, because otherwise the wording would have been different. Maybe something like "further scrutinized" or something else that indicates that it's actually been inspected and approved before.

gruez a day ago

nirui a day ago

> Bananas how we're letting these companies use public roads with real humans as testing grounds for driver assistance software.

Do you know a word called Cronyism?

Xi was in a meeting with few Chinese tech leaders just few weeks ago (https://www.reuters.com/world/china/chinas-xi-attends-sympos...), including Xiaomi's leader, probably trying to establish bi-directional support and connections.

Under current desperate climate in China, i.e. unemployment number is creeping up, marriage and childbirth are down..., as well as the fact that Xi likes flashy and fancy stuff, I don't think there is any incentive for the gov to put a limit on how those companies develop. I mean... it's not like the leftist undemocratic communist should have any incentive to do things appropriately for people of different orientations anyways.

In fact, right now in China, criticize such "high-tech" company may lead to serious trouble for the criticizer, given how easy it is for powerful companies to send criticizers to prison for reasons such as "disturb public peace", "hacking" and/or "spread rumor and lies".

Also, let's don't forget Telsa also crash and burns. So it is really tricky to explain to the communist why they must do better.

garrettjoecox a day ago

Are there any useful statistics yet regarding accidents/deaths per million miles driven in “self driving” vehicles?

It always comes off as click/rage bait to me when people report on these deaths when there are literally hundreds per day that don’t involve an autonomous vehicle.

xnx a day ago

Yes. Waymo is safer than other drivers and makes the roads safer for everyone. https://waymo.com/blog/2024/09/safety-data-hub

No other company is even close (i.e. 5-10 years behind) to where Waymo is on self driving maturity.

lm28469 a day ago

> No other company is even close (i.e. 5-10 years behind) to where Waymo is on self driving maturity.

Not too hard when you stay inside like three or four cities with good weather, straight roads, &c.

jwagenet a day ago

xnx a day ago

belter a day ago

"Lies, Damn Lies, and Waymo Statistics" - https://ojoyoshidareport.com/lies-damn-lies-and-waymo-statis...

jajko a day ago

Well if thats the tip of the spear of FSD tech we're fucked, no way I will take robotaxi to work before I retire. Extremely limited environment well under control, almost always sunny, doing the same area for what - 10 or 15 years?

Can it drive in rain & snow on narrow non-marked roads, then join traffic jams (or not) on highway at 120kmh, then enter city and navigate obscure construction works around it, crazy aggressive cyclists and scooters and get me where I need, 100% reliably? Or lets say >99.995%, thats roughly human frequent driver level.

This is what I am willing to pay for, either as shared taxi or our own car, nothing less. Anything less is me doing all the driving requiring full attention, have that already in dumb cars.

AlotOfReading a day ago

aredox a day ago

It is not easy to compare as there are lots of confounding variables - self-driving is not activated at random or all the time, but typically on highways, which are less accident-prone.

They are also deactivated in difficult conditions such as bad weather which are also hard for human drivers. You can imagine a future with all cars are equipped with a self-driving system that always "passes the buck" to a human when conditions degrade - of course the system will have less accidents than humans! The statistics will even show human drivers being worse than before the advent of self-driving!

lm28469 a day ago

idk but tesla has the highest fatality rate despite bragging about all their "smart" safety features: https://www.forbes.com/sites/stevebanker/2025/02/11/tesla-ag...

gruez a day ago

But the most dangerous car model is the Hyundai Venue[1], which also brags about all their Hyundai SmartSense safety features. I'm sure the next few cars down the list also do the same. Maybe your ire should be directed at them as well?

[1] https://www.carpro.com/blog/list-of-the-most-dangerous-cars-...

[2] https://www.hyundaiusa.com/us/en/vehicles/venue

ivewonyoung a day ago

Your comment says fatality rate while the 'article' says accident rate.

Those are two very different things.

Even the accident rate is below other cars if you adjust by miles driven.

https://news.ycombinator.com/item?id=42151851

lm28469 a day ago

bilbo0s a day ago

It kind of says something when it turns out that Volvo, with their old-timey ‘dumb’ safety features, seems to be outperforming all the sexier brands on safety.

Maybe focusing on the dumb stuff brings a lot more bang for the buck than the sparkly new ‘smart’ safety widgets?

ahartmetz a day ago

PeterStuer a day ago

ronnier a day ago

> The study is based on QuoteWizard by LendingTree insurance inquiries from Jan. 1, 2024, through Dec. 31, 2024. They analyzed the 30 brands with the most inquiries in this period.

QuoteWizard. Based on inquiries. I don't trust this.

honeybadger1 a day ago

the amount of miles driven with tesla fsd with no crashes is so significantly higher it's laughable to even draw the comparison, this data is even publicly available via the API.

ronnier a day ago

pwagland a day ago

toast0 a day ago

No, the statistics I've seen haven't really been useful.

It's typically comparing cars in whatever autonomous modes vs all cars operating within a country/state. But the autonomous modes don't operate in all conditions, so it's not a good comparison.

There's concern about making sure the control group is appropriate too, comparing against a representative subset of the population is important.

I think there's some reasonable data for automatic emergency braking, in that I think I've seen it compared as just cars with aeb equipped vs cars without, number/severity of injuries for all collisions and there's enough data to show a difference.

nickthegreek a day ago

The best you can do is try and limit accident/deaths of similar events in the same areas during the same weather. But we don't collect proper stats on this stuff in order to make true apples to apples comparisons. They arent driving in the same scenarios as the many people yet.

undefined a day ago

[deleted]

viraptor a day ago

There's a few I'm not going to link, but warn about them instead. They're often in the "lies, damned lies" category.

For example comparing self driving to average accidents often misses: non self driving cars having worse equipment (lack of collision warning, adaptive cruise control, etc.), comparison to all roads (self driving is activated mostly on known, well mapped areas and open highways), unknown accounting for self driving status (Teslas try to give back control just before the crash), and many other issues.

Unless some actually independent third party runs the numbers with a lot of explanations about the methodology, I'm ignoring them.

jqpabc123 a day ago

maxglute a day ago

TIL frameless car doors have emergency / manual door release. I naively thought you can push hard enough to force door open even if it broke glass. Is this common knowledge?

dkjaudyeqooe a day ago

Aren't we beyond the point where makers of these systems should be required to prominently state that these are level 2 driving systems? Playing word games ("Full Self Driving"!) is arguably killing people.

srmatto a day ago

Car companies need to either sell feature complete auto-pilot model trims with LIDAR and sell a trim not equipped with auto-pilot at all. They should not sell middle-ground, budget trims with less-effective auto-pilot that compromise safety.

sleepyguy a day ago

https://archive.ph/77nxC

>According to Xiaomi’s initial report, the car’s advanced driver assistance function had been engaged less than 20 minutes before the crash. Alerts were issued because the driver apparently wasn’t holding on to the steering wheel. Seconds after another warning was sent about obstacles in the road and the driver then retook control of the wheel, the car crashed into concrete fencing on the side of the road.

>According to the company’s marketing materials, Xiaomi’s Navigate on Autopilot function can change lanes, speed a car up or down, make turns or brake with minimal human input. However, the company advises drivers stay alert to traffic conditions and reminds them that “smart-driving” isn’t the same as “self-driving.”

>It’s illegal in China for drivers to take their hands off the steering wheel, even if advanced driver assistance is engaged.

slaw a day ago

> , steering wheel turned 22.0625 degrees left, brake pedal pressed 31%

The driver instead of braking tried to avoid obstacle and hit concrete barrier at 97 km/h.

https://carnewschina.com/2025/04/01/first-fatal-accident-inv...

kasey_junk a day ago

I was _taught_ to do this in driving school.

dkjaudyeqooe a day ago

I never went to driving school but isn't it obvious to both brake and steer away from the obstacle?

Not that long ago I had some idiot driving the wrong direction, in my lane, and speeding, and was about to have a head on at about 200 km/h. I credit my survival to pulling hard to the right while slamming on the brakes, with the ABS allowing me to steer freely. It put as much space between me and the idiot as possible.

vel0city a day ago

ninalanyon a day ago

In Norway you are taught never to attempt avoiding action in a crisis. Just brake hard. Attempting avoiding action often puts more people at risk.

Perhaps this is more applicable to our typical driving conditions where we have rain, ice, snow, or all three, on roads for substantial fractions of the year and a sudden change of direction will result in loss of control.

Most of us do not practice the kind of precision high speed reactions necessary to control a car in such situations.

llm_nerd a day ago

There was one second between the driver taking over and the collision, so it was likely a panic reaction to an imminent crash.

Which is fundamentally the problem with self driving technologies. If it isn't 100%, it might just increase the danger: It lures the driver into distraction, because enough kilometres on straight roads and who is going to keep paying attention when a car drives itself...and then boom exception you have 1 second to focus and ingest the situation perfectly or you die.

CBLT a day ago

It's been proven that people are extraordinarily poor drivers for the first few seconds they take over driving from a computer.

piva00 a day ago

nielsbot a day ago

not to defend shitty self-driving implementations, BUT if on average they crash less than humans, even if they’re not 100%, society might accept that.

foobarian a day ago

porridgeraisin a day ago

ijidak a day ago

slaw a day ago

There was not taking over, the car was always in driver's control. The driver was using cruise control not anything self driving.

llm_nerd 4 hours ago

potato3732842 a day ago

Not that physical reality has ever gotten in the way of low effort internet comments but the driver isn't wrong, you can avoid an obstacle in much less forward distance by turning than braking.

There were two seconds between the alert and the crash. I'm not sure why the driver didn't at least get the car out of the barrier lane, perhaps there was other traffic (though the brake pedal percentages don't seem to indicate they committed to braking, odd, but I wasn't there so IDK). Obviously paying attention beforehand would have been the better move from the get go.

undefined a day ago

[deleted]

lenerdenator a day ago

Now replace "Xiaomi" with "Apple" in that headline and you'll see why they dropped the car project.

PeterStuer a day ago

Apple never clearly decided whether it was building a full car (like Tesla)? A self-driving system (like Waymo)? A tech stack to license (like Mobileye). This lack of focus created internal turmoil and killed momentum. The car team reportedly had over half a dozen leadership changes.

Apple lacked a tangible car ecosystem: no charging network, no manufacturing experience, and no clear place in the market for another premium EV.

The project was burning $1B+/year without ROI. Ultimately, it chose to cut losses. Apple failed because it was indecisive, risk-averse, and out of its industrial depth—while Tesla, Waymo, and Xiaomi had clarity, speed, and alignment between ambition and execution.

toast0 a day ago

Not knowing what they want to build is a big problem, much bigger than these.

> Apple lacked a tangible car ecosystem: no charging network, no manufacturing experience, and no clear place in the market for another premium EV.

It wouldn't make sense to develop a charging network until they at least figure out what their product is. Most car makers take about three years between showing a finalish prototype and retail sales. That's enough time to build a charging network, if you even need to. Tesla needed to build a charging network, but now that's opening up. VW needed to build a charging network as a condition of their release, and that's open. I'm not an EV driver and I don't charge my PHEV unless it's free, but I see a lot of chargers around, and I don't know if there's a need for another major chsrging network. If Apple only wanted to be part of a car, not the whole car, there would be no reason for them to be involved in charging.

Apple does most (all?) of its manufacturing through contractors. Foxconn is building cars [1], and there are plenty of dedicated contract auto manufacturers. Again, not a big defect until we know what the product is.

Market positioning also needs to wait for the product. Maybe there's an opening now that Tesla is losing sales.

[1] https://www.carscoops.com/2025/03/foxconn-gearing-up-to-buil...

rtkwe a day ago

Also manufacturing a car is vastly different from manufacturing small electronic devices (not that that competence lives entirely with in Apple but they're experienced in designing them for manufacture). If Apple wanted to do car manufacturing that's a completely different area of expertise they'd need to spend a decade or more building up to.

Look at Tesla's journey, it took them years to get things like panel gaps even close to right on their cars and they're having massive problems with the Cybertruck's manufacturing (car washes can completely destroy their electronics, whole pieces of 'trim' falling off because they used the wrong glue, etc).

spwa4 a day ago

Well, now we really know they take inspiration from Tesla. Anyone remember the Tesla beheading, with the car doing a victory lap afterwards?

zozbot234 a day ago

You can complain about Xiaomi all you want, but at least they're not BYD - that even call their driver's assistance system "God's Eyes", just so that your insurance can get away with classing every crash it gets into as an "Act of God".

gruez a day ago

>that even call their driver's assistance system "God's Eyes", just so that your insurance can get away with classing every crash it gets into as an "Act of God".

Not sure if serious.