by Outcast_Searcher » Thu 22 Mar 2018, 20:16:39
$this->bbcode_second_pass_quote('BahamasEd', 'S')o I guess the car had no forward looking radar?
I saw on TV the other day an ad that showed a car with a human driver that automatically stops if someone/thing is in front of the car.
if it can't 'see' a person sized object moving into it's path in any lighting conditions then it's not ready for prime time.
Uber's cars have radar and lidar. Clearly something went wrong. Whether it was hardware, software, or some combination of both will need to be analyzed.
I suspect for these things to be very reliable there will need to be multiple types of sensors doing a variety of safety screens, and that the software will need to "intelligently" make a lot of decisions, resolve conflicting readings safely, etc.
So, IMO, some serious decisions need to be made about safety when testing. About standards that clearly show what IS "good enough" for these things to truly become mainstream. Certification to operate on the roadway for example. Or certification to be owned by a taxi company, or owned and operated by an ordinary private citizen.
And such decisions should be thought through and implemented ahead of time for something as dangerous as driving -- but our government agencies are very lacking in the efficiency department.
So this is where I believe the better and more standardized (say, federal) regulations should come into play. Perhaps an event like this will wake enough people up to give that some priority.
One thing they should demand, it seems to me, is a suite of tests covering MANY such conditions in a real, 3D model city type environment, where test dummies on tracks, etc. are used and where normal humans aren't around. That for each software improvement BEFORE the cars are unleashed on public roads where fragile humans do stupid things. Of course, that would cost time and money, so industry will object -- and politicians don't CARE what I think.
But things ARE improving. Watching Tesla's autopilot last summmer via Youtube, after the fatal Tesla accident, it was an absolute joke. It would be trying to have an accident about every minute in many cases, with the driver having to intervene.
Yesterday I read an article that had a few Youtube videos showing how much Autopilot 2.0 has recently improved. It was handling hills, curves, snow and ice, and varying daylight conditions flawlessly. (All conditions it has up until recently been failing at in lots of cases). It wasn't perfect, but it looked roughly 10X better than 8ish months ago, under normal two lane highway conditions. That, IMO, is rapid progress indeed. Much faster than I expected 8ish months ago, frankly.
Given the track record of the perma-doomer blogs, I wouldn't bet a fast crash doomer's money on their predictions.