logoalt Hacker News

Almondsetattoday at 6:36 PM1 replyview on HN

To me, self driving seems like the opposite of Moore's law.

In the last decades, one of the principle of SWE was to take into account how much computing power will have improved when planning multi-year project. Meaning, you could write something that was too heavy for today's machines, but would be bleeding edge in 5 years.

IMHO, self-driving is actually the inverse situation.

Piloting a car in an human-centered environment is difficult and requires the machine to behave humanly. This of course requires an absurd amount od data and training to pull off. But what happens when self driving adoption increases? At a certain point, so many driverless cars will be roaming the streets that most daily interactions will be automated by letting the cars negotiate in a nice, deterministic, algorithmic way. Thus, reliance on predictive and opaque systems like neural networks will be needed less and less, actually reducing the complexity of self driving.

My main point is: what should win is the tech that makes cars drive as well as trained professional drivers. Once that's done, adoption will drive down human driving and thus unpredictable behavior on the road, reducing the computational load needed to correctly perform tasks. Next, cars will start to behave more programmatically and deterministic and will need less sensors and tech. Car companies will have accurate maps of everything, and cars will mostly become shuttles which can rely more on predetermined routines and less on world models, especially as smart cities gain a foodhold too.


Replies

DauntingPear7today at 7:04 PM

Isn’t there always the fact that non-car things happen near and on a road, thus forever requiring high amounts of compute?

show 1 reply