> A decade ago, Intel tried that back in the day with Galileo / Edison, and tellingly, they came up with the same "ideas": IoT / AI.
Intel's execution - as usual - was poor and lacking.
Both the Galileo and Edison were much more expensive than their Arduino counterparts, and their x86 cpu's were of little value within that space (especially at the time). Neither made it past 5 years without being killed - which is exactly what people feared. A stunning lack of long-term commitment from Intel to develop and grow a community, leaving anyone that actually built products based on their devices holding a useless bag.
Intel Edison/Galileo didn’t work because everything they could do is replaced by purpose built ASICs, much cheaper at scale and energy efficient, important metrics for IoT. They were at best PoC material in the lab.
and their x86 cpu's were of little value within that space
Intel could've attracted the entire retrocomputing community if they realised that the peripherals around x86 and the PC ecosystem were what got them to where they were in the first place, and made Galileo/Edison actually PC-compatible, but they ended up making a SoC with a 486DX+ core and mostly-incompatible peripherals (one would think they should've learned their lesson with the 80186/88...) and somehow convinced Microsoft to make a special version of Windows(!) for it despite a complete lack of any video output capabilities.
"WTF were they thinking!?" is the most concise summary of that fiasco.