Microsoft's fumble here is pretty spectacular.
Back in early 2023, the state of google search was abysmal (despite that their leaders insisted it wasn't, it had become nearly unusable for me and I don't think was that unfounded of an opinion). Microsoft rolled out a new version of bing, which became bing chat - search worked for me again for a very brief window of time.
They could have pounced on this opportunity to take a big chunk out of google's search, because google didn't really catch up there til the AI overview was rolled out, and even that is notorious for having issues. Eventually chatGPT seems to have carved out some of this search space with web-search being native to the tools now.
But microsoft was way ahead of everyone here for a brief period! Instead they just rolled everything into bloatware vaguely called "Copilot" and called it a day.
Over and over Microsoft kills products with mis-marketing.
One scenario is the product is good (OneNote) but they put three icons on the taskbar for it and spam the rest of Windows for ads for it that just make people scream "take it away!"
Another scenario is that the product is bad (OneDrive) and they push you into having a traumatic experience (Microsoft Office uses it as the default save location and when it is down you can't save your work!) that makes sure you'll never use it again -- even though now OneDrive seems to be basically reliable.
Today is it the dominant playbook for marketing of AI experiences. Mostly people are sick and tired of hearing about it, the master Unique Selling Point of 2026 is products that don't interrupt you when you are trying to get work done.
>it had become nearly unusable for me and I don't think was that unfounded of an opinion
if ironic is the right word; the (google) search product itself still is. if not even worse.
the 'new' ai mode routinely creates these silly categories that are not what i was looking for and my screen is filled with repetitive ai summaries of articles. it will ingest a source as fact, and then use that fact to create confirmation bias across other articles. it will even use words like "confirm" when it finds a source saying something, even if the source is junk or seo spam. it becomes somewhat impossible to escape the assumptions the model has made, and i have to resort to traditional web search to get diversity in my results.
and while deep research works, its so overly verbose, with no easy way to tone down the wordiness.