> Even though my software is packaged and notarised as per their requirements, they still show my users a dialog box confirming they want to run my app, something they do not for apps installed through their walled garden. This is just friction to punish developers outside their store. I am very tired of it.
Indeed. I'm honestly impressed that he lasted this long. My first "I'm very displeased moment" was when Java became a second-class citizen on macos. I was a Java dev at that time and had written some non-trivial apps. They weren't native perfect, but they were close enough that my highly-Apple-fan relatives didn't realize they weren't "native" until I told them. The write-once-run-anywhere dream of desktop UI software (without getting into Qt) was there in a very real way for me. I ran it on my windows machine at work, and my mac laptop and linux desktop at home. The hoops at that point were nothing compared to what they are now, and it began souring me.
For me the final straw was when I got the latest macbook pro with the latest mac monitor (all from Apple mind you) and yet there was a horrific bug that about half the time when you plugged in to the monitor, the laptop screen shut off and would never come back on until you did a hard reboot (holding the power button). That was never supposed to be possible since it was Apple hardware/software controlled top to bottom, the original promise of the vertical integration and one of the reasons we accepted the heavy lack of cross-platform compatiblity.
A little before that I used to put my macbook on the nightstand and listen to podcasts at night to fall asleep. I would dim the screen to off and have the volume at low levels. Apple rolled out a software update that suddenly caused the screen to kick on at FULL BRIGHTNESS after about 5 to 10 minutes (when the screensaver would have normally kicked in), while I'm sleeping in a completely dark room. It was so bright that it would wake me up. That bug was there for years, and myabe still is (I replaced it with a Linux laptop).
My user experience on macs was never close to bug-free, and was frankly worse than almost everything else out there. It took me a while to figure that out though.
The last straw for me was around 2009. I was in college minoring and interning in media production. Invested pretty heavily in Final Cut, which was long in the tooth, and hoping for better I/O in Macbooks to support better ingest. That was when Apple announced the following:
- Final Cut X. Its first incarnation was a huge slap in the face for features and workflow. They completely cut a large swathe of the rest of Final Cut studio, and knew they were shipping shit with the new pricing.
- The first Unibody Macbook came out. Very little could be upgraded, the keyboard was a leap backward, and all they had for I/O was a half-baked USB3. It's usage for pro video workflows was severely hobbled compared to the last generation.
- Mac OS Lion came out, which was when it started showing signs of user hostility. Power-user features were getting locked down or removed, the app store was being pushed harder, and it was consuming more base resources for the privilege. The tend was clear that advanced users were no longer welcome in Apple land.
These things made me change majors back to computing, and a full return to Linux. I've never regretted that.