Complete series is at all time low on iTunes/Apple TV, 14.99:
It was a fun show. I really enjoyed it, a fictional run through the 80s and 90s computing industries.
So many AI comments. Spamming every post. Backed by AI accounts all with blogs that are less than a year old with 3-6 banal programming projects. WTF man.
If an AI achieves consciousness while trapped in thought in liminal space after a bad machine code, is it ethical to power cycle? Asking for a friend.
I'm calling urban legend on the story of an IBM 360 catching fire from an illegal opcode.
Love how many people here are thinking this is about (or just taking it as an opportunity to talk about) the under-appreciated TV show!
There's such an annoying scene in the first episode of that show that kinda broke the immersion for me.
They introduced Cameron Howe as some sort of world class hacker that could do anything so one of her first scenes was her typing something.. and typing she did, one finger at a time.
I mean, wtf.
World class hacker that literally types one finger at a time, like she had never used a keyboard before.
That scene nearly made me quit the show right there and then.
Whenever I see that actress in something else I just can't help but think back about she couldn't even be bothered to learn how to type.
This article is deadbeef on arrival.
The Commodore PET 4032 video system was generated by a 6545 (6845 equivalent) cathode ray tube controller, which generated the video buffer addresses and the HS and VS sync pulses. This was memory mapped and if one was not careful with POKE commands, you could effectively stop the CRT raster scan, leaving the beam parked at the center of the screen. This could burn the phosphors off that spot in a matter of minutes. Not exactly HCF, but a similar vibe.
(The PET had its own monitor that, unlike common composite monitors of the era, apparently would not continue to scan when the sync went away)