logoalt Hacker News

Boogie_Manyesterday at 4:11 PM5 repliesview on HN

I'm reminded of the time GPT4 refused to help me assess the viability of parking a helium zeppelin an inch off of the ground to bypass health department regulations because, as an aircraft in transit, I wasn't under their jurisdiction.


Replies

Aurornisyesterday at 5:28 PM

The other side of this problem is the never ending media firestorm that occurs any time a crime or tragedy occurs and a journalist tries to link it to the perpetrator’s ChatGPT history.

You can see why the LLM companies are overly cautious around any topics that are destined to weaponized against them.

show 5 replies
pants2yesterday at 4:51 PM

lol I remember asking GPT4 how much aspartame it would take to sweeten the ocean, and it refused because that would harm the ecosystem.

show 1 reply
reactordevyesterday at 4:30 PM

Technically in their airspace though so you might be in bigger trouble than parking.

If you tether it to an asphalt ground hook you can claim it’s a tarmac and that it’s “parked” for sake of the FAA. You’ll need a “lighter-than-air” certification.

michaelbuckbeeyesterday at 5:08 PM

There's that maniac who is building a quad-copter skateboard contraption who got in trouble with the FAA who successfully reported that he was flying, but got fined for landing at a stoplight.

cyanydeezyesterday at 4:29 PM

If the spirit of a law is beneficial, it can still be hacked to evil ends.

This isnt the failure of the law, its the failure of humans to understand the abstraction.

Programmers should absolutely understand when theyre using a high level abstraction to a complex problem.

Its bemusing when you seem them actively ignore that and claim the abstraction is broken rather than the underlying problem is simply more complex and the abstraction is for 95% of use cases.

"Aha," the confused programmer exclaims, "the abstraction is wrong, I can still shoot my foot off when i disable the gun safety"