logoalt Hacker News

Legal Contracts Built for AI Agents

69 pointsby arnonyesterday at 12:55 PM42 commentsview on HN

Comments

Neywinyyesterday at 3:47 PM

I'm not sure I understand why this is about agents. This feels more like contracting than SaaS. If I contract a company to build a house and it's upside down, I don't care if it was a robot that made the call, it's that company's fault not mine. I often write electronic hardware test automation code and my goodness if my code sets the power supply to 5000V instead of 5.000V (made up example), that's my fault. It's not the code's fault or the power supply's fault.

So, why would you use a SaaS contract for an agent in the first place? It should be like a subcontractor. I pay you to send 10k emails a day to all my clients. If you use an agent and it messes up, that's on you. If you use an agent and it saves you time, you get the reward.

show 2 replies
Animatsyesterday at 6:54 PM

Legal contracts built for sellers of AI agents.

The contract establishes that your agent functions as a sophisticated tool, not an autonomous employee. When a customer's agent books 500 meetings with the wrong prospect list, the answer to "who approved that?" cannot be "the AI decided."

It has to be "the customer deployed the agent with these parameters and maintained oversight responsibility."

The MSA includes explicit language in Section 1.2 that protects you from liability for autonomous decisions while clarifying customer responsibility.

The alternative is that the service has financial responsibility for its mistakes. This is the norm in the gambling industry. Back when GTech was publicly held, their financial statements listed how much they paid out for their errors. It was about 3%-5% of revenue.

Since this kind of product is sold via large scale B2B deals, buyers can negotiate. Perhaps service responsibility for errors backed up by reinsurance above some limit.

nadisyesterday at 7:48 PM

> "The template uses CommonPaper's Software Licensing Agreement and AI Addendum as a foundation, adapted for the unique characteristics of AI agents. Nick and the GitLaw team built this based on patterns from reviewing hundreds of agent contracts. We contributed our research from working with dozens of agent companies on monetization challenges."

Unless I'm misunderstanding and GitLaw and CommonPaper are related or collaborating, I feel like this callout deserves to be mentioned earlier on and the changes / distinctions ought to be called out more explicitly. Otherwise, why not just use CommonPaper's version?

ataha322yesterday at 1:19 PM

The question isn't just who's liable - it's whether traditional contract structures can even keep up with systems that learn and change behavior over time. Wonder if this becomes a bigger moat than the AI.

show 3 replies
jrm4yesterday at 5:14 PM

Sigh -- another not-even-thinly-veiled ducking of "A computer can never be held accountable, therefore a computer must never make a management decision."

This is not the way we want to be going.

show 1 reply
n8m8yesterday at 2:38 PM

Can't scroll, Cookies disclaimer doesn't work in firefox with ublock origin :(

show 2 replies