>Here’s why this changes everything: most AI accountability frameworks assume a discrete, auditable dataset. EU’s GDPR gives you the right to erasure — the right to delete your data. But GDPR was written for databases. The Ontology is a graph. You can delete a node. You can’t easily delete the edges i.e, the inferred relationships between you and everything else the system has connected you to.
Edges are personal data according to GDPR so this is completely wrong. Almost all things to which the GDPR applies are edges.
'impossiblefork likes stories' is an edge.
Ontologies are also old. It's been a big research area since like the 90s.
Facebook has shadow profiles and collects phone numvers feom these contacts.
You could certainly include phone numbers, residential addresses as edges that should be deleted for compliance.
Fair correction - I should have been more precise.
The point I was reaching for is a practical enforcement one: verifying that edges have actually been deleted from an opaque, continuously updated knowledge graph has no standardized technical mechanism. Regulators have audit powers, but graph deletion verification i.e, confirming that relational inferences are gone, not just that a node was removed has no established standard. Controllers can assert compliance in ways that are genuinely difficult to challenge in practice.