I agree that communities should try to protect themselves from malicious actors.
But the part about FOSS being used in a project not aligned with the creator's values seams hypocritical:
IMO FOSS is a gift to humanity and as such:
"A gift should be given freely, without obligation or expectation, as a true expression of love and kindness"
Thsi talk is scheduled for January 31st, or am I missing something? Why is it being posted here? There is no video yet.
No n-gate summary. Sad.
The guy holding this talk apparently does this:
> NGI Zero, a family of research programmes including NGI0 Entrust, NGI0 Core and NGI0 Commons Fund, part of the Next Generation Internet initiative.
with the Next Generation Internet thing at the end receiving money/financing from the political supra-state entity called the EU [1] . So I guess said speech-holder is not happy because political entities which are seen by the EU as adversarial are also using open-source code? Not sure how war plays into this, as I’m sure he must be aware of the hundreds of billions of euros the EU has allocated for that.
[1] https://ngi.eu/
>AI in its current form has no actual sense of truth or ethics.
This is untrue. It does have sense of truth and ethics. Although it does get few things wrong from time to time but you can't reliably get it to say something blatantly incorrect (at least with thinking enabled). I would say it is more truthful than any human on average. Ethically I don't think you can get it to do or say something unethical.
Reading this felt like the official obituary for the 90s techno-optimism many of us grew up on.
The "end of history" hangover is real. We went about building the modern stack assuming bad actors were outliers, not state-sponsored standard procedure. But trying to legislate good use into licenses? I don't know how you would realistically implement it and to what extent? That solution implies we have to move toward zero-trust architectures even within open communities.
As an example: formal proofs and compartmentalization are unsexy but they're a solid way we survive the next decade of adversarial noise.
I remember reading a quote somewhere that stuck with me. Paraphrasing, "If the architecture of my code doesn't enforce privacy and resistance to censorship by default, we have to assume it will be weaponized".
I am out of ideas, practical ones, lots sound good on paper and in theory. It's a bit sad tbh. Always curious to hear more on this issue from smarter people.