From what I hear, this will not happen. AI keeps absolutely making up laws and cases that don’t exist no matter what you feed it. Basically anything legal written or partially written by AI is a liability. IANAL but have been reading a tiny bit about it.
The need for lawyers will shrink and is shrinking. My company used to call lawyers for many small little things. Now it is easy to ask an LLM and have the second LLM verify it. For super critical things, we may still call lawyers. And in the court rooms, you will still see lawyers. But everywhere else the need for lawyers will keep going down.
Ehhh just calling a raw LLM is not going to replace anyone and be prone to hallucination, sure. But lawyers are increasingly using LLM systems, and there's law-specific products that are heavily grounded (ie. they can only respond from source material).
Worth to note that lawyer is not only read a text and say: "true or false" requires interpretation and understanding of how a society changes/evolves, depending on the country it's jurisprudence or more analytical (written laws).
I have a difficult in see why a portion of HN audience is so "narrowed view" about justice systems and politics.