You can get good models that run fine on M1 32GB laptops just using Ollama App.
Or if you want numerous features on top of your local LLMS then Open WebUI would be my choice.
https://docs.openwebui.com