Oh does llama.cpp use MLX or whatever? I had this question, wonder if you know? A search suggests it doesn’t but I don’t really understand.
>Oh does llama.cpp use MLX or whatever?
No. It runs on MacOS but uses Metal instead of MLX.
llama.cpp uses GGML which uses Metal directly.
>Oh does llama.cpp use MLX or whatever?
No. It runs on MacOS but uses Metal instead of MLX.