is it possible to use llama3 via ollama rather than huggingface one?
is it possible to use llama3 via ollama rather than huggingface one?