Tags
Note
This post is a thought. It's a short note that I make about someone else's content online. Learn more about the process thoughts
Here's my thought on π Ollama
ollama is the easiest to get going local llm tool that I have tried, and seems to be crazy fast. It feels faster than chat gpt, which has not been the experience I have had previously with running llm's on my hardware.
curl https://i.jpillora.com/jmorganca/ollama | bash ollama serve ollama run mistral ollama run codellama:7b-code ollama list
This post was a thought by Waylon Walker see all my thoughts at https://waylonwalker.com/thoughts