Ollama lets you download and run local versions of LLMs of your choice. In this article, I downloaded Phi3 with 8B parameters.
To begin, download the Ollama software from their website. Then, in the DOS prompt, type ollama run phi3
. The download will begin, and after downloading, it will start Phi3 right away so you can chat with Phi3 immediately.
Install the required library.
pip install langchain_community
Next, I prepared this Python program to interact with Phi3 locally, i.e., offline:
1 2 3 4 5 6 7 8 | from langchain_community.llms import Ollama llm = Ollama( model="phi3" ) # assuming you have Ollama installed and have the phi3 model pulled with `ollama pull phi3` x = llm.invoke("Tell me a joke") print(x) |
No comments:
Post a Comment