|
2278 | 2278 | }, |
2279 | 2279 | { |
2280 | 2280 | "cell_type": "markdown", |
2281 | | - "id": "747a2fc7-282d-47ec-a987-ed0a23ed6822", |
2282 | | - "metadata": { |
2283 | | - "id": "747a2fc7-282d-47ec-a987-ed0a23ed6822" |
2284 | | - }, |
| 2281 | + "id": "267cd444-3156-46ad-8243-f9e7a55e66e7", |
| 2282 | + "metadata": {}, |
2285 | 2283 | "source": [ |
2286 | 2284 | "- For macOS and Windows users, click on the ollama application you downloaded; if it prompts you to install the command line usage, say \"yes\"\n", |
2287 | 2285 | "- Linux users can use the installation command provided on the ollama website\n", |
2288 | 2286 | "\n", |
2289 | 2287 | "- In general, before we can use ollama from the command line, we have to either start the ollama application or run `ollama serve` in a separate terminal\n", |
2290 | 2288 | "\n", |
2291 | | - "<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch07_compressed/ollama-run.webp?1\" width=700px>\n", |
| 2289 | + "<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch07_compressed/ollama-run.webp?1\" width=700px>" |
| 2290 | + ] |
| 2291 | + }, |
| 2292 | + { |
| 2293 | + "cell_type": "markdown", |
| 2294 | + "id": "30266e32-63c4-4f6c-8be3-c99e05ed05b7", |
| 2295 | + "metadata": {}, |
| 2296 | + "source": [ |
| 2297 | + "---\n", |
2292 | 2298 | "\n", |
| 2299 | + "**Note**:\n", |
2293 | 2300 | "\n", |
| 2301 | + "- When running `ollama serve` in the terminal, as described above, you may encounter an error message saying `Error: listen tcp 127.0.0.1:11434: bind: address already in use`\n", |
| 2302 | + "- If that's the case, try use the command `OLLAMA_HOST=127.0.0.1:11435 ollama serve` (and if this address is also in use, try to increment the numbers by one until you find an address not in use\n", |
| 2303 | + "\n", |
| 2304 | + "---" |
| 2305 | + ] |
| 2306 | + }, |
| 2307 | + { |
| 2308 | + "cell_type": "markdown", |
| 2309 | + "id": "747a2fc7-282d-47ec-a987-ed0a23ed6822", |
| 2310 | + "metadata": { |
| 2311 | + "id": "747a2fc7-282d-47ec-a987-ed0a23ed6822" |
| 2312 | + }, |
| 2313 | + "source": [ |
2294 | 2314 | "- With the ollama application or `ollama serve` running in a different terminal, on the command line, execute the following command to try out the 8-billion-parameter Llama 3 model (the model, which takes up 4.7 GB of storage space, will be automatically downloaded the first time you execute this command)\n", |
2295 | 2315 | "\n", |
2296 | 2316 | "```bash\n", |
|
2475 | 2495 | "def query_model(\n", |
2476 | 2496 | " prompt,\n", |
2477 | 2497 | " model=\"llama3\",\n", |
| 2498 | + " # If you used OLLAMA_HOST=127.0.0.1:11435 ollama serve\n", |
| 2499 | + " # update the address from 11434 to 11435\n", |
2478 | 2500 | " url=\"http://localhost:11434/api/chat\"\n", |
2479 | 2501 | "):\n", |
2480 | 2502 | " # Create the data payload as a dictionary\n", |
|
0 commit comments