Question for those running Ollama locally #397
Unanswered
wakingdaydreams
asked this question in
Q&A
Replies: 1 comment 1 reply
-
|
Scratch that-- I swear I wrote that before Opencode was an option! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Does anyone know if local agent capability will be added? I have an Ollama config running via Docker and wondering if that will be available down the road.
Beta Was this translation helpful? Give feedback.
All reactions