-
Notifications
You must be signed in to change notification settings - Fork 2.8k
docker: configure config.toml using environment variable #751
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Signed-off-by: Navratan Lal Gupta <[email protected]>
docker-compose.yaml
Outdated
| - MODELS_CUSTOM_OPENAI_API_KEY="" | ||
| - MODELS_CUSTOM_OPENAI_API_URL="" | ||
| - MODELS_CUSTOM_OPENAI_MODEL_NAME="" | ||
| - MODELS_OLLAMA_API_KEY="" # Ollama API URL - http://host.docker.internal:11434 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be MODELS_OLLAMA_API_URL
docker-compose.yaml
Outdated
| - MODELS_CUSTOM_OPENAI_MODEL_NAME="" | ||
| - MODELS_OLLAMA_API_KEY="" # Ollama API URL - http://host.docker.internal:11434 | ||
| - MODELS_DEEPSEEK_API_KEY="" | ||
| - MODELS_LM_STUDIO_API_KEY="" # LM Studio API URL - http://host.docker.internal:1234 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be MODELS_LM_STUDIO_API_URL
Signed-off-by: Navratan Lal Gupta <[email protected]>
…ce-config-toml Signed-off-by: Navratan Lal Gupta <[email protected]>
|
Thanks for reviewing PR @djmaze . I have corrected the code as per your comment and fixed the merged conflict. |
|
Any chance to push this forward? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cubic found 2 issues across 4 files. Review them in cubic.dev
React with 👍 or 👎 to teach cubic. Tag @cubic-dev-ai to give specific feedback.
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
Instead of updating
config.tomlfile to configure Perplexica, We can use environment variables to configure those configs.For example to configure below section
we can set envrionment variable
MODELS_CUSTOM_OPENAI_API_KEY="sk-123456",MODELS_CUSTOM_OPENAI_API_URL="http://localopenai:11134"andMODELS_CUSTOM_OPENAI_MODEL_NAME="meta-llama/llama-4"by simply setting variableSECTION_WITH_DOTS_REPLACED_WITH_UNDERSCORE_KEYNAME=VALUEThis will fix issue #673 and #750.