1
0
Fork 0
local-llm/.env

5 lines
186 B
Bash
Raw Normal View History

2024-02-22 03:45:35 +00:00
# If set, HTTP_PROXY messes with inter-container communication in the deployment.
# Ollama downloads the models via https anyway so it should be safe to unset it
HTTP_PROXY=
http_proxy=