docker pull mintplexlabs/anythingllm
export STORAGE_LOCATION=/var/lib/anythingllm && \
mkdir -p STORAGE_LOCATION && \
touch "STORAGE_LOCATION/.env"
$env:STORAGE_LOCATION="$HOME\Documents\anythingllm"; `
If(!(Test-Path $env:STORAGE_LOCATION)) {New-Item $env:STORAGE_LOCATION -ItemType Directory}; `
If(!(Test-Path "$env:STORAGE_LOCATION\.env")) {New-Item "$env:STORAGE_LOCATION\.env" -ItemType File};
docker run -d \
--name anythingllm \
--add-host=host.docker.internal:host-gateway \
--env STORAGE_DIR=/app/server/storage \
--health-cmd "/bin/bash /usr/local/bin/docker-healthcheck.sh || exit 1" \
--health-interval 60s \
--health-start-period 60s \
--health-timeout 10s \
-p 3001:3001/tcp \
--restart=always \
--user anythingllm \
-v {STORAGE_LOCATION}:/app/server/storage \
-v{STORAGE_LOCATION}/.env:/app/server/.env \
-w /app \
mintplexlabs/anythingllm
现在就可以访问 http://localhost:3001 进行初始配置,最好配置团队,可以进行权限控制,也可以配置大语言模型LLM、向量模型、向量数据库等等。
完成后查看 .env 文件,如下(可能会有不同):
SERVER_PORT=3001
JWT_SECRET="my-random-string-for-seeding" # Please generate random string at least 12 chars long.
STORAGE_DIR="/var/lib/anything"
OPEN_AI_KEY=""
LLM_PROVIDER='ollama'
OLLAMA_BASE_PATH='http://localhost:11434'
OLLAMA_MODEL_PREF='llama3-64k:latest'
OLLAMA_MODEL_TOKEN_LIMIT='4096'
EMBEDDING_ENGINE='native'
VECTOR_DB='lancedb'
在浏览器中访问这个地址http://localhost:3001/api/docs/可以查看已有的API接口。在设置里生成APIKEY,客户端编程通过这个Key来访问接口。
![image-20240715141114860](/Users/mac/Library/Application Support/typora-user-images/image-20240715141114860.png)
制作自己的镜像是为了更加的高度定制化,可以有更多的扩展性。
git clone https://github.com/Mintplex-Labs/anything-llm.git
进入代码目录anything-llm, 执行命令:
docker build -f ./docker/Dockerfile -t anythingllm:my_1.0 .
这样就可以得到自己的镜像文件了,后面也可以自主修改代码了。
powered by kaifamiao