环境准备
- ubuntu server 24.04
- 1panel
- git
- conda
- pipx
- poetry
- pnpm
- nvm
安装1panel
因为1panel会自动选择最优的docker+docker compose
# 一键安装
curl -sSL https://resource.fit2cloud.com/1panel/package/quick_start.sh -o quick_start.sh && sudo bash quick_start.sh
安装git(如果有,可以跳过)
sudo apt install git
安装conda
执行下面命令,完成之后,就需要重启一下bash终端
mkdir -p ~/miniconda3
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh
bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3
rm -rf ~/miniconda3/miniconda.sh
# 下面代码建议执行,可以让控制台有个前缀提示
~/miniconda3/bin/conda init bash
记得一定要重启终端,提示如下,有个(base)前缀,就说明安装完成
(base) dify_beta@difybeta:~$
安装pipx
直接执行安装即可,参考官方链接:https://pipx.pypa.io/stable/installation/
sudo apt update
sudo apt install pipx
pipx ensurepath
执行一下版本输出,如果有数据,说明安装成功
pipx --version
安装poetry
pipx install poetry
配置要求
CPU: 8核心
内存: 32GB
硬盘: 100GB
下载源码
直接拉取主分支源码
git clone https://github.com/langgenius/dify.git
启动基础业务
在启用业务服务之前,我们需要先部署 PostgreSQL / Redis / Weaviate(如果本地没有的话),可以通过以下命令启动:
cd docker # 切换源码目录下的docker
cp middleware.env.example middleware.env # 复制一份中间件配置文件
选择docker版本的PostgreSQL数据库
如果只是想了解一下dify的本地启动,没有把数据库独立出来的需求,那么直接执行
sudo docker compose -f docker-compose.middleware.yaml up -d
就可以跳过下面:选择远程的PostgreSQL数据库 这一步骤
选择远程的PostgreSQL数据库
第一步,需要修改文件: docker/docker-compose.middleware.yaml
删除掉db文件,这样就不会自动连接docker的数据库
services:
# 删除DB配置
db:
image: postgres:15-alpine
restart: always
env_file:
- ./middleware.env
environment:
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-difyai123456}
POSTGRES_DB: ${POSTGRES_DB:-dify}
PGDATA: ${PGDATA:-/var/lib/postgresql/data/pgdata}
command: >
postgres -c 'max_connections=${POSTGRES_MAX_CONNECTIONS:-100}'
-c 'shared_buffers=${POSTGRES_SHARED_BUFFERS:-128MB}'
-c 'work_mem=${POSTGRES_WORK_MEM:-4MB}'
-c 'maintenance_work_mem=${POSTGRES_MAINTENANCE_WORK_MEM:-64MB}'
-c 'effective_cache_size=${POSTGRES_EFFECTIVE_CACHE_SIZE:-4096MB}'
volumes:
- ${PGDATA_HOST_VOLUME:-./volumes/db/data}:/var/lib/postgresql/data
ports:
- "${EXPOSE_POSTGRES_PORT:-5432}:5432"
healthcheck:
test: [ "CMD", "pg_isready" ]
interval: 1s
timeout: 3s
retries: 30
第二步,修改文件: api/.env
修改配置
# PostgreSQL database configuration
DB_USERNAME=dify
DB_PASSWORD=password
DB_HOST=192.168.9.130
DB_PORT=5432
DB_DATABASE=dify
改成远程的数据库配置就行,数据库名要写 dify
如果都修改好了,就直接
sudo docker compose -f docker-compose.middleware.yaml up -d
到这一步,就说明基础环境都弄好了。
复制环境变量配置文件
路径:dify/api
cp .env.example .env
生成随机密钥,并替换 .env 中 SECRET_KEY 的值
awk -v key="$(openssl rand -base64 42)" '/^SECRET_KEY=/ {sub(/=.*/, "=" key)} 1' .env > temp_env && mv temp_env .env
安装后端api依赖
切换到:dify/api
poetry env use 3.12
poetry install
执行数据库迁移
将数据库结构迁移至最新版本。
poetry run flask db upgrade
启动API服务
poetry run flask run --host 0.0.0.0 --port=5001 --debug
正确输出:
* Debug mode: on
INFO:werkzeug:WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5001
INFO:werkzeug:Press CTRL+C to quit
INFO:werkzeug: * Restarting with stat
WARNING:werkzeug: * Debugger is active!
INFO:werkzeug: * Debugger PIN: 695-801-919
启动 Worker 服务
用于消费异步队列任务,如知识库文件导入、更新知识库文档等异步操作。 Linux / MacOS 启动
poetry run celery -A app.celery worker -P gevent -c 1 -Q dataset,generation,mail,ops_trace --loglevel INFO
正确输出
-------------- celery@TAKATOST.lan v5.2.7 (dawn-chorus)
--- ***** -----
-- ******* ---- macOS-10.16-x86_64-i386-64bit 2023-07-31 12:58:08
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: app:0x7fb568572a10
- ** ---------- .> transport: redis://:**@localhost:6379/1
- ** ---------- .> results: postgresql://postgres:**@localhost:5432/dify
- *** --- * --- .> concurrency: 1 (gevent)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> dataset exchange=dataset(direct) key=dataset
.> generation exchange=generation(direct) key=generation
.> mail exchange=mail(direct) key=mail
[tasks]
. tasks.add_document_to_index_task.add_document_to_index_task
. tasks.clean_dataset_task.clean_dataset_task
. tasks.clean_document_task.clean_document_task
. tasks.clean_notion_document_task.clean_notion_document_task
. tasks.create_segment_to_index_task.create_segment_to_index_task
. tasks.deal_dataset_vector_index_task.deal_dataset_vector_index_task
. tasks.document_indexing_sync_task.document_indexing_sync_task
. tasks.document_indexing_task.document_indexing_task
. tasks.document_indexing_update_task.document_indexing_update_task
. tasks.enable_segment_to_index_task.enable_segment_to_index_task
. tasks.generate_conversation_summary_task.generate_conversation_summary_task
. tasks.mail_invite_member_task.send_invite_member_mail_task
. tasks.remove_document_from_index_task.remove_document_from_index_task
. tasks.remove_segment_from_index_task.remove_segment_from_index_task
. tasks.update_segment_index_task.update_segment_index_task
. tasks.update_segment_keyword_index_task.update_segment_keyword_index_task
[2023-07-31 12:58:08,831: INFO/MainProcess] Connected to redis://:**@localhost:6379/1
[2023-07-31 12:58:08,840: INFO/MainProcess] mingle: searching for neighbors
[2023-07-31 12:58:09,873: INFO/MainProcess] mingle: all alone
[2023-07-31 12:58:09,886: INFO/MainProcess] pidbox: Connected to redis://:**@localhost:6379/1.
[2023-07-31 12:58:09,890: INFO/MainProcess] celery@TAKATOST.lan ready.
都部署好了之后,后续每次启动只需要执行
启动worker
poetry run celery -A app.celery worker -P gevent -c 1 -Q dataset,generation,mail,ops_trace --loglevel INFO
调试模式启动后端
poetry run flask run --host 0.0.0.0 --port=5001 --debug
启动前端
npm run dev
其他问题
dify-sandbox:0.2.10 Restarting (2) 23 seconds ago
这种情况需要检查一下,是不是volumes里面没有这个文件,需要从github下载手工放进去。
https://github.com/langgenius/dify/blob/main/docker/volumes/sandbox/conf/config.yaml