|
|
2 years ago | |
|---|---|---|
| .. | ||
| .vscode | 2 years ago | |
| configs | 2 years ago | |
| constants | 2 years ago | |
| controllers | 2 years ago | |
| core | 2 years ago | |
| docker | 2 years ago | |
| events | 2 years ago | |
| extensions | 2 years ago | |
| fields | 2 years ago | |
| libs | 2 years ago | |
| migrations | 2 years ago | |
| models | 2 years ago | |
| schedule | 2 years ago | |
| services | 2 years ago | |
| tasks | 2 years ago | |
| templates | 2 years ago | |
| tests | 2 years ago | |
| .dockerignore | 3 years ago | |
| .env.example | 2 years ago | |
| Dockerfile | 2 years ago | |
| README.md | 2 years ago | |
| app.py | 2 years ago | |
| commands.py | 2 years ago | |
| config.py | 2 years ago | |
| poetry.lock | 2 years ago | |
| poetry.toml | 2 years ago | |
| pyproject.toml | 2 years ago | |
| requirements-dev.txt | 2 years ago | |
| requirements.txt | 2 years ago | |
README.md
Dify Backend API
Usage
-
Start the docker-compose stack
The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using
docker-compose.cd ../docker docker-compose -f docker-compose.middleware.yaml -p dify up -d cd ../api -
Copy
.env.exampleto.env -
Generate a
SECRET_KEYin the.envfile.sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .envsecret_key=$(openssl rand -base64 42) sed -i '' "/^SECRET_KEY=/c\\ SECRET_KEY=${secret_key}" .env -
Create environment.
Dify API service uses Poetry to manage dependencies. You can execute
poetry shellto activate the environment.Using pip can be found below.
-
Install dependencies
poetry env use 3.10 poetry installIn case of contributors missing to update dependencies for
pyproject.toml, you can perform the following shell instead.poetry shell # activate current environment poetry add $(cat requirements.txt) # install dependencies of production and update pyproject.toml poetry add $(cat requirements-dev.txt) --group dev # install dependencies of development and update pyproject.toml -
Run migrate
Before the first launch, migrate the database to the latest version.
poetry run python -m flask db upgrade -
Start backend
poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug -
Start Dify web service.
-
Setup your application by visiting
http://localhost:3000... -
If you need to debug local async processing, please start the worker service.
poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail
The started celery app handles the async tasks, e.g. dataset importing and documents indexing.
Testing
-
Install dependencies for both the backend and the test environment
poetry install --with dev -
Run the tests locally with mocked system environment variables in
tool.pytest_envsection inpyproject.tomlcd ../ poetry run -C api bash dev/pytest/pytest_all_tests.sh
Usage with pip
[!NOTE]
In the next version, we will deprecate pip as the primary package management tool for dify api service, currently Poetry and pip coexist.
-
Start the docker-compose stack
The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using
docker-compose.cd ../docker docker-compose -f docker-compose.middleware.yaml -p dify up -d cd ../api -
Copy
.env.exampleto.env -
Generate a
SECRET_KEYin the.envfile.sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env -
Create environment.
If you use Anaconda, create a new environment and activate it
conda create --name dify python=3.10 conda activate dify -
Install dependencies
pip install -r requirements.txt -
Run migrate
Before the first launch, migrate the database to the latest version.
flask db upgrade -
Start backend:
flask run --host 0.0.0.0 --port=5001 --debug -
Setup your application by visiting http://localhost:5001/console/api/setup or other apis...
-
If you need to debug local async processing, please start the worker service.
celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mailThe started celery app handles the async tasks, e.g. dataset importing and documents indexing.