This commit addresses a critical issue where streaming output would fail in workflows with complex topologies, particularly those involving multiple conditional branches (if/else) that converge on a common node before the LLM and Answer nodes. The root cause was twofold: 1. A bug in the branch pruning logic () that would incorrectly remove shared downstream nodes, leading to a premature emptying of the list. 2. A flawed static dependency analysis () that could not correctly resolve dependencies for nodes that were part of multiple, mutually exclusive execution paths. This refactor introduces a new, robust architecture for streaming dependency management based on the principle of "Static Pre-pruning + Dynamic Adjudication": - **Fix**: The branch pruning logic in is now non-recursive and conservative. It only prunes the immediate first node of an unreachable branch, preserving the integrity of shared downstream paths and join points. - **Refactor**: The old static dependency analysis has been completely removed. This includes deleting the attribute from the entity and deleting the associated recursive dependency fetching methods (, ). - **Feat**: A new method, , has been implemented in . This method performs a real-time, backward traversal of the graph from the streaming node, querying the runtime execution state () to dynamically validate if the *actual* dependency path has been successfully completed. This ensures that streaming decisions are based on the ground truth of the current execution, not a flawed static prediction. - **Doc**: Added comprehensive docstrings and comments to the modified components to explain the new architecture and the rationale behind the changes. |
7 months ago | |
|---|---|---|
| .. | ||
| .idea | 1 year ago | |
| .vscode | 1 year ago | |
| configs | 7 months ago | |
| constants | 7 months ago | |
| contexts | 8 months ago | |
| controllers | 7 months ago | |
| core | 7 months ago | |
| docker | 9 months ago | |
| events | 7 months ago | |
| extensions | 7 months ago | |
| factories | 7 months ago | |
| fields | 7 months ago | |
| libs | 7 months ago | |
| migrations | 7 months ago | |
| models | 7 months ago | |
| repositories | 7 months ago | |
| schedule | 8 months ago | |
| services | 7 months ago | |
| tasks | 7 months ago | |
| templates | 7 months ago | |
| tests | 7 months ago | |
| .dockerignore | 9 months ago | |
| .env.example | 7 months ago | |
| .ruff.toml | 8 months ago | |
| Dockerfile | 8 months ago | |
| README.md | 9 months ago | |
| app.py | 9 months ago | |
| app_factory.py | 9 months ago | |
| commands.py | 7 months ago | |
| dify_app.py | 1 year ago | |
| mypy.ini | 8 months ago | |
| pyproject.toml | 7 months ago | |
| pytest.ini | 8 months ago | |
| uv.lock | 7 months ago | |
README.md
Dify Backend API
Usage
[!IMPORTANT]
In the v1.3.0 release,
poetryhas been replaced withuvas the package manager for Dify API backend service.
-
Start the docker-compose stack
The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using
docker-compose.cd ../docker cp middleware.env.example middleware.env # change the profile to other vector database if you are not using weaviate docker compose -f docker-compose.middleware.yaml --profile weaviate -p dify up -d cd ../api -
Copy
.env.exampleto.envcp .env.example .env -
Generate a
SECRET_KEYin the.envfile.bash for Linux
sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .envbash for Mac
secret_key=$(openssl rand -base64 42) sed -i '' "/^SECRET_KEY=/c\\ SECRET_KEY=${secret_key}" .env -
Create environment.
Dify API service uses UV to manage dependencies. First, you need to add the uv package manager, if you don't have it already.
pip install uv # Or on macOS brew install uv -
Install dependencies
uv sync --dev -
Run migrate
Before the first launch, migrate the database to the latest version.
uv run flask db upgrade -
Start backend
uv run flask run --host 0.0.0.0 --port=5001 --debug -
Start Dify web service.
-
Setup your application by visiting
http://localhost:3000. -
If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
Testing
-
Install dependencies for both the backend and the test environment
uv sync --dev -
Run the tests locally with mocked system environment variables in
tool.pytest_envsection inpyproject.tomluv run -P api bash dev/pytest/pytest_all_tests.sh