Improves the robustness of the workflow engine's streaming output by fixing two core issues that caused streaming to fail in complex topologies where multiple conditional branches merge.
**1. Corrected Runtime State Management ("Pruning"):**
The primary bug was located in the `_remove_unreachable_nodes` method. Its aggressive recursive "pruning" algorithm incorrectly removed shared downstream nodes (including LLM and Answer) when handling conditional branches that led to a join point. This prematurely emptied the `rest_node_ids` list, causing the stream processor to fail its initial state check.
The fix replaces the recursive logic with a more conservative, non-recursive approach that only prunes the immediate first node of an unreachable branch. This ensures the integrity of the `rest_node_ids` list throughout the workflow execution.
**2. Improved Static Dependency Analysis:**
A secondary, underlying issue was found in the static dependency analysis (`_recursive_fetch_answer_dependencies`). It incorrectly identified all upstream, mutually exclusive `If/Else` nodes as parallel dependencies of the Answer node.
The fix enhances this analysis by adding "join point awareness". The upward trace now stops when it encounters a node with more than one incoming edge, correctly identifying the join point itself as the dependency rather than its upstream branches.
Together, these changes ensure that streaming output remains reliable and predictable, even in complex workflows with reusable, multi-input nodes.
|
10 months ago | |
|---|---|---|
| .. | ||
| .idea | ||
| .vscode | 2 years ago | |
| configs | 10 months ago | |
| constants | 10 months ago | |
| contexts | 12 months ago | |
| controllers | 10 months ago | |
| core | 10 months ago | |
| docker | 1 year ago | |
| events | 10 months ago | |
| extensions | 10 months ago | |
| factories | 10 months ago | |
| fields | 10 months ago | |
| libs | 10 months ago | |
| migrations | 10 months ago | |
| models | 10 months ago | |
| repositories | 10 months ago | |
| schedule | 11 months ago | |
| services | 10 months ago | |
| tasks | 10 months ago | |
| templates | 10 months ago | |
| tests | 10 months ago | |
| .dockerignore | 1 year ago | |
| .env.example | 10 months ago | |
| .ruff.toml | 11 months ago | |
| Dockerfile | 11 months ago | |
| README.md | 1 year ago | |
| app.py | 1 year ago | |
| app_factory.py | 12 months ago | |
| commands.py | 10 months ago | |
| dify_app.py | 1 year ago | |
| mypy.ini | 11 months ago | |
| pyproject.toml | 10 months ago | |
| pytest.ini | 11 months ago | |
| uv.lock | 10 months ago | |
README.md
Dify Backend API
Usage
[!IMPORTANT]
In the v1.3.0 release,
poetryhas been replaced withuvas the package manager for Dify API backend service.
-
Start the docker-compose stack
The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using
docker-compose.cd ../docker cp middleware.env.example middleware.env # change the profile to other vector database if you are not using weaviate docker compose -f docker-compose.middleware.yaml --profile weaviate -p dify up -d cd ../api -
Copy
.env.exampleto.envcp .env.example .env -
Generate a
SECRET_KEYin the.envfile.bash for Linux
sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .envbash for Mac
secret_key=$(openssl rand -base64 42) sed -i '' "/^SECRET_KEY=/c\\ SECRET_KEY=${secret_key}" .env -
Create environment.
Dify API service uses UV to manage dependencies. First, you need to add the uv package manager, if you don't have it already.
pip install uv # Or on macOS brew install uv -
Install dependencies
uv sync --dev -
Run migrate
Before the first launch, migrate the database to the latest version.
uv run flask db upgrade -
Start backend
uv run flask run --host 0.0.0.0 --port=5001 --debug -
Start Dify web service.
-
Setup your application by visiting
http://localhost:3000. -
If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
Testing
-
Install dependencies for both the backend and the test environment
uv sync --dev -
Run the tests locally with mocked system environment variables in
tool.pytest_envsection inpyproject.tomluv run -P api bash dev/pytest/pytest_all_tests.sh