You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
gcgj-dify-1.7.0/api
xuzijie1995 db91643915 refactor(workflow): Rearchitect stream dependency logic for complex graphs
This commit addresses a critical issue where streaming output would fail in workflows with complex topologies, particularly those involving multiple conditional branches (if/else) that converge on a common node before the LLM and Answer nodes.

The root cause was twofold:
1. A bug in the branch pruning logic () that would incorrectly remove shared downstream nodes, leading to a premature emptying of the  list.
2. A flawed static dependency analysis () that could not correctly resolve dependencies for nodes that were part of multiple, mutually exclusive execution paths.

This refactor introduces a new, robust architecture for streaming dependency management based on the principle of "Static Pre-pruning + Dynamic Adjudication":

- **Fix**: The branch pruning logic in  is now non-recursive and conservative. It only prunes the immediate first node of an unreachable branch, preserving the integrity of shared downstream paths and join points.

- **Refactor**: The old static dependency analysis has been completely removed. This includes deleting the  attribute from the  entity and deleting the associated recursive dependency fetching methods (, ).

- **Feat**: A new method, , has been implemented in . This method performs a real-time, backward traversal of the graph from the streaming node, querying the runtime execution state () to dynamically validate if the *actual* dependency path has been successfully completed. This ensures that streaming decisions are based on the ground truth of the current execution, not a flawed static prediction.

- **Doc**: Added comprehensive docstrings and comments to the modified components to explain the new architecture and the rationale behind the changes.
10 months ago
..
.idea
.vscode
configs enabling vector index prefix name via configuration files (#22661) 10 months ago
constants Support OAuth Integration for Plugin Tools (#22550) 10 months ago
contexts fix: Copy request context and current user in app generators. (#20240) 12 months ago
controllers chore: code improvement for mcp_client and mcp_tools_manage_service (#22645) 10 months ago
core refactor(workflow): Rearchitect stream dependency logic for complex graphs 10 months ago
docker add MAX_TASK_PRE_CHILD for celery (#18985) 1 year ago
events Fix/replace datetime patterns with naive utc now (#22654) 10 months ago
extensions Fix/replace datetime patterns with naive utc now (#22654) 10 months ago
factories refactor(api): Separate SegmentType for Integer/Float to Enable Pydantic Serialization (#22025) 10 months ago
fields feat(app): support custom max_active_requests per app (#22073) 10 months ago
libs Fix/replace datetime patterns with naive utc now (#22654) 10 months ago
migrations Increased the character limitation (#22679) 10 months ago
models Increased the character limitation (#22679) 10 months ago
repositories fix: create api workflow run repository error (#22422) 10 months ago
schedule Feat/queue monitor (#20647) 11 months ago
services chore: code improvement for mcp_client and mcp_tools_manage_service (#22645) 10 months ago
tasks Fix/replace datetime patterns with naive utc now (#22654) 10 months ago
templates minor typo fix: remove debug code and fix typo (#22539) 10 months ago
tests fix: resolve Redis mock import error in test configuration (#22663) 10 months ago
.dockerignore Enhance Code Consistency Across Repository with `.editorconfig` (#19023) 1 year ago
.env.example enabling vector index prefix name via configuration files (#22661) 10 months ago
.ruff.toml feat: Persist Variables for Enhanced Debugging Workflow (#20699) 11 months ago
Dockerfile chore: bump uv to 0.7.x (#20692) 11 months ago
README.md chore: required pip and performance improvment in mypy checks (#19225) 1 year ago
app.py chore: avoid repeated type ignore noqa by adding flask_restful and flask_login in mypy import exclusions (#19224) 1 year ago
app_factory.py feat: add debug log for request and response (#19781) (#19783) 12 months ago
commands.py Support OAuth Integration for Plugin Tools (#22550) 10 months ago
dify_app.py refactor: assembling the app features in modular way (#9129) 1 year ago
mypy.ini Feat/support sendgrid (#21011) 11 months ago
pyproject.toml chore: bump ruff to 0.12.x (#22259) 10 months ago
pytest.ini Refactor/remove db from cycle manager (#20455) 11 months ago
uv.lock chore: bump ruff to 0.12.x (#22259) 10 months ago

README.md

Dify Backend API

Usage

[!IMPORTANT]

In the v1.3.0 release, poetry has been replaced with uv as the package manager for Dify API backend service.

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    cp middleware.env.example middleware.env
    # change the profile to other vector database if you are not using weaviate
    docker compose -f docker-compose.middleware.yaml --profile weaviate -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

    cp .env.example .env 
    
  3. Generate a SECRET_KEY in the .env file.

    bash for Linux

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    

    bash for Mac

    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  4. Create environment.

    Dify API service uses UV to manage dependencies. First, you need to add the uv package manager, if you don't have it already.

    pip install uv
    # Or on macOS
    brew install uv
    
  5. Install dependencies

    uv sync --dev
    
  6. Run migrate

    Before the first launch, migrate the database to the latest version.

    uv run flask db upgrade
    
  7. Start backend

    uv run flask run --host 0.0.0.0 --port=5001 --debug
    
  8. Start Dify web service.

  9. Setup your application by visiting http://localhost:3000.

  10. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.

uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion

Testing

  1. Install dependencies for both the backend and the test environment

    uv sync --dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    uv run -P api bash dev/pytest/pytest_all_tests.sh