Remove app_id parameter from three endpoints and update trace manager to use
tenant_id as storage identifier when app_id is unavailable. This allows
standalone prompt generation utilities to emit telemetry.
Changes:
- controllers/console/app/generator.py: Remove app_id=None from 3 endpoints
(RuleGenerateApi, RuleCodeGenerateApi, RuleStructuredOutputGenerateApi)
- core/ops/ops_trace_manager.py: Use tenant_id fallback in send_to_celery
- Extract tenant_id from task.kwargs when app_id is None
- Use 'tenant-{tenant_id}' format as storage identifier
- Skip traces only if neither app_id nor tenant_id available
The trace metadata still contains the actual tenant_id, so enterprise
telemetry correctly emits metrics and logs grouped by tenant.
Dify Backend API
Usage
Important
In the v1.3.0 release,
poetryhas been replaced withuvas the package manager for Dify API backend service.
-
Start the docker-compose stack
The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using
docker-compose.cd ../docker cp middleware.env.example middleware.env # change the profile to mysql if you are not using postgres,change the profile to other vector database if you are not using weaviate docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d cd ../api -
Copy
.env.exampleto.envcp .env.example .env
Important
When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the site’s top-level domain (e.g.,
example.com). The frontend and backend must be under the same top-level domain in order to share authentication cookies.
-
Generate a
SECRET_KEYin the.envfile.bash for Linux
sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .envbash for Mac
secret_key=$(openssl rand -base64 42) sed -i '' "/^SECRET_KEY=/c\\ SECRET_KEY=${secret_key}" .env -
Create environment.
Dify API service uses UV to manage dependencies. First, you need to add the uv package manager, if you don't have it already.
pip install uv # Or on macOS brew install uv -
Install dependencies
uv sync --dev -
Run migrate
Before the first launch, migrate the database to the latest version.
uv run flask db upgrade -
Start backend
uv run flask run --host 0.0.0.0 --port=5001 --debug -
Start Dify web service.
-
Setup your application by visiting
http://localhost:3000. -
If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.
uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention
Additionally, if you want to debug the celery scheduled tasks, you can run the following command in another terminal to start the beat service:
uv run celery -A app.celery beat
Testing
-
Install dependencies for both the backend and the test environment
uv sync --dev -
Run the tests locally with mocked system environment variables in
tool.pytest_envsection inpyproject.toml, more can check Claude.mduv run pytest # Run all tests uv run pytest tests/unit_tests/ # Unit tests only uv run pytest tests/integration_tests/ # Integration tests # Code quality ../dev/reformat # Run all formatters and linters uv run ruff check --fix ./ # Fix linting issues uv run ruff format ./ # Format code uv run basedpyright . # Type checking