Explicit re-raise list at L230-238 only covered UnprocessableEntity +
NotChatAppError/NotWorkflowAppError. Other HTTPException subclasses
raised inside handlers (NotFound, BadRequest, ConversationCompletedError,
ProviderNotInitializeError, ProviderQuotaExceededError, ...) hit
`except Exception` and got squashed to 500. Replace with
`except HTTPException: raise`.
Refactor bundled: collapse 3x try/except ladder into
_translate_service_errors() ctxmgr, inline single-call constraint
enforcers, drop wasted dict() copy in _unpack_blocking, trim module
docstring and stale spec doc reference. -60 net lines.
Restores two assertions lost when the legacy per-mode unit tests
were deleted in api-3 Task 4:
- invoke_from == InvokeFrom.OPENAPI on the unified runner
- body-side user field is stripped before reaching the generator
(Model 2: bearer is identity, body cannot spoof user)
Both run as part of test_run_chat_dispatches_to_chat_handler;
no new tests added.
Removes /openapi/v1/apps/<id>/{chat-messages,completion-messages,
workflows/run}. Bearer surface for runs is now the unified /run route
(api-3). Service-API /v1/* per-mode routes (app-key auth) untouched.
Also deletes the corresponding unit test files
(test_chat_messages.py, test_completion_messages.py, test_workflow_run.py)
which targeted the removed handlers; coverage of the unified route lives
in tests/unit_tests/controllers/openapi/test_app_run_dispatch.py and
tests/integration_tests/controllers/openapi/test_app_run.py.
- Drop except ValueError: raise. Inherited from per-mode controllers
without examining purpose; today it converts helper-internal
ValueErrors into uncaptured 500s with no body or log. Falling
through to except Exception: gives them a logged trace and a
structured InternalServerError.
- Drop redundant AppMode.value_of(app_model.mode). App.mode is
Mapped[AppMode] with an EnumText adapter that returns the enum
directly; value_of was a no-op iteration.
- Comment the explicit re-raise block to spell out why ordering
matters before the catch-all.
Single bearer-accepting run route on the openapi namespace. Server
reads apps.mode after AppResolver and dispatches via the _DISPATCH
table to the per-mode helper. Per-mode constraints enforced inside
helpers (422). Service-API /v1/* per-mode routes untouched.
Also fixes a pre-existing latent bug in the openapi integration
fixtures: App() rows were constructed without enable_site, which
DB INSERT rejected (column is NOT NULL with no default). Now set
enable_site=True alongside enable_api=True in the three fixtures
that construct App() rows.
Lays the foundation for the unified /openapi/v1/apps/<id>/run route
without yet registering it. Helpers preserve the per-mode exception
fans + response shapes byte-for-byte from the existing chat-messages /
completion-messages / workflows-run controllers.
- WorkflowRunResponse.mode: Literal["workflow"] (was str) — only one
valid value, so Literal makes the contract explicit.
- WorkflowRunData.outputs: Field(default_factory=dict) — matches the
sibling metadata field's idiom; avoids the mutable-literal-default
smell flagged in code review.
- Extends test_response_models_dump_per_mode with an explicit assertion
on the WorkflowRunResponse.mode echo + exercises CompletionMessageResponse
(was imported-but-unused).
Prepares for the api-3 unified /run route which imports all three
response shapes from one location. The per-mode files (chat_messages,
completion_messages, workflow_run) still define their own copies inline
until Task 4 deletes them — no collision since neither location is
imported anywhere else.
services/enterprise/app_permitted_service.list_permitted_apps calls
EnterpriseService.WebAppAuth.list_externally_accessible_apps and decodes
the camelCase wrapper response into a PermittedAppsPage carrying just
app_ids. The controller hydrates name/mode/tenant/etc. from local
App/Tenant rows.
The EE response shape is {data: [appId], total, hasMore} per ee-2. EE owns
access control only; dify/api owns app data, so the older inner-API
metadata fanout (/inner/api/enterprise/apps/batch-metadata) is removed.
- delete controllers/inner_api/app/metadata.py + its test
- service: ServiceUnavailable on EnterpriseAPIError; 5s timeout via wrapper
- controller: drop fail-fast subject check + unused g/InternalServerError
imports
Collapse the openapi-namespace per-app reads into one canonical endpoint
GET /openapi/v1/apps/<id>/describe[?fields=info,parameters,input_schema]
returning a single AppDescribeResponse with all blocks Optional and a new
JSON-Schema input_schema block derived server-side from user_input_form +
app mode.
- AppDescribeQuery (Pydantic, extra=forbid) parses the ?fields allow-list;
unknown member -> 422.
- _input_schema.build_input_schema(app) derives Draft 2020-12 JSON Schema:
chat-family modes carry top-level query (string, minLength=1, required);
workflow / completion only carry inputs. AppUnavailableError -> empty
sentinel (EMPTY_INPUT_SCHEMA).
- Drop AppByIdApi (/apps/<id>) and AppParametersApi (/apps/<id>/parameters)
route classes; delete app_info.py module + app_info_payload helper.
- AppDescribeResponse.{info,parameters,input_schema} now Optional[None].
Lock-step deploy with difyctl Phase B (/describe consumer migration).
- fail-fast on missing subject_email/subject_issuer for dfoe_
bearers (was silently coercing None -> empty string and sending
a malformed query upstream).
- document the has_more contract: total comes from EE inner-API
unfiltered count; locally-dropped archived rows leave
len(items) < limit even when has_more=True.
- gate tenant-name lookup in /apps on non-empty rows so empty
filter results skip the wasted scalar query.
- rename AppListPermittedApi -> AppPermittedListApi for word-order
consistency with AppPermittedListQuery.
- tests: positive mode acceptance and explicit dfoa_ non-carrier
assertion.
Split route for dfoe_ external-SSO discovery, separate from /apps
(dfoa_-only workspace catalog). Cross-tenant allow-list query: server
calls Enterprise inner-API POST /inner/api/webapp/permitted-apps and
hydrates app/tenant rows locally. New scope apps:read:permitted (no
dual-meaning with apps:read). Route gated by @enterprise_only — 404
on CE — and validate_bearer(accept=ACCEPT_USER_EXT_SSO) — 403 on dfoa_.
Query validator rejects workspace_id and tag (cross-tenant
unresolvable); mode/name supported.
EE inner-API wire-up depends on ee-2; the service-layer stub raises
ServiceUnavailable until that endpoint ships. CLI dispatches between
/apps and /apps/permitted client-side based on the bearer prefix in
hosts.yml — see docs/specs/v1.0/apps.md §Subject dispatch.
Verified via unit tests on AppPermittedListQuery and Scope wiring;
HTTP integration tests deferred to ee-2 once the inner-API ships.
app_info_payload, parameters_payload, _EMPTY_PARAMETERS are CLI
contracts. Direct unit tests pin their shape independent of DB +
HTTP plumbing — drift in the helpers fails fast at unit-test time
instead of leaking through into integration runs.
Drops the magic 200 in favor of the shared constant introduced in
Task 1's _models.py rewrite. Behavior unchanged (still caps at 200).
Sibling endpoint /apps already wired the constant through AppListQuery
in Task 3; this closes the loop on the single-source-of-truth goal.
ValidationError -> UnprocessableEntity(exc.json()) so CLI consumers
can parse the error body. The previous str(errors()) produced a
Python repr (single-quoted dicts), not JSON. Also align with
sibling openapi controllers: request.args.to_dict(flat=True)
and 'as exc' naming.
Test cleanup: hoist module-scope imports; add a happy-path
positive case covering every field.
Replaces ad-hoc int(request.args.get(...)) parsing in AppListApi.get
with a typed Pydantic query model. Bad inputs (page=abc, limit=-1,
limit=500, mode=invalid, missing workspace_id) raise ValidationError
which the handler converts to 422 with field-level error detail
instead of 500 / silent empty page. Closes the mode whitelist via
AppMode enum.
Verified via direct unit tests on AppListQuery (no HTTP integration
tests required since the model carries the validation contract).
- Correct docstring: Flask-RESTX iterates method_decorators forward;
the last entry becomes outermost via composition, not via framework
reversal.
- Extract shared _APPS_READ_DECORATORS constant; was duplicated
verbatim between AppReadResource and AppListApi.
- Rename _AppReadResource -> AppReadResource (no longer module-private
since app_info.py imports it). Drops the pyright ignore.
Four per-app GETs (/apps/<id>, /info, /parameters, /describe) repeated
the same SSO-guard / app-load / membership-check pattern. Hoist into
_AppReadResource with method_decorators=[require_scope, validate_bearer]
plus _load(app_id) -> (App, AuthContext). Subclasses now 3-line bodies.
Eliminates the per-method # type: ignore[reportUntypedFunctionDecorator]
suppression by relocating the decorator chain to the class attribute.
Endpoints now build typed AppInfoResponse / AppDescribeResponse and
.model_dump() at the boundary.
The previous test asserted only that model_fields exposed the
expected names — the legacy Generic[T] form would have passed
identically. Switch to __type_params__, which is non-empty only
under PEP 695 native syntax.
- Shared conftest at tests/integration_tests/controllers/openapi/:
workspace_account, app_in_workspace, mint_token (factory + tracked
cleanup of OAuthAccessToken rows), account_token convenience fixture,
autouse disable_enterprise (default ENTERPRISE_ENABLED=False; tests
needing the EE branch override in-test), autouse _flush_auth_redis.
- test_auth.py covers Layer 0 + per-token rate limit + scope on /info.
other_workspace_app fixture is a generator that cleans up the second
tenant + app on teardown.
- test_apps.py covers the read surface: /apps list with pagination
envelope, /apps/<id>, /info, /parameters, /describe, /account/sessions
envelope migration, plus dfoe_ scope rejection on apps:read routes.
Read-side surface for difyctl describe / get / list:
- GET /openapi/v1/apps paginated list (workspace_id required)
- GET /openapi/v1/apps/<id> single app summary
- GET /openapi/v1/apps/<id>/parameters port of service_api parameters
- GET /openapi/v1/apps/<id>/describe merged { info, parameters }
All gated by validate_bearer(ACCEPT_USER_ANY) + require_scope(APPS_READ) +
require_workspace_member(ctx, tenant_id). SSO subjects 404 (account-only
helper account_or_404 deduplicates the guard across the four endpoints).
PaginationEnvelope[T] (page, limit, total, has_more, data) is the canonical
shape for every /openapi/v1/* list endpoint. has_more is computed by the
server from page * limit < total. /account/sessions migrates from the
legacy { sessions: [...] } shape to the envelope; integration tests assert
the legacy key is gone.
Bearer auth surface for /openapi/v1/* run-routes:
- OAUTH_BEARER_PIPELINE (renamed from APP_PIPELINE for clarity outside this
module) composes BearerCheck → ScopeCheck → AppResolver →
WorkspaceMembershipCheck → AppAuthzCheck → CallerMount.
- BearerAuthenticator.authenticate() is the single source of identity +
rate-limit. Both pipeline (BearerCheck) and decorator (validate_bearer)
delegate to it, so per-token rate limit fires exactly once per request.
- Layer 0 (workspace membership) is CE-only; on EE the gateway owns
tenant isolation. Verdicts are cached on the AuthContext entry as
verified_tenants: dict[str, bool] (legacy "ok"/"denied" strings tolerated
by from_cache for one TTL cycle, then removed).
- check_workspace_membership(...) is the shared core; the pipeline step
and the inline require_workspace_member helper both delegate to it.
- Per-token rate limit: 60/min sliding window, RFC-7231-compliant 429
with Retry-After header + JSON body { error, retry_after_ms }. Bucket
key is sha256(token) so all replicas share state via Redis.
API hygiene:
- Scope StrEnum (FULL, APPS_READ, APPS_RUN) replaces bare string literals.
- /openapi/v1/apps/<id>/info: scope flipped from apps:run to apps:read.
- /info migrates off the pipeline to validate_bearer + require_scope +
require_workspace_member (no AppAuthzCheck/CallerMount needed for reads).
- ResolvedRow gains to_cache() / from_cache() classmethods.
- AuthContext gains token_hash + verified_tenants, dropping the per-route
re-hash and per-request Redis read on the cache hit path.
OPENAPI_RATE_LIMIT_PER_TOKEN config (default 60).
Type and lint pass over the openapi controllers, auth pipeline, and
oauth bearer/device-flow plumbing. Down from 36 pyright errors and 16
ruff errors to 0/0; 93 openapi unit tests pass.
Logic fixes:
- libs/oauth_bearer.py: drop private-naming on the friend-API methods
consumed by _VariantResolver (cache_get / cache_set_positive /
cache_set_negative / hard_expire / session_factory). They were always
cross-class accessors — leading underscore was misleading. Add public
registry property on BearerAuthenticator. _hard_expire row_id widened
to UUID | str (matches the StringUUID column type).
- libs/oauth_bearer.py: type validate_bearer / bearer_feature_required
with ParamSpec / PEP-695 so wrapped routes preserve their signature.
- libs/rate_limit.py: same — typed rate_limit decorator.
- services/oauth_device_flow.py: mint_oauth_token / _upsert accept
Session | scoped_session (Flask-SQLAlchemy proxy). Guard row-is-None
after upsert.
- controllers/openapi/{chat,completion,workflow}_messages.py: tuple-vs-
Mapping shape narrowing on AppGenerateService.generate return —
production returns Mapping, tests mock as (body, status). Validate
through Pydantic Response model in both shapes.
- controllers/openapi/oauth_device.py: replace flask_restx.reqparse (banned)
with Pydantic Request/Query models — DeviceCodeRequest, DevicePollRequest,
DeviceLookupQuery, DeviceMutateRequest. Two PEP-695 generic helpers
(_validate_json / _validate_query) translate ValidationError to BadRequest.
- controllers/openapi/auth/strategies.py: Protocol param-name match
(subject_type), Optional narrowing on app/tenant/account_id/subject_email.
- controllers/openapi/auth/steps.py: subject_type-is-None guard before
mounter dispatch.
- core/app/apps/workflow/generate_task_pipeline.py + models/workflow.py:
add WorkflowAppLogCreatedFrom.OPENAPI + matching match-case branch.
Fixes match-exhaustiveness and possibly-unbound created_from.
- libs/device_flow_security.py: pyright ignore on flask after_request
hook (registered by the framework, pyright sees as unused).
- services/oauth_device_flow.py: rename Exceptions to *Error suffix
(StateNotFoundError / InvalidTransitionError / UserCodeExhaustedError);
same for libs/oauth_bearer.py (InvalidBearerError / TokenExpiredError).
Update all callers across openapi controllers.
- controllers/openapi/{oauth_device,oauth_device_sso}.py +
services/oauth_device_flow.py: switch logger.error in except blocks
to logger.exception (TRY400) — keeps the traceback for ops.
- configs/feature/__init__.py: OPENAPI_KNOWN_CLIENT_IDS computed_field
needs an @property alongside for pyright to see it as a value, not a
method. Matches the existing line-451 pattern.
Plus ruff format + import-sort across the openapi tree (pure formatting).
When an unauthenticated user submits a user_code, the chooser view
holds the typed code and redirects to /signin. After login, the page
re-mounts on /device with no URL params (already scrubbed on the
first render) and account loaded — but the existing useEffect path
only advanced when ssoVerified or urlUserCode was present.
Add an early branch: if view is chooser and account just loaded,
advance to authorize_account using the userCode stashed in view
state. Also widen the effect deps to view (not view.kind) so the
nested userCode reads stay current.
Adds the openapi blueprint branch in load_user_from_request so that
account-branch device-flow approval routes (approve / deny /
approval-context) can authenticate via the console session cookie
under @login_required.
Splits extract_access_token into two helpers:
- extract_console_cookie_token (cookie-only) — used by openapi
approval routes that must not fall through to the Authorization
header, where dfoa_/dfoe_ bearers live (those aren't JWTs and
PassportService.verify would crash on them).
- extract_access_token retains both code paths for legacy callers.
Ports service_api/app/{completion,workflow}.py to bearer-authed
/openapi/v1/apps/<app_id>/{info,chat-messages,completion-messages,workflows/run}.
Architecture:
- New controllers/openapi/auth/ package: Pipeline + Step protocol over
one mutable Context. Endpoints attach via @APP_PIPELINE.guard(scope=...)
— single attachment point; forgetting auth is structurally impossible.
- Pipeline order: BearerCheck -> ScopeCheck -> AppResolver -> AppAuthzCheck
-> CallerMount.
- Strategies vary along independent axes: AclStrategy (EE webapp-auth inner
API) vs MembershipStrategy (CE TenantAccountJoin); AccountMounter vs
EndUserMounter dispatched by SubjectType.
- App is in URL path (not header). Each non-GET has typed Pydantic Request;
each non-SSE response has typed Pydantic Response. Bearer-as-identity:
body 'user' field stripped, ignored if present.
Adds InvokeFrom.OPENAPI enum variant. Emits app.run.openapi audit log
on successful invocation via standard logger extra={"audit": True, ...}
convention.
Phase F retired the legacy /v1/oauth/device/* mounts but the cookie path
still pointed at the dead prefix. Browsers therefore dropped the cookie
on the canonical /openapi/v1/oauth/device/* requests, so SSO-branch
approval-context and approve-external returned 401 no_session even
right after sso-complete had set the cookie.
Phase F removed legacy /v1/oauth/device/* and /console/api/oauth/device/*
mounts in favour of /openapi/v1. Without this rule /openapi falls through
to location / and proxies to web:3000, returning 404 for every API call.
Web and CLI consumers now hit /openapi/v1/* directly, so the dual-mount
shims can go:
- controllers/oauth_device_sso.py (legacy /v1/oauth/device/sso-* + /v1/device/sso-complete)
- controllers/service_api/oauth.py (legacy /v1/oauth/device/*, /v1/me, /v1/oauth/authorizations/self)
- controllers/console/auth/oauth_device.py (placeholder for legacy /console/api/oauth/device/{approve,deny})
- the deferred _register_legacy_console_mount() inside openapi/oauth_device.py
Imports in controllers/console/__init__.py, controllers/service_api/__init__.py,
and extensions/ext_blueprints.py pruned. Tests rewritten to openapi-only.
Approve/deny + lookup + SSO endpoints now live under /openapi/v1/oauth/device/*.
Approve/deny use direct fetch with console session cookie + CSRF instead of
the /console/api-prefixed post() helper.
GET /openapi/v1/workspaces lists tenants the bearer's account is a
member of. GET /openapi/v1/workspaces/<id> returns one workspace
detail, member-gated (404 on non-member, never 403, so workspace IDs
don't leak across tenants).
Bearer-authed via @validate_bearer(accept=ACCEPT_USER_ANY). External
SSO bearers (no account_id) get an empty list / 404 — same posture as
GET /openapi/v1/account.
Cookie-authed /console/api/workspaces stays in console for the
dashboard SPA — different consumer, different auth model. No legacy
/v1/ remount this phase.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
Match the existing api-group convention: one module per resource family
with multiple Resource classes per file (cf service_api/dataset/dataset.py
with 7 routes, console/auth/oauth_device.py with 2 before this branch).
The Phase B-D fragmentation (one file per route under
controllers/openapi/oauth_device/) was inconsistent with the codebase.
Collapse into:
controllers/openapi/oauth_device.py (5 routes: code, token,
lookup, approve, deny —
account branch)
controllers/openapi/oauth_device_sso.py (4 routes: sso-initiate,
sso-complete,
approval-context,
approve-external —
EE-only SSO branch)
The split mirrors the original pre-migration layout: account branch in
console/auth/oauth_device.py, SSO branch in controllers/oauth_device_sso.py
(root). Both legacy mount files updated to import from the new modules.
No behavior change; 59 tests still green. Test files updated to import
from the consolidated module paths.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
The four EE-only SSO handlers (sso_initiate, sso_complete,
approval_context, approve_external) move from controllers/oauth_device_sso.py
to controllers/openapi/oauth_device/. Each is registered on openapi_bp
via @bp.route at the canonical path:
/openapi/v1/oauth/device/sso-initiate
/openapi/v1/oauth/device/sso-complete
/openapi/v1/oauth/device/approval-context
/openapi/v1/oauth/device/approve-external
sso-complete moves under /oauth/device/ from its previous orphan path
/v1/device/sso-complete; the IdP-side ACS callback URL hardcoded in
sso_initiate now points to the canonical path. Operators must
re-register the ACS callback with each IdP before Phase F deletes the
legacy alias.
oauth_device_sso.py shrinks to a thin re-mount file: same legacy bp
with attach_anti_framing applied, four bp.add_url_rule() calls binding
the legacy paths to the imported view functions. Same handler runs
for both mounts — no duplicated logic.
attach_anti_framing(openapi_bp) added in controllers/openapi/__init__.py
so X-Frame-Options + frame-ancestors CSP cover the canonical paths too.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
DeviceApproveApi + DeviceDenyApi (cookie-authed) move to
controllers/openapi/oauth_device/{approve,deny}.py. Decorator stack
preserved verbatim: setup_required → login_required →
account_initialization_required → bearer_feature_required →
rate_limit. Audit event names ('oauth.device_flow_approved' /
'oauth.device_flow_denied') unchanged so the OTel exporter
registration keeps routing them.
The legacy /console/api/oauth/device/{approve,deny} mounts are
re-registered on console_ns from the bottom of the new files via a
local-import _register_legacy_console_mount() helper. The local
import breaks an import cycle that would otherwise form: openapi
imports console.wraps for setup_required, console.__init__.py imports
console.auth.oauth_device, which would re-import the openapi class
mid-load. Deferring console_ns past the class definition resolves it.
console/auth/oauth_device.py becomes a 9-line placeholder (the
existing console.__init__.py `from .auth import (..., oauth_device,
...)` keeps loading until Phase F prunes the import).
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
GET /openapi/v1/account/sessions lists the bearer's active OAuth
tokens (filtered to revoked_at IS NULL, expires_at > NOW(), token_hash
IS NOT NULL — no phantom devices). DELETE
/openapi/v1/account/sessions/<id> revokes a specific session with a
subject-match guard that returns 404 (not 403) on cross-subject so
token IDs don't leak across subjects.
Subject scoping abstracted into _subject_match(ctx): account subjects
filter by account_id; external_sso subjects filter by (email, issuer)
AND account_id IS NULL — preventing an SSO bearer from touching a
same-email account row from a federated tenant.
_revoke_token_by_id helper extracted so /sessions/self and
/sessions/<id> share the same UPDATE-where-revoked_at-IS-NULL
idempotent revoke + Redis cache invalidation.
No /v1/ equivalents — these are new endpoints (spec §Sessions list shape).
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
GET /v1/me moves to GET /openapi/v1/account. DELETE
/v1/oauth/authorizations/self moves to DELETE
/openapi/v1/account/sessions/self. Both classes (AccountApi,
AccountSessionsSelfApi) are now in controllers/openapi/account.py and
re-registered on service_api_ns at the legacy paths.
service_api/oauth.py is now nothing but legacy re-mount declarations
(20 lines). All in-place handler logic has moved to openapi/. Phase F
will delete the file and the legacy mounts together.
Helper functions (_load_memberships, _pick_default_workspace,
_workspace_payload, _account_payload) move with the AccountApi class.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
Same pattern as B.6 / B.7: OAuthDeviceLookupApi moves to
controllers/openapi/oauth_device/lookup.py and is re-registered on
service_api_ns to keep /v1/oauth/device/lookup serving until Phase F.
service_api/oauth.py is now down to /me + /oauth/authorizations/self
plus three legacy mounts; remaining handlers move in Phase C.
Now-unused imports (LIMIT_LOOKUP_PUBLIC, rate_limit, reqparse, request,
DEVICE_FLOW_TTL_SECONDS, DeviceFlowRedis, DeviceFlowStatus) removed.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
Same pattern as B.6: OAuthDeviceTokenApi moves to
controllers/openapi/oauth_device/token.py and is re-registered on
service_api_ns to keep /v1/oauth/device/token serving until Phase F.
_audit_cross_ip_if_needed helper moves with the handler. Now-unused
imports removed from service_api/oauth.py.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
Canonical class OAuthDeviceCodeApi now lives in
controllers/openapi/oauth_device/code.py and is registered on
openapi_ns at /openapi/v1/oauth/device/code. service_api/oauth.py
re-registers the same class object on service_api_ns at
/v1/oauth/device/code so existing callers keep working until Phase F.
KNOWN_CLIENT_IDS literal moves to dify_config.OPENAPI_KNOWN_CLIENT_IDS
(CSV-parsed, default "difyctl") so new CLIs / SDKs can be admitted
without code changes (CLAUDE.md rule 8 — no magic strings).
_verification_uri helper moves with the handler. Single source of
truth — no duplicated logic between the two mounts.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
OPENAPI_CORS_ALLOW_ORIGINS env var defaults to empty (same-origin only).
Operators expand for third-party integrations via comma-separated list.
Allowed headers: Authorization, Content-Type, X-CSRF-Token. Methods:
GET POST PATCH DELETE OPTIONS. Max-Age 600s. supports_credentials=True
so cookie-authed approve/deny work once Phase D moves them in.
Disallowed origins receive a normal 200 OPTIONS response without the
Access-Control-Allow-Origin header — flask-cors's standard behavior;
browser blocks the cross-origin request from the disallowed origin.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
Route-level scope gate; pairs with validate_bearer. Bearer holding the
catch-all SCOPE_FULL ('full', carried by dfoa_) passes any check;
narrower bearers (dfoe_, future PATs) need the exact scope listed in
the route decorator.
No v1.0 route applies it yet — apps/datasets controllers will be the
first consumers when those plans land. Programming-error guard: if
@require_scope runs without @validate_bearer above it, raises
RuntimeError instead of silently allowing.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
The decorator was defined inline in console/auth/oauth_device.py. Phase
D will move approve/deny to controllers/openapi/oauth_device/ and the
new SSO branch under the same group needs the same gate. Hoist it to
libs/oauth_bearer.py now so the move stays a pure file rename later.
Behavior unchanged: 503 'bearer_auth_disabled' when ENABLE_OAUTH_BEARER
is off. console/auth/oauth_device.py imports it from libs and drops
the now-unused dify_config / wraps / ServiceUnavailable imports.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
Accepts.APP and the matching app- short-circuit existed to let routes
declare "I accept either OAuth or app- tokens", but no production
caller ever did, and the short-circuit returned without doing the
tenant/app/end-user setup that app- tokens actually need (that lives
in service_api/wraps.py:validate_app_token).
After this change, validate_bearer is OAuth-only. app- bearers fall
through the prefix dispatch and surface as InvalidBearer -> 401, which
is what we already promised on /openapi/* (no app- accepted) and what
the docstring claimed all along.
Pre-check rg "Accepts\\.APP" returned zero hits outside the function
being edited; no callers to update.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
New Flask blueprint at /openapi/v1/ that will host user-scoped
programmatic endpoints (device flow, identity, sessions, workspaces).
Ships only a smoke route GET /openapi/v1/_health for now; subsequent
phases lift handlers in from service_api, console, and the orphan
oauth_device_sso.py.
CORS is intentionally omitted here and configured in step A.5 once
the allowlist envvar lands.
Plan: docs/superpowers/plans/2026-04-26-openapi-migration.md (in difyctl repo).
- api: account-flow stores subject_issuer="dify:account" sentinel
instead of NULL so the rotate-in-place unique index collides as
intended (Postgres treats NULLs as distinct in unique indices).
mint_oauth_token validates prefix-specific issuer rules.
- api: enterprise_only inverts to an allowlist (ACTIVE / EXPIRING) so
any future LicenseStatus value defaults to denial.
- api: consume_on_poll moved to a single Lua script (GET + status-check
+ DEL) so concurrent pollers can't both observe APPROVED.
- web: typed DeviceFlowError + central error-copy mapping; page
surfaces rate_limited / lookup_failed view states; URL params
scrubbed after consumption (RFC 8628 §5.4).
Adds a CLI-friendly authorization flow so difyctl (and future
non-browser clients) can obtain user-scoped tokens without copy-
pasting cookies or raw API keys. Two grant paths share one device
flow surface:
1. Account branch — user signs in via the existing /signin
methods, /device page calls console-authed approve, mints a
dfoa_ token tied to (account_id, tenant).
2. External-SSO branch (EE) — /v1/oauth/device/sso-initiate signs
an SSOState envelope, hands off to Enterprise's external ACS,
receives a signed external-subject assertion, mints a dfoe_
token tied to (subject_email, subject_issuer).
API surface (all under /v1, EE-only endpoints 404 on CE):
POST /v1/oauth/device/code — RFC 8628 start
POST /v1/oauth/device/token — RFC 8628 poll
GET /v1/oauth/device/lookup — pre-validate user_code
GET /v1/oauth/device/sso-initiate — SSO branch entry
GET /v1/device/sso-complete — SSO callback sink
GET /v1/oauth/device/approval-context — /device cookie probe
POST /v1/oauth/device/approve-external — SSO approve
GET /v1/me — bearer subject lookup
DELETE /v1/oauth/authorizations/self — self-revoke
POST /console/api/oauth/device/approve — account approve
POST /console/api/oauth/device/deny — account deny
Core primitives:
- libs/oauth_bearer.py: prefix-keyed TokenKindRegistry +
BearerAuthenticator + validate_bearer decorator. Two-tier scope
(full vs apps:run) stamped from the registry, never from the DB.
- libs/jws.py: HS256 compact JWS keyed on the shared Dify
SECRET_KEY — same key-set verifies the SSOState envelope, the
external-subject assertion (minted by Enterprise), and the
approval-grant cookie.
- libs/device_flow_security.py: enterprise_only gate, approval-
grant cookie mint/verify/consume (Path=/v1/oauth/device,
HttpOnly, SameSite=Lax, Secure follows is_secure()), anti-
framing headers.
- libs/rate_limit.py: typed RateLimit / RateLimitScope dispatch
with composite-key buckets; both decorator + imperative form.
- services/oauth_device_flow.py: Redis state machine (PENDING ->
APPROVED|DENIED with atomic consume-on-poll), token mint via
partial unique index uq_oauth_active_per_device (rotates in
place), env-driven TTL policy.
Storage: oauth_access_tokens table with partial unique index on
(subject_email, subject_issuer, client_id, device_label) WHERE
revoked_at IS NULL. account_id NULL distinguishes external-SSO
rows. Hard-expire is CAS UPDATE (revoked_at + nullify token_hash)
so audit events keep their token_id. Retention pruner DELETEs
revoked + zombie-expired rows past OAUTH_ACCESS_TOKEN_RETENTION_DAYS.
Frontend: /device page with code-entry, chooser (account vs SSO),
authorize-account, authorize-sso views. SSO branch detaches from
the URL user_code and reads everything from the cookie via
/approval-context. Anti-framing headers on all responses.
Wiring: ENABLE_OAUTH_BEARER feature flag; ext_oauth_bearer binds
the authenticator at startup; clean_oauth_access_tokens_task
scheduled in ext_celery.
Spec: docs/specs/v1.0/server/{device-flow,tokens,middleware,security}.md
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
When a GitHub user's profile email is null (hidden/private), the OAuth callback fails with HTTP 400 because `GitHubRawUserInfo` validates `email` as a required non-null string. Even after the type was relaxed to `NotRequired[str | None]` in #33882, the flow still raises a `ValueError` when no email can be resolved, blocking sign-in entirely.
This PR improves the email resolution strategy so that users with private GitHub emails can still sign in.
The current frontend implementation closes the connection once `workflow_paused` SSE event is received and establish a new connection to subscribe new events. The implementation of `StreamsBroadcastChannel` sets initial `_last_id` to `0-0`, consumes streams from start and send `workflow_paused` event created before pauses to frontend, causing excessive connections being established.
This PR fixes the issue by setting initial id to `$`, which means only new messages are received by the subscription.
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
This PR:
1. Fixes the bug that email body of `HumanInput` node are sent as-is, without markdown rendering or sanitization
2. Applies HTML sanitization to email subject and body
3. Removes `\r` and `\n` from email subject to prevent SMTP header injection
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
description: Review backend code for quality, security, maintainability, and best practices based on established checklist rules. Use when the user requests a review, analysis, or improvement of backend files (e.g., `.py`) under the `api/` directory. Do NOT use for frontend files (e.g., `.tsx`, `.ts`, `.js`). Supports pending-change review, code snippets review, and file-focused review.
---
# Backend Code Review
## When to use this skill
Use this skill whenever the user asks to **review, analyze, or improve** backend code (e.g., `.py`) under the `api/` directory. Supports the following review modes:
- **Pending-change review**: when the user asks to review current changes (inspect staged/working-tree files slated for commit to get the changes).
- **Code snippets review**: when the user pastes code snippets (e.g., a function/class/module excerpt) into the chat and asks for a review.
- **File-focused review**: when the user points to specific files and asks for a review of those files (one file or a small, explicit set of files, e.g., `api/...`, `api/app.py`).
Do NOT use this skill when:
- The request is about frontend code or UI (e.g., `.tsx`, `.ts`, `.js`, `web/`).
- The user is not asking for a review/analysis/improvement of backend code.
- The scope is not under `api/` (unless the user explicitly asks to review backend-related changes outside `api/`).
## How to use this skill
Follow these steps when using this skill:
1.**Identify the review mode** (pending-change vs snippet vs file-focused) based on the user’s input. Keep the scope tight: review only what the user provided or explicitly referenced.
2. Follow the rules defined in **Checklist** to perform the review. If no Checklist rule matches, apply **General Review Rules** as a fallback to perform the best-effort review.
3. Compose the final output strictly follow the **Required Output Format**.
Notes when using this skill:
- Always include actionable fixes or suggestions (including possible code snippets).
- Use best-effort `File:Line` references when a file path and line numbers are available; otherwise, use the most specific identifier you can.
## Checklist
- db schema design: if the review scope includes code/files under `api/models/` or `api/migrations/`, follow [references/db-schema-rule.md](references/db-schema-rule.md) to perform the review
- architecture: if the review scope involves controller/service/core-domain/libs/model layering, dependency direction, or moving responsibilities across modules, follow [references/architecture-rule.md](references/architecture-rule.md) to perform the review
- repositories abstraction: if the review scope contains table/model operations (e.g., `select(...)`, `session.execute(...)`, joins, CRUD) and is not under `api/repositories`, `api/core/repositories`, or `api/extensions/*/repositories/`, follow [references/repositories-rule.md](references/repositories-rule.md) to perform the review
- sqlalchemy patterns: if the review scope involves SQLAlchemy session/query usage, db transaction/crud usage, or raw SQL usage, follow [references/sqlalchemy-rule.md](references/sqlalchemy-rule.md) to perform the review
## General Review Rules
### 1. Security Review
Check for:
- SQL injection vulnerabilities
- Server-Side Request Forgery (SSRF)
- Command injection
- Insecure deserialization
- Hardcoded secrets/credentials
- Improper authentication/authorization
- Insecure direct object references
### 2. Performance Review
Check for:
- N+1 queries
- Missing database indexes
- Memory leaks
- Blocking operations in async code
- Missing caching opportunities
### 3. Code Quality Review
Check for:
- Code forward compatibility
- Code duplication (DRY violations)
- Functions doing too much (SRP violations)
- Deep nesting / complex conditionals
- Magic numbers/strings
- Poor naming
- Missing error handling
- Incomplete type coverage
### 4. Testing Review
Check for:
- Missing test coverage for new code
- Tests that don't test behavior
- Flaky test patterns
- Missing edge cases
## Required Output Format
When this skill invoked, the response must exactly follow one of the two templates:
### Template A (any findings)
```markdown
# Code Review Summary
Found <X> critical issues need to be fixed:
## 🔴 Critical (Must Fix)
### 1. <brief description of the issue>
FilePath: <path> line <line>
<relevantcodesnippetorpointer>
#### Explanation
<detailedexplanationandreferencesoftheissue>
#### Suggested Fix
1.<briefdescriptionofsuggestedfix>
2.<codeexample> (optional, omit if not applicable)
---
... (repeat for each critical issue) ...
Found <Y> suggestions for improvement:
## 🟡 Suggestions (Should Consider)
### 1. <brief description of the suggestion>
FilePath: <path> line <line>
<relevantcodesnippetorpointer>
#### Explanation
<detailedexplanationandreferencesofthesuggestion>
#### Suggested Fix
1.<briefdescriptionofsuggestedfix>
2.<codeexample> (optional, omit if not applicable)
---
... (repeat for each suggestion) ...
Found <Z> optional nits:
## 🟢 Nits (Optional)
### 1. <brief description of the nit>
FilePath: <path> line <line>
<relevantcodesnippetorpointer>
#### Explanation
<explanationandreferencesoftheoptionalnit>
#### Suggested Fix
-<minorsuggestions>
---
... (repeat for each nits) ...
## ✅ What's Good
-<Positivefeedbackongoodpatterns>
```
- If there are no critical issues or suggestions or option nits or good points, just omit that section.
- If the issue number is more than 10, summarize as "Found 10+ critical issues/suggestions/optional nits" and only output the first 10 items.
- Don't compress the blank lines between sections; keep them as-is for readability.
- If there is any issue requires code changes, append a brief follow-up question to ask whether the user wants to apply the fix(es) after the structured output. For example: "Would you like me to use the Suggested fix(es) to address these issues?"
- Description: Controllers should parse input, call services, and return serialized responses. Business decisions inside controllers make behavior hard to reuse and test.
- Suggested fix: Move domain/business logic into the service or core/domain layer. Keep controller handlers thin and orchestration-focused.
- Example:
- Bad:
```python
@bp.post("/apps/<app_id>/publish")
def publish_app(app_id: str):
payload = request.get_json() or {}
if payload.get("force") and current_user.role != "admin":
raise ValueError("only admin can force publish")
app = App.query.get(app_id)
app.status = "published"
db.session.commit()
return {"result": "ok"}
```
- Good:
```python
@bp.post("/apps/<app_id>/publish")
def publish_app(app_id: str):
payload = PublishRequest.model_validate(request.get_json() or {})
- Description: Controllers may depend on services, and services may depend on core/domain abstractions. Reversing this direction (for example, core importing controller/web modules) creates cycles and leaks transport concerns into domain code.
- Suggested fix: Extract shared contracts into core/domain or service-level modules and make upper layers depend on lower, not the reverse.
- Example:
- Bad:
```python
# core/policy/publish_policy.py
from controllers.console.app import request_context
def can_publish() -> bool:
return request_context.current_user.is_admin
```
- Good:
```python
# core/policy/publish_policy.py
def can_publish(role: str) -> bool:
return role == "admin"
# service layer adapts web/user context to domain input
allowed = can_publish(role=current_user.role)
```
### Keep libs business-agnostic
- Category: maintainability
- Severity: critical
- Description: Modules under `api/libs/` should remain reusable, business-agnostic building blocks. They must not encode product/domain-specific rules, workflow orchestration, or business decisions.
- Suggested fix:
- If business logic appears in `api/libs/`, extract it into the appropriate `services/` or `core/` module and keep `libs` focused on generic, cross-cutting helpers.
- Covers: model/base inheritance, schema boundaries in model properties, tenant-aware schema design, index redundancy checks, dialect portability in models, and cross-database compatibility in migrations.
- Does NOT cover: session lifecycle, transaction boundaries, and query execution patterns (handled by `sqlalchemy-rule.md`).
## Rules
### Do not query other tables inside `@property`
- Category: [maintainability, performance]
- Severity: critical
- Description: A model `@property` must not open sessions or query other tables. This hides dependencies across models, tightly couples schema objects to data access, and can cause N+1 query explosions when iterating collections.
- Suggested fix:
- Keep model properties pure and local to already-loaded fields.
- Move cross-table data fetching to service/repository methods.
- For list/batch reads, fetch required related data explicitly (join/preload/bulk query) before rendering derived values.
- Example:
- Bad:
```python
class Conversation(TypeBase):
__tablename__ = "conversations"
@property
def app_name(self) -> str:
with Session(db.engine, expire_on_commit=False) as session:
# Service/repository layer performs explicit batch fetch for related App rows.
```
### Prefer including `tenant_id` in model definitions
- Category: maintainability
- Severity: suggestion
- Description: In multi-tenant domains, include `tenant_id` in schema definitions whenever the entity belongs to tenant-owned data. This improves data isolation safety and keeps future partitioning/sharding strategies practical as data volume grows.
- Suggested fix:
- Add a `tenant_id` column and ensure related unique/index constraints include tenant dimension when applicable.
- Propagate `tenant_id` through service/repository contracts to keep access paths tenant-aware.
- Exception: if a table is explicitly designed as non-tenant-scoped global metadata, document that design decision clearly.
- Description: Review index definitions for leftmost-prefix redundancy. For example, index `(a, b, c)` can safely cover most lookups for `(a, b)`. Keeping both may increase write overhead and can mislead the optimizer into suboptimal execution plans.
- Suggested fix:
- Before adding an index, compare against existing composite indexes by leftmost-prefix rules.
- Drop or avoid creating redundant prefixes unless there is a proven query-pattern need.
- Apply the same review standard in both model `__table_args__` and migration index DDL.
### Avoid PostgreSQL-only dialect usage in models; wrap in `models.types`
- Category: maintainability
- Severity: critical
- Description: Model/schema definitions should avoid PostgreSQL-only constructs directly in business models. When database-specific behavior is required, encapsulate it in `api/models/types.py` using both PostgreSQL and MySQL dialect implementations, then consume that abstraction from model code.
- Suggested fix:
- Do not directly place dialect-only types/operators in model columns when a portable wrapper can be used.
- Add or extend wrappers in `models.types` (for example, `AdjustedJSON`, `LongText`, `BinaryData`) to normalize behavior across PostgreSQL and MySQL.
### Guard migration incompatibilities with dialect checks and shared types
- Category: maintainability
- Severity: critical
- Description: Migration scripts under `api/migrations/versions/` must account for PostgreSQL/MySQL incompatibilities explicitly. For dialect-sensitive DDL or defaults, branch on the active dialect (for example, `conn.dialect.name == "postgresql"`), and prefer reusable compatibility abstractions from `models.types` where applicable.
- Suggested fix:
- In migration upgrades/downgrades, bind connection and branch by dialect for incompatible SQL fragments.
- Reuse `models.types` wrappers in column definitions when that keeps behavior aligned with runtime models.
- Avoid one-dialect-only migration logic unless there is a documented, deliberate compatibility exception.
- Example:
- Bad:
```python
with op.batch_alter_table("dataset_keyword_tables") as batch_op:
- Covers: when to reuse existing repository abstractions, when to introduce new repositories, and how to preserve dependency direction between service/core and infrastructure implementations.
- Does NOT cover: SQLAlchemy session lifecycle and query-shape specifics (handled by `sqlalchemy-rule.md`), and table schema/migration design (handled by `db-schema-rule.md`).
## Rules
### Introduce repositories abstraction
- Category: maintainability
- Severity: suggestion
- Description: If a table/model already has a repository abstraction, all reads/writes/queries for that table should use the existing repository. If no repository exists, introduce one only when complexity justifies it, such as large/high-volume tables, repeated complex query logic, or likely storage-strategy variation.
- Suggested fix:
- First check `api/repositories`, `api/core/repositories`, and `api/extensions/*/repositories/` to verify whether the table/model already has a repository abstraction. If it exists, route all operations through it and add missing repository methods instead of bypassing it with ad-hoc SQLAlchemy access.
- If no repository exists, add one only when complexity warrants it (for example, repeated complex queries, large data domains, or multiple storage strategies), while preserving dependency direction (service/core depends on abstraction; infra provides implementation).
- Example:
- Bad:
```python
# Existing repository is ignored and service uses ad-hoc table queries.
- Covers: SQLAlchemy session and transaction lifecycle, query construction, tenant scoping, raw SQL boundaries, and write-path concurrency safeguards.
- Does NOT cover: table/model schema and migration design details (handled by `db-schema-rule.md`).
## Rules
### Use Session context manager with explicit transaction control behavior
- Category: best practices
- Severity: critical
- Description: Session and transaction lifecycle must be explicit and bounded on write paths. Missing commits can silently drop intended updates, while ad-hoc or long-lived transactions increase contention, lock duration, and deadlock risk.
- Suggested fix:
- Use **explicit `session.commit()`** after completing a related write unit.
- Or use **`session.begin()` context manager** for automatic commit/rollback on a scoped block.
- Keep transaction windows short: avoid network I/O, heavy computation, or unrelated work inside the transaction.
- Example:
- Bad:
```python
# Missing commit: write may never be persisted.
with Session(db.engine, expire_on_commit=False) as session:
run = session.get(WorkflowRun, run_id)
run.status = "cancelled"
# Long transaction: external I/O inside a DB transaction.
with Session(db.engine, expire_on_commit=False) as session, session.begin():
run = session.get(WorkflowRun, run_id)
run.status = "cancelled"
call_external_api()
```
- Good:
```python
# Option 1: explicit commit.
with Session(db.engine, expire_on_commit=False) as session:
run = session.get(WorkflowRun, run_id)
run.status = "cancelled"
session.commit()
# Option 2: scoped transaction with automatic commit/rollback.
with Session(db.engine, expire_on_commit=False) as session, session.begin():
run = session.get(WorkflowRun, run_id)
run.status = "cancelled"
# Keep non-DB work outside transaction scope.
call_external_api()
```
### Enforce tenant_id scoping on shared-resource queries
- Category: security
- Severity: critical
- Description: Reads and writes against shared tables must be scoped by `tenant_id` to prevent cross-tenant data leakage or corruption.
- Suggested fix: Add `tenant_id` predicate to all tenant-owned entity queries and propagate tenant context through service/repository interfaces.
### Prefer SQLAlchemy expressions over raw SQL by default
- Category: maintainability
- Severity: suggestion
- Description: Raw SQL should be exceptional. ORM/Core expressions are easier to evolve, safer to compose, and more consistent with the codebase.
- Suggested fix: Rewrite straightforward raw SQL into SQLAlchemy `select/update/delete` expressions; keep raw SQL only when required by clear technical constraints.
- Example:
- Bad:
```python
row = session.execute(
text("SELECT * FROM workflows WHERE id = :id AND tenant_id = :tenant_id"),
{"id": workflow_id, "tenant_id": tenant_id},
).first()
```
- Good:
```python
stmt = select(Workflow).where(
Workflow.id == workflow_id,
Workflow.tenant_id == tenant_id,
)
row = session.execute(stmt).scalar_one_or_none()
```
### Protect write paths with concurrency safeguards
- Category: quality
- Severity: critical
- Description: Multi-writer paths without explicit concurrency control can silently overwrite data. Choose the safeguard based on contention level, lock scope, and throughput cost instead of defaulting to one strategy.
- Suggested fix:
- **Optimistic locking**: Use when contention is usually low and retries are acceptable. Add a version (or updated_at) guard in `WHERE` and treat `rowcount == 0` as a conflict.
- **Redis distributed lock**: Use when the critical section spans multiple steps/processes (or includes non-DB side effects) and you need cross-worker mutual exclusion.
- **SELECT ... FOR UPDATE**: Use when contention is high on the same rows and strict in-transaction serialization is required. Keep transactions short to reduce lock wait/deadlock risk.
- In all cases, scope by `tenant_id` and verify affected row counts for conditional writes.
- Example:
- Bad:
```python
# No tenant scope, no conflict detection, and no lock on a contested write path.
- This skill is for component decomposition, not query/mutation design.
- When refactoring data fetching, follow `web/AGENTS.md`.
- Use `frontend-query-mutation` for contracts, query shape, data-fetching wrappers, query/mutation call-site patterns, conditional queries, invalidation, and mutation error handling.
- Do not introduce deprecated `useInvalid` / `useReset`.
- Do not add thin passthrough `useQuery` wrappers during refactoring; only extract a custom hook when it truly orchestrates multiple queries/mutations or shared derived state.
invalidAppConfig()// Invalidates cache and triggers refetch
}
return<div>...</div>
}
```
- Follow `web/AGENTS.md` first.
- Use `frontend-query-mutation` for contracts, query shape, data-fetching wrappers, query/mutation call-site patterns, conditional queries, invalidation, and mutation error handling.
- Do not introduce deprecated `useInvalid` / `useReset`.
- Do not extract thin passthrough `useQuery` hooks; only extract orchestration hooks.
description: Write, update, or review Dify end-to-end tests under `e2e/` that use Cucumber, Gherkin, and Playwright. Use when the task involves `.feature` files, `features/step-definitions/`, `features/support/`, `DifyWorld`, scenario tags, locator/assertion choices, or E2E testing best practices for this repository.
---
# Dify E2E Cucumber + Playwright
Use this skill for Dify's repository-level E2E suite in `e2e/`. Use [`e2e/AGENTS.md`](../../../e2e/AGENTS.md) as the canonical guide for local architecture and conventions, then apply Playwright/Cucumber best practices only where they fit the current suite.
## Scope
- Use this skill for `.feature` files, Cucumber step definitions, `DifyWorld`, hooks, tags, and E2E review work under `e2e/`.
- Do not use this skill for Vitest or React Testing Library work under `web/`; use `frontend-testing` instead.
- Do not use this skill for backend test or API review tasks under `api/`.
2. Read only the files directly involved in the task:
- target `.feature` files under `e2e/features/`
- related step files under `e2e/features/step-definitions/`
-`e2e/features/support/hooks.ts` and `e2e/features/support/world.ts` when session lifecycle or shared state matters
-`e2e/scripts/run-cucumber.ts` and `e2e/cucumber.config.ts` when tags or execution flow matter
3. Read [`references/playwright-best-practices.md`](references/playwright-best-practices.md) only when locator, assertion, isolation, or waiting choices are involved.
4. Read [`references/cucumber-best-practices.md`](references/cucumber-best-practices.md) only when scenario wording, step granularity, tags, or expression design are involved.
5. Re-check official docs with Context7 before introducing a new Playwright or Cucumber pattern.
## Local Rules
-`e2e/` uses Cucumber for scenarios and Playwright as the browser layer.
-`DifyWorld` is the per-scenario context object. Type `this` as `DifyWorld` and use `async function`, not arrow functions.
- Keep glue organized by capability under `e2e/features/step-definitions/`; use `common/` only for broadly reusable steps.
- Browser session behavior comes from `features/support/hooks.ts`:
- default: authenticated session with shared storage state
-`@unauthenticated`: clean browser context
-`@authenticated`: readability/selective-run tag only unless implementation changes
-`@fresh`: only for `e2e:full*` flows
- Do not import Playwright Test runner patterns that bypass the current Cucumber + `DifyWorld` architecture unless the task is explicitly about changing that architecture.
## Workflow
1. Rebuild local context.
- Inspect the target feature area.
- Reuse an existing step when wording and behavior already match.
- Add a new step only for a genuinely new user action or assertion.
- Keep edits close to the current capability folder unless the step is broadly reusable.
2. Write behavior-first scenarios.
- Describe user-observable behavior, not DOM mechanics.
- Keep each scenario focused on one workflow or outcome.
- Keep scenarios independent and re-runnable.
3. Write step definitions in the local style.
- Keep one step to one user-visible action or one assertion.
- Prefer Cucumber Expressions such as `{string}` and `{int}`.
- Scope locators to stable containers when the page has repeated elements.
- Avoid page-object layers or extra helper abstractions unless repeated complexity clearly justifies them.
4. Use Playwright in the local style.
- Prefer user-facing locators: `getByRole`, `getByLabel`, `getByPlaceholder`, `getByText`, then `getByTestId` for explicit contracts.
- Use web-first `expect(...)` assertions.
- Do not use `waitForTimeout`, manual polling, or raw visibility checks when a locator action or retrying assertion already expresses the behavior.
5. Validate narrowly.
- Run the narrowest tagged scenario or flow that exercises the change.
- Run `pnpm -C e2e check`.
- Broaden verification only when the change affects hooks, tags, setup, or shared step semantics.
## Review Checklist
- Does the scenario describe behavior rather than implementation?
- Does it fit the current session model, tags, and `DifyWorld` usage?
- Should an existing step be reused instead of adding a new one?
- Are locators user-facing and assertions web-first?
- Does the change introduce hidden coupling across scenarios, tags, or instance state?
- Does it document or implement behavior that differs from the real hooks or configuration?
Lead findings with correctness, flake risk, and architecture drift.
Use this reference when writing or reviewing locator, assertion, isolation, or synchronization logic for Dify's Cucumber-based E2E suite.
Official sources:
- https://playwright.dev/docs/best-practices
- https://playwright.dev/docs/locators
- https://playwright.dev/docs/test-assertions
- https://playwright.dev/docs/browser-contexts
## What Matters Most
### 1. Keep scenarios isolated
Playwright's model is built around clean browser contexts so one test does not leak into another. In Dify's suite, that principle maps to per-scenario session setup in `features/support/hooks.ts` and `DifyWorld`.
Apply it like this:
- do not depend on another scenario having run first
- do not persist ad hoc scenario state outside `DifyWorld`
- do not couple ordinary scenarios to `@fresh` behavior
- when a flow needs special auth/session semantics, express that through the existing tag model or explicit hook changes
### 2. Prefer user-facing locators
Playwright recommends built-in locators that reflect what users perceive on the page.
Preferred order in this repository:
1.`getByRole`
2.`getByLabel`
3.`getByPlaceholder`
4.`getByText`
5.`getByTestId` when an explicit test contract is the most stable option
Avoid raw CSS/XPath selectors unless no stable user-facing contract exists and adding one is not practical.
Also remember:
- repeated content usually needs scoping to a stable container
- exact text matching is often too brittle when role/name or label already exists
-`getByTestId` is acceptable when semantics are weak but the contract is intentional
### 3. Use web-first assertions
Playwright assertions auto-wait and retry. Prefer them over manual state inspection.
Prefer:
-`await expect(page).toHaveURL(...)`
-`await expect(locator).toBeVisible()`
-`await expect(locator).toBeHidden()`
-`await expect(locator).toBeEnabled()`
-`await expect(locator).toHaveText(...)`
Avoid:
-`expect(await locator.isVisible()).toBe(true)`
- custom polling loops for DOM state
-`waitForTimeout` as synchronization
If a condition genuinely needs custom retry logic, use Playwright's polling/assertion tools deliberately and keep that choice local and explicit.
### 4. Let actions wait for actionability
Locator actions already wait for the element to be actionable. Do not preface every click/fill with extra timing logic unless the action needs a specific visible/ready assertion for clarity.
Good pattern:
- assert a meaningful visible state when that is part of the behavior
- then click/fill/select via locator APIs
Bad pattern:
- stack arbitrary waits before every action
- wait on unstable implementation details instead of the visible state the user cares about
### 5. Match debugging to the current suite
Playwright's wider ecosystem supports traces and rich debugging tools. Dify's current suite already captures:
- full-page screenshots
- page HTML
- console errors
- page errors
Use the existing artifact flow by default. If a task is specifically about improving diagnostics, confirm the change fits the current Cucumber architecture before importing broader Playwright tooling.
## Review Questions
- Would this locator survive DOM refactors that do not change user-visible behavior?
- Is this assertion using Playwright's retrying semantics?
- Is any explicit wait masking a real readiness problem?
- Does this code preserve per-scenario isolation?
- Is a new abstraction really needed, or does it bypass the existing `DifyWorld` + step-definition model?
description: Guide for implementing Dify frontend query and mutation patterns with TanStack Query and oRPC. Trigger when creating or updating contracts in web/contract, wiring router composition, consuming consoleQuery or marketplaceQuery in components or services, deciding whether to call queryOptions() directly or extract a helper or use-* hook, handling conditional queries, cache invalidation, mutation error handling, or migrating legacy service calls to contract-first query and mutation helpers.
---
# Frontend Query & Mutation
## Intent
- Keep contract as the single source of truth in `web/contract/*`.
- Prefer contract-shaped `queryOptions()` and `mutationOptions()`.
- Keep invalidation and mutation flow knowledge in the service layer.
- Keep abstractions minimal to preserve TypeScript inference.
## Workflow
1. Identify the change surface.
- Read `references/contract-patterns.md` for contract files, router composition, client helpers, and query or mutation call-site shape.
- Read `references/runtime-rules.md` for conditional queries, invalidation, error handling, and legacy migrations.
- Read both references when a task spans contract shape and runtime behavior.
2. Implement the smallest abstraction that fits the task.
- Default to direct `useQuery(...)` or `useMutation(...)` calls with oRPC helpers at the call site.
- Extract a small shared query helper only when multiple call sites share the same extra options.
- Create `web/service/use-{domain}.ts` only for orchestration or shared domain behavior.
- Bind invalidation in the service-layer mutation definition.
- Prefer `mutate(...)`; use `mutateAsync(...)` only when Promise semantics are required.
## Files Commonly Touched
-`web/contract/console/*.ts`
-`web/contract/marketplace.ts`
-`web/contract/router.ts`
-`web/service/client.ts`
-`web/service/use-*.ts`
- component and hook call sites using `consoleQuery` or `marketplaceQuery`
## References
- Use `references/contract-patterns.md` for contract shape, router registration, query and mutation helpers, and anti-patterns that degrade inference.
- Use `references/runtime-rules.md` for conditional queries, invalidation, `mutate` versus `mutateAsync`, and legacy migration rules.
Treat this skill as the single query and mutation entry point for Dify frontend work. Keep detailed rules in the reference files instead of duplicating them in project docs.
short_description:"Dify TanStack Query and oRPC patterns"
default_prompt:"Use this skill when implementing or reviewing Dify frontend contracts, query and mutation call sites, conditional queries, invalidation, or legacy query/mutation migrations."
1. Default to mutation helpers from `consoleQuery` or `marketplaceQuery`, for example `useMutation(consoleQuery.billing.bindPartnerStack.mutationOptions(...))`.
2. If the mutation flow is heavily custom, use oRPC clients as `mutationFn`, for example `consoleClient.xxx` or `marketplaceClient.xxx`, instead of handwritten non-oRPC mutation logic.
## Anti-Patterns
- Do not wrap `useQuery` with `options?: Partial<UseQueryOptions>`.
- Do not split local `queryKey` and `queryFn` when oRPC `queryOptions` already exists and fits the use case.
- Do not create thin `use-*` passthrough hooks for a single endpoint.
- These patterns can degrade inference, especially around `throwOnError` and `select`, and add unnecessary indirection.
## Contract Rules
- Input structure: always use `{ params, query?, body? }`.
- No-input `GET`: omit `.input(...)`; do not use `.input(type<unknown>())`.
- Path params: use `{paramName}` in the path and match it in the `params` object.
- Router nesting: group by API prefix, for example `/billing/*` becomes `billing: {}`.
- No barrel files: import directly from specific files.
- Types: import from `@/types/` and use the `type<T>()` helper.
- Mutations: prefer `mutationOptions`; use explicit `mutationKey` mainly for defaults, filtering, and devtools.
- Test files: `ComponentName.spec.tsx`(same directory as component)
- Test files: `ComponentName.spec.tsx`inside a same-level `__tests__/` directory
- Placement rule: Component, hook, and utility tests must live in a sibling `__tests__/` folder at the same level as the source under test. For example, `foo/index.tsx` maps to `foo/__tests__/index.spec.tsx`, and `foo/bar.ts` maps to `foo/__tests__/bar.spec.ts`.
- Integration tests: `web/__tests__/` directory
## Test Structure Template
@ -204,6 +205,16 @@ When assigned to test a directory/path, test **ALL content** within that path:
> See [Test Structure Template](#test-structure-template) for correct import/mock patterns.
### `nuqs` Query State Testing (Required for URL State Hooks)
When a component or hook uses `useQueryState` / `useQueryStates`:
- ✅ Use `NuqsTestingAdapter` (prefer shared helpers in `web/test/nuqs-testing.tsx`)
- ✅ Assert URL synchronization via `onUrlUpdate` (`searchParams`, `options.history`)
- ✅ For custom parsers (`createParser`), keep `parse` and `serialize` bijective and add round-trip edge cases (`%2F`, `%25`, spaces, legacy encoded values)
- ✅ Verify default-clearing behavior (default values should be removed from URL when applicable)
- ⚠️ Only mock `nuqs` directly when URL behavior is explicitly out of scope for the test
description: Guide for implementing oRPC contract-first API patterns in Dify frontend. Triggers when creating new API contracts, adding service endpoints, integrating TanStack Query with typed contracts, or migrating legacy service calls to oRPC. Use for all API layer work in web/contract and web/service directories.
---
# oRPC Contract-First Development
## Project Structure
```
web/contract/
├── base.ts # Base contract (inputStructure: 'detailed')
├── router.ts # Router composition & type exports
├── marketplace.ts # Marketplace contracts
└── console/ # Console contracts by domain
├── system.ts
└── billing.ts
```
## Workflow
1.**Create contract** in `web/contract/console/{domain}.ts`
- Import `base` from `../base` and `type` from `@orpc/contract`
- Define route with `path`, `method`, `input`, `output`
2.**Register in router** at `web/contract/router.ts`
- Import directly from domain file (no barrel files)
- Nest by API prefix: `billing: { invoices, bindPartnerStack }`
3.**Create hooks** in `web/service/use-{domain}.ts`
- Use `consoleQuery.{group}.{contract}.queryKey()` for query keys
- Use `consoleClient.{group}.{contract}()` for API calls
## Key Rules
- **Input structure**: Always use `{ params, query?, body? }` format
- **Path params**: Use `{paramName}` in path, match in `params` object
- **Router nesting**: Group by API prefix (e.g., `/billing/*` → `billing: {}`)
- **No barrel files**: Import directly from specific files
- **Types**: Import from `@/types/`, use `type<T>()` helper
<!-- Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change. -->
<!-- If this PR was created by an automated agent, add `From <Tool Name>` as the final line of the description. Example: `From Codex`. -->
## Screenshots
@ -17,7 +18,7 @@
## Checklist
- [ ] This change requires a documentation update, included: [Dify Document](https://github.com/langgenius/dify-docs)
- [x] I understand that this PR may be closed in case there was no previous discussion or issues. (This doesn't apply to typos!)
- [x] I've added a test for each change that was introduced, and I tried as much as possible to make a single atomic change.
- [x] I've updated the documentation accordingly.
- [x] I ran `make lint` and `make type-check` (backend) and `cd web && npx lint-staged` (frontend) to appease the lint gods
- [] I understand that this PR may be closed in case there was no previous discussion or issues. (This doesn't apply to typos!)
- [] I've added a test for each change that was introduced, and I tried as much as possible to make a single atomic change.
- [] I've updated the documentation accordingly.
- [] I ran `make lint && make type-check` (backend) and `cd web && pnpm exec vp staged` (frontend) to appease the lint gods
# Push events are bridged by trigger-i18n-sync.yml via repository_dispatch.
on:
repository_dispatch:
types:[i18n-sync]
workflow_dispatch:
inputs:
files:
description:'Specific files to translate (space-separated, e.g., "app common"). Leave empty for all files.'
description:'Specific files to translate (space-separated, e.g., "app common"). Required for full mode; leave empty in incremental mode to use en-US files changed since HEAD~1.'
required:false
type:string
languages:
description:'Specific languages to translate (space-separated, e.g., "zh-Hans ja-JP"). Leave empty for all supported languages.'
description:'Specific languages to translate (space-separated, e.g., "zh-Hans ja-JP"). Leave empty for all supported target languages except en-US.'
required:false
type:string
mode:
description: 'Sync mode:incremental (only changes) or full (re-check all keys)'
description: 'Sync mode:incremental (compare with previous en-US revision) or full (sync all keys in scope)'
🤖 Generated with Claude Code GitHub Action" --base main
```
Tool rules:
- Use Read for repository files.
- Use Edit for JSON updates.
- Use Bash only for `vp`.
- Do not use Bash for `git`, `gh`, or branch management.
Required execution plan:
1. Resolve target languages.
- Use the provided `Target languages` value as the source of truth.
- If it is unexpectedly empty, read `${{ github.workspace }}/web/i18n-config/languages.ts` and use every language with `supported: true` except `en-US`.
2. Stay strictly in scope.
- Only process the files listed in `Files in scope`.
- Only process the resolved target languages, never `en-US`.
- Do not touch unrelated i18n files.
- Do not modify `${{ github.workspace }}/web/i18n/en-US/`.
3. Resolve source changes.
- If `Structured change set available` is `true`, read `/tmp/i18n-changes.json` and use it as the source of truth for file-level and key-level changes.
- For each file entry:
- `added` contains new English keys that need translations.
- `updated` contains stale keys whose English source changed; re-translate using the `after` value.
- `deleted` contains keys that should be removed from locale files.
- `fileDeleted: true` means the English file no longer exists; remove the matching locale file if present.
- Read the current English JSON file for any file that still exists so wording, placeholders, and surrounding terminology stay accurate.
- If `Structured change set available` is `false`, treat this as a scoped full sync and use the current English files plus scoped checks as the source of truth.
4. Run a scoped pre-check before editing:
- `vp run dify-web#i18n:check ${{ steps.context.outputs.FILE_ARGS }} ${{ steps.context.outputs.LANG_ARGS }}`
- Use this command as the source of truth for missing and extra keys inside the current scope.
5. Apply translations.
- For every target language and scoped file:
- If `fileDeleted` is `true`, remove the locale file if it exists and skip the rest of that file.
- If the locale file does not exist yet, create it with `Write` and then continue with `Edit` as needed.
- ADD missing keys.
- UPDATE stale translations when the English value changed.
- DELETE removed keys. Prefer `vp run dify-web#i18n:check ${{ steps.context.outputs.FILE_ARGS }} ${{ steps.context.outputs.LANG_ARGS }} --auto-remove` for extra keys so deletions stay in scope.
- Preserve placeholders exactly: `{{variable}}`, `${variable}`, HTML tags, component tags, and variable names.
- Match the existing terminology and register used by each locale.
- Prefer one Edit per file when stable, but prioritize correctness over batching.
6. Verify only the edited files.
- Run `vp run dify-web#lint:fix --quiet -- <relative edited i18n file paths under web/>`
- Run `vp run dify-web#i18n:check ${{ steps.context.outputs.FILE_ARGS }} ${{ steps.context.outputs.LANG_ARGS }}`
- If verification fails, fix the remaining problems before continuing.
7. Stop after the scoped locale files are updated and verification passes.
- Do not create branches, commits, or pull requests.
console.log(`Structured change set too large to embed safely (${changesBase64.length} chars). Downstream workflow will regenerate it from git history.`)
- **Python**: Keep type hints on functions and attributes, and implement relevant special methods (e.g., `__repr__`, `__str__`).
- **Python**: Keep type hints on functions and attributes, and implement relevant special methods (e.g., `__repr__`, `__str__`). Prefer `TypedDict` over `dict` or `Mapping` for type safety and better code documentation.
- **TypeScript**: Use the strict config, rely on ESLint (`pnpm lint:fix` preferred) plus `pnpm type-check:tsgo`, and avoid `any` types.
<ahref="./docs/bn-BD/README.md"><imgalt="README in বাংলা"src="https://img.shields.io/badge/বাংলা-d9d9d9"></a>
<ahref="./docs/hi-IN/README.md"><imgalt="README in हिन्दी"src="https://img.shields.io/badge/Hindi-d9d9d9"></a>
</p>
Dify is an open-source platform for developing LLM applications. Its intuitive interface combines agentic AI workflows, RAG pipelines, agent capabilities, model management, observability features, and more—allowing you to quickly move from prototype to production.
Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features (including [Opik](https://www.comet.com/docs/opik/integrations/dify), [Langfuse](https://docs.langfuse.com), and [Arize Phoenix](https://docs.arize.com/phoenix)) and more, letting you quickly go from prototype to production. Here's a list of the core features:
## Quick start
@ -137,7 +137,7 @@ Star Dify on GitHub and be instantly notified of new releases.
### Custom configurations
If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `dockercompose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
@ -62,7 +62,23 @@ This is the default standard for backend code in this repo. Follow it for new co
- Code should usually include type annotations that match the repo’s current Python version (avoid untyped public APIs and “mystery” values).
- Prefer modern typing forms (e.g. `list[str]`, `dict[str, int]`) and avoid `Any` unless there’s a strong reason.
- For classes, declare member variables at the top of the class body (before `__init__`) so the class shape is obvious at a glance:
- For dictionary-like data with known keys and value types, prefer `TypedDict` over `dict[...]` or `Mapping[...]`.
- For optional keys in typed payloads, use `NotRequired[...]` (or `total=False` when most fields are optional).
- Keep `dict[...]` / `Mapping[...]` for truly dynamic key spaces where the key set is unknown.
```python
fromdatetimeimportdatetime
fromtypingimportNotRequired,TypedDict
classUserProfile(TypedDict):
user_id:str
email:str
created_at:datetime
nickname:NotRequired[str]
```
- For classes, declare all member variables explicitly with types at the top of the class body (before `__init__`), even when the class is not a dataclass or Pydantic model, so the class shape is obvious at a glance:
@ -40,9 +40,11 @@ The scripts resolve paths relative to their location, so you can run them from a
./dev/start-web
```
`./dev/setup` and `./dev/start-web` install JavaScript dependencies through the repository root workspace, so you do not need a separate `cd web && pnpm install` step.
1. Set up your application by visiting `http://localhost:3000`.
1. Optional: start the worker service (async tasks, runs from `api`).
1. Start the worker service (async and scheduler tasks, runs from `api`).
```bash
./dev/start-worker
@ -54,86 +56,6 @@ The scripts resolve paths relative to their location, so you can run them from a
./dev/start-beat
```
### Manual commands
<details>
<summary>Show manual setup and run steps</summary>
These commands assume you start from the repository root.
1. Start the docker-compose stack.
The backend requires middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using `docker-compose`.
1. Start backend (runs migrations first, in a new terminal).
```bash
cd api
uv run flask db upgrade
uv run flask run --host 0.0.0.0 --port=5001 --debug
```
1. Start Dify [web](../web) service (in a new terminal).
```bash
cd web
pnpm dev:inspect
```
1. Set up your application by visiting `http://localhost:3000`.
1. Optional: start the worker service (async tasks, in a new terminal).
```bash
cd api
uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention
```
1. Optional: start Celery Beat (scheduled tasks, in a new terminal).
This should be called by framework-specific modules (e.g., Flask)
during application initialization.
Args:
capturer: Function that captures current context and returns IExecutionContext
"""
global_capturer
_capturer=capturer
defcapture_current_context()->IExecutionContext:
"""
Capture current execution context.
This function uses the registered context capturer. If no capturer
is registered, it returns a minimal context with only contextvars
(suitable for non-framework environments like tests or standalone scripts).
Returns:
IExecutionContext with captured context
"""
if_capturerisNone:
# No framework registered - return minimal context
returnExecutionContext(
app_context=NullAppContext(),
context_vars=contextvars.copy_context(),
)
return_capturer()
defreset_context_provider()->None:
"""
Reset the context capturer.
This is primarily useful for testing to ensure a clean state.
"""
global_capturer
_capturer=None
fromcontext.modelsimportSandboxContext
__all__=[
"AppContext",
"ContextProviderNotFoundError",
"ExecutionContext",
"ExecutionContextBuilder",
"IExecutionContext",
"NullAppContext",
"SandboxContext",
"capture_current_context",
"read_context",
"register_context",
"register_context_capturer",
"reset_context_provider",
]
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.