Compare commits

..

183 Commits

Author SHA1 Message Date
ef9a741781 feat(trigger): enhance trigger management with new error handling and response structure
- Added `TriggerInvokeError` and `TriggerIgnoreEventError` for better error categorization during trigger invocation.
- Updated `TriggerInvokeResponse` to include a `cancelled` field, indicating if a trigger was ignored.
- Enhanced `TriggerManager` to handle specific errors and return appropriate responses.
- Refactored `dispatch_triggered_workflows` to improve workflow execution logic and error handling.

These changes improve the robustness and clarity of the trigger management system.
2025-09-23 16:01:59 +08:00
c5de91ba94 refactor(trigger): update cache expiration constants and log key format
- Renamed validation-related constants to builder-related ones for clarity.
- Updated cache expiration from milliseconds to seconds for consistency.
- Adjusted log key format to reflect the builder context instead of validation.

These changes enhance the readability and maintainability of the TriggerSubscriptionBuilderService.
2025-09-22 13:37:46 +08:00
bc1e6e011b fix(trigger): update cache key format in TriggerSubscriptionBuilderService
- Changed the cache key format in the `encode_cache_key` method from `trigger:subscription:validation:{subscription_id}` to `trigger:subscription:builder:{subscription_id}` to better reflect its purpose.

This update improves clarity in cache key usage for trigger subscriptions.
2025-09-22 13:37:46 +08:00
906028b1fb fix: start node validation 2025-09-22 12:58:20 +08:00
034602969f feat(schedule-trigger): enhance cron parser with mature library and comprehensive testing (#26002) 2025-09-22 10:01:48 +08:00
4ca14bfdad chore: improve webhook (#25998) 2025-09-21 12:16:31 +08:00
59f56d8c94 feat: schedule trigger default daily midnight (#25937) 2025-09-19 08:05:00 +08:00
63d26f0478 fix: api key params 2025-09-18 17:35:34 +08:00
eae65e55ce feat: oauth config opt & add dynamic options 2025-09-18 17:12:48 +08:00
0edf06329f fix: apply suggestions 2025-09-18 17:04:02 +08:00
6943a379c9 feat: show placeholder '--' for invalid cron expressions in node display
- Return '--' placeholder when cron mode has empty or invalid expressions
- Prevents displaying fallback dates that confuse users
- Maintains consistent UX for invalid schedule configurations
2025-09-18 17:04:02 +08:00
e49534b70c fix: make frequency optional 2025-09-18 17:04:02 +08:00
344616ca2f fix: clear opposite mode data only when editing, preserve data during mode switching 2025-09-18 17:04:02 +08:00
0e287a9c93 chore: add missing translations 2025-09-18 13:25:57 +08:00
8141f53af5 fix: add preventDefaultSubmit prop to BaseForm to prevent unwanted page refresh on Enter key 2025-09-18 12:48:26 +08:00
5a6cb0d887 feat: enhance API key modal step indicator with active dots and improved styling 2025-09-18 12:44:11 +08:00
26e7677595 fix: align width and use rounded xl 2025-09-18 12:08:21 +08:00
814b0e1fe8 feat: oauth config init 2025-09-18 00:00:50 +08:00
a173dc5c9d feat(provider): add multiple option support in ProviderConfig
- Introduced a new field `multiple` in the `ProviderConfig` class to allow for multiple selections, enhancing the configuration capabilities for providers.
- This addition improves flexibility in provider settings and aligns with the evolving requirements for provider configurations.
2025-09-17 22:12:01 +08:00
a567facf2b refactor(trigger): streamline encrypter creation in TriggerProviderService
- Replaced calls to `create_trigger_provider_encrypter` and `create_trigger_provider_oauth_encrypter` with a unified `create_provider_encrypter` method, simplifying the encrypter creation process.
- Updated the parameters passed to the new method to enhance configuration management and cache handling.

These changes improve code clarity and maintainability in the trigger provider service.
2025-09-17 21:47:11 +08:00
e76d80defe fix(trigger): update client parameter handling in TriggerProviderService
- Modified the `create_provider_encrypter` call to include a cache assignment, ensuring proper management of encryption resources.
- Added a cache deletion step after updating client parameters, enhancing the integrity of the parameter handling process.

These changes improve the reliability of client parameter updates within the trigger provider service.
2025-09-17 20:57:52 +08:00
4a17025467 fix(trigger): update session management in TriggerProviderService
- Changed session management in `TriggerProviderService` from `autoflush=True` to `expire_on_commit=False` for improved control over session state.
- This change enhances the reliability of database interactions by preventing automatic expiration of objects after commit, ensuring data consistency during trigger operations.

These updates contribute to better session handling and stability in trigger-related functionalities.
2025-09-16 18:01:44 +08:00
bd1fcd3525 feat(trigger): add TriggerProviderInfoApi and enhance trigger provider service
- Introduced `TriggerProviderInfoApi` to retrieve information for a specific trigger provider, improving API capabilities.
- Added `get_trigger_provider` method in `TriggerProviderService` to fetch trigger provider details, enhancing data retrieval.
- Updated route configurations to include the new API endpoint for trigger provider information.

These changes enhance the functionality and usability of trigger provider interactions within the application.
2025-09-16 17:03:52 +08:00
0cb0cea167 feat(trigger): enhance trigger plugin data structure and error handling
- Added `plugin_unique_identifier` to `PluginTriggerData` and `TriggerProviderApiEntity` to improve identification of trigger plugins.
- Introduced `PluginTriggerDispatchData` for structured dispatch data in Celery tasks, enhancing the clarity of trigger dispatching.
- Updated `dispatch_triggered_workflows_async` to utilize the new dispatch data structure, improving error handling and logging for trigger invocations.
- Enhanced metadata handling in `TriggerPluginNode` to include trigger information, aiding in debugging and tracking.

These changes improve the robustness and maintainability of trigger plugin interactions within the workflow system.
2025-09-16 15:39:40 +08:00
ee68a685a7 fix(workflow): enforce non-nullable arguments in DraftWorkflowTriggerRunApi
- Updated the argument definitions in the DraftWorkflowTriggerRunApi to include `nullable=False` for `node_id`, `trigger_name`, and `subscription_id`. This change ensures that these fields are always provided in the request, improving the robustness of the API.

This fix enhances input validation and prevents potential errors related to missing arguments.
2025-09-16 11:25:16 +08:00
c78bd492af feat(trigger): add supported creation methods to TriggerProviderApiEntity
- Introduced a new field `supported_creation_methods` in `TriggerProviderApiEntity` to specify the available methods for creating triggers, including OAUTH, APIKEY, and MANUAL.
- Updated the `PluginTriggerProviderController` to populate this field based on the entity's schemas, enhancing the API's clarity and usability.

These changes improve the flexibility and configurability of trigger providers within the application.
2025-09-15 17:01:29 +08:00
6857bb4406 feat(trigger): implement plugin trigger synchronization and subscription management in workflow
- Added a new event handler for syncing plugin trigger relationships when a draft workflow is synced, ensuring that the database reflects the current state of plugin triggers.
- Introduced subscription management features in the frontend, allowing users to select, add, and remove subscriptions for trigger plugins.
- Updated various components to support subscription handling, including the addition of new UI elements for subscription selection and removal.
- Enhanced internationalization support by adding new translation keys related to subscription management.

These changes improve the overall functionality and user experience of trigger plugins within workflows.
2025-09-15 15:49:07 +08:00
dcf3ee6982 fix(trigger): update trigger label assignment for improved clarity
- Changed the label assignment in the convertToTriggerWithProvider function from trigger.description.human to trigger.identity.label, ensuring the label reflects the correct identity format.

This update enhances the accuracy of trigger data representation in the application.
2025-09-15 14:50:56 +08:00
76850749e4 feat(trigger): enhance trigger debugging with polling API and new subscription retrieval
- Refactored DraftWorkflowTriggerNodeApi and DraftWorkflowTriggerRunApi to implement polling for trigger events instead of listening, improving responsiveness and reliability.
- Introduced TriggerSubscriptionBuilderGetApi to retrieve subscription instances for trigger providers, enhancing the API's capabilities.
- Removed deprecated trigger event classes and streamlined event handling in TriggerDebugService, ensuring a cleaner architecture.
- Updated Queue and Stream entities to reflect the changes in trigger event handling, improving overall clarity and maintainability.

These enhancements significantly improve the trigger debugging experience and API usability.
2025-09-14 19:12:31 +08:00
91e5e33440 feat: add modal style opt 2025-09-12 20:22:33 +08:00
11e55088c9 fix: restore id prop passing to node children in BaseNode (#25520) 2025-09-11 17:54:31 +08:00
57c0bc9fb6 feat(trigger): refactor trigger debug event handling and improve response structures
- Renamed and refactored trigger debug event classes to enhance clarity and consistency, including changes from `TriggerDebugEventData` to `TriggerEventData` and related response classes.
- Updated `DraftWorkflowTriggerNodeApi` and `DraftWorkflowTriggerRunApi` to utilize the new event structures, improving the handling of trigger events.
- Removed the `TriggerDebugEventGenerator` class, consolidating event generation directly within the API logic for streamlined processing.
- Enhanced error handling and response formatting for trigger events, ensuring structured outputs for better integration and debugging.

This refactor improves the overall architecture of trigger debugging, making it more intuitive and maintainable.
2025-09-11 16:55:58 +08:00
c3ebb22a4b feat(trigger): add workflows_in_use field to TriggerProviderSubscriptionApiEntity
- Introduced a new field `workflows_in_use` to the TriggerProviderSubscriptionApiEntity to track the number of workflows utilizing each subscription.
- Enhanced the TriggerProviderService to populate this field by querying the WorkflowPluginTrigger model for usage counts associated with each subscription.

This addition improves the visibility of subscription usage within the trigger provider context.
2025-09-11 16:55:58 +08:00
1562d00037 feat(trigger): implement trigger debugging functionality
- Added DraftWorkflowTriggerNodeApi and DraftWorkflowTriggerRunApi for debugging trigger nodes and workflows.
- Enhanced TriggerDebugService to manage trigger debugging sessions and event listening.
- Introduced structured event responses for trigger debugging, including listening started, received, node finished, and workflow started events.
- Updated Queue and Stream entities to support new trigger debug events.
- Refactored trigger input handling to streamline the process of creating inputs from trigger data.

This implementation improves the debugging capabilities for trigger nodes and workflows, providing clearer event handling and structured responses.
2025-09-11 16:55:58 +08:00
e9e843b27d fix(tool): standardize tool naming across components
- Updated references from `trigger_name` to `tool_name` in multiple components for consistency.
- Adjusted type definitions to reflect the change in naming convention, enhancing clarity in the codebase.
2025-09-11 16:55:57 +08:00
ec33b9908e fix(trigger): improve formatting of OAuth client response in TriggerOAuthClientManageApi
- Refactored the return statement in the TriggerOAuthClientManageApi to enhance readability and maintainability.
- Ensured consistent formatting of the response structure for better clarity in API responses.
2025-09-11 16:55:57 +08:00
67004368d9 feat: sub card style 2025-09-11 16:22:59 +08:00
50bff270b6 feat: add subscription 2025-09-10 23:21:33 +08:00
bd5cf1c272 fix(trigger): enhance OAuth client response in TriggerOAuthClientManageApi
- Integrated TriggerManager to retrieve the trigger provider's OAuth client schema.
- Updated the return structure to include the redirect URI and OAuth client schema for improved API response clarity.
2025-09-10 17:35:30 +08:00
d22404994a chore: add comments on generate_webhook_id 2025-09-10 17:23:29 +08:00
9898730cc5 feat: add webhook node limit validation (max 5 per workflow)
- Add MAX_WEBHOOK_NODES_PER_WORKFLOW constant set to 5
- Validate webhook node count in sync_webhook_relationships method
- Raise ValueError when workflow exceeds webhook node limit
- Block workflow save when limit is exceeded to ensure data integrity
- Provide clear error message indicating current count and maximum allowed

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-10 17:22:09 +08:00
b0f1e55a87 refactor: remove triggered_by field from webhook triggers and use automatic sync
- Remove triggered_by field from WorkflowWebhookTrigger model
- Replace manual webhook creation/deletion APIs with automatic sync via WebhookService
- Keep only GET API for retrieving webhook information
- Use same webhook ID for both debug and production environments (differentiated by endpoint)
- Add sync_webhook_relationships to automatically manage webhook lifecycle
- Update tests to remove triggered_by references
- Clean up unused imports and fix type checking issues

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-10 17:17:19 +08:00
6566824807 fix(trigger): update return type in TriggerSubscriptionBuilderService
- Changed the return type of the method in `TriggerSubscriptionBuilderService` from `SubscriptionBuilder` to `SubscriptionBuilderApiEntity` for improved clarity and alignment with API entity structures.
- Updated the return statement to utilize the new method for converting the builder to the API entity.
2025-09-10 15:48:32 +08:00
9249a2af0d fix(trigger): update event data publishing in TriggerDebugService
- Changed the event data publishing method in `TriggerDebugService` to use `model_dump()` for improved data structure handling when publishing to Redis Pub/Sub.
2025-09-10 15:48:32 +08:00
112fc3b1d1 fix: clear schedule config when exporting data 2025-09-10 13:50:37 +08:00
37299b3bd7 fix: rename migration 2025-09-10 13:41:50 +08:00
8f65ce995a fix: migrations 2025-09-10 13:38:34 +08:00
4a743e6dc1 feat: add workflow schedule trigger support (#24428)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-10 13:24:23 +08:00
07dda61929 fix/tooltip and onboarding ui (#25451) 2025-09-10 10:40:14 +08:00
0d8438ef40 fix(trigger): add 'trigger' category key to plugin constants for error avoid 2025-09-10 10:34:33 +08:00
96bb638969 fix: limits 2025-09-09 23:32:51 +08:00
e74962272e fix: only workflow use trigger api (#25443) 2025-09-09 23:14:10 +08:00
5a15419baf feat(trigger): implement debug session capabilities for trigger nodes
- Added `DraftWorkflowTriggerNodeApi` to handle debugging of trigger nodes, allowing for real-time event listening and session management.
- Introduced `TriggerDebugService` for managing debug sessions and event dispatching using Redis Pub/Sub.
- Updated `TriggerService` to support dispatching events to debug sessions and refactored related methods for improved clarity and functionality.
- Enhanced data structures in `request.py` and `entities.py` to accommodate new debug event data requirements.

These changes significantly improve the debugging capabilities for trigger nodes in draft workflows, facilitating better development and troubleshooting processes.
2025-09-09 21:27:31 +08:00
e8403977b9 feat(plugin): add triggers field to PluginDeclaration for enhanced functionality
- Introduced a new `triggers` field in the `PluginDeclaration` class to support trigger functionalities within plugins.
- This addition improves the integration of triggers in the plugin architecture, aligning with recent updates to the trigger entity structures.

These changes enhance the overall capabilities of the plugin system.
2025-09-09 17:22:11 +08:00
add2ca85f2 refactor(trigger): update plugin and trigger entity structures
- Removed unnecessary newline in `TriggerPluginNode` class for consistency.
- Made `provider` in `TriggerIdentity` optional to enhance flexibility.
- Added `trigger` field to `PluginDeclaration` and updated `PluginCategory` to include `Trigger`, improving the integration of trigger functionalities within the plugin architecture.

These changes streamline the entity definitions and enhance the overall structure of the trigger and plugin components.
2025-09-09 17:16:44 +08:00
fbb7b02e90 fix(webhook): prevent SimpleSelect from resetting user selections (#25423)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 17:11:11 +08:00
249b62c9de fix: workflow header (#25411) 2025-09-09 15:34:15 +08:00
b433322e8d feat/trigger plugin apikey (#25388) 2025-09-09 15:01:06 +08:00
1c8850fc95 feat: adjust scroll to selected node position to top-left area (#25403) 2025-09-09 14:58:42 +08:00
dc16f1b65a refactor(trigger): simplify provider path handling in workflow components
- Updated various components to directly use `provider.name` instead of constructing a path with `provider.plugin_id` and `provider.name`.
- Adjusted related calls to `invalidateSubscriptions` and other functions to reflect this change.

These modifications enhance code clarity and streamline the handling of provider information in the trigger plugin components.
2025-09-09 00:17:20 +08:00
ff30395dc1 fix(OAuthClientConfigModal): simplify provider path handling in OAuth configuration
- Updated the provider path handling in `OAuthClientConfigModal` to directly use `provider.name` instead of constructing a path with `provider.plugin_id` and `provider.name`.
- Adjusted the corresponding calls to `invalidateOAuthConfig` and `configureTriggerOAuth` to reflect this change.

These modifications enhance code clarity and streamline the OAuth configuration process in the trigger plugin component.
2025-09-09 00:10:04 +08:00
8e600f3302 feat(trigger): optimize trigger parameter schema handling in useConfig
- Refactored the trigger parameter schema construction in `useConfig` to utilize a Map for improved efficiency and clarity.
- Updated the return value to ensure unique schema entries, enhancing the integrity of the trigger configuration.

These changes streamline the management of trigger parameters, improving performance and maintainability in the workflow component.
2025-09-08 23:39:44 +08:00
5a1e0a8379 feat(FormInputItem): enhance UI components for improved user experience
- Added loading indicator using `RiLoader4Line` to `FormInputItem` for better feedback during option fetching.
- Refactored button and option styles for improved accessibility and visual consistency.
- Updated text color classes to enhance readability based on loading state and selection.

These changes improve the overall user experience and visual clarity of the form input components.
2025-09-08 23:19:33 +08:00
2a3ce6baa9 feat(trigger): enhance plugin and trigger integration with updated naming conventions
- Refactored `PluginFetchDynamicSelectOptionsApi` to replace the `extra` argument with `credential_id`, improving clarity in dynamic option fetching.
- Updated `ProviderConfigEncrypter` to rename `mask_tool_credentials` to `mask_credentials` for consistency, and added a new method to maintain backward compatibility.
- Enhanced `PluginParameterService` to utilize `credential_id` for fetching subscriptions, improving the handling of trigger credentials.
- Adjusted various components and types in the frontend to replace `tool_name` with `trigger_name`, ensuring consistency across the application.
- Introduced `multiple` property in `TriggerParameter` to support multi-select functionality.

These changes improve the integration of triggers and plugins, enhance code clarity, and align naming conventions across the codebase.
2025-09-08 23:14:50 +08:00
01b2f9cff6 feat: add providerType prop to form components for dynamic behavior
- Introduced `providerType` prop in `FormInputItem`, `ToolForm`, and `ToolFormItem` components to support both 'tool' and 'trigger' types, enhancing flexibility in handling different provider scenarios.
- Updated the `useFetchDynamicOptions` function to accept `provider_type` as 'tool' | 'trigger', allowing for more dynamic option fetching based on the provider type.

These changes improve the adaptability of the form components and streamline the integration of different provider types in the workflow.
2025-09-08 18:29:48 +08:00
ac38614171 refactor(trigger): streamline trigger provider verification and update imports
- Updated `TriggerSubscriptionBuilderVerifyApi` to directly return the result of `verify_trigger_subscription_builder`, improving clarity.
- Refactored import statement in `trigger_plugin/__init__.py` to point to the correct module, enhancing code organization.
- Removed the obsolete `node.py` file, cleaning up the codebase by eliminating unused components.

These changes enhance the maintainability and clarity of the trigger provider functionality.
2025-09-08 18:25:04 +08:00
eb95c5cd07 feat(trigger): enhance subscription builder management and update API
- Introduced `SubscriptionBuilderUpdater` class to streamline updates to subscription builders, encapsulating properties like name, parameters, and credentials.
- Refactored API endpoints to utilize the new updater class, improving code clarity and maintainability.
- Adjusted OAuth handling to create and update subscription builders more effectively, ensuring proper credential management.

This change enhances the overall functionality and organization of the trigger subscription builder API.
2025-09-08 15:09:47 +08:00
a799b54b9e feat: initialize trigger status at application level to prevent canvas refresh state issues (#25329) 2025-09-08 09:34:28 +08:00
98ba0236e6 feat: implement trigger plugin authentication UI (#25310) 2025-09-07 21:53:22 +08:00
b6c552df07 fix: add stable sorting for trigger list to prevent position changes (#25328) 2025-09-07 21:52:41 +08:00
e2827e475d feat: implement trigger-plugin support with real-time status sync (#25326) 2025-09-07 21:29:53 +08:00
58cbd337b5 fix: improve test run menu and checklist ui (#25300) 2025-09-06 22:54:36 +08:00
a91e59d544 feat: implement trigger plugin frontend integration (#25283) 2025-09-06 16:18:46 +08:00
814787677a feat(trigger): update plugin trigger API and model to use trigger_name
- Modified `PluginTriggerApi` to accept `trigger_name` as a JSON argument and return encoded plugin triggers.
- Updated `WorkflowPluginTrigger` model to replace `trigger_id` with `trigger_name` for better clarity.
- Adjusted `WorkflowPluginTriggerService` to handle the new `trigger_name` field and ensure proper error handling for subscriptions.
- Enhanced `workflow_trigger_fields` to include `trigger_name` in the plugin trigger schema.

This change improves the API's clarity and aligns the model with the updated naming conventions.
2025-09-05 15:56:13 +08:00
85caa5bd0c fix(trigger): clean up whitespace in encryption utility and trigger provider service
- Removed unnecessary blank lines in `encryption.py` and `trigger_provider_service.py` for improved code readability.
- This minor adjustment enhances the overall code quality without altering functionality.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-05 15:56:13 +08:00
e04083fc0e feat: add icon support for trigger plugin workflow nodes (#25241) 2025-09-05 15:50:54 +08:00
cf532e5e0d feat(trigger): add context caching for trigger providers
- Add plugin_trigger_providers and plugin_trigger_providers_lock to contexts module
- Implement caching mechanism in TriggerManager.get_trigger_provider() method
- Cache fetched trigger providers to reduce repeated daemon calls
- Use double-check locking pattern for thread-safe cache access

This follows the same pattern as ToolManager.get_plugin_provider() to improve performance
by avoiding redundant requests to the daemon when accessing trigger providers.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-05 14:30:10 +08:00
c097fc2c48 refactor(trigger): add uuid import to trigger provider service
- Imported `uuid` in `trigger_provider_service.py` to support unique identifier generation.
- This change prepares the service for future enhancements that may require UUID functionality.
2025-09-05 14:30:10 +08:00
0371d71409 feat(trigger): enhance trigger subscription management and cache handling
- Added `name` parameter to `TriggerSubscriptionBuilderCreateApi` for better subscription identification.
- Implemented `delete_cache_for_subscription` function to clear cache associated with trigger subscriptions.
- Updated `WorkflowPluginTriggerService` to check for existing subscriptions before creating new plugin triggers, improving error handling.
- Refactored `TriggerProviderService` to utilize the new cache deletion method during provider deletion.

This improves the overall management of trigger subscriptions and enhances cache efficiency.
2025-09-05 14:30:10 +08:00
81ef7343d4 chore: (trigger) refactor webhook service (#25229) 2025-09-05 14:00:20 +08:00
8e4b59c90c feat: improve trigger plugin UI layout and responsiveness (#25232) 2025-09-05 14:00:14 +08:00
68f73410fc chore: (trigger) add WEBHOOK_REQUEST_BODY_MAX_SIZE (#25217) 2025-09-05 12:23:11 +08:00
88af8ed374 fix: block selector ui (#25228) 2025-09-05 12:22:13 +08:00
015f82878e feat(trigger): integrate plugin icon retrieval into trigger provider
- Added `get_plugin_icon_url` method in `PluginService` to fetch plugin icons.
- Updated `PluginTriggerProviderController` to use the new method for icon handling.
- Refactored `ToolTransformService` to utilize `PluginService` for consistent icon URL generation.

This enhances the trigger provider's ability to manage plugin icons effectively.
2025-09-05 12:01:41 +08:00
3874e58dc2 refactor(trigger): enhance trigger provider deletion process and session management 2025-09-05 11:31:57 +08:00
9f8c159583 feat(trigger): implement trigger plugin block selector following tools pattern (#25204) 2025-09-05 10:20:47 +08:00
d8f6f9ce19 chore: (trigger)change content type from form to application/octet-stream (#25167) 2025-09-05 09:54:07 +08:00
eab03e63d4 refactor(trigger): rename request logs API and enhance logging functionality
- Renamed `TriggerSubscriptionBuilderRequestLogsApi` to `TriggerSubscriptionBuilderLogsApi` for clarity.
- Updated the API endpoint to retrieve logs for subscription builders.
- Enhanced logging functionality in `TriggerSubscriptionBuilderService` to append and list logs more effectively.
- Refactored trigger processing tasks to improve naming consistency and clarity in logging.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 21:11:25 +08:00
461829274a feat: (trigger) support file upload in webhook (#25159) 2025-09-04 18:33:42 +08:00
e751c0c535 refactor(trigger): update trigger provider API and clean up unused classes
- Renamed the API endpoint for trigger providers from `/workspaces/current/trigger-providers` to `/workspaces/current/triggers` for consistency.
- Removed unused `TriggerProviderCredentialsCache` and `TriggerProviderOAuthClientParamsCache` classes to streamline the codebase.
- Enhanced the `TriggerProviderApiEntity` to include additional properties and improved the conversion logic in `PluginTriggerProviderController`.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 17:45:15 +08:00
1fffc79c32 fix: prevent empty workflow draft sync during page navigation (#25140) 2025-09-04 17:13:49 +08:00
83fab4bc19 chore: (webhook) when content type changed clear the body variables (#25136) 2025-09-04 15:09:54 +08:00
f60e28d2f5 feat(trigger): enhance user role validation and add request logs API for trigger providers
- Updated user role validation in PluginTriggerApi and WebhookTriggerApi to assert current_user as an Account and check tenant ID.
- Introduced TriggerSubscriptionBuilderRequestLogsApi to retrieve request logs for subscription instances, ensuring proper user authentication and error handling.
- Added new API endpoint for accessing request logs related to trigger providers.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 14:44:02 +08:00
a62d7aa3ee feat(trigger): add plugin trigger workflow support and refactor trigger system
- Add new workflow plugin trigger service for managing plugin-based triggers
- Implement trigger provider encryption utilities for secure credential storage
- Add custom trigger errors module for better error handling
- Refactor trigger provider and manager classes for improved plugin integration
- Update API endpoints to support plugin trigger workflows
- Add database migration for plugin trigger workflow support

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-04 13:20:43 +08:00
cc84a45244 chore: (webhook) use variable instead of InputVar (#25119) 2025-09-04 11:10:42 +08:00
5cf3d24018 fix(webhook): selected type ui style (#25106) 2025-09-04 10:59:08 +08:00
4bdbe617fe fix: uuidv7 (#25097) 2025-09-04 08:44:14 +08:00
33c867fd8c feat(workflow): enhance webhook status code input with increment/decrement controls (#25099) 2025-09-03 22:26:00 +08:00
2013ceb9d2 chore: validate param type of application/json when call a webhook (#25074) 2025-09-03 15:49:07 +08:00
7120c6414c fix: content type of webhook (#25032) 2025-09-03 15:13:01 +08:00
5ce7b2d98d refactor(migrations): remove obsolete plugin_trigger migration file
- Deleted the plugin_trigger migration file as it is no longer needed in the codebase.
- Updated model imports in `__init__.py` to include new trigger-related classes for better organization.
2025-09-03 15:02:17 +08:00
cb82198271 refactor(trigger): update trigger provider classes and API endpoints
- Renamed classes for trigger subscription management to improve clarity, including TriggerProviderSubscriptionListApi to TriggerSubscriptionListApi and TriggerSubscriptionsDeleteApi to TriggerSubscriptionDeleteApi.
- Updated API endpoint paths to reflect the new naming conventions for trigger subscriptions.
- Removed deprecated TriggerOAuthRefreshTokenApi class to streamline the codebase.
- Added trigger_providers import to the console controller for better organization.
2025-09-03 14:53:27 +08:00
5e5ffaa416 feat(tool-form): add extraParams prop to ToolForm and ToolFormItem components
- Introduced extraParams prop to both ToolForm and ToolFormItem components for enhanced flexibility in passing additional parameters.
- Updated component usage to accommodate the new prop, improving the overall functionality of the tool forms.
2025-09-03 14:53:27 +08:00
4b253e1f73 feat(trigger): plugin trigger workflow 2025-09-03 14:53:27 +08:00
dd929dbf0e fix(dynamic_select): implement function 2025-09-03 14:53:27 +08:00
97a9d34e96 feat(trigger): introduce plugin trigger management and enhance trigger processing
- Remove the debug endpoint for cleaner API structure
- Add support for TRIGGER_PLUGIN in NodeType enumeration
- Implement WorkflowPluginTrigger model to map plugin triggers to workflow nodes
- Enhance TriggerService to process plugin triggers and store trigger data in Redis
- Update node mapping to include TriggerPluginNode for workflow execution

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
602070ec9c refactor(trigger): improve method signature formatting in TriggerService
- Adjust the formatting of the `process_triggered_workflows` method signature for better readability
- Ensure consistent style across method definitions in the TriggerService class

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
afd8989150 feat(trigger): introduce subscription builder and enhance trigger management
- Refactor trigger provider classes to improve naming consistency, including renaming classes for subscription management
- Implement new TriggerSubscriptionBuilderService for creating and verifying subscription builders
- Update API endpoints to support subscription builder creation and verification
- Enhance data models to include new attributes for subscription builders
- Remove the deprecated TriggerSubscriptionValidationService to streamline the codebase

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
694197a701 refactor(trigger): clean up imports and optimize trigger-related code
- Remove unused imports in trigger-related files for better clarity and maintainability
- Streamline import statements across various modules to enhance code quality

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
2f08306695 feat(trigger): enhance trigger subscription management and processing
- Refactor trigger provider classes to improve naming consistency and clarity
- Introduce new methods for managing trigger subscriptions, including validation and dispatching
- Update API endpoints to reflect changes in subscription handling
- Implement logging and request management for endpoint interactions
- Enhance data models to support subscription attributes and lifecycle management

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
6acc77d86d feat(trigger): refactor trigger provider to subscription model
- Rename classes and methods to reflect the transition from credentials to subscriptions
- Update API endpoints for managing trigger subscriptions
- Modify data models and entities to support subscription attributes
- Enhance service methods for listing, adding, updating, and deleting subscriptions
- Adjust encryption utilities to handle subscription data

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
5ddd5e49ee feat(trigger): enhance subscription schema and provider configuration
- Update ProviderConfig to allow a list as a default value
- Introduce SubscriptionSchema for better organization of subscription-related configurations
- Modify TriggerProviderApiEntity to use Optional for subscription_schema
- Add custom_model_schema to TriggerProviderEntity for additional configuration options

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
72f9e77368 refactor(trigger): clean up and optimize trigger-related code
- Remove unused classes and imports in encryption utilities
- Simplify method signatures for better readability
- Enhance code quality by adding newlines for clarity
- Update tests to reflect changes in import paths

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:26 +08:00
a46c9238fa feat(trigger): implement complete OAuth authorization flow for trigger providers
- Add OAuth authorization URL generation API endpoint
- Implement OAuth callback handler for credential storage
- Support both system-level and tenant-level OAuth clients
- Add trigger provider credential encryption utilities
- Refactor trigger entities into separate modules
- Update trigger provider service with OAuth client management
- Add credential cache for trigger providers

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-03 14:53:26 +08:00
87120ad4ac feat(trigger): add trigger provider management and webhook handling functionality 2025-09-03 14:53:26 +08:00
7544b5ec9a fix: delete var of webhook (#25038) 2025-09-03 14:49:56 +08:00
ff4a62d1e7 chore: limit webhook status code 200~399 (#25045) 2025-09-03 14:48:18 +08:00
41daa51988 fix: missing key for translation path (#25059) 2025-09-03 14:43:40 +08:00
d522350c99 fix(webhook-trigger): request array type adjustment (#25005) 2025-09-02 23:20:12 +08:00
1d1bb9451e fix: prevent workflow canvas clearing due to race condition and viewport errors (#25003) 2025-09-02 20:53:44 +08:00
1fce1a61d4 feat(workflow-log): enhance workflow logs UI with sorting and status filters (#24978) 2025-09-02 16:43:11 +08:00
883a6caf96 feat: add trigger by of app log (#24973) 2025-09-02 16:04:08 +08:00
a239c39f09 fix: webhook http method should case insensitive (#24957) 2025-09-02 14:47:24 +08:00
e925a8ab99 fix(app-cards): restrict toggle enable to Start nodes only (#24918) 2025-09-01 22:52:23 +08:00
bccaf939e6 fix: migrations 2025-09-01 18:07:21 +08:00
676648e0b3 Merge branch 'main' into feat/trigger 2025-09-01 18:05:31 +08:00
4ae19e6dde fix(webhook-trigger): remove error handling (#24902) 2025-09-01 17:11:49 +08:00
4d0ff5c281 feat: implement variable synchronization for webhook node (#24874)
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-01 16:58:06 +08:00
327b354cc2 refactor: unify trigger node architecture and clean up technical debt (#24886)
Co-authored-by: hjlarry <hjlarry@163.com>
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-01 15:47:44 +08:00
6d307cc9fc Fix test run shortcut consistency and improve dropdown styling (#24849) 2025-09-01 14:47:21 +08:00
adc7134af5 fix: improve TimePicker footer layout and button styling (#24831) 2025-09-01 13:34:53 +08:00
10f19cd0c2 fix(webhook): add content-type aware parameter type handling (#24865) 2025-09-01 10:06:26 +08:00
9ed45594c6 fix: improve schedule trigger and quick settings app-operation btns ui (#24843) 2025-08-31 16:59:49 +08:00
c138f4c3a6 fix: check AppTrigger status before webhook execution (#24829)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-30 16:40:21 +08:00
a35be05790 Fix workflow card toggle logic and implement minimal state UI (#24822) 2025-08-30 16:35:34 +08:00
60b5ed8e5d fix: enhance webhook trigger panel UI consistency and user experience (#24780) 2025-08-29 17:41:42 +08:00
d8ddbc4d87 feat: enhance webhook trigger panel UI consistency and interactivity (#24759)
Co-authored-by: hjlarry <hjlarry@163.com>
2025-08-29 14:24:23 +08:00
19c0fc85e2 feat: when add/delete webhook trigger call the API (#24755) 2025-08-29 14:23:50 +08:00
a58df35ead fix: improve trigger card layout spacing and remove dividers (#24756) 2025-08-29 13:37:44 +08:00
9789bd02d8 feat: implement trigger card component with auto-refresh (#24743) 2025-08-29 11:57:08 +08:00
d94e54923f Improve tooltip design for trigger blocks (#24724) 2025-08-28 23:18:00 +08:00
64c7be59b7 Improve workflow block selector search functionality (#24707) 2025-08-28 17:21:34 +08:00
89ad6ad902 feat: add app trigger list api (#24693) 2025-08-28 15:23:08 +08:00
4f73bc9693 fix(schedule): add time logic to weekly frequency mode for consistent behavior with daily mode (#24673) 2025-08-28 14:40:11 +08:00
add6b79231 UI enhancements for workflow checklist component (#24647) 2025-08-28 10:10:10 +08:00
c90dad566f feat: enhance workflow error handling and internationalization (#24648) 2025-08-28 09:41:22 +08:00
5cbe6bf8f8 fix(schedule): correct weekly frequency weekday calculation algorithm (#24641) 2025-08-27 18:20:09 +08:00
4ef6ff217e fix: improve code quality in webhook services and controllers (#24634)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-27 17:50:51 +08:00
87abfbf515 Allow empty workflows and improve workflow validation (#24627) 2025-08-27 17:49:09 +08:00
73e65fd838 feat: align trigger webhook style with schedule node and fix selection border truncation (#24635) 2025-08-27 17:47:14 +08:00
e53edb0fc2 refactor: optimize TenantDailyRateLimiter to use UTC internally with timezone-aware error messages (#24632)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-27 17:35:04 +08:00
17908fbf6b fix: only workflow should display start modal (#24623) 2025-08-27 16:20:31 +08:00
3dae108f84 refactor(sidebar): Restructure app operations with toggle functionality (#24625) 2025-08-27 16:20:17 +08:00
5bbf685035 feat: fix i18n missing keys and merge upstream/main (#24615)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: GuanMu <ballmanjq@gmail.com>
Co-authored-by: Davide Delbianco <davide.delbianco@outlook.com>
Co-authored-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: Yongtao Huang <99629139+hyongtao-db@users.noreply.github.com>
Co-authored-by: Qiang Lee <18018968632@163.com>
Co-authored-by: 李强04 <liqiang04@gaotu.cn>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
Co-authored-by: Matri Qi <matrixdom@126.com>
Co-authored-by: huayaoyue6 <huayaoyue@163.com>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
Co-authored-by: znn <jubinkumarsoni@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: yihong <zouzou0208@gmail.com>
Co-authored-by: Muke Wang <shaodwaaron@gmail.com>
Co-authored-by: wangmuke <wangmuke@kingsware.cn>
Co-authored-by: Wu Tianwei <30284043+WTW0313@users.noreply.github.com>
Co-authored-by: quicksand <quicksandzn@gmail.com>
Co-authored-by: 非法操作 <hjlarry@163.com>
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
Co-authored-by: Eric Guo <eric.guocz@gmail.com>
Co-authored-by: Zhedong Cen <cenzhedong2@126.com>
Co-authored-by: jiangbo721 <jiangbo721@163.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: hjlarry <25834719+hjlarry@users.noreply.github.com>
Co-authored-by: lxsummer <35754229+lxjustdoit@users.noreply.github.com>
Co-authored-by: 湛露先生 <zhanluxianshen@163.com>
Co-authored-by: Guangdong Liu <liugddx@gmail.com>
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Yessenia-d <yessenia.contact@gmail.com>
Co-authored-by: huangzhuo1949 <167434202+huangzhuo1949@users.noreply.github.com>
Co-authored-by: huangzhuo <huangzhuo1@xiaomi.com>
Co-authored-by: 17hz <0x149527@gmail.com>
Co-authored-by: Amy <1530140574@qq.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: Nite Knite <nkCoding@gmail.com>
Co-authored-by: Yeuoly <45712896+Yeuoly@users.noreply.github.com>
Co-authored-by: Petrus Han <petrus.hanks@gmail.com>
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
Co-authored-by: Kalo Chin <frog.beepers.0n@icloud.com>
Co-authored-by: Ujjwal Maurya <ujjwalsbx@gmail.com>
Co-authored-by: Maries <xh001x@hotmail.com>
2025-08-27 15:07:28 +08:00
a63d1e87b1 feat: webhook trigger backend api (#24387)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-27 14:42:45 +08:00
7129de98cd feat: implement workflow onboarding modal system (#24551) 2025-08-27 13:31:22 +08:00
2984dbc0df fix: when workflow not has start node can't open service api (#24564) 2025-08-26 18:06:11 +08:00
392db7f611 fix: when workflow only has trigger node can't save (#24546) 2025-08-26 16:41:47 +08:00
5a427b8daa refactor: rename RunAllTriggers icon to TriggerAll for semantic clarity (#24478)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-25 17:51:04 +08:00
18f2e6f166 refactor: Use specific error types for workflow execution (#24475)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-25 16:19:12 +08:00
e78903302f feat(trigger-schedule): simplify timezone handling with user-centric approach (#24401) 2025-08-24 21:03:59 +08:00
4084ade86c refactor(trigger-webhook): remove redundant WebhookParam type and sim… (#24390) 2025-08-24 00:21:47 +08:00
6b0d919dbd feat: webhook trigger frontend (#24311) 2025-08-23 23:54:41 +08:00
a7b558b38b feat/trigger: support specifying root node (#24388)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-23 20:44:03 +08:00
6aed7e3ff4 feat/trigger universal entry (#24358)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-23 20:18:08 +08:00
8e93a8a2e2 refactor: comprehensive schedule trigger component redesign (#24359)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-23 11:03:18 +08:00
e38a86e37b Merge branch 'main' into feat/trigger 2025-08-22 20:11:49 +08:00
392e3530bf feat: replace mock data with dynamic workflow options in test run dropdown (#24320) 2025-08-22 16:36:09 +08:00
833c902b2b feat(workflow): Plugin Trigger Node with Unified Entry Node System (#24205) 2025-08-20 23:49:10 +08:00
6eaea64b3f feat: implement multi-select monthly trigger schedule (#24247) 2025-08-20 06:23:30 -07:00
5303b50737 fix: initialize recur fields when switching to hourly frequency (#24181) 2025-08-20 09:32:05 +08:00
6acbcfe679 UI improvements: fix translation and custom icons for schedule trigger (#24167) 2025-08-19 18:27:07 +08:00
16ef5ebb97 fix: remove duplicate weekdays keys in i18n workflow files (#24157) 2025-08-19 14:55:16 +08:00
acfb95f9c2 Refactor Start node UI to User Input and optimize EntryNodeContainer (#24156) 2025-08-19 14:40:24 +08:00
aacea166d7 fix: resolve merge conflict between Features removal and validation enhancement (#24150) 2025-08-19 13:47:38 +08:00
f7bb3b852a feat: implement Schedule Trigger validation with multi-start node topology support (#24134) 2025-08-19 11:55:15 +08:00
d4ff1e031a Remove workflow features button (#24085) 2025-08-19 09:32:07 +08:00
6a3d135d49 fix: simplify trigger-schedule hourly mode calculation and improve UI consistency (#24082)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-18 23:37:57 +08:00
5c4bf7aabd feat: Test Run dropdown with dynamic trigger selection (#24113) 2025-08-18 17:46:36 +08:00
e9c7dc7464 feat: update workflow run button to Test Run with keyboard shortcut (#24071) 2025-08-18 10:44:17 +08:00
74ad21b145 feat: comprehensive trigger node system with Schedule Trigger implementation (#24039)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-18 09:23:16 +08:00
f214eeb7b1 feat: add scroll to selected node button in workflow header (#24030)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-16 19:26:44 +08:00
ae25f90f34 Replace export button with more actions button in workflow control panel (#24033) 2025-08-16 19:25:18 +08:00
1509 changed files with 40793 additions and 29647 deletions

View File

@ -1,5 +1,6 @@
#!/bin/bash
npm add -g pnpm@10.15.0
corepack enable
cd web && pnpm install
pipx install uv

View File

@ -42,7 +42,11 @@ jobs:
- name: Run Unit tests
run: |
uv run --project api bash dev/pytest/pytest_unit_tests.sh
- name: Run ty check
run: |
cd api
uv add --dev ty
uv run ty check || true
- name: Run pyrefly check
run: |
cd api
@ -62,6 +66,15 @@ jobs:
- name: Run dify config tests
run: uv run --project api dev/pytest/pytest_config_tests.py
- name: MyPy Cache
uses: actions/cache@v4
with:
path: api/.mypy_cache
key: mypy-${{ matrix.python-version }}-${{ runner.os }}-${{ hashFiles('api/uv.lock') }}
- name: Run MyPy Checks
run: dev/mypy-check
- name: Set up dotenvs
run: |
cp docker/.env.example docker/.env

View File

@ -2,6 +2,8 @@ name: autofix.ci
on:
pull_request:
branches: ["main"]
push:
branches: ["main"]
permissions:
contents: read
@ -20,7 +22,7 @@ jobs:
cd api
uv sync --dev
# Fix lint errors
uv run ruff check --fix .
uv run ruff check --fix-only .
# Format code
uv run ruff format .
- name: ast-grep

View File

@ -19,23 +19,11 @@ jobs:
github.event.workflow_run.head_branch == 'deploy/enterprise'
steps:
- name: trigger deployments
env:
DEV_ENV_ADDRS: ${{ vars.DEV_ENV_ADDRS }}
DEPLOY_SECRET: ${{ secrets.DEPLOY_SECRET }}
run: |
IFS=',' read -ra ENDPOINTS <<< "${DEV_ENV_ADDRS:-}"
BODY='{"project":"dify-api","tag":"deploy-enterprise"}'
for ENDPOINT in "${ENDPOINTS[@]}"; do
ENDPOINT="$(echo "$ENDPOINT" | xargs)"
[ -z "$ENDPOINT" ] && continue
API_SIGNATURE=$(printf '%s' "$BODY" | openssl dgst -sha256 -hmac "$DEPLOY_SECRET" | awk '{print "sha256="$2}')
curl -sSf -X POST \
-H "Content-Type: application/json" \
-H "X-Hub-Signature-256: $API_SIGNATURE" \
-d "$BODY" \
"$ENDPOINT"
done
- name: Deploy to server
uses: appleboy/ssh-action@v0.1.8
with:
host: ${{ secrets.ENTERPRISE_SSH_HOST }}
username: ${{ secrets.ENTERPRISE_SSH_USER }}
password: ${{ secrets.ENTERPRISE_SSH_PASSWORD }}
script: |
${{ vars.ENTERPRISE_SSH_SCRIPT || secrets.ENTERPRISE_SSH_SCRIPT }}

View File

@ -44,14 +44,6 @@ jobs:
if: steps.changed-files.outputs.any_changed == 'true'
run: uv sync --project api --dev
- name: Run Basedpyright Checks
if: steps.changed-files.outputs.any_changed == 'true'
run: dev/basedpyright-check
- name: Run Mypy Type Checks
if: steps.changed-files.outputs.any_changed == 'true'
run: uv --directory api run mypy --exclude-gitignore --exclude 'tests/' --exclude 'migrations/' --check-untyped-defs --disable-error-code=import-untyped .
- name: Dotenv check
if: steps.changed-files.outputs.any_changed == 'true'
run: uv run --project api dotenv-linter ./api/.env.example ./web/.env.example

View File

@ -67,22 +67,12 @@ jobs:
working-directory: ./web
run: pnpm run auto-gen-i18n ${{ env.FILE_ARGS }}
- name: Generate i18n type definitions
if: env.FILES_CHANGED == 'true'
working-directory: ./web
run: pnpm run gen:i18n-types
- name: Create Pull Request
if: env.FILES_CHANGED == 'true'
uses: peter-evans/create-pull-request@v6
with:
token: ${{ secrets.GITHUB_TOKEN }}
commit-message: Update i18n files and type definitions based on en-US changes
title: 'chore: translate i18n files and update type definitions'
body: |
This PR was automatically created to update i18n files and TypeScript type definitions based on changes in en-US locale.
**Changes included:**
- Updated translation files for all locales
- Regenerated TypeScript type definitions for type safety
commit-message: Update i18n files based on en-US changes
title: 'chore: translate i18n files'
body: This PR was automatically created to update i18n files based on changes in en-US locale.
branch: chore/automated-i18n-updates

View File

@ -47,11 +47,6 @@ jobs:
working-directory: ./web
run: pnpm install --frozen-lockfile
- name: Check i18n types synchronization
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web
run: pnpm run check:i18n-types
- name: Run tests
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web

16
.gitignore vendored
View File

@ -123,12 +123,10 @@ venv.bak/
# mkdocs documentation
/site
# type checking
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
pyrightconfig.json
!api/pyrightconfig.json
# Pyre type checker
.pyre/
@ -197,8 +195,8 @@ sdks/python-client/dify_client.egg-info
.vscode/*
!.vscode/launch.json.template
!.vscode/README.md
pyrightconfig.json
api/.vscode
web/.vscode
# vscode Code History Extension
.history
@ -216,14 +214,10 @@ mise.toml
# Next.js build output
.next/
# PWA generated files
web/public/sw.js
web/public/sw.js.map
web/public/workbox-*.js
web/public/workbox-*.js.map
web/public/fallback-*.js
# AI Assistant
.roo/
api/.env.backup
/clickzetta
# mcp
.serena

View File

@ -32,7 +32,7 @@ uv run --project api pytest tests/integration_tests/ # Integration tests
./dev/reformat # Run all formatters and linters
uv run --project api ruff check --fix ./ # Fix linting issues
uv run --project api ruff format ./ # Format code
uv run --directory api basedpyright # Type checking
uv run --project api mypy . # Type checking
```
### Frontend (Web)
@ -59,6 +59,7 @@ pnpm test # Run Jest tests
- Use type hints for all functions and class attributes
- No `Any` types unless absolutely necessary
- Implement special methods (`__repr__`, `__str__`) appropriately
- **Logging**: Never use `str(e)` in `logger.exception()` calls. Use `logger.exception("message", exc_info=e)` instead
### TypeScript/JavaScript

View File

@ -4,48 +4,6 @@ WEB_IMAGE=$(DOCKER_REGISTRY)/dify-web
API_IMAGE=$(DOCKER_REGISTRY)/dify-api
VERSION=latest
# Backend Development Environment Setup
.PHONY: dev-setup prepare-docker prepare-web prepare-api
# Default dev setup target
dev-setup: prepare-docker prepare-web prepare-api
@echo "✅ Backend development environment setup complete!"
# Step 1: Prepare Docker middleware
prepare-docker:
@echo "🐳 Setting up Docker middleware..."
@cp -n docker/middleware.env.example docker/middleware.env 2>/dev/null || echo "Docker middleware.env already exists"
@cd docker && docker compose -f docker-compose.middleware.yaml --env-file middleware.env -p dify-middlewares-dev up -d
@echo "✅ Docker middleware started"
# Step 2: Prepare web environment
prepare-web:
@echo "🌐 Setting up web environment..."
@cp -n web/.env.example web/.env 2>/dev/null || echo "Web .env already exists"
@cd web && pnpm install
@cd web && pnpm build
@echo "✅ Web environment prepared (not started)"
# Step 3: Prepare API environment
prepare-api:
@echo "🔧 Setting up API environment..."
@cp -n api/.env.example api/.env 2>/dev/null || echo "API .env already exists"
@cd api && uv sync --dev
@cd api && uv run flask db upgrade
@echo "✅ API environment prepared (not started)"
# Clean dev environment
dev-clean:
@echo "⚠️ Stopping Docker containers..."
@cd docker && docker compose -f docker-compose.middleware.yaml --env-file middleware.env -p dify-middlewares-dev down
@echo "🗑️ Removing volumes..."
@rm -rf docker/volumes/db
@rm -rf docker/volumes/redis
@rm -rf docker/volumes/plugin_daemon
@rm -rf docker/volumes/weaviate
@rm -rf api/storage
@echo "✅ Cleanup complete"
# Build Docker images
build-web:
@echo "Building web Docker image: $(WEB_IMAGE):$(VERSION)..."
@ -81,21 +39,5 @@ build-push-web: build-web push-web
build-push-all: build-all push-all
@echo "All Docker images have been built and pushed."
# Help target
help:
@echo "Development Setup Targets:"
@echo " make dev-setup - Run all setup steps for backend dev environment"
@echo " make prepare-docker - Set up Docker middleware"
@echo " make prepare-web - Set up web environment"
@echo " make prepare-api - Set up API environment"
@echo " make dev-clean - Stop Docker middleware containers"
@echo ""
@echo "Docker Build Targets:"
@echo " make build-web - Build web Docker image"
@echo " make build-api - Build API Docker image"
@echo " make build-all - Build all Docker images"
@echo " make push-all - Push all Docker images"
@echo " make build-push-all - Build and push all Docker images"
# Phony targets
.PHONY: build-web build-api push-web push-api build-all push-all build-push-all dev-setup prepare-docker prepare-web prepare-api dev-clean help
.PHONY: build-web build-api push-web push-api build-all push-all build-push-all

View File

@ -75,7 +75,6 @@ DB_PASSWORD=difyai123456
DB_HOST=localhost
DB_PORT=5432
DB_DATABASE=dify
SQLALCHEMY_POOL_PRE_PING=true
# Storage configuration
# use for store upload files, private keys...
@ -435,6 +434,9 @@ HTTP_REQUEST_NODE_MAX_BINARY_SIZE=10485760
HTTP_REQUEST_NODE_MAX_TEXT_SIZE=1048576
HTTP_REQUEST_NODE_SSL_VERIFY=True
# Webhook request configuration
WEBHOOK_REQUEST_BODY_MAX_SIZE=10485760
# Respect X-* headers to redirect clients
RESPECT_XFORWARD_HEADERS_ENABLED=false
@ -503,6 +505,12 @@ ENABLE_CLEAN_MESSAGES=false
ENABLE_MAIL_CLEAN_DOCUMENT_NOTIFY_TASK=false
ENABLE_DATASETS_QUEUE_MONITOR=false
ENABLE_CHECK_UPGRADABLE_PLUGIN_TASK=true
ENABLE_WORKFLOW_SCHEDULE_POLLER_TASK=true
# Interval time in minutes for polling scheduled workflows(default: 1 min)
WORKFLOW_SCHEDULE_POLLER_INTERVAL=1
WORKFLOW_SCHEDULE_POLLER_BATCH_SIZE=100
# Maximum number of scheduled workflows to dispatch per tick (0 for unlimited)
WORKFLOW_SCHEDULE_MAX_DISPATCH_PER_TICK=0
# Position configuration
POSITION_TOOL_PINS=
@ -569,7 +577,3 @@ QUEUE_MONITOR_INTERVAL=30
# Swagger UI configuration
SWAGGER_UI_ENABLED=true
SWAGGER_UI_PATH=/swagger-ui.html
# Whether to encrypt dataset IDs when exporting DSL files (default: true)
# Set to false to export dataset IDs as plain text for easier cross-environment import
DSL_EXPORT_ENCRYPT_DATASET_ID=true

View File

@ -45,7 +45,6 @@ select = [
"G001", # don't use str format to logging messages
"G003", # don't use + in logging messages
"G004", # don't use f-strings to format logging messages
"UP042", # use StrEnum
]
ignore = [

View File

@ -54,7 +54,7 @@
"--loglevel",
"DEBUG",
"-Q",
"dataset,generation,mail,ops_trace,app_deletion"
"dataset,generation,mail,ops_trace,app_deletion,workflow"
]
}
]

View File

@ -108,5 +108,5 @@ uv run celery -A app.celery beat
../dev/reformat # Run all formatters and linters
uv run ruff check --fix ./ # Fix linting issues
uv run ruff format ./ # Format code
uv run basedpyright . # Type checking
uv run mypy . # Type checking
```

View File

@ -25,9 +25,6 @@ def create_flask_app_with_configs() -> DifyApp:
# add an unique identifier to each request
RecyclableContextVar.increment_thread_recycles()
# Capture the decorator's return value to avoid pyright reportUnusedFunction
_ = before_request
return dify_app

11
api/child_class.py Normal file
View File

@ -0,0 +1,11 @@
from tests.integration_tests.utils.parent_class import ParentClass
class ChildClass(ParentClass):
"""Test child class for module import helper tests"""
def __init__(self, name):
super().__init__(name)
def get_name(self):
return f"Child: {self.name}"

View File

@ -212,9 +212,7 @@ def migrate_annotation_vector_database():
if not dataset_collection_binding:
click.echo(f"App annotation collection binding not found: {app.id}")
continue
annotations = db.session.scalars(
select(MessageAnnotation).where(MessageAnnotation.app_id == app.id)
).all()
annotations = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app.id).all()
dataset = Dataset(
id=app.id,
tenant_id=app.tenant_id,
@ -369,25 +367,29 @@ def migrate_knowledge_vector_database():
)
raise e
dataset_documents = db.session.scalars(
select(DatasetDocument).where(
dataset_documents = (
db.session.query(DatasetDocument)
.where(
DatasetDocument.dataset_id == dataset.id,
DatasetDocument.indexing_status == "completed",
DatasetDocument.enabled == True,
DatasetDocument.archived == False,
)
).all()
.all()
)
documents = []
segments_count = 0
for dataset_document in dataset_documents:
segments = db.session.scalars(
select(DocumentSegment).where(
segments = (
db.session.query(DocumentSegment)
.where(
DocumentSegment.document_id == dataset_document.id,
DocumentSegment.status == "completed",
DocumentSegment.enabled == True,
)
).all()
.all()
)
for segment in segments:
document = Document(
@ -509,7 +511,7 @@ def add_qdrant_index(field: str):
from qdrant_client.http.exceptions import UnexpectedResponse
from qdrant_client.http.models import PayloadSchemaType
from core.rag.datasource.vdb.qdrant.qdrant_vector import PathQdrantParams, QdrantConfig
from core.rag.datasource.vdb.qdrant.qdrant_vector import QdrantConfig
for binding in bindings:
if dify_config.QDRANT_URL is None:
@ -523,21 +525,7 @@ def add_qdrant_index(field: str):
prefer_grpc=dify_config.QDRANT_GRPC_ENABLED,
)
try:
params = qdrant_config.to_qdrant_params()
# Check the type before using
if isinstance(params, PathQdrantParams):
# PathQdrantParams case
client = qdrant_client.QdrantClient(path=params.path)
else:
# UrlQdrantParams case - params is UrlQdrantParams
client = qdrant_client.QdrantClient(
url=params.url,
api_key=params.api_key,
timeout=int(params.timeout),
verify=params.verify,
grpc_port=params.grpc_port,
prefer_grpc=params.prefer_grpc,
)
client = qdrant_client.QdrantClient(**qdrant_config.to_qdrant_params())
# create payload index
client.create_payload_index(binding.collection_name, field, field_schema=PayloadSchemaType.KEYWORD)
create_count += 1
@ -583,7 +571,7 @@ def old_metadata_migration():
for document in documents:
if document.doc_metadata:
doc_metadata = document.doc_metadata
for key in doc_metadata:
for key, value in doc_metadata.items():
for field in BuiltInField:
if field.value == key:
break
@ -1219,6 +1207,55 @@ def setup_system_tool_oauth_client(provider, client_params):
click.echo(click.style(f"OAuth client params setup successfully. id: {oauth_client.id}", fg="green"))
@click.command("setup-system-trigger-oauth-client", help="Setup system trigger oauth client.")
@click.option("--provider", prompt=True, help="Provider name")
@click.option("--client-params", prompt=True, help="Client Params")
def setup_system_trigger_oauth_client(provider, client_params):
"""
Setup system trigger oauth client
"""
from core.plugin.entities.plugin import TriggerProviderID
from models.trigger import TriggerOAuthSystemClient
provider_id = TriggerProviderID(provider)
provider_name = provider_id.provider_name
plugin_id = provider_id.plugin_id
try:
# json validate
click.echo(click.style(f"Validating client params: {client_params}", fg="yellow"))
client_params_dict = TypeAdapter(dict[str, Any]).validate_json(client_params)
click.echo(click.style("Client params validated successfully.", fg="green"))
click.echo(click.style(f"Encrypting client params: {client_params}", fg="yellow"))
click.echo(click.style(f"Using SECRET_KEY: `{dify_config.SECRET_KEY}`", fg="yellow"))
oauth_client_params = encrypt_system_oauth_params(client_params_dict)
click.echo(click.style("Client params encrypted successfully.", fg="green"))
except Exception as e:
click.echo(click.style(f"Error parsing client params: {str(e)}", fg="red"))
return
deleted_count = (
db.session.query(TriggerOAuthSystemClient)
.filter_by(
provider=provider_name,
plugin_id=plugin_id,
)
.delete()
)
if deleted_count > 0:
click.echo(click.style(f"Deleted {deleted_count} existing oauth client params.", fg="yellow"))
oauth_client = TriggerOAuthSystemClient(
provider=provider_name,
plugin_id=plugin_id,
encrypted_oauth_params=oauth_client_params,
)
db.session.add(oauth_client)
db.session.commit()
click.echo(click.style(f"OAuth client params setup successfully. id: {oauth_client.id}", fg="green"))
def _find_orphaned_draft_variables(batch_size: int = 1000) -> list[str]:
"""
Find draft variables that reference non-existent apps.

View File

@ -147,6 +147,17 @@ class CodeExecutionSandboxConfig(BaseSettings):
)
class TriggerConfig(BaseSettings):
"""
Configuration for trigger
"""
WEBHOOK_REQUEST_BODY_MAX_SIZE: PositiveInt = Field(
description="Maximum allowed size for webhook request bodies in bytes",
default=10485760,
)
class PluginConfig(BaseSettings):
"""
Plugin configs
@ -796,11 +807,6 @@ class DataSetConfig(BaseSettings):
default=30,
)
DSL_EXPORT_ENCRYPT_DATASET_ID: bool = Field(
description="Enable or disable dataset ID encryption when exporting DSL files",
default=True,
)
class WorkspaceConfig(BaseSettings):
"""
@ -876,6 +882,22 @@ class CeleryScheduleTasksConfig(BaseSettings):
description="Enable check upgradable plugin task",
default=True,
)
ENABLE_WORKFLOW_SCHEDULE_POLLER_TASK: bool = Field(
description="Enable workflow schedule poller task",
default=True,
)
WORKFLOW_SCHEDULE_POLLER_INTERVAL: int = Field(
description="Workflow schedule poller interval in minutes",
default=1,
)
WORKFLOW_SCHEDULE_POLLER_BATCH_SIZE: int = Field(
description="Maximum number of schedules to process in each poll batch",
default=100,
)
WORKFLOW_SCHEDULE_MAX_DISPATCH_PER_TICK: int = Field(
description="Maximum schedules to dispatch per tick (0=unlimited, circuit breaker)",
default=0,
)
class PositionConfig(BaseSettings):
@ -999,6 +1021,7 @@ class FeatureConfig(
AuthConfig, # Changed from OAuthConfig to AuthConfig
BillingConfig,
CodeExecutionSandboxConfig,
TriggerConfig,
PluginConfig,
MarketplaceConfig,
DataSetConfig,

View File

@ -300,7 +300,8 @@ class DatasetQueueMonitorConfig(BaseSettings):
class MiddlewareConfig(
# place the configs in alphabet order
CeleryConfig, # Note: CeleryConfig already inherits from DatabaseConfig
CeleryConfig,
DatabaseConfig,
KeywordStoreConfig,
RedisConfig,
# configs of storage and storage providers

View File

@ -1,10 +1,9 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
from pydantic import BaseModel, Field
class ClickzettaConfig(BaseSettings):
class ClickzettaConfig(BaseModel):
"""
Clickzetta Lakehouse vector database configuration
"""

View File

@ -1,8 +1,7 @@
from pydantic import Field
from pydantic_settings import BaseSettings
from pydantic import BaseModel, Field
class MatrixoneConfig(BaseSettings):
class MatrixoneConfig(BaseModel):
"""Matrixone vector database configuration."""
MATRIXONE_HOST: str = Field(default="localhost", description="Host address of the Matrixone server")

View File

@ -1,6 +1,6 @@
from pydantic import Field
from configs.packaging.pyproject import PyProjectTomlConfig
from configs.packaging.pyproject import PyProjectConfig, PyProjectTomlConfig
class PackagingInfo(PyProjectTomlConfig):

View File

@ -4,9 +4,8 @@ import logging
import os
import threading
import time
from collections.abc import Callable, Mapping
from collections.abc import Mapping
from pathlib import Path
from typing import Any
from .python_3x import http_request, makedirs_wrapper
from .utils import (
@ -26,13 +25,13 @@ logger = logging.getLogger(__name__)
class ApolloClient:
def __init__(
self,
config_url: str,
app_id: str,
cluster: str = "default",
secret: str = "",
start_hot_update: bool = True,
change_listener: Callable[[str, str, str, Any], None] | None = None,
_notification_map: dict[str, int] | None = None,
config_url,
app_id,
cluster="default",
secret="",
start_hot_update=True,
change_listener=None,
_notification_map=None,
):
# Core routing parameters
self.config_url = config_url
@ -48,17 +47,17 @@ class ApolloClient:
# Private control variables
self._cycle_time = 5
self._stopping = False
self._cache: dict[str, dict[str, Any]] = {}
self._no_key: dict[str, str] = {}
self._hash: dict[str, str] = {}
self._cache = {}
self._no_key = {}
self._hash = {}
self._pull_timeout = 75
self._cache_file_path = os.path.expanduser("~") + "/.dify/config/remote-settings/apollo/cache/"
self._long_poll_thread: threading.Thread | None = None
self._long_poll_thread = None
self._change_listener = change_listener # "add" "delete" "update"
if _notification_map is None:
_notification_map = {"application": -1}
self._notification_map = _notification_map
self.last_release_key: str | None = None
self.last_release_key = None
# Private startup method
self._path_checker()
if start_hot_update:
@ -69,7 +68,7 @@ class ApolloClient:
heartbeat.daemon = True
heartbeat.start()
def get_json_from_net(self, namespace: str = "application") -> dict[str, Any] | None:
def get_json_from_net(self, namespace="application"):
url = "{}/configs/{}/{}/{}?releaseKey={}&ip={}".format(
self.config_url, self.app_id, self.cluster, namespace, "", self.ip
)
@ -89,7 +88,7 @@ class ApolloClient:
logger.exception("an error occurred in get_json_from_net")
return None
def get_value(self, key: str, default_val: Any = None, namespace: str = "application") -> Any:
def get_value(self, key, default_val=None, namespace="application"):
try:
# read memory configuration
namespace_cache = self._cache.get(namespace)
@ -105,8 +104,7 @@ class ApolloClient:
namespace_data = self.get_json_from_net(namespace)
val = get_value_from_dict(namespace_data, key)
if val is not None:
if namespace_data is not None:
self._update_cache_and_file(namespace_data, namespace)
self._update_cache_and_file(namespace_data, namespace)
return val
# read the file configuration
@ -128,23 +126,23 @@ class ApolloClient:
# to ensure the real-time correctness of the function call.
# If the user does not have the same default val twice
# and the default val is used here, there may be a problem.
def _set_local_cache_none(self, namespace: str, key: str) -> None:
def _set_local_cache_none(self, namespace, key):
no_key = no_key_cache_key(namespace, key)
self._no_key[no_key] = key
def _start_hot_update(self) -> None:
def _start_hot_update(self):
self._long_poll_thread = threading.Thread(target=self._listener)
# When the asynchronous thread is started, the daemon thread will automatically exit
# when the main thread is launched.
self._long_poll_thread.daemon = True
self._long_poll_thread.start()
def stop(self) -> None:
def stop(self):
self._stopping = True
logger.info("Stopping listener...")
# Call the set callback function, and if it is abnormal, try it out
def _call_listener(self, namespace: str, old_kv: dict[str, Any] | None, new_kv: dict[str, Any] | None) -> None:
def _call_listener(self, namespace, old_kv, new_kv):
if self._change_listener is None:
return
if old_kv is None:
@ -170,12 +168,12 @@ class ApolloClient:
except BaseException as e:
logger.warning(str(e))
def _path_checker(self) -> None:
def _path_checker(self):
if not os.path.isdir(self._cache_file_path):
makedirs_wrapper(self._cache_file_path)
# update the local cache and file cache
def _update_cache_and_file(self, namespace_data: dict[str, Any], namespace: str = "application") -> None:
def _update_cache_and_file(self, namespace_data, namespace="application"):
# update the local cache
self._cache[namespace] = namespace_data
# update the file cache
@ -189,7 +187,7 @@ class ApolloClient:
self._hash[namespace] = new_hash
# get the configuration from the local file
def _get_local_cache(self, namespace: str = "application") -> dict[str, Any]:
def _get_local_cache(self, namespace="application"):
cache_file_path = os.path.join(self._cache_file_path, f"{self.app_id}_configuration_{namespace}.txt")
if os.path.isfile(cache_file_path):
with open(cache_file_path) as f:
@ -197,8 +195,8 @@ class ApolloClient:
return result
return {}
def _long_poll(self) -> None:
notifications: list[dict[str, Any]] = []
def _long_poll(self):
notifications = []
for key in self._cache:
namespace_data = self._cache[key]
notification_id = -1
@ -238,7 +236,7 @@ class ApolloClient:
except Exception as e:
logger.warning(str(e))
def _get_net_and_set_local(self, namespace: str, n_id: int, call_change: bool = False) -> None:
def _get_net_and_set_local(self, namespace, n_id, call_change=False):
namespace_data = self.get_json_from_net(namespace)
if not namespace_data:
return
@ -250,7 +248,7 @@ class ApolloClient:
new_kv = namespace_data.get(CONFIGURATIONS)
self._call_listener(namespace, old_kv, new_kv)
def _listener(self) -> None:
def _listener(self):
logger.info("start long_poll")
while not self._stopping:
self._long_poll()
@ -268,13 +266,13 @@ class ApolloClient:
headers["Timestamp"] = time_unix_now
return headers
def _heart_beat(self) -> None:
def _heart_beat(self):
while not self._stopping:
for namespace in self._notification_map:
self._do_heart_beat(namespace)
time.sleep(60 * 10) # 10 minutes
def _do_heart_beat(self, namespace: str) -> None:
def _do_heart_beat(self, namespace):
url = f"{self.config_url}/configs/{self.app_id}/{self.cluster}/{namespace}?ip={self.ip}"
try:
code, body = http_request(url, timeout=3, headers=self._sign_headers(url))
@ -294,7 +292,7 @@ class ApolloClient:
logger.exception("an error occurred in _do_heart_beat")
return None
def get_all_dicts(self, namespace: str) -> dict[str, Any] | None:
def get_all_dicts(self, namespace):
namespace_data = self._cache.get(namespace)
if namespace_data is None:
net_namespace_data = self.get_json_from_net(namespace)

View File

@ -2,8 +2,6 @@ import logging
import os
import ssl
import urllib.request
from collections.abc import Mapping
from typing import Any
from urllib import parse
from urllib.error import HTTPError
@ -21,9 +19,9 @@ urllib.request.install_opener(opener)
logger = logging.getLogger(__name__)
def http_request(url: str, timeout: int | float, headers: Mapping[str, str] = {}) -> tuple[int, str | None]:
def http_request(url, timeout, headers={}):
try:
request = urllib.request.Request(url, headers=dict(headers))
request = urllib.request.Request(url, headers=headers)
res = urllib.request.urlopen(request, timeout=timeout)
body = res.read().decode("utf-8")
return res.code, body
@ -35,9 +33,9 @@ def http_request(url: str, timeout: int | float, headers: Mapping[str, str] = {}
raise e
def url_encode(params: dict[str, Any]) -> str:
def url_encode(params):
return parse.urlencode(params)
def makedirs_wrapper(path: str) -> None:
def makedirs_wrapper(path):
os.makedirs(path, exist_ok=True)

View File

@ -1,6 +1,5 @@
import hashlib
import socket
from typing import Any
from .python_3x import url_encode
@ -11,7 +10,7 @@ NAMESPACE_NAME = "namespaceName"
# add timestamps uris and keys
def signature(timestamp: str, uri: str, secret: str) -> str:
def signature(timestamp, uri, secret):
import base64
import hmac
@ -20,16 +19,16 @@ def signature(timestamp: str, uri: str, secret: str) -> str:
return base64.b64encode(hmac_code).decode()
def url_encode_wrapper(params: dict[str, Any]) -> str:
def url_encode_wrapper(params):
return url_encode(params)
def no_key_cache_key(namespace: str, key: str) -> str:
def no_key_cache_key(namespace, key):
return f"{namespace}{len(namespace)}{key}"
# Returns whether the obtained value is obtained, and None if it does not
def get_value_from_dict(namespace_cache: dict[str, Any] | None, key: str) -> Any | None:
def get_value_from_dict(namespace_cache, key):
if namespace_cache:
kv_data = namespace_cache.get(CONFIGURATIONS)
if kv_data is None:
@ -39,7 +38,7 @@ def get_value_from_dict(namespace_cache: dict[str, Any] | None, key: str) -> Any
return None
def init_ip() -> str:
def init_ip():
ip = ""
s = None
try:

View File

@ -11,5 +11,5 @@ class RemoteSettingsSource:
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
raise NotImplementedError
def prepare_field_value(self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool):
def prepare_field_value(self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool) -> Any:
return value

View File

@ -11,16 +11,16 @@ logger = logging.getLogger(__name__)
from configs.remote_settings_sources.base import RemoteSettingsSource
from .utils import parse_config
from .utils import _parse_config
class NacosSettingsSource(RemoteSettingsSource):
def __init__(self, configs: Mapping[str, Any]):
self.configs = configs
self.remote_configs: dict[str, str] = {}
self.remote_configs: dict[str, Any] = {}
self.async_init()
def async_init(self) -> None:
def async_init(self):
data_id = os.getenv("DIFY_ENV_NACOS_DATA_ID", "dify-api-env.properties")
group = os.getenv("DIFY_ENV_NACOS_GROUP", "nacos-dify")
tenant = os.getenv("DIFY_ENV_NACOS_NAMESPACE", "")
@ -29,19 +29,22 @@ class NacosSettingsSource(RemoteSettingsSource):
try:
content = NacosHttpClient().http_request("/nacos/v1/cs/configs", method="GET", headers={}, params=params)
self.remote_configs = self._parse_config(content)
except Exception:
except Exception as e:
logger.exception("[get-access-token] exception occurred")
raise
def _parse_config(self, content: str) -> dict[str, str]:
def _parse_config(self, content: str) -> dict:
if not content:
return {}
try:
return parse_config(content)
return _parse_config(self, content)
except Exception as e:
raise RuntimeError(f"Failed to parse config: {e}")
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
if not isinstance(self.remote_configs, dict):
raise ValueError(f"remote configs is not dict, but {type(self.remote_configs)}")
field_value = self.remote_configs.get(field_name)
if field_value is None:
return None, field_name, False

View File

@ -17,26 +17,20 @@ class NacosHttpClient:
self.ak = os.getenv("DIFY_ENV_NACOS_ACCESS_KEY")
self.sk = os.getenv("DIFY_ENV_NACOS_SECRET_KEY")
self.server = os.getenv("DIFY_ENV_NACOS_SERVER_ADDR", "localhost:8848")
self.token: str | None = None
self.token = None
self.token_ttl = 18000
self.token_expire_time: float = 0
def http_request(
self, url: str, method: str = "GET", headers: dict[str, str] | None = None, params: dict[str, str] | None = None
) -> str:
if headers is None:
headers = {}
if params is None:
params = {}
def http_request(self, url, method="GET", headers=None, params=None):
try:
self._inject_auth_info(headers, params)
response = requests.request(method, url="http://" + self.server + url, headers=headers, params=params)
response.raise_for_status()
return response.text
except requests.RequestException as e:
except requests.exceptions.RequestException as e:
return f"Request to Nacos failed: {e}"
def _inject_auth_info(self, headers: dict[str, str], params: dict[str, str], module: str = "config") -> None:
def _inject_auth_info(self, headers, params, module="config"):
headers.update({"User-Agent": "Nacos-Http-Client-In-Dify:v0.0.1"})
if module == "login":
@ -51,17 +45,16 @@ class NacosHttpClient:
headers["timeStamp"] = ts
if self.username and self.password:
self.get_access_token(force_refresh=False)
if self.token is not None:
params["accessToken"] = self.token
params["accessToken"] = self.token
def __do_sign(self, sign_str: str, sk: str) -> str:
def __do_sign(self, sign_str, sk):
return (
base64.encodebytes(hmac.new(sk.encode(), sign_str.encode(), digestmod=hashlib.sha1).digest())
.decode()
.strip()
)
def get_sign_str(self, group: str, tenant: str, ts: str) -> str:
def get_sign_str(self, group, tenant, ts):
sign_str = ""
if tenant:
sign_str = tenant + "+"
@ -70,7 +63,7 @@ class NacosHttpClient:
sign_str += ts # Directly concatenate ts without conditional checks, because the nacos auth header forced it.
return sign_str
def get_access_token(self, force_refresh: bool = False) -> str | None:
def get_access_token(self, force_refresh=False):
current_time = time.time()
if self.token and not force_refresh and self.token_expire_time > current_time:
return self.token
@ -84,7 +77,6 @@ class NacosHttpClient:
self.token = response_data.get("accessToken")
self.token_ttl = response_data.get("tokenTtl", 18000)
self.token_expire_time = current_time + self.token_ttl - 10
return self.token
except Exception:
except Exception as e:
logger.exception("[get-access-token] exception occur")
raise

View File

@ -1,4 +1,4 @@
def parse_config(content: str) -> dict[str, str]:
def _parse_config(self, content: str) -> dict[str, str]:
config: dict[str, str] = {}
if not content:
return config

View File

@ -16,14 +16,14 @@ AUDIO_EXTENSIONS = ["mp3", "m4a", "wav", "amr", "mpga"]
AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
_doc_extensions: list[str]
if dify_config.ETL_TYPE == "Unstructured":
_doc_extensions = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"]
_doc_extensions.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"]
DOCUMENT_EXTENSIONS.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
if dify_config.UNSTRUCTURED_API_URL:
_doc_extensions.append("ppt")
DOCUMENT_EXTENSIONS.append("ppt")
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
else:
_doc_extensions = [
DOCUMENT_EXTENSIONS = [
"txt",
"markdown",
"md",
@ -38,4 +38,4 @@ else:
"vtt",
"properties",
]
DOCUMENT_EXTENSIONS = _doc_extensions + [ext.upper() for ext in _doc_extensions]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])

View File

@ -19,7 +19,6 @@ language_timezone_mapping = {
"fa-IR": "Asia/Tehran",
"sl-SI": "Europe/Ljubljana",
"th-TH": "Asia/Bangkok",
"id-ID": "Asia/Jakarta",
}
languages = list(language_timezone_mapping.keys())

View File

@ -8,6 +8,8 @@ if TYPE_CHECKING:
from core.model_runtime.entities.model_entities import AIModelEntity
from core.plugin.entities.plugin_daemon import PluginModelProviderEntity
from core.tools.plugin_tool.provider import PluginToolProviderController
from core.trigger.provider import PluginTriggerProviderController
from core.workflow.entities.variable_pool import VariablePool
"""
@ -32,3 +34,11 @@ plugin_model_schema_lock: RecyclableContextVar[Lock] = RecyclableContextVar(Cont
plugin_model_schemas: RecyclableContextVar[dict[str, "AIModelEntity"]] = RecyclableContextVar(
ContextVar("plugin_model_schemas")
)
plugin_trigger_providers: RecyclableContextVar[dict[str, "PluginTriggerProviderController"]] = RecyclableContextVar(
ContextVar("plugin_trigger_providers")
)
plugin_trigger_providers_lock: RecyclableContextVar[Lock] = RecyclableContextVar(
ContextVar("plugin_trigger_providers_lock")
)

View File

@ -1,5 +1,4 @@
from flask import Blueprint
from flask_restx import Namespace
from libs.external_api import ExternalApi
@ -27,16 +26,7 @@ from .files import FileApi, FilePreviewApi, FileSupportTypeApi
from .remote_files import RemoteFileInfoApi, RemoteFileUploadApi
bp = Blueprint("console", __name__, url_prefix="/console/api")
api = ExternalApi(
bp,
version="1.0",
title="Console API",
description="Console management APIs for app configuration, monitoring, and administration",
)
# Create namespace
console_ns = Namespace("console", description="Console management API operations", path="/")
api = ExternalApi(bp)
# File
api.add_resource(FileApi, "/files/upload")
@ -53,90 +43,57 @@ api.add_resource(AppImportConfirmApi, "/apps/imports/<string:import_id>/confirm"
api.add_resource(AppImportCheckDependenciesApi, "/apps/imports/<string:app_id>/check-dependencies")
# Import other controllers
from . import (
admin, # pyright: ignore[reportUnusedImport]
apikey, # pyright: ignore[reportUnusedImport]
extension, # pyright: ignore[reportUnusedImport]
feature, # pyright: ignore[reportUnusedImport]
init_validate, # pyright: ignore[reportUnusedImport]
ping, # pyright: ignore[reportUnusedImport]
setup, # pyright: ignore[reportUnusedImport]
version, # pyright: ignore[reportUnusedImport]
)
from . import admin, apikey, extension, feature, ping, setup, version
# Import app controllers
from .app import (
advanced_prompt_template, # pyright: ignore[reportUnusedImport]
agent, # pyright: ignore[reportUnusedImport]
annotation, # pyright: ignore[reportUnusedImport]
app, # pyright: ignore[reportUnusedImport]
audio, # pyright: ignore[reportUnusedImport]
completion, # pyright: ignore[reportUnusedImport]
conversation, # pyright: ignore[reportUnusedImport]
conversation_variables, # pyright: ignore[reportUnusedImport]
generator, # pyright: ignore[reportUnusedImport]
mcp_server, # pyright: ignore[reportUnusedImport]
message, # pyright: ignore[reportUnusedImport]
model_config, # pyright: ignore[reportUnusedImport]
ops_trace, # pyright: ignore[reportUnusedImport]
site, # pyright: ignore[reportUnusedImport]
statistic, # pyright: ignore[reportUnusedImport]
workflow, # pyright: ignore[reportUnusedImport]
workflow_app_log, # pyright: ignore[reportUnusedImport]
workflow_draft_variable, # pyright: ignore[reportUnusedImport]
workflow_run, # pyright: ignore[reportUnusedImport]
workflow_statistic, # pyright: ignore[reportUnusedImport]
advanced_prompt_template,
agent,
annotation,
app,
audio,
completion,
conversation,
conversation_variables,
generator,
mcp_server,
message,
model_config,
ops_trace,
site,
statistic,
workflow,
workflow_app_log,
workflow_draft_variable,
workflow_run,
workflow_statistic,
workflow_trigger,
)
# Import auth controllers
from .auth import (
activate, # pyright: ignore[reportUnusedImport]
data_source_bearer_auth, # pyright: ignore[reportUnusedImport]
data_source_oauth, # pyright: ignore[reportUnusedImport]
forgot_password, # pyright: ignore[reportUnusedImport]
login, # pyright: ignore[reportUnusedImport]
oauth, # pyright: ignore[reportUnusedImport]
oauth_server, # pyright: ignore[reportUnusedImport]
)
from .auth import activate, data_source_bearer_auth, data_source_oauth, forgot_password, login, oauth, oauth_server
# Import billing controllers
from .billing import billing, compliance # pyright: ignore[reportUnusedImport]
from .billing import billing, compliance
# Import datasets controllers
from .datasets import (
data_source, # pyright: ignore[reportUnusedImport]
datasets, # pyright: ignore[reportUnusedImport]
datasets_document, # pyright: ignore[reportUnusedImport]
datasets_segments, # pyright: ignore[reportUnusedImport]
external, # pyright: ignore[reportUnusedImport]
hit_testing, # pyright: ignore[reportUnusedImport]
metadata, # pyright: ignore[reportUnusedImport]
website, # pyright: ignore[reportUnusedImport]
data_source,
datasets,
datasets_document,
datasets_segments,
external,
hit_testing,
metadata,
website,
)
# Import explore controllers
from .explore import (
installed_app, # pyright: ignore[reportUnusedImport]
parameter, # pyright: ignore[reportUnusedImport]
recommended_app, # pyright: ignore[reportUnusedImport]
saved_message, # pyright: ignore[reportUnusedImport]
)
# Import tag controllers
from .tag import tags # pyright: ignore[reportUnusedImport]
# Import workspace controllers
from .workspace import (
account, # pyright: ignore[reportUnusedImport]
agent_providers, # pyright: ignore[reportUnusedImport]
endpoint, # pyright: ignore[reportUnusedImport]
load_balancing_config, # pyright: ignore[reportUnusedImport]
members, # pyright: ignore[reportUnusedImport]
model_providers, # pyright: ignore[reportUnusedImport]
models, # pyright: ignore[reportUnusedImport]
plugin, # pyright: ignore[reportUnusedImport]
tool_providers, # pyright: ignore[reportUnusedImport]
workspace, # pyright: ignore[reportUnusedImport]
installed_app,
parameter,
recommended_app,
saved_message,
)
# Explore Audio
@ -210,4 +167,20 @@ api.add_resource(
InstalledAppWorkflowTaskStopApi, "/installed-apps/<uuid:installed_app_id>/workflows/tasks/<string:task_id>/stop"
)
api.add_namespace(console_ns)
# Import tag controllers
from .tag import tags
# Import workspace controllers
from .workspace import (
account,
agent_providers,
endpoint,
load_balancing_config,
members,
model_providers,
models,
plugin,
tool_providers,
trigger_providers,
workspace,
)

View File

@ -1,26 +1,22 @@
from collections.abc import Callable
from functools import wraps
from typing import ParamSpec, TypeVar
from flask import request
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound, Unauthorized
P = ParamSpec("P")
R = TypeVar("R")
from configs import dify_config
from constants.languages import supported_language
from controllers.console import api, console_ns
from controllers.console import api
from controllers.console.wraps import only_edition_cloud
from extensions.ext_database import db
from models.model import App, InstalledApp, RecommendedApp
def admin_required(view: Callable[P, R]):
def admin_required(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
if not dify_config.ADMIN_API_KEY:
raise Unauthorized("API key is invalid.")
@ -45,28 +41,7 @@ def admin_required(view: Callable[P, R]):
return decorated
@console_ns.route("/admin/insert-explore-apps")
class InsertExploreAppListApi(Resource):
@api.doc("insert_explore_app")
@api.doc(description="Insert or update an app in the explore list")
@api.expect(
api.model(
"InsertExploreAppRequest",
{
"app_id": fields.String(required=True, description="Application ID"),
"desc": fields.String(description="App description"),
"copyright": fields.String(description="Copyright information"),
"privacy_policy": fields.String(description="Privacy policy"),
"custom_disclaimer": fields.String(description="Custom disclaimer"),
"language": fields.String(required=True, description="Language code"),
"category": fields.String(required=True, description="App category"),
"position": fields.Integer(required=True, description="Display position"),
},
)
)
@api.response(200, "App updated successfully")
@api.response(201, "App inserted successfully")
@api.response(404, "App not found")
@only_edition_cloud
@admin_required
def post(self):
@ -136,12 +111,7 @@ class InsertExploreAppListApi(Resource):
return {"result": "success"}, 200
@console_ns.route("/admin/insert-explore-apps/<uuid:app_id>")
class InsertExploreAppApi(Resource):
@api.doc("delete_explore_app")
@api.doc(description="Remove an app from the explore list")
@api.doc(params={"app_id": "Application ID to remove"})
@api.response(204, "App removed successfully")
@only_edition_cloud
@admin_required
def delete(self, app_id):
@ -160,21 +130,21 @@ class InsertExploreAppApi(Resource):
app.is_public = False
with Session(db.engine) as session:
installed_apps = (
session.execute(
select(InstalledApp).where(
InstalledApp.app_id == recommended_app.app_id,
InstalledApp.tenant_id != InstalledApp.app_owner_tenant_id,
)
installed_apps = session.execute(
select(InstalledApp).where(
InstalledApp.app_id == recommended_app.app_id,
InstalledApp.tenant_id != InstalledApp.app_owner_tenant_id,
)
.scalars()
.all()
)
).all()
for installed_app in installed_apps:
session.delete(installed_app)
for installed_app in installed_apps:
db.session.delete(installed_app)
db.session.delete(recommended_app)
db.session.commit()
return {"result": "success"}, 204
api.add_resource(InsertExploreAppListApi, "/admin/insert-explore-apps")
api.add_resource(InsertExploreAppApi, "/admin/insert-explore-apps/<uuid:app_id>")

View File

@ -1,9 +1,8 @@
from typing import Optional
from typing import Any, Optional
import flask_restx
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with
from flask_restx._http import HTTPStatus
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden
@ -14,7 +13,7 @@ from libs.login import login_required
from models.dataset import Dataset
from models.model import ApiToken, App
from . import api, console_ns
from . import api
from .wraps import account_initialization_required, setup_required
api_key_fields = {
@ -41,7 +40,7 @@ def _get_resource(resource_id, tenant_id, resource_model):
).scalar_one_or_none()
if resource is None:
flask_restx.abort(HTTPStatus.NOT_FOUND, message=f"{resource_model.__name__} not found.")
flask_restx.abort(404, message=f"{resource_model.__name__} not found.")
return resource
@ -50,7 +49,7 @@ class BaseApiKeyListResource(Resource):
method_decorators = [account_initialization_required, login_required, setup_required]
resource_type: str | None = None
resource_model: Optional[type] = None
resource_model: Optional[Any] = None
resource_id_field: str | None = None
token_prefix: str | None = None
max_keys = 10
@ -60,11 +59,11 @@ class BaseApiKeyListResource(Resource):
assert self.resource_id_field is not None, "resource_id_field must be set"
resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
keys = db.session.scalars(
select(ApiToken).where(
ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id
)
).all()
keys = (
db.session.query(ApiToken)
.where(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id)
.all()
)
return {"items": keys}
@marshal_with(api_key_fields)
@ -83,12 +82,12 @@ class BaseApiKeyListResource(Resource):
if current_key_count >= self.max_keys:
flask_restx.abort(
HTTPStatus.BAD_REQUEST,
400,
message=f"Cannot create more than {self.max_keys} API keys for this resource type.",
custom="max_keys_exceeded",
code="max_keys_exceeded",
)
key = ApiToken.generate_api_key(self.token_prefix or "", 24)
key = ApiToken.generate_api_key(self.token_prefix, 24)
api_token = ApiToken()
setattr(api_token, self.resource_id_field, resource_id)
api_token.tenant_id = current_user.current_tenant_id
@ -103,7 +102,7 @@ class BaseApiKeyResource(Resource):
method_decorators = [account_initialization_required, login_required, setup_required]
resource_type: str | None = None
resource_model: Optional[type] = None
resource_model: Optional[Any] = None
resource_id_field: str | None = None
def delete(self, resource_id, api_key_id):
@ -127,7 +126,7 @@ class BaseApiKeyResource(Resource):
)
if key is None:
flask_restx.abort(HTTPStatus.NOT_FOUND, message="API key not found")
flask_restx.abort(404, message="API key not found")
db.session.query(ApiToken).where(ApiToken.id == api_key_id).delete()
db.session.commit()
@ -135,25 +134,7 @@ class BaseApiKeyResource(Resource):
return {"result": "success"}, 204
@console_ns.route("/apps/<uuid:resource_id>/api-keys")
class AppApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_app_api_keys")
@api.doc(description="Get all API keys for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
"""Get all API keys for an app"""
return super().get(resource_id)
@api.doc("create_app_api_key")
@api.doc(description="Create a new API key for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
"""Create a new API key for an app"""
return super().post(resource_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
@ -165,16 +146,7 @@ class AppApiKeyListResource(BaseApiKeyListResource):
token_prefix = "app-"
@console_ns.route("/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class AppApiKeyResource(BaseApiKeyResource):
@api.doc("delete_app_api_key")
@api.doc(description="Delete an API key for an app")
@api.doc(params={"resource_id": "App ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for an app"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
@ -185,25 +157,7 @@ class AppApiKeyResource(BaseApiKeyResource):
resource_id_field = "app_id"
@console_ns.route("/datasets/<uuid:resource_id>/api-keys")
class DatasetApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_dataset_api_keys")
@api.doc(description="Get all API keys for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
"""Get all API keys for a dataset"""
return super().get(resource_id)
@api.doc("create_dataset_api_key")
@api.doc(description="Create a new API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
"""Create a new API key for a dataset"""
return super().post(resource_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
@ -215,16 +169,7 @@ class DatasetApiKeyListResource(BaseApiKeyListResource):
token_prefix = "ds-"
@console_ns.route("/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class DatasetApiKeyResource(BaseApiKeyResource):
@api.doc("delete_dataset_api_key")
@api.doc(description="Delete an API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for a dataset"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
@ -233,3 +178,9 @@ class DatasetApiKeyResource(BaseApiKeyResource):
resource_type = "dataset"
resource_model = Dataset
resource_id_field = "dataset_id"
api.add_resource(AppApiKeyListResource, "/apps/<uuid:resource_id>/api-keys")
api.add_resource(AppApiKeyResource, "/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
api.add_resource(DatasetApiKeyListResource, "/datasets/<uuid:resource_id>/api-keys")
api.add_resource(DatasetApiKeyResource, "/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")

View File

@ -115,10 +115,6 @@ class AppListApi(Resource):
raise BadRequest("mode is required")
app_service = AppService()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
if current_user.current_tenant_id is None:
raise ValueError("current_user.current_tenant_id cannot be None")
app = app_service.create_app(current_user.current_tenant_id, args, current_user)
return app, 201
@ -165,26 +161,14 @@ class AppApi(Resource):
args = parser.parse_args()
app_service = AppService()
# Construct ArgsDict from parsed arguments
from services.app_service import AppService as AppServiceType
args_dict: AppServiceType.ArgsDict = {
"name": args["name"],
"description": args.get("description", ""),
"icon_type": args.get("icon_type", ""),
"icon": args.get("icon", ""),
"icon_background": args.get("icon_background", ""),
"use_icon_as_answer_icon": args.get("use_icon_as_answer_icon", False),
"max_active_requests": args.get("max_active_requests", 0),
}
app_model = app_service.update_app(app_model, args_dict)
app_model = app_service.update_app(app_model, args)
return app_model
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def delete(self, app_model):
"""Delete app"""
# The role of the current user in the ta table must be admin, owner, or editor
@ -240,10 +224,10 @@ class AppCopyApi(Resource):
class AppExportApi(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
"""Export app"""
# The role of the current user in the ta table must be admin, owner, or editor
@ -253,14 +237,9 @@ class AppExportApi(Resource):
# Add include_secret params
parser = reqparse.RequestParser()
parser.add_argument("include_secret", type=inputs.boolean, default=False, location="args")
parser.add_argument("workflow_id", type=str, location="args")
args = parser.parse_args()
return {
"data": AppDslService.export_dsl(
app_model=app_model, include_secret=args["include_secret"], workflow_id=args.get("workflow_id")
)
}
return {"data": AppDslService.export_dsl(app_model=app_model, include_secret=args["include_secret"])}
class AppNameApi(Resource):
@ -279,7 +258,7 @@ class AppNameApi(Resource):
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_name(app_model, args["name"])
app_model = app_service.update_app_name(app_model, args.get("name"))
return app_model
@ -301,7 +280,7 @@ class AppIconApi(Resource):
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_icon(app_model, args.get("icon") or "", args.get("icon_background") or "")
app_model = app_service.update_app_icon(app_model, args.get("icon"), args.get("icon_background"))
return app_model
@ -322,7 +301,7 @@ class AppSiteStatus(Resource):
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_site_status(app_model, args["enable_site"])
app_model = app_service.update_app_site_status(app_model, args.get("enable_site"))
return app_model
@ -343,7 +322,7 @@ class AppApiStatus(Resource):
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_api_status(app_model, args["enable_api"])
app_model = app_service.update_app_api_status(app_model, args.get("enable_api"))
return app_model

View File

@ -77,10 +77,10 @@ class ChatMessageAudioApi(Resource):
class ChatMessageTextApi(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def post(self, app_model: App):
try:
parser = reqparse.RequestParser()
@ -125,10 +125,10 @@ class ChatMessageTextApi(Resource):
class TextModesApi(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
try:
parser = reqparse.RequestParser()

View File

@ -1,5 +1,6 @@
import logging
import flask_login
from flask import request
from flask_restx import Resource, reqparse
from werkzeug.exceptions import InternalServerError, NotFound
@ -28,8 +29,7 @@ from core.helper.trace_id_helper import get_external_trace_id
from core.model_runtime.errors.invoke import InvokeError
from libs import helper
from libs.helper import uuid_value
from libs.login import current_user, login_required
from models import Account
from libs.login import login_required
from models.model import AppMode
from services.app_generate_service import AppGenerateService
from services.errors.llm import InvokeRateLimitError
@ -56,11 +56,11 @@ class CompletionMessageApi(Resource):
streaming = args["response_mode"] != "blocking"
args["auto_generate_name"] = False
account = flask_login.current_user
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account or EndUser instance")
response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
)
return helper.compact_generate_response(response)
@ -92,9 +92,9 @@ class CompletionMessageStopApi(Resource):
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
def post(self, app_model, task_id):
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
account = flask_login.current_user
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id)
return {"result": "success"}, 200
@ -123,11 +123,11 @@ class ChatMessageApi(Resource):
if external_trace_id:
args["external_trace_id"] = external_trace_id
account = flask_login.current_user
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account or EndUser instance")
response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
)
return helper.compact_generate_response(response)
@ -161,9 +161,9 @@ class ChatMessageStopApi(Resource):
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def post(self, app_model, task_id):
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
account = flask_login.current_user
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id)
return {"result": "success"}, 200

View File

@ -22,7 +22,7 @@ from fields.conversation_fields import (
from libs.datetime_utils import naive_utc_now
from libs.helper import DatetimeString
from libs.login import login_required
from models import Account, Conversation, EndUser, Message, MessageAnnotation
from models import Conversation, EndUser, Message, MessageAnnotation
from models.model import AppMode
from services.conversation_service import ConversationService
from services.errors.conversation import ConversationNotExistsError
@ -117,15 +117,13 @@ class CompletionConversationDetailApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def delete(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
conversation_id = str(conversation_id)
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")
@ -284,8 +282,6 @@ class ChatConversationDetailApi(Resource):
conversation_id = str(conversation_id)
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")

View File

@ -12,6 +12,7 @@ from controllers.console.app.error import (
)
from controllers.console.wraps import account_initialization_required, setup_required
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.helper.code_executor.code_node_provider import CodeNodeProvider
from core.helper.code_executor.javascript.javascript_code_provider import JavascriptCodeProvider
from core.helper.code_executor.python3.python3_code_provider import Python3CodeProvider
from core.llm_generator.llm_generator import LLMGenerator
@ -125,13 +126,11 @@ class InstructionGenerateApi(Resource):
parser.add_argument("model_config", type=dict, required=True, nullable=False, location="json")
parser.add_argument("ideal_output", type=str, required=False, default="", location="json")
args = parser.parse_args()
code_template = (
Python3CodeProvider.get_default_code()
if args["language"] == "python"
else (JavascriptCodeProvider.get_default_code())
if args["language"] == "javascript"
else ""
providers: list[type[CodeNodeProvider]] = [Python3CodeProvider, JavascriptCodeProvider]
code_provider: type[CodeNodeProvider] | None = next(
(p for p in providers if p.is_accept_language(args["language"])), None
)
code_template = code_provider.get_default_code() if code_provider else ""
try:
# Generate from nothing for a workflow node
if (args["current"] == code_template or args["current"] == "") and args["node_id"] != "":
@ -207,7 +206,7 @@ class InstructionGenerationTemplateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self):
def post(self) -> dict:
parser = reqparse.RequestParser()
parser.add_argument("type", type=str, required=True, default=False, location="json")
args = parser.parse_args()

View File

@ -1,5 +1,6 @@
import logging
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with, reqparse
from flask_restx.inputs import int_range
from sqlalchemy import exists, select
@ -26,8 +27,7 @@ from extensions.ext_database import db
from fields.conversation_fields import annotation_fields, message_detail_fields
from libs.helper import uuid_value
from libs.infinite_scroll_pagination import InfiniteScrollPagination
from libs.login import current_user, login_required
from models.account import Account
from libs.login import login_required
from models.model import AppMode, Conversation, Message, MessageAnnotation, MessageFeedback
from services.annotation_service import AppAnnotationService
from services.errors.conversation import ConversationNotExistsError
@ -118,14 +118,11 @@ class ChatMessageListApi(Resource):
class MessageFeedbackApi(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def post(self, app_model):
if current_user is None:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("message_id", required=True, type=uuid_value, location="json")
parser.add_argument("rating", type=str, choices=["like", "dislike", None], location="json")
@ -170,8 +167,6 @@ class MessageAnnotationApi(Resource):
@get_app_model
@marshal_with(annotation_fields)
def post(self, app_model):
if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.is_editor:
raise Forbidden()
@ -187,10 +182,10 @@ class MessageAnnotationApi(Resource):
class MessageAnnotationCountApi(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
count = db.session.query(MessageAnnotation).where(MessageAnnotation.app_id == app_model.id).count()

View File

@ -10,7 +10,7 @@ from extensions.ext_database import db
from fields.app_fields import app_site_fields
from libs.datetime_utils import naive_utc_now
from libs.login import login_required
from models import Account, Site
from models import Site
def parse_app_site_args():
@ -75,8 +75,6 @@ class AppSite(Resource):
if value is not None:
setattr(site, attr_name, value)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id
site.updated_at = naive_utc_now()
db.session.commit()
@ -101,8 +99,6 @@ class AppSiteAccessTokenReset(Resource):
raise NotFound
site.code = Site.generate_code(16)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id
site.updated_at = naive_utc_now()
db.session.commit()

View File

@ -18,10 +18,10 @@ from models import AppMode, Message
class DailyMessageStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user
@ -75,10 +75,10 @@ WHERE
class DailyConversationStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user
@ -127,10 +127,10 @@ class DailyConversationStatistic(Resource):
class DailyTerminalsStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user
@ -184,10 +184,10 @@ WHERE
class DailyTokenCostStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user
@ -320,10 +320,10 @@ ORDER BY
class UserSatisfactionRateStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user
@ -443,10 +443,10 @@ WHERE
class TokensPerSecondStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user

View File

@ -11,7 +11,11 @@ from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
from configs import dify_config
from controllers.console import api
from controllers.console.app.error import ConversationCompletedError, DraftWorkflowNotExist, DraftWorkflowNotSync
from controllers.console.app.error import (
ConversationCompletedError,
DraftWorkflowNotExist,
DraftWorkflowNotSync,
)
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError
@ -20,6 +24,7 @@ from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from core.file.models import File
from core.helper.trace_id_helper import get_external_trace_id
from core.model_runtime.utils.encoders import jsonable_encoder
from extensions.ext_database import db
from factories import file_factory, variable_factory
from fields.workflow_fields import workflow_fields, workflow_pagination_fields
@ -34,6 +39,7 @@ from models.workflow import Workflow
from services.app_generate_service import AppGenerateService
from services.errors.app import WorkflowHashNotEqualError
from services.errors.llm import InvokeRateLimitError
from services.trigger_debug_service import TriggerDebugService
from services.workflow_service import DraftWorkflowDeletionError, WorkflowInUseError, WorkflowService
logger = logging.getLogger(__name__)
@ -522,7 +528,7 @@ class PublishedWorkflowApi(Resource):
)
app_model.workflow_id = workflow.id
db.session.commit() # NOTE: this is necessary for update app_model.workflow_id
db.session.commit()
workflow_created_at = TimestampField().format(workflow.created_at)
@ -802,6 +808,132 @@ class DraftWorkflowNodeLastRunApi(Resource):
return node_exec
class DraftWorkflowTriggerNodeApi(Resource):
"""
Single node debug - Polling API for trigger events
Path: /apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/trigger
"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.WORKFLOW])
def post(self, app_model: App, node_id: str):
"""
Poll for trigger events and execute single node when event arrives
"""
if not isinstance(current_user, Account) or not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("trigger_name", type=str, required=True, location="json")
parser.add_argument("subscription_id", type=str, required=True, location="json")
args = parser.parse_args()
trigger_name = args["trigger_name"]
subscription_id = args["subscription_id"]
event = TriggerDebugService.poll_event(
tenant_id=app_model.tenant_id,
user_id=current_user.id,
app_id=app_model.id,
subscription_id=subscription_id,
node_id=node_id,
trigger_name=trigger_name,
)
if not event:
return jsonable_encoder({"status": "waiting"})
try:
workflow_service = WorkflowService()
draft_workflow = workflow_service.get_draft_workflow(app_model)
if not draft_workflow:
raise ValueError("Workflow not found")
user_inputs = event.model_dump()
node_execution = workflow_service.run_draft_workflow_node(
app_model=app_model,
draft_workflow=draft_workflow,
node_id=node_id,
user_inputs=user_inputs,
account=current_user,
query="",
files=[],
)
return jsonable_encoder(node_execution)
except Exception:
logger.exception("Error running draft workflow trigger node")
return jsonable_encoder(
{
"status": "error",
}
), 500
class DraftWorkflowTriggerRunApi(Resource):
"""
Full workflow debug - Polling API for trigger events
Path: /apps/<uuid:app_id>/workflows/draft/trigger/run
"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.WORKFLOW])
def post(self, app_model: App):
"""
Poll for trigger events and execute full workflow when event arrives
"""
if not isinstance(current_user, Account) or not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=True, location="json", nullable=False)
parser.add_argument("trigger_name", type=str, required=True, location="json", nullable=False)
parser.add_argument("subscription_id", type=str, required=True, location="json", nullable=False)
args = parser.parse_args()
node_id = args["node_id"]
trigger_name = args["trigger_name"]
subscription_id = args["subscription_id"]
event = TriggerDebugService.poll_event(
tenant_id=app_model.tenant_id,
user_id=current_user.id,
app_id=app_model.id,
subscription_id=subscription_id,
node_id=node_id,
trigger_name=trigger_name,
)
if not event:
return jsonable_encoder({"status": "waiting"})
workflow_args = {
"inputs": event.model_dump(),
"query": "",
"files": [],
}
external_trace_id = get_external_trace_id(request)
if external_trace_id:
workflow_args["external_trace_id"] = external_trace_id
try:
response = AppGenerateService.generate(
app_model=app_model,
user=current_user,
args=workflow_args,
invoke_from=InvokeFrom.DEBUGGER,
streaming=True,
)
return helper.compact_generate_response(response)
except InvokeRateLimitError as ex:
raise InvokeRateLimitHttpError(ex.description)
except Exception:
logger.exception("Error running draft workflow trigger run")
return jsonable_encoder(
{
"status": "error",
}
), 500
api.add_resource(
DraftWorkflowApi,
"/apps/<uuid:app_id>/workflows/draft",
@ -826,6 +958,14 @@ api.add_resource(
DraftWorkflowNodeRunApi,
"/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/run",
)
api.add_resource(
DraftWorkflowTriggerNodeApi,
"/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/trigger",
)
api.add_resource(
DraftWorkflowTriggerRunApi,
"/apps/<uuid:app_id>/workflows/draft/trigger/run",
)
api.add_resource(
AdvancedChatDraftRunIterationNodeApi,
"/apps/<uuid:app_id>/advanced-chat/workflows/draft/iteration/nodes/<string:node_id>/run",

View File

@ -1,5 +1,5 @@
import logging
from typing import NoReturn
from typing import Any, NoReturn
from flask import Response
from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqparse
@ -29,7 +29,7 @@ from services.workflow_service import WorkflowService
logger = logging.getLogger(__name__)
def _convert_values_to_json_serializable_object(value: Segment):
def _convert_values_to_json_serializable_object(value: Segment) -> Any:
if isinstance(value, FileSegment):
return value.value.model_dump()
elif isinstance(value, ArrayFileSegment):
@ -40,7 +40,7 @@ def _convert_values_to_json_serializable_object(value: Segment):
return value.value
def _serialize_var_value(variable: WorkflowDraftVariable):
def _serialize_var_value(variable: WorkflowDraftVariable) -> Any:
value = variable.get_value()
# create a copy of the value to avoid affecting the model cache.
value = value.model_copy(deep=True)

View File

@ -18,10 +18,10 @@ from models.model import AppMode
class WorkflowDailyRunsStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user
@ -80,10 +80,10 @@ WHERE
class WorkflowDailyTerminalsStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user
@ -142,10 +142,10 @@ WHERE
class WorkflowDailyTokenCostStatistic(Resource):
@get_app_model
@setup_required
@login_required
@account_initialization_required
@get_app_model
def get(self, app_model):
account = current_user

View File

@ -0,0 +1,249 @@
import logging
from flask_restx import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden, NotFound
from configs import dify_config
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from core.model_runtime.utils.encoders import jsonable_encoder
from extensions.ext_database import db
from fields.workflow_trigger_fields import trigger_fields, triggers_list_fields, webhook_trigger_fields
from libs.login import current_user, login_required
from models.model import Account, AppMode
from models.workflow import AppTrigger, AppTriggerStatus, WorkflowWebhookTrigger
logger = logging.getLogger(__name__)
from services.workflow_plugin_trigger_service import WorkflowPluginTriggerService
class PluginTriggerApi(Resource):
"""Workflow Plugin Trigger API"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
def post(self, app_model):
"""Create plugin trigger"""
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=False, location="json")
parser.add_argument("provider_id", type=str, required=False, location="json")
parser.add_argument("trigger_name", type=str, required=False, location="json")
parser.add_argument("subscription_id", type=str, required=False, location="json")
args = parser.parse_args()
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
if not current_user.is_editor:
raise Forbidden()
plugin_trigger = WorkflowPluginTriggerService.create_plugin_trigger(
app_id=app_model.id,
tenant_id=current_user.current_tenant_id,
node_id=args["node_id"],
provider_id=args["provider_id"],
trigger_name=args["trigger_name"],
subscription_id=args["subscription_id"],
)
return jsonable_encoder(plugin_trigger)
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
def get(self, app_model):
"""Get plugin trigger"""
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=True, help="Node ID is required")
args = parser.parse_args()
plugin_trigger = WorkflowPluginTriggerService.get_plugin_trigger(
app_id=app_model.id,
node_id=args["node_id"],
)
return jsonable_encoder(plugin_trigger)
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
def put(self, app_model):
"""Update plugin trigger"""
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=True, help="Node ID is required")
parser.add_argument("subscription_id", type=str, required=True, location="json", help="Subscription ID")
args = parser.parse_args()
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
if not current_user.is_editor:
raise Forbidden()
plugin_trigger = WorkflowPluginTriggerService.update_plugin_trigger(
app_id=app_model.id,
node_id=args["node_id"],
subscription_id=args["subscription_id"],
)
return jsonable_encoder(plugin_trigger)
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
def delete(self, app_model):
"""Delete plugin trigger"""
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=True, help="Node ID is required")
args = parser.parse_args()
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
if not current_user.is_editor:
raise Forbidden()
WorkflowPluginTriggerService.delete_plugin_trigger(
app_id=app_model.id,
node_id=args["node_id"],
)
return {"result": "success"}, 204
class WebhookTriggerApi(Resource):
"""Webhook Trigger API"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(webhook_trigger_fields)
def get(self, app_model):
"""Get webhook trigger for a node"""
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=True, help="Node ID is required")
args = parser.parse_args()
node_id = args["node_id"]
with Session(db.engine) as session:
# Get webhook trigger for this app and node
webhook_trigger = (
session.query(WorkflowWebhookTrigger)
.filter(
WorkflowWebhookTrigger.app_id == app_model.id,
WorkflowWebhookTrigger.node_id == node_id,
)
.first()
)
if not webhook_trigger:
raise NotFound("Webhook trigger not found for this node")
# Add computed fields for marshal_with
base_url = dify_config.SERVICE_API_URL
webhook_trigger.webhook_url = f"{base_url}/triggers/webhook/{webhook_trigger.webhook_id}" # type: ignore
webhook_trigger.webhook_debug_url = f"{base_url}/triggers/webhook-debug/{webhook_trigger.webhook_id}" # type: ignore
return webhook_trigger
class AppTriggersApi(Resource):
"""App Triggers list API"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(triggers_list_fields)
def get(self, app_model):
"""Get app triggers list"""
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
with Session(db.engine) as session:
# Get all triggers for this app using select API
triggers = (
session.execute(
select(AppTrigger)
.where(
AppTrigger.tenant_id == current_user.current_tenant_id,
AppTrigger.app_id == app_model.id,
)
.order_by(AppTrigger.created_at.desc(), AppTrigger.id.desc())
)
.scalars()
.all()
)
# Add computed icon field for each trigger
url_prefix = dify_config.CONSOLE_API_URL + "/console/api/workspaces/current/tool-provider/builtin/"
for trigger in triggers:
if trigger.trigger_type == "trigger-plugin":
trigger.icon = url_prefix + trigger.provider_name + "/icon" # type: ignore
else:
trigger.icon = "" # type: ignore
return {"data": triggers}
class AppTriggerEnableApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(trigger_fields)
def post(self, app_model):
"""Update app trigger (enable/disable)"""
parser = reqparse.RequestParser()
parser.add_argument("trigger_id", type=str, required=True, nullable=False, location="json")
parser.add_argument("enable_trigger", type=bool, required=True, nullable=False, location="json")
args = parser.parse_args()
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
if not current_user.is_editor:
raise Forbidden()
trigger_id = args["trigger_id"]
with Session(db.engine) as session:
# Find the trigger using select
trigger = session.execute(
select(AppTrigger).where(
AppTrigger.id == trigger_id,
AppTrigger.tenant_id == current_user.current_tenant_id,
AppTrigger.app_id == app_model.id,
)
).scalar_one_or_none()
if not trigger:
raise NotFound("Trigger not found")
# Update status based on enable_trigger boolean
trigger.status = AppTriggerStatus.ENABLED if args["enable_trigger"] else AppTriggerStatus.DISABLED
session.commit()
session.refresh(trigger)
# Add computed icon field
url_prefix = dify_config.CONSOLE_API_URL + "/console/api/workspaces/current/tool-provider/builtin/"
if trigger.trigger_type == "trigger-plugin":
trigger.icon = url_prefix + trigger.provider_name + "/icon" # type: ignore
else:
trigger.icon = "" # type: ignore
return trigger
api.add_resource(WebhookTriggerApi, "/apps/<uuid:app_id>/workflows/triggers/webhook")
api.add_resource(PluginTriggerApi, "/apps/<uuid:app_id>/workflows/triggers/plugin")
api.add_resource(AppTriggersApi, "/apps/<uuid:app_id>/triggers")
api.add_resource(AppTriggerEnableApi, "/apps/<uuid:app_id>/trigger-enable")

View File

@ -1,6 +1,6 @@
from collections.abc import Callable
from functools import wraps
from typing import Optional, ParamSpec, TypeVar, Union
from typing import Optional, Union
from controllers.console.app.error import AppNotFoundError
from extensions.ext_database import db
@ -8,9 +8,6 @@ from libs.login import current_user
from models import App, AppMode
from models.account import Account
P = ParamSpec("P")
R = TypeVar("R")
def _load_app_model(app_id: str) -> Optional[App]:
assert isinstance(current_user, Account)
@ -22,10 +19,10 @@ def _load_app_model(app_id: str) -> Optional[App]:
return app_model
def get_app_model(view: Optional[Callable[P, R]] = None, *, mode: Union[AppMode, list[AppMode], None] = None):
def decorator(view_func: Callable[P, R]):
def get_app_model(view: Optional[Callable] = None, *, mode: Union[AppMode, list[AppMode], None] = None):
def decorator(view_func):
@wraps(view_func)
def decorated_view(*args: P.args, **kwargs: P.kwargs):
def decorated_view(*args, **kwargs):
if not kwargs.get("app_id"):
raise ValueError("missing app_id in path parameters")

View File

@ -1,8 +1,8 @@
from flask import request
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource, reqparse
from constants.languages import supported_language
from controllers.console import api, console_ns
from controllers.console import api
from controllers.console.error import AlreadyActivateError
from extensions.ext_database import db
from libs.datetime_utils import naive_utc_now
@ -10,36 +10,14 @@ from libs.helper import StrLen, email, extract_remote_ip, timezone
from models.account import AccountStatus
from services.account_service import AccountService, RegisterService
active_check_parser = reqparse.RequestParser()
active_check_parser.add_argument(
"workspace_id", type=str, required=False, nullable=True, location="args", help="Workspace ID"
)
active_check_parser.add_argument(
"email", type=email, required=False, nullable=True, location="args", help="Email address"
)
active_check_parser.add_argument(
"token", type=str, required=True, nullable=False, location="args", help="Activation token"
)
@console_ns.route("/activate/check")
class ActivateCheckApi(Resource):
@api.doc("check_activation_token")
@api.doc(description="Check if activation token is valid")
@api.expect(active_check_parser)
@api.response(
200,
"Success",
api.model(
"ActivationCheckResponse",
{
"is_valid": fields.Boolean(description="Whether token is valid"),
"data": fields.Raw(description="Activation data if valid"),
},
),
)
def get(self):
args = active_check_parser.parse_args()
parser = reqparse.RequestParser()
parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="args")
parser.add_argument("email", type=email, required=False, nullable=True, location="args")
parser.add_argument("token", type=str, required=True, nullable=False, location="args")
args = parser.parse_args()
workspaceId = args["workspace_id"]
reg_email = args["email"]
@ -60,36 +38,18 @@ class ActivateCheckApi(Resource):
return {"is_valid": False}
active_parser = reqparse.RequestParser()
active_parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="json")
active_parser.add_argument("email", type=email, required=False, nullable=True, location="json")
active_parser.add_argument("token", type=str, required=True, nullable=False, location="json")
active_parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
active_parser.add_argument(
"interface_language", type=supported_language, required=True, nullable=False, location="json"
)
active_parser.add_argument("timezone", type=timezone, required=True, nullable=False, location="json")
@console_ns.route("/activate")
class ActivateApi(Resource):
@api.doc("activate_account")
@api.doc(description="Activate account with invitation token")
@api.expect(active_parser)
@api.response(
200,
"Account activated successfully",
api.model(
"ActivationResponse",
{
"result": fields.String(description="Operation result"),
"data": fields.Raw(description="Login token data"),
},
),
)
@api.response(400, "Already activated or invalid token")
def post(self):
args = active_parser.parse_args()
parser = reqparse.RequestParser()
parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="json")
parser.add_argument("email", type=email, required=False, nullable=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
parser.add_argument(
"interface_language", type=supported_language, required=True, nullable=False, location="json"
)
parser.add_argument("timezone", type=timezone, required=True, nullable=False, location="json")
args = parser.parse_args()
invitation = RegisterService.get_invitation_if_token_valid(args["workspace_id"], args["email"], args["token"])
if invitation is None:
@ -110,3 +70,7 @@ class ActivateApi(Resource):
token_pair = AccountService.login(account, ip_address=extract_remote_ip(request))
return {"result": "success", "data": token_pair.model_dump()}
api.add_resource(ActivateCheckApi, "/activate/check")
api.add_resource(ActivateApi, "/activate")

View File

@ -3,11 +3,11 @@ import logging
import requests
from flask import current_app, redirect, request
from flask_login import current_user
from flask_restx import Resource, fields
from flask_restx import Resource
from werkzeug.exceptions import Forbidden
from configs import dify_config
from controllers.console import api, console_ns
from controllers.console import api
from libs.login import login_required
from libs.oauth_data_source import NotionOAuth
@ -28,21 +28,7 @@ def get_oauth_providers():
return OAUTH_PROVIDERS
@console_ns.route("/oauth/data-source/<string:provider>")
class OAuthDataSource(Resource):
@api.doc("oauth_data_source")
@api.doc(description="Get OAuth authorization URL for data source provider")
@api.doc(params={"provider": "Data source provider name (notion)"})
@api.response(
200,
"Authorization URL or internal setup success",
api.model(
"OAuthDataSourceResponse",
{"data": fields.Raw(description="Authorization URL or 'internal' for internal setup")},
),
)
@api.response(400, "Invalid provider")
@api.response(403, "Admin privileges required")
def get(self, provider: str):
# The role of the current user in the table must be admin or owner
if not current_user.is_admin_or_owner:
@ -63,19 +49,7 @@ class OAuthDataSource(Resource):
return {"data": auth_url}, 200
@console_ns.route("/oauth/data-source/callback/<string:provider>")
class OAuthDataSourceCallback(Resource):
@api.doc("oauth_data_source_callback")
@api.doc(description="Handle OAuth callback from data source provider")
@api.doc(
params={
"provider": "Data source provider name (notion)",
"code": "Authorization code from OAuth provider",
"error": "Error message from OAuth provider",
}
)
@api.response(302, "Redirect to console with result")
@api.response(400, "Invalid provider")
def get(self, provider: str):
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context():
@ -94,19 +68,7 @@ class OAuthDataSourceCallback(Resource):
return redirect(f"{dify_config.CONSOLE_WEB_URL}?type=notion&error=Access denied")
@console_ns.route("/oauth/data-source/binding/<string:provider>")
class OAuthDataSourceBinding(Resource):
@api.doc("oauth_data_source_binding")
@api.doc(description="Bind OAuth data source with authorization code")
@api.doc(
params={"provider": "Data source provider name (notion)", "code": "Authorization code from OAuth provider"}
)
@api.response(
200,
"Data source binding success",
api.model("OAuthDataSourceBindingResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid provider or code")
def get(self, provider: str):
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context():
@ -119,7 +81,7 @@ class OAuthDataSourceBinding(Resource):
return {"error": "Invalid code"}, 400
try:
oauth_provider.get_access_token(code)
except requests.HTTPError as e:
except requests.exceptions.HTTPError as e:
logger.exception(
"An error occurred during the OAuthCallback process with %s: %s", provider, e.response.text
)
@ -128,17 +90,7 @@ class OAuthDataSourceBinding(Resource):
return {"result": "success"}, 200
@console_ns.route("/oauth/data-source/<string:provider>/<uuid:binding_id>/sync")
class OAuthDataSourceSync(Resource):
@api.doc("oauth_data_source_sync")
@api.doc(description="Sync data from OAuth data source")
@api.doc(params={"provider": "Data source provider name (notion)", "binding_id": "Data source binding ID"})
@api.response(
200,
"Data source sync success",
api.model("OAuthDataSourceSyncResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid provider or sync failed")
@setup_required
@login_required
@account_initialization_required
@ -152,10 +104,16 @@ class OAuthDataSourceSync(Resource):
return {"error": "Invalid provider"}, 400
try:
oauth_provider.sync_data_source(binding_id)
except requests.HTTPError as e:
except requests.exceptions.HTTPError as e:
logger.exception(
"An error occurred during the OAuthCallback process with %s: %s", provider, e.response.text
)
return {"error": "OAuth data source process failed"}, 400
return {"result": "success"}, 200
api.add_resource(OAuthDataSource, "/oauth/data-source/<string:provider>")
api.add_resource(OAuthDataSourceCallback, "/oauth/data-source/callback/<string:provider>")
api.add_resource(OAuthDataSourceBinding, "/oauth/data-source/binding/<string:provider>")
api.add_resource(OAuthDataSourceSync, "/oauth/data-source/<string:provider>/<uuid:binding_id>/sync")

View File

@ -2,12 +2,12 @@ import base64
import secrets
from flask import request
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from constants.languages import languages
from controllers.console import api, console_ns
from controllers.console import api
from controllers.console.auth.error import (
EmailCodeError,
EmailPasswordResetLimitError,
@ -28,32 +28,7 @@ from services.errors.workspace import WorkSpaceNotAllowedCreateError, Workspaces
from services.feature_service import FeatureService
@console_ns.route("/forgot-password")
class ForgotPasswordSendEmailApi(Resource):
@api.doc("send_forgot_password_email")
@api.doc(description="Send password reset email")
@api.expect(
api.model(
"ForgotPasswordEmailRequest",
{
"email": fields.String(required=True, description="Email address"),
"language": fields.String(description="Language for email (zh-Hans/en-US)"),
},
)
)
@api.response(
200,
"Email sent successfully",
api.model(
"ForgotPasswordEmailResponse",
{
"result": fields.String(description="Operation result"),
"data": fields.String(description="Reset token"),
"code": fields.String(description="Error code if account not found"),
},
),
)
@api.response(400, "Invalid email or rate limit exceeded")
@setup_required
@email_password_login_enabled
def post(self):
@ -86,33 +61,7 @@ class ForgotPasswordSendEmailApi(Resource):
return {"result": "success", "data": token}
@console_ns.route("/forgot-password/validity")
class ForgotPasswordCheckApi(Resource):
@api.doc("check_forgot_password_code")
@api.doc(description="Verify password reset code")
@api.expect(
api.model(
"ForgotPasswordCheckRequest",
{
"email": fields.String(required=True, description="Email address"),
"code": fields.String(required=True, description="Verification code"),
"token": fields.String(required=True, description="Reset token"),
},
)
)
@api.response(
200,
"Code verified successfully",
api.model(
"ForgotPasswordCheckResponse",
{
"is_valid": fields.Boolean(description="Whether code is valid"),
"email": fields.String(description="Email address"),
"token": fields.String(description="New reset token"),
},
),
)
@api.response(400, "Invalid code or token")
@setup_required
@email_password_login_enabled
def post(self):
@ -151,26 +100,7 @@ class ForgotPasswordCheckApi(Resource):
return {"is_valid": True, "email": token_data.get("email"), "token": new_token}
@console_ns.route("/forgot-password/resets")
class ForgotPasswordResetApi(Resource):
@api.doc("reset_password")
@api.doc(description="Reset password with verification token")
@api.expect(
api.model(
"ForgotPasswordResetRequest",
{
"token": fields.String(required=True, description="Verification token"),
"new_password": fields.String(required=True, description="New password"),
"password_confirm": fields.String(required=True, description="Password confirmation"),
},
)
)
@api.response(
200,
"Password reset successfully",
api.model("ForgotPasswordResetResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid token or password mismatch")
@setup_required
@email_password_login_enabled
def post(self):
@ -242,3 +172,8 @@ class ForgotPasswordResetApi(Resource):
pass
except AccountRegisterError:
raise AccountInFreezeError()
api.add_resource(ForgotPasswordSendEmailApi, "/forgot-password")
api.add_resource(ForgotPasswordCheckApi, "/forgot-password/validity")
api.add_resource(ForgotPasswordResetApi, "/forgot-password/resets")

View File

@ -130,7 +130,7 @@ class ResetPasswordSendEmailApi(Resource):
language = "en-US"
try:
account = AccountService.get_user_through_email(args["email"])
except AccountRegisterError:
except AccountRegisterError as are:
raise AccountInFreezeError()
if account is None:
@ -162,7 +162,7 @@ class EmailCodeLoginSendEmailApi(Resource):
language = "en-US"
try:
account = AccountService.get_user_through_email(args["email"])
except AccountRegisterError:
except AccountRegisterError as are:
raise AccountInFreezeError()
if account is None:
@ -200,7 +200,7 @@ class EmailCodeLoginApi(Resource):
AccountService.revoke_email_code_login_token(args["token"])
try:
account = AccountService.get_user_through_email(user_email)
except AccountRegisterError:
except AccountRegisterError as are:
raise AccountInFreezeError()
if account:
tenants = TenantService.get_join_tenants(account)
@ -223,7 +223,7 @@ class EmailCodeLoginApi(Resource):
)
except WorkSpaceNotAllowedCreateError:
raise NotAllowedCreateWorkspace()
except AccountRegisterError:
except AccountRegisterError as are:
raise AccountInFreezeError()
except WorkspacesLimitExceededError:
raise WorkspacesLimitExceeded()

View File

@ -22,7 +22,7 @@ from services.errors.account import AccountNotFoundError, AccountRegisterError
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkSpaceNotFoundError
from services.feature_service import FeatureService
from .. import api, console_ns
from .. import api
logger = logging.getLogger(__name__)
@ -50,13 +50,7 @@ def get_oauth_providers():
return OAUTH_PROVIDERS
@console_ns.route("/oauth/login/<provider>")
class OAuthLogin(Resource):
@api.doc("oauth_login")
@api.doc(description="Initiate OAuth login process")
@api.doc(params={"provider": "OAuth provider name (github/google)", "invite_token": "Optional invitation token"})
@api.response(302, "Redirect to OAuth authorization URL")
@api.response(400, "Invalid provider")
def get(self, provider: str):
invite_token = request.args.get("invite_token") or None
OAUTH_PROVIDERS = get_oauth_providers()
@ -69,19 +63,7 @@ class OAuthLogin(Resource):
return redirect(auth_url)
@console_ns.route("/oauth/authorize/<provider>")
class OAuthCallback(Resource):
@api.doc("oauth_callback")
@api.doc(description="Handle OAuth callback and complete login process")
@api.doc(
params={
"provider": "OAuth provider name (github/google)",
"code": "Authorization code from OAuth provider",
"state": "Optional state parameter (used for invite token)",
}
)
@api.response(302, "Redirect to console with access token")
@api.response(400, "OAuth process failed")
def get(self, provider: str):
OAUTH_PROVIDERS = get_oauth_providers()
with current_app.app_context():
@ -95,19 +77,16 @@ class OAuthCallback(Resource):
if state:
invite_token = state
if not code:
return {"error": "Authorization code is required"}, 400
try:
token = oauth_provider.get_access_token(code)
user_info = oauth_provider.get_user_info(token)
except requests.RequestException as e:
except requests.exceptions.RequestException as e:
error_text = e.response.text if e.response else str(e)
logger.exception("An error occurred during the OAuth process with %s: %s", provider, error_text)
return {"error": "OAuth process failed"}, 400
if invite_token and RegisterService.is_valid_invite_token(invite_token):
invitation = RegisterService.get_invitation_by_token(token=invite_token)
invitation = RegisterService._get_invitation_by_token(token=invite_token)
if invitation:
invitation_email = invitation.get("email", None)
if invitation_email != user_info.email:
@ -202,3 +181,7 @@ def _generate_account(provider: str, user_info: OAuthUserInfo):
AccountService.link_account_integrate(provider, user_info.id, account)
return account
api.add_resource(OAuthLogin, "/oauth/login/<provider>")
api.add_resource(OAuthCallback, "/oauth/authorize/<provider>")

View File

@ -1,9 +1,8 @@
from collections.abc import Callable
from functools import wraps
from typing import Concatenate, ParamSpec, TypeVar, cast
from typing import cast
import flask_login
from flask import jsonify, request
from flask import request
from flask_restx import Resource, reqparse
from werkzeug.exceptions import BadRequest, NotFound
@ -16,14 +15,10 @@ from services.oauth_server import OAUTH_ACCESS_TOKEN_EXPIRES_IN, OAuthGrantType,
from .. import api
P = ParamSpec("P")
R = TypeVar("R")
T = TypeVar("T")
def oauth_server_client_id_required(view: Callable[Concatenate[T, OAuthProviderApp, P], R]):
def oauth_server_client_id_required(view):
@wraps(view)
def decorated(self: T, *args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
parser = reqparse.RequestParser()
parser.add_argument("client_id", type=str, required=True, location="json")
parsed_args = parser.parse_args()
@ -35,53 +30,43 @@ def oauth_server_client_id_required(view: Callable[Concatenate[T, OAuthProviderA
if not oauth_provider_app:
raise NotFound("client_id is invalid")
return view(self, oauth_provider_app, *args, **kwargs)
kwargs["oauth_provider_app"] = oauth_provider_app
return view(*args, **kwargs)
return decorated
def oauth_server_access_token_required(view: Callable[Concatenate[T, OAuthProviderApp, Account, P], R]):
def oauth_server_access_token_required(view):
@wraps(view)
def decorated(self: T, oauth_provider_app: OAuthProviderApp, *args: P.args, **kwargs: P.kwargs):
if not isinstance(oauth_provider_app, OAuthProviderApp):
def decorated(*args, **kwargs):
oauth_provider_app = kwargs.get("oauth_provider_app")
if not oauth_provider_app or not isinstance(oauth_provider_app, OAuthProviderApp):
raise BadRequest("Invalid oauth_provider_app")
authorization_header = request.headers.get("Authorization")
if not authorization_header:
response = jsonify({"error": "Authorization header is required"})
response.status_code = 401
response.headers["WWW-Authenticate"] = "Bearer"
return response
raise BadRequest("Authorization header is required")
parts = authorization_header.strip().split(None, 1)
parts = authorization_header.strip().split(" ")
if len(parts) != 2:
response = jsonify({"error": "Invalid Authorization header format"})
response.status_code = 401
response.headers["WWW-Authenticate"] = "Bearer"
return response
raise BadRequest("Invalid Authorization header format")
token_type = parts[0].strip()
if token_type.lower() != "bearer":
response = jsonify({"error": "token_type is invalid"})
response.status_code = 401
response.headers["WWW-Authenticate"] = "Bearer"
return response
raise BadRequest("token_type is invalid")
access_token = parts[1].strip()
if not access_token:
response = jsonify({"error": "access_token is required"})
response.status_code = 401
response.headers["WWW-Authenticate"] = "Bearer"
return response
raise BadRequest("access_token is required")
account = OAuthServerService.validate_oauth_access_token(oauth_provider_app.client_id, access_token)
if not account:
response = jsonify({"error": "access_token or client_id is invalid"})
response.status_code = 401
response.headers["WWW-Authenticate"] = "Bearer"
return response
raise BadRequest("access_token or client_id is invalid")
return view(self, oauth_provider_app, account, *args, **kwargs)
kwargs["account"] = account
return view(*args, **kwargs)
return decorated

View File

@ -1,9 +1,9 @@
from flask_login import current_user
from flask_restx import Resource, reqparse
from controllers.console import api
from controllers.console.wraps import account_initialization_required, only_edition_cloud, setup_required
from libs.login import current_user, login_required
from models.model import Account
from libs.login import login_required
from services.billing_service import BillingService
@ -17,10 +17,9 @@ class Subscription(Resource):
parser.add_argument("plan", type=str, required=True, location="args", choices=["professional", "team"])
parser.add_argument("interval", type=str, required=True, location="args", choices=["month", "year"])
args = parser.parse_args()
assert isinstance(current_user, Account)
BillingService.is_tenant_owner_or_admin(current_user)
assert current_user.current_tenant_id is not None
return BillingService.get_subscription(
args["plan"], args["interval"], current_user.email, current_user.current_tenant_id
)
@ -32,9 +31,7 @@ class Invoices(Resource):
@account_initialization_required
@only_edition_cloud
def get(self):
assert isinstance(current_user, Account)
BillingService.is_tenant_owner_or_admin(current_user)
assert current_user.current_tenant_id is not None
return BillingService.get_invoices(current_user.email, current_user.current_tenant_id)

View File

@ -10,7 +10,6 @@ from werkzeug.exceptions import NotFound
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from core.indexing_runner import IndexingRunner
from core.rag.extractor.entity.datasource_type import DatasourceType
from core.rag.extractor.entity.extract_setting import ExtractSetting
from core.rag.extractor.notion_extractor import NotionExtractor
from extensions.ext_database import db
@ -29,12 +28,14 @@ class DataSourceApi(Resource):
@marshal_with(integrate_list_fields)
def get(self):
# get workspace data source integrates
data_source_integrates = db.session.scalars(
select(DataSourceOauthBinding).where(
data_source_integrates = (
db.session.query(DataSourceOauthBinding)
.where(
DataSourceOauthBinding.tenant_id == current_user.current_tenant_id,
DataSourceOauthBinding.disabled == False,
)
).all()
.all()
)
base_url = request.url_root.rstrip("/")
data_source_oauth_base_path = "/console/api/oauth/data-source"
@ -213,7 +214,7 @@ class DataSourceNotionApi(Resource):
workspace_id = notion_info["workspace_id"]
for page in notion_info["pages"]:
extract_setting = ExtractSetting(
datasource_type=DatasourceType.NOTION.value,
datasource_type="notion_import",
notion_info={
"notion_workspace_id": workspace_id,
"notion_obj_id": page["page_id"],
@ -247,7 +248,7 @@ class DataSourceNotionDatasetSyncApi(Resource):
documents = DocumentService.get_document_by_dataset_id(dataset_id_str)
for document in documents:
document_indexing_sync_task.delay(dataset_id_str, document.id)
return {"result": "success"}, 200
return 200
class DataSourceNotionDocumentSyncApi(Resource):
@ -265,7 +266,7 @@ class DataSourceNotionDocumentSyncApi(Resource):
if document is None:
raise NotFound("Document not found.")
document_indexing_sync_task.delay(dataset_id_str, document_id_str)
return {"result": "success"}, 200
return 200
api.add_resource(DataSourceApi, "/data-source/integrates", "/data-source/integrates/<uuid:binding_id>/<string:action>")

View File

@ -2,7 +2,6 @@ import flask_restx
from flask import request
from flask_login import current_user
from flask_restx import Resource, marshal, marshal_with, reqparse
from sqlalchemy import select
from werkzeug.exceptions import Forbidden, NotFound
import services
@ -23,7 +22,6 @@ from core.model_runtime.entities.model_entities import ModelType
from core.plugin.entities.plugin import ModelProviderID
from core.provider_manager import ProviderManager
from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.extractor.entity.datasource_type import DatasourceType
from core.rag.extractor.entity.extract_setting import ExtractSetting
from core.rag.retrieval.retrieval_methods import RetrievalMethod
from extensions.ext_database import db
@ -412,11 +410,11 @@ class DatasetIndexingEstimateApi(Resource):
extract_settings = []
if args["info_list"]["data_source_type"] == "upload_file":
file_ids = args["info_list"]["file_info_list"]["file_ids"]
file_details = db.session.scalars(
select(UploadFile).where(
UploadFile.tenant_id == current_user.current_tenant_id, UploadFile.id.in_(file_ids)
)
).all()
file_details = (
db.session.query(UploadFile)
.where(UploadFile.tenant_id == current_user.current_tenant_id, UploadFile.id.in_(file_ids))
.all()
)
if file_details is None:
raise NotFound("File not found.")
@ -424,9 +422,7 @@ class DatasetIndexingEstimateApi(Resource):
if file_details:
for file_detail in file_details:
extract_setting = ExtractSetting(
datasource_type=DatasourceType.FILE.value,
upload_file=file_detail,
document_model=args["doc_form"],
datasource_type="upload_file", upload_file=file_detail, document_model=args["doc_form"]
)
extract_settings.append(extract_setting)
elif args["info_list"]["data_source_type"] == "notion_import":
@ -435,7 +431,7 @@ class DatasetIndexingEstimateApi(Resource):
workspace_id = notion_info["workspace_id"]
for page in notion_info["pages"]:
extract_setting = ExtractSetting(
datasource_type=DatasourceType.NOTION.value,
datasource_type="notion_import",
notion_info={
"notion_workspace_id": workspace_id,
"notion_obj_id": page["page_id"],
@ -449,7 +445,7 @@ class DatasetIndexingEstimateApi(Resource):
website_info_list = args["info_list"]["website_info_list"]
for url in website_info_list["urls"]:
extract_setting = ExtractSetting(
datasource_type=DatasourceType.WEBSITE.value,
datasource_type="website_crawl",
website_info={
"provider": website_info_list["provider"],
"job_id": website_info_list["job_id"],
@ -519,11 +515,11 @@ class DatasetIndexingStatusApi(Resource):
@account_initialization_required
def get(self, dataset_id):
dataset_id = str(dataset_id)
documents = db.session.scalars(
select(Document).where(
Document.dataset_id == dataset_id, Document.tenant_id == current_user.current_tenant_id
)
).all()
documents = (
db.session.query(Document)
.where(Document.dataset_id == dataset_id, Document.tenant_id == current_user.current_tenant_id)
.all()
)
documents_status = []
for document in documents:
completed_segments = (
@ -570,11 +566,11 @@ class DatasetApiKeyApi(Resource):
@account_initialization_required
@marshal_with(api_key_list)
def get(self):
keys = db.session.scalars(
select(ApiToken).where(
ApiToken.type == self.resource_type, ApiToken.tenant_id == current_user.current_tenant_id
)
).all()
keys = (
db.session.query(ApiToken)
.where(ApiToken.type == self.resource_type, ApiToken.tenant_id == current_user.current_tenant_id)
.all()
)
return {"items": keys}
@setup_required

View File

@ -1,6 +1,5 @@
import logging
from argparse import ArgumentTypeError
from collections.abc import Sequence
from typing import Literal, cast
from flask import request
@ -41,7 +40,6 @@ from core.model_manager import ModelManager
from core.model_runtime.entities.model_entities import ModelType
from core.model_runtime.errors.invoke import InvokeAuthorizationError
from core.plugin.impl.exc import PluginDaemonClientSideError
from core.rag.extractor.entity.datasource_type import DatasourceType
from core.rag.extractor.entity.extract_setting import ExtractSetting
from extensions.ext_database import db
from fields.document_fields import (
@ -80,7 +78,7 @@ class DocumentResource(Resource):
return document
def get_batch_documents(self, dataset_id: str, batch: str) -> Sequence[Document]:
def get_batch_documents(self, dataset_id: str, batch: str) -> list[Document]:
dataset = DatasetService.get_dataset(dataset_id)
if not dataset:
raise NotFound("Dataset not found.")
@ -356,6 +354,9 @@ class DatasetInitApi(Resource):
parser.add_argument("embedding_model_provider", type=str, required=False, nullable=True, location="json")
args = parser.parse_args()
# The role of the current user in the ta table must be admin, owner, or editor, or dataset_operator
if not current_user.is_dataset_editor:
raise Forbidden()
knowledge_config = KnowledgeConfig(**args)
if knowledge_config.indexing_technique == "high_quality":
if knowledge_config.embedding_model is None or knowledge_config.embedding_model_provider is None:
@ -427,7 +428,7 @@ class DocumentIndexingEstimateApi(DocumentResource):
raise NotFound("File not found.")
extract_setting = ExtractSetting(
datasource_type=DatasourceType.FILE.value, upload_file=file, document_model=document.doc_form
datasource_type="upload_file", upload_file=file, document_model=document.doc_form
)
indexing_runner = IndexingRunner()
@ -476,8 +477,6 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
data_source_info = document.data_source_info_dict
if document.data_source_type == "upload_file":
if not data_source_info:
continue
file_id = data_source_info["upload_file_id"]
file_detail = (
db.session.query(UploadFile)
@ -489,15 +488,13 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
raise NotFound("File not found.")
extract_setting = ExtractSetting(
datasource_type=DatasourceType.FILE.value, upload_file=file_detail, document_model=document.doc_form
datasource_type="upload_file", upload_file=file_detail, document_model=document.doc_form
)
extract_settings.append(extract_setting)
elif document.data_source_type == "notion_import":
if not data_source_info:
continue
extract_setting = ExtractSetting(
datasource_type=DatasourceType.NOTION.value,
datasource_type="notion_import",
notion_info={
"notion_workspace_id": data_source_info["notion_workspace_id"],
"notion_obj_id": data_source_info["notion_page_id"],
@ -508,10 +505,8 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
)
extract_settings.append(extract_setting)
elif document.data_source_type == "website_crawl":
if not data_source_info:
continue
extract_setting = ExtractSetting(
datasource_type=DatasourceType.WEBSITE.value,
datasource_type="website_crawl",
website_info={
"provider": data_source_info["provider"],
"job_id": data_source_info["job_id"],

View File

@ -113,7 +113,7 @@ class DatasetMetadataBuiltInFieldActionApi(Resource):
MetadataService.enable_built_in_field(dataset)
elif action == "disable":
MetadataService.disable_built_in_field(dataset)
return {"result": "success"}, 200
return 200
class DocumentMetadataEditApi(Resource):
@ -135,7 +135,7 @@ class DocumentMetadataEditApi(Resource):
MetadataService.update_documents_metadata(dataset, metadata_args)
return {"result": "success"}, 200
return 200
api.add_resource(DatasetMetadataCreateApi, "/datasets/<uuid:dataset_id>/metadata")

View File

@ -1,5 +1,6 @@
import logging
from flask_login import current_user
from flask_restx import reqparse
from werkzeug.exceptions import InternalServerError, NotFound
@ -27,8 +28,6 @@ from extensions.ext_database import db
from libs import helper
from libs.datetime_utils import naive_utc_now
from libs.helper import uuid_value
from libs.login import current_user
from models import Account
from models.model import AppMode
from services.app_generate_service import AppGenerateService
from services.errors.llm import InvokeRateLimitError
@ -58,8 +57,6 @@ class CompletionApi(InstalledAppResource):
db.session.commit()
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.EXPLORE, streaming=streaming
)
@ -93,8 +90,6 @@ class CompletionStopApi(InstalledAppResource):
if app_model.mode != "completion":
raise NotCompletionAppError()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.EXPLORE, current_user.id)
return {"result": "success"}, 200
@ -122,8 +117,6 @@ class ChatApi(InstalledAppResource):
db.session.commit()
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.EXPLORE, streaming=True
)
@ -160,8 +153,6 @@ class ChatStopApi(InstalledAppResource):
if app_mode not in {AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT}:
raise NotChatAppError()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.EXPLORE, current_user.id)
return {"result": "success"}, 200

View File

@ -1,3 +1,4 @@
from flask_login import current_user
from flask_restx import marshal_with, reqparse
from flask_restx.inputs import int_range
from sqlalchemy.orm import Session
@ -9,8 +10,6 @@ from core.app.entities.app_invoke_entities import InvokeFrom
from extensions.ext_database import db
from fields.conversation_fields import conversation_infinite_scroll_pagination_fields, simple_conversation_fields
from libs.helper import uuid_value
from libs.login import current_user
from models import Account
from models.model import AppMode
from services.conversation_service import ConversationService
from services.errors.conversation import ConversationNotExistsError, LastConversationNotExistsError
@ -36,8 +35,6 @@ class ConversationListApi(InstalledAppResource):
pinned = args["pinned"] == "true"
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
with Session(db.engine) as session:
return WebConversationService.pagination_by_last_id(
session=session,
@ -61,11 +58,10 @@ class ConversationApi(InstalledAppResource):
conversation_id = str(c_id)
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")
WebConversationService.unpin(app_model, conversation_id, current_user)
return {"result": "success"}, 204
@ -86,8 +82,6 @@ class ConversationRenameApi(InstalledAppResource):
args = parser.parse_args()
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
return ConversationService.rename(
app_model, conversation_id, current_user, args["name"], args["auto_generate"]
)
@ -105,8 +99,6 @@ class ConversationPinApi(InstalledAppResource):
conversation_id = str(c_id)
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
WebConversationService.pin(app_model, conversation_id, current_user)
except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")
@ -122,8 +114,6 @@ class ConversationUnPinApi(InstalledAppResource):
raise NotChatAppError()
conversation_id = str(c_id)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
WebConversationService.unpin(app_model, conversation_id, current_user)
return {"result": "success"}

View File

@ -2,8 +2,9 @@ import logging
from typing import Any
from flask import request
from flask_login import current_user
from flask_restx import Resource, inputs, marshal_with, reqparse
from sqlalchemy import and_, select
from sqlalchemy import and_
from werkzeug.exceptions import BadRequest, Forbidden, NotFound
from controllers.console import api
@ -12,8 +13,8 @@ from controllers.console.wraps import account_initialization_required, cloud_edi
from extensions.ext_database import db
from fields.installed_app_fields import installed_app_list_fields
from libs.datetime_utils import naive_utc_now
from libs.login import current_user, login_required
from models import Account, App, InstalledApp, RecommendedApp
from libs.login import login_required
from models import App, InstalledApp, RecommendedApp
from services.account_service import TenantService
from services.app_service import AppService
from services.enterprise.enterprise_service import EnterpriseService
@ -28,23 +29,17 @@ class InstalledAppsListApi(Resource):
@marshal_with(installed_app_list_fields)
def get(self):
app_id = request.args.get("app_id", default=None, type=str)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
current_tenant_id = current_user.current_tenant_id
if app_id:
installed_apps = db.session.scalars(
select(InstalledApp).where(
and_(InstalledApp.tenant_id == current_tenant_id, InstalledApp.app_id == app_id)
)
).all()
installed_apps = (
db.session.query(InstalledApp)
.where(and_(InstalledApp.tenant_id == current_tenant_id, InstalledApp.app_id == app_id))
.all()
)
else:
installed_apps = db.session.scalars(
select(InstalledApp).where(InstalledApp.tenant_id == current_tenant_id)
).all()
installed_apps = db.session.query(InstalledApp).where(InstalledApp.tenant_id == current_tenant_id).all()
if current_user.current_tenant is None:
raise ValueError("current_user.current_tenant must not be None")
current_user.role = TenantService.get_user_role(current_user, current_user.current_tenant)
installed_app_list: list[dict[str, Any]] = [
{
@ -120,8 +115,6 @@ class InstalledAppsListApi(Resource):
if recommended_app is None:
raise NotFound("App not found")
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
current_tenant_id = current_user.current_tenant_id
app = db.session.query(App).where(App.id == args["app_id"]).first()
@ -161,8 +154,6 @@ class InstalledAppApi(InstalledAppResource):
"""
def delete(self, installed_app):
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
if installed_app.app_owner_tenant_id == current_user.current_tenant_id:
raise BadRequest("You can't uninstall an app owned by the current tenant")

View File

@ -1,5 +1,6 @@
import logging
from flask_login import current_user
from flask_restx import marshal_with, reqparse
from flask_restx.inputs import int_range
from werkzeug.exceptions import InternalServerError, NotFound
@ -23,8 +24,6 @@ from core.model_runtime.errors.invoke import InvokeError
from fields.message_fields import message_infinite_scroll_pagination_fields
from libs import helper
from libs.helper import uuid_value
from libs.login import current_user
from models import Account
from models.model import AppMode
from services.app_generate_service import AppGenerateService
from services.errors.app import MoreLikeThisDisabledError
@ -55,8 +54,6 @@ class MessageListApi(InstalledAppResource):
args = parser.parse_args()
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
return MessageService.pagination_by_first_id(
app_model, current_user, args["conversation_id"], args["first_id"], args["limit"]
)
@ -78,8 +75,6 @@ class MessageFeedbackApi(InstalledAppResource):
args = parser.parse_args()
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
MessageService.create_feedback(
app_model=app_model,
message_id=message_id,
@ -110,8 +105,6 @@ class MessageMoreLikeThisApi(InstalledAppResource):
streaming = args["response_mode"] == "streaming"
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
response = AppGenerateService.generate_more_like_this(
app_model=app_model,
user=current_user,
@ -149,8 +142,6 @@ class MessageSuggestedQuestionApi(InstalledAppResource):
message_id = str(message_id)
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
questions = MessageService.get_suggested_questions_after_answer(
app_model=app_model, user=current_user, message_id=message_id, invoke_from=InvokeFrom.EXPLORE
)

View File

@ -43,8 +43,6 @@ class ExploreAppMetaApi(InstalledAppResource):
def get(self, installed_app: InstalledApp):
"""Get app meta"""
app_model = installed_app.app
if not app_model:
raise ValueError("App not found")
return AppService().get_app_meta(app_model)

View File

@ -1,10 +1,11 @@
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with, reqparse
from constants.languages import languages
from controllers.console import api
from controllers.console.wraps import account_initialization_required
from libs.helper import AppIconUrlField
from libs.login import current_user, login_required
from libs.login import login_required
from services.recommended_app_service import RecommendedAppService
app_fields = {
@ -45,9 +46,8 @@ class RecommendedAppListApi(Resource):
parser.add_argument("language", type=str, location="args")
args = parser.parse_args()
language = args.get("language")
if language and language in languages:
language_prefix = language
if args.get("language") and args.get("language") in languages:
language_prefix = args.get("language")
elif current_user and current_user.interface_language:
language_prefix = current_user.interface_language
else:

View File

@ -1,3 +1,4 @@
from flask_login import current_user
from flask_restx import fields, marshal_with, reqparse
from flask_restx.inputs import int_range
from werkzeug.exceptions import NotFound
@ -7,8 +8,6 @@ from controllers.console.explore.error import NotCompletionAppError
from controllers.console.explore.wraps import InstalledAppResource
from fields.conversation_fields import message_file_fields
from libs.helper import TimestampField, uuid_value
from libs.login import current_user
from models import Account
from services.errors.message import MessageNotExistsError
from services.saved_message_service import SavedMessageService
@ -43,8 +42,6 @@ class SavedMessageListApi(InstalledAppResource):
parser.add_argument("limit", type=int_range(1, 100), required=False, default=20, location="args")
args = parser.parse_args()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
return SavedMessageService.pagination_by_last_id(app_model, current_user, args["last_id"], args["limit"])
def post(self, installed_app):
@ -57,8 +54,6 @@ class SavedMessageListApi(InstalledAppResource):
args = parser.parse_args()
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
SavedMessageService.save(app_model, current_user, args["message_id"])
except MessageNotExistsError:
raise NotFound("Message Not Exists.")
@ -75,8 +70,6 @@ class SavedMessageApi(InstalledAppResource):
if app_model.mode != "completion":
raise NotCompletionAppError()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
SavedMessageService.delete(app_model, current_user, message_id)
return {"result": "success"}, 204

View File

@ -35,8 +35,6 @@ class InstalledAppWorkflowRunApi(InstalledAppResource):
Run workflow
"""
app_model = installed_app.app
if not app_model:
raise NotWorkflowAppError()
app_mode = AppMode.value_of(app_model.mode)
if app_mode != AppMode.WORKFLOW:
raise NotWorkflowAppError()
@ -75,8 +73,6 @@ class InstalledAppWorkflowTaskStopApi(InstalledAppResource):
Stop workflow task
"""
app_model = installed_app.app
if not app_model:
raise NotWorkflowAppError()
app_mode = AppMode.value_of(app_model.mode)
if app_mode != AppMode.WORKFLOW:
raise NotWorkflowAppError()

View File

@ -1,6 +1,4 @@
from collections.abc import Callable
from functools import wraps
from typing import Concatenate, Optional, ParamSpec, TypeVar
from flask_login import current_user
from flask_restx import Resource
@ -15,15 +13,19 @@ from services.app_service import AppService
from services.enterprise.enterprise_service import EnterpriseService
from services.feature_service import FeatureService
P = ParamSpec("P")
R = TypeVar("R")
T = TypeVar("T")
def installed_app_required(view: Optional[Callable[Concatenate[InstalledApp, P], R]] = None):
def decorator(view: Callable[Concatenate[InstalledApp, P], R]):
def installed_app_required(view=None):
def decorator(view):
@wraps(view)
def decorated(installed_app_id: str, *args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
if not kwargs.get("installed_app_id"):
raise ValueError("missing installed_app_id in path parameters")
installed_app_id = kwargs.get("installed_app_id")
installed_app_id = str(installed_app_id)
del kwargs["installed_app_id"]
installed_app = (
db.session.query(InstalledApp)
.where(
@ -50,10 +52,10 @@ def installed_app_required(view: Optional[Callable[Concatenate[InstalledApp, P],
return decorator
def user_allowed_to_access_app(view: Optional[Callable[Concatenate[InstalledApp, P], R]] = None):
def decorator(view: Callable[Concatenate[InstalledApp, P], R]):
def user_allowed_to_access_app(view=None):
def decorator(view):
@wraps(view)
def decorated(installed_app: InstalledApp, *args: P.args, **kwargs: P.kwargs):
def decorated(installed_app: InstalledApp, *args, **kwargs):
feature = FeatureService.get_system_features()
if feature.webapp_auth.enabled:
app_id = installed_app.app_id

View File

@ -1,8 +1,8 @@
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with, reqparse
from flask_restx import Resource, marshal_with, reqparse
from constants import HIDDEN_VALUE
from controllers.console import api, console_ns
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from fields.api_based_extension_fields import api_based_extension_fields
from libs.login import login_required
@ -11,21 +11,7 @@ from services.api_based_extension_service import APIBasedExtensionService
from services.code_based_extension_service import CodeBasedExtensionService
@console_ns.route("/code-based-extension")
class CodeBasedExtensionAPI(Resource):
@api.doc("get_code_based_extension")
@api.doc(description="Get code-based extension data by module name")
@api.expect(
api.parser().add_argument("module", type=str, required=True, location="args", help="Extension module name")
)
@api.response(
200,
"Success",
api.model(
"CodeBasedExtensionResponse",
{"module": fields.String(description="Module name"), "data": fields.Raw(description="Extension data")},
),
)
@setup_required
@login_required
@account_initialization_required
@ -37,11 +23,7 @@ class CodeBasedExtensionAPI(Resource):
return {"module": args["module"], "data": CodeBasedExtensionService.get_code_based_extension(args["module"])}
@console_ns.route("/api-based-extension")
class APIBasedExtensionAPI(Resource):
@api.doc("get_api_based_extensions")
@api.doc(description="Get all API-based extensions for current tenant")
@api.response(200, "Success", fields.List(fields.Nested(api_based_extension_fields)))
@setup_required
@login_required
@account_initialization_required
@ -50,19 +32,6 @@ class APIBasedExtensionAPI(Resource):
tenant_id = current_user.current_tenant_id
return APIBasedExtensionService.get_all_by_tenant_id(tenant_id)
@api.doc("create_api_based_extension")
@api.doc(description="Create a new API-based extension")
@api.expect(
api.model(
"CreateAPIBasedExtensionRequest",
{
"name": fields.String(required=True, description="Extension name"),
"api_endpoint": fields.String(required=True, description="API endpoint URL"),
"api_key": fields.String(required=True, description="API key for authentication"),
},
)
)
@api.response(201, "Extension created successfully", api_based_extension_fields)
@setup_required
@login_required
@account_initialization_required
@ -84,12 +53,7 @@ class APIBasedExtensionAPI(Resource):
return APIBasedExtensionService.save(extension_data)
@console_ns.route("/api-based-extension/<uuid:id>")
class APIBasedExtensionDetailAPI(Resource):
@api.doc("get_api_based_extension")
@api.doc(description="Get API-based extension by ID")
@api.doc(params={"id": "Extension ID"})
@api.response(200, "Success", api_based_extension_fields)
@setup_required
@login_required
@account_initialization_required
@ -100,20 +64,6 @@ class APIBasedExtensionDetailAPI(Resource):
return APIBasedExtensionService.get_with_tenant_id(tenant_id, api_based_extension_id)
@api.doc("update_api_based_extension")
@api.doc(description="Update API-based extension")
@api.doc(params={"id": "Extension ID"})
@api.expect(
api.model(
"UpdateAPIBasedExtensionRequest",
{
"name": fields.String(required=True, description="Extension name"),
"api_endpoint": fields.String(required=True, description="API endpoint URL"),
"api_key": fields.String(required=True, description="API key for authentication"),
},
)
)
@api.response(200, "Extension updated successfully", api_based_extension_fields)
@setup_required
@login_required
@account_initialization_required
@ -138,10 +88,6 @@ class APIBasedExtensionDetailAPI(Resource):
return APIBasedExtensionService.save(extension_data_from_db)
@api.doc("delete_api_based_extension")
@api.doc(description="Delete API-based extension")
@api.doc(params={"id": "Extension ID"})
@api.response(204, "Extension deleted successfully")
@setup_required
@login_required
@account_initialization_required
@ -154,3 +100,9 @@ class APIBasedExtensionDetailAPI(Resource):
APIBasedExtensionService.delete(extension_data_from_db)
return {"result": "success"}, 204
api.add_resource(CodeBasedExtensionAPI, "/code-based-extension")
api.add_resource(APIBasedExtensionAPI, "/api-based-extension")
api.add_resource(APIBasedExtensionDetailAPI, "/api-based-extension/<uuid:id>")

View File

@ -1,40 +1,26 @@
from flask_login import current_user
from flask_restx import Resource, fields
from flask_restx import Resource
from libs.login import login_required
from services.feature_service import FeatureService
from . import api, console_ns
from . import api
from .wraps import account_initialization_required, cloud_utm_record, setup_required
@console_ns.route("/features")
class FeatureApi(Resource):
@api.doc("get_tenant_features")
@api.doc(description="Get feature configuration for current tenant")
@api.response(
200,
"Success",
api.model("FeatureResponse", {"features": fields.Raw(description="Feature configuration object")}),
)
@setup_required
@login_required
@account_initialization_required
@cloud_utm_record
def get(self):
"""Get feature configuration for current tenant"""
return FeatureService.get_features(current_user.current_tenant_id).model_dump()
@console_ns.route("/system-features")
class SystemFeatureApi(Resource):
@api.doc("get_system_features")
@api.doc(description="Get system-wide feature configuration")
@api.response(
200,
"Success",
api.model("SystemFeatureResponse", {"features": fields.Raw(description="System feature configuration object")}),
)
def get(self):
"""Get system-wide feature configuration"""
return FeatureService.get_system_features().model_dump()
api.add_resource(FeatureApi, "/features")
api.add_resource(SystemFeatureApi, "/system-features")

View File

@ -22,7 +22,6 @@ from controllers.console.wraps import (
)
from fields.file_fields import file_fields, upload_config_fields
from libs.login import login_required
from models import Account
from services.file_service import FileService
PREVIEW_WORDS_LIMIT = 3000
@ -69,8 +68,6 @@ class FileApi(Resource):
source = None
try:
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),

View File

@ -1,7 +1,7 @@
import os
from flask import session
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
@ -11,47 +11,20 @@ from libs.helper import StrLen
from models.model import DifySetup
from services.account_service import TenantService
from . import api, console_ns
from . import api
from .error import AlreadySetupError, InitValidateFailedError
from .wraps import only_edition_self_hosted
@console_ns.route("/init")
class InitValidateAPI(Resource):
@api.doc("get_init_status")
@api.doc(description="Get initialization validation status")
@api.response(
200,
"Success",
model=api.model(
"InitStatusResponse",
{"status": fields.String(description="Initialization status", enum=["finished", "not_started"])},
),
)
def get(self):
"""Get initialization validation status"""
init_status = get_init_validate_status()
if init_status:
return {"status": "finished"}
return {"status": "not_started"}
@api.doc("validate_init_password")
@api.doc(description="Validate initialization password for self-hosted edition")
@api.expect(
api.model(
"InitValidateRequest",
{"password": fields.String(required=True, description="Initialization password", max_length=30)},
)
)
@api.response(
201,
"Success",
model=api.model("InitValidateResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Already setup or validation failed")
@only_edition_self_hosted
def post(self):
"""Validate initialization password"""
# is tenant created
tenant_count = TenantService.get_tenant_count()
if tenant_count > 0:
@ -79,3 +52,6 @@ def get_init_validate_status():
return db_session.execute(select(DifySetup)).scalar_one_or_none()
return True
api.add_resource(InitValidateAPI, "/init")

View File

@ -1,17 +1,14 @@
from flask_restx import Resource, fields
from flask_restx import Resource
from . import api, console_ns
from controllers.console import api
@console_ns.route("/ping")
class PingApi(Resource):
@api.doc("health_check")
@api.doc(description="Health check endpoint for connection testing")
@api.response(
200,
"Success",
api.model("PingResponse", {"result": fields.String(description="Health check result", example="pong")}),
)
def get(self):
"""Health check endpoint for connection testing"""
"""
For connection health check
"""
return {"result": "pong"}
api.add_resource(PingApi, "/ping")

View File

@ -1,5 +1,5 @@
from flask import request
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource, reqparse
from configs import dify_config
from libs.helper import StrLen, email, extract_remote_ip
@ -7,56 +7,23 @@ from libs.password import valid_password
from models.model import DifySetup, db
from services.account_service import RegisterService, TenantService
from . import api, console_ns
from . import api
from .error import AlreadySetupError, NotInitValidateError
from .init_validate import get_init_validate_status
from .wraps import only_edition_self_hosted
@console_ns.route("/setup")
class SetupApi(Resource):
@api.doc("get_setup_status")
@api.doc(description="Get system setup status")
@api.response(
200,
"Success",
api.model(
"SetupStatusResponse",
{
"step": fields.String(description="Setup step status", enum=["not_started", "finished"]),
"setup_at": fields.String(description="Setup completion time (ISO format)", required=False),
},
),
)
def get(self):
"""Get system setup status"""
if dify_config.EDITION == "SELF_HOSTED":
setup_status = get_setup_status()
# Check if setup_status is a DifySetup object rather than a bool
if setup_status and not isinstance(setup_status, bool):
if setup_status:
return {"step": "finished", "setup_at": setup_status.setup_at.isoformat()}
elif setup_status:
return {"step": "finished"}
return {"step": "not_started"}
return {"step": "finished"}
@api.doc("setup_system")
@api.doc(description="Initialize system setup with admin account")
@api.expect(
api.model(
"SetupRequest",
{
"email": fields.String(required=True, description="Admin email address"),
"name": fields.String(required=True, description="Admin name (max 30 characters)"),
"password": fields.String(required=True, description="Admin password"),
},
)
)
@api.response(201, "Success", api.model("SetupResponse", {"result": fields.String(description="Setup result")}))
@api.response(400, "Already setup or validation failed")
@only_edition_self_hosted
def post(self):
"""Initialize system setup with admin account"""
# is set up
if get_setup_status():
raise AlreadySetupError()
@ -88,3 +55,6 @@ def get_setup_status():
return db.session.query(DifySetup).first()
else:
return True
api.add_resource(SetupApi, "/setup")

View File

@ -111,7 +111,7 @@ class TagBindingCreateApi(Resource):
args = parser.parse_args()
TagService.save_tag_binding(args)
return {"result": "success"}, 200
return 200
class TagBindingDeleteApi(Resource):
@ -132,7 +132,7 @@ class TagBindingDeleteApi(Resource):
args = parser.parse_args()
TagService.delete_tag_binding(args)
return {"result": "success"}, 200
return 200
api.add_resource(TagListApi, "/tags")

View File

@ -2,41 +2,18 @@ import json
import logging
import requests
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource, reqparse
from packaging import version
from configs import dify_config
from . import api, console_ns
from . import api
logger = logging.getLogger(__name__)
@console_ns.route("/version")
class VersionApi(Resource):
@api.doc("check_version_update")
@api.doc(description="Check for application version updates")
@api.expect(
api.parser().add_argument(
"current_version", type=str, required=True, location="args", help="Current application version"
)
)
@api.response(
200,
"Success",
api.model(
"VersionResponse",
{
"version": fields.String(description="Latest version number"),
"release_date": fields.String(description="Release date of latest version"),
"release_notes": fields.String(description="Release notes for latest version"),
"can_auto_update": fields.Boolean(description="Whether auto-update is supported"),
"features": fields.Raw(description="Feature flags and capabilities"),
},
),
)
def get(self):
"""Check for application version updates"""
parser = reqparse.RequestParser()
parser.add_argument("current_version", type=str, required=True, location="args")
args = parser.parse_args()
@ -57,14 +34,14 @@ class VersionApi(Resource):
return result
try:
response = requests.get(check_update_url, {"current_version": args["current_version"]}, timeout=(3, 10))
response = requests.get(check_update_url, {"current_version": args.get("current_version")}, timeout=(3, 10))
except Exception as error:
logger.warning("Check update version error: %s.", str(error))
result["version"] = args["current_version"]
result["version"] = args.get("current_version")
return result
content = json.loads(response.content)
if _has_new_version(latest_version=content["version"], current_version=f"{args['current_version']}"):
if _has_new_version(latest_version=content["version"], current_version=f"{args.get('current_version')}"):
result["version"] = content["version"]
result["release_date"] = content["releaseDate"]
result["release_notes"] = content["releaseNotes"]
@ -82,3 +59,6 @@ def _has_new_version(*, latest_version: str, current_version: str) -> bool:
except version.InvalidVersion:
logger.warning("Invalid version format: latest=%s, current=%s", latest_version, current_version)
return False
api.add_resource(VersionApi, "/version")

View File

@ -1,6 +1,4 @@
from collections.abc import Callable
from functools import wraps
from typing import ParamSpec, TypeVar
from flask_login import current_user
from sqlalchemy.orm import Session
@ -9,17 +7,14 @@ from werkzeug.exceptions import Forbidden
from extensions.ext_database import db
from models.account import TenantPluginPermission
P = ParamSpec("P")
R = TypeVar("R")
def plugin_permission_required(
install_required: bool = False,
debug_required: bool = False,
):
def interceptor(view: Callable[P, R]):
def interceptor(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
user = current_user
tenant_id = user.current_tenant_id

View File

@ -49,8 +49,6 @@ class AccountInitApi(Resource):
@setup_required
@login_required
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
account = current_user
if account.status == "active":
@ -104,8 +102,6 @@ class AccountProfileApi(Resource):
@marshal_with(account_fields)
@enterprise_license_required
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
return current_user
@ -115,8 +111,6 @@ class AccountNameApi(Resource):
@account_initialization_required
@marshal_with(account_fields)
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, required=True, location="json")
args = parser.parse_args()
@ -136,8 +130,6 @@ class AccountAvatarApi(Resource):
@account_initialization_required
@marshal_with(account_fields)
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("avatar", type=str, required=True, location="json")
args = parser.parse_args()
@ -153,8 +145,6 @@ class AccountInterfaceLanguageApi(Resource):
@account_initialization_required
@marshal_with(account_fields)
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("interface_language", type=supported_language, required=True, location="json")
args = parser.parse_args()
@ -170,8 +160,6 @@ class AccountInterfaceThemeApi(Resource):
@account_initialization_required
@marshal_with(account_fields)
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("interface_theme", type=str, choices=["light", "dark"], required=True, location="json")
args = parser.parse_args()
@ -187,8 +175,6 @@ class AccountTimezoneApi(Resource):
@account_initialization_required
@marshal_with(account_fields)
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("timezone", type=str, required=True, location="json")
args = parser.parse_args()
@ -208,8 +194,6 @@ class AccountPasswordApi(Resource):
@account_initialization_required
@marshal_with(account_fields)
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("password", type=str, required=False, location="json")
parser.add_argument("new_password", type=str, required=True, location="json")
@ -244,13 +228,9 @@ class AccountIntegrateApi(Resource):
@account_initialization_required
@marshal_with(integrate_list_fields)
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
account = current_user
account_integrates = db.session.scalars(
select(AccountIntegrate).where(AccountIntegrate.account_id == account.id)
).all()
account_integrates = db.session.query(AccountIntegrate).where(AccountIntegrate.account_id == account.id).all()
base_url = request.url_root.rstrip("/")
oauth_base_path = "/console/api/oauth/login"
@ -288,8 +268,6 @@ class AccountDeleteVerifyApi(Resource):
@login_required
@account_initialization_required
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
account = current_user
token, code = AccountService.generate_account_deletion_verification_code(account)
@ -303,8 +281,6 @@ class AccountDeleteApi(Resource):
@login_required
@account_initialization_required
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
account = current_user
parser = reqparse.RequestParser()
@ -345,8 +321,6 @@ class EducationVerifyApi(Resource):
@cloud_edition_billing_enabled
@marshal_with(verify_fields)
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
account = current_user
return BillingService.EducationIdentity.verify(account.id, account.email)
@ -366,8 +340,6 @@ class EducationApi(Resource):
@only_edition_cloud
@cloud_edition_billing_enabled
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
account = current_user
parser = reqparse.RequestParser()
@ -385,8 +357,6 @@ class EducationApi(Resource):
@cloud_edition_billing_enabled
@marshal_with(status_fields)
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
account = current_user
res = BillingService.EducationIdentity.status(account.id)
@ -451,8 +421,6 @@ class ChangeEmailSendEmailApi(Resource):
raise InvalidTokenError()
user_email = reset_data.get("email", "")
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if user_email != current_user.email:
raise InvalidEmailError()
else:
@ -533,8 +501,6 @@ class ChangeEmailResetApi(Resource):
AccountService.revoke_change_email_token(args["token"])
old_email = reset_data.get("old_email", "")
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if current_user.email != old_email:
raise AccountNotFound()

View File

@ -1,22 +1,14 @@
from flask_login import current_user
from flask_restx import Resource, fields
from flask_restx import Resource
from controllers.console import api, console_ns
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from core.model_runtime.utils.encoders import jsonable_encoder
from libs.login import login_required
from services.agent_service import AgentService
@console_ns.route("/workspaces/current/agent-providers")
class AgentProviderListApi(Resource):
@api.doc("list_agent_providers")
@api.doc(description="Get list of available agent providers")
@api.response(
200,
"Success",
fields.List(fields.Raw(description="Agent provider information")),
)
@setup_required
@login_required
@account_initialization_required
@ -29,16 +21,7 @@ class AgentProviderListApi(Resource):
return jsonable_encoder(AgentService.list_agent_providers(user_id, tenant_id))
@console_ns.route("/workspaces/current/agent-provider/<path:provider_name>")
class AgentProviderApi(Resource):
@api.doc("get_agent_provider")
@api.doc(description="Get specific agent provider details")
@api.doc(params={"provider_name": "Agent provider name"})
@api.response(
200,
"Success",
fields.Raw(description="Agent provider details"),
)
@setup_required
@login_required
@account_initialization_required
@ -47,3 +30,7 @@ class AgentProviderApi(Resource):
user_id = user.id
tenant_id = user.current_tenant_id
return jsonable_encoder(AgentService.get_agent_provider(user_id, tenant_id, provider_name))
api.add_resource(AgentProviderListApi, "/workspaces/current/agent-providers")
api.add_resource(AgentProviderApi, "/workspaces/current/agent-provider/<path:provider_name>")

View File

@ -1,8 +1,8 @@
from flask_login import current_user
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api, console_ns
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.impl.exc import PluginPermissionDeniedError
@ -10,26 +10,7 @@ from libs.login import login_required
from services.plugin.endpoint_service import EndpointService
@console_ns.route("/workspaces/current/endpoints/create")
class EndpointCreateApi(Resource):
@api.doc("create_endpoint")
@api.doc(description="Create a new plugin endpoint")
@api.expect(
api.model(
"EndpointCreateRequest",
{
"plugin_unique_identifier": fields.String(required=True, description="Plugin unique identifier"),
"settings": fields.Raw(required=True, description="Endpoint settings"),
"name": fields.String(required=True, description="Endpoint name"),
},
)
)
@api.response(
200,
"Endpoint created successfully",
api.model("EndpointCreateResponse", {"success": fields.Boolean(description="Operation success")}),
)
@api.response(403, "Admin privileges required")
@setup_required
@login_required
@account_initialization_required
@ -62,20 +43,7 @@ class EndpointCreateApi(Resource):
raise ValueError(e.description) from e
@console_ns.route("/workspaces/current/endpoints/list")
class EndpointListApi(Resource):
@api.doc("list_endpoints")
@api.doc(description="List plugin endpoints with pagination")
@api.expect(
api.parser()
.add_argument("page", type=int, required=True, location="args", help="Page number")
.add_argument("page_size", type=int, required=True, location="args", help="Page size")
)
@api.response(
200,
"Success",
api.model("EndpointListResponse", {"endpoints": fields.List(fields.Raw(description="Endpoint information"))}),
)
@setup_required
@login_required
@account_initialization_required
@ -102,23 +70,7 @@ class EndpointListApi(Resource):
)
@console_ns.route("/workspaces/current/endpoints/list/plugin")
class EndpointListForSinglePluginApi(Resource):
@api.doc("list_plugin_endpoints")
@api.doc(description="List endpoints for a specific plugin")
@api.expect(
api.parser()
.add_argument("page", type=int, required=True, location="args", help="Page number")
.add_argument("page_size", type=int, required=True, location="args", help="Page size")
.add_argument("plugin_id", type=str, required=True, location="args", help="Plugin ID")
)
@api.response(
200,
"Success",
api.model(
"PluginEndpointListResponse", {"endpoints": fields.List(fields.Raw(description="Endpoint information"))}
),
)
@setup_required
@login_required
@account_initialization_required
@ -148,19 +100,7 @@ class EndpointListForSinglePluginApi(Resource):
)
@console_ns.route("/workspaces/current/endpoints/delete")
class EndpointDeleteApi(Resource):
@api.doc("delete_endpoint")
@api.doc(description="Delete a plugin endpoint")
@api.expect(
api.model("EndpointDeleteRequest", {"endpoint_id": fields.String(required=True, description="Endpoint ID")})
)
@api.response(
200,
"Endpoint deleted successfully",
api.model("EndpointDeleteResponse", {"success": fields.Boolean(description="Operation success")}),
)
@api.response(403, "Admin privileges required")
@setup_required
@login_required
@account_initialization_required
@ -183,26 +123,7 @@ class EndpointDeleteApi(Resource):
}
@console_ns.route("/workspaces/current/endpoints/update")
class EndpointUpdateApi(Resource):
@api.doc("update_endpoint")
@api.doc(description="Update a plugin endpoint")
@api.expect(
api.model(
"EndpointUpdateRequest",
{
"endpoint_id": fields.String(required=True, description="Endpoint ID"),
"settings": fields.Raw(required=True, description="Updated settings"),
"name": fields.String(required=True, description="Updated name"),
},
)
)
@api.response(
200,
"Endpoint updated successfully",
api.model("EndpointUpdateResponse", {"success": fields.Boolean(description="Operation success")}),
)
@api.response(403, "Admin privileges required")
@setup_required
@login_required
@account_initialization_required
@ -233,19 +154,7 @@ class EndpointUpdateApi(Resource):
}
@console_ns.route("/workspaces/current/endpoints/enable")
class EndpointEnableApi(Resource):
@api.doc("enable_endpoint")
@api.doc(description="Enable a plugin endpoint")
@api.expect(
api.model("EndpointEnableRequest", {"endpoint_id": fields.String(required=True, description="Endpoint ID")})
)
@api.response(
200,
"Endpoint enabled successfully",
api.model("EndpointEnableResponse", {"success": fields.Boolean(description="Operation success")}),
)
@api.response(403, "Admin privileges required")
@setup_required
@login_required
@account_initialization_required
@ -268,19 +177,7 @@ class EndpointEnableApi(Resource):
}
@console_ns.route("/workspaces/current/endpoints/disable")
class EndpointDisableApi(Resource):
@api.doc("disable_endpoint")
@api.doc(description="Disable a plugin endpoint")
@api.expect(
api.model("EndpointDisableRequest", {"endpoint_id": fields.String(required=True, description="Endpoint ID")})
)
@api.response(
200,
"Endpoint disabled successfully",
api.model("EndpointDisableResponse", {"success": fields.Boolean(description="Operation success")}),
)
@api.response(403, "Admin privileges required")
@setup_required
@login_required
@account_initialization_required
@ -301,3 +198,12 @@ class EndpointDisableApi(Resource):
tenant_id=user.current_tenant_id, user_id=user.id, endpoint_id=endpoint_id
)
}
api.add_resource(EndpointCreateApi, "/workspaces/current/endpoints/create")
api.add_resource(EndpointListApi, "/workspaces/current/endpoints/list")
api.add_resource(EndpointListForSinglePluginApi, "/workspaces/current/endpoints/list/plugin")
api.add_resource(EndpointDeleteApi, "/workspaces/current/endpoints/delete")
api.add_resource(EndpointUpdateApi, "/workspaces/current/endpoints/update")
api.add_resource(EndpointEnableApi, "/workspaces/current/endpoints/enable")
api.add_resource(EndpointDisableApi, "/workspaces/current/endpoints/disable")

View File

@ -1,8 +1,8 @@
from urllib import parse
from flask import abort, request
from flask import request
from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse
from flask_restx import Resource, abort, marshal_with, reqparse
import services
from configs import dify_config
@ -41,10 +41,6 @@ class MemberListApi(Resource):
@account_initialization_required
@marshal_with(account_with_role_list_fields)
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant:
raise ValueError("No current tenant")
members = TenantService.get_tenant_members(current_user.current_tenant)
return {"result": "success", "accounts": members}, 200
@ -69,11 +65,7 @@ class MemberInviteEmailApi(Resource):
if not TenantAccountRole.is_non_owner_role(invitee_role):
return {"code": "invalid-role", "message": "Invalid role"}, 400
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
inviter = current_user
if not inviter.current_tenant:
raise ValueError("No current tenant")
invitation_results = []
console_web_url = dify_config.CONSOLE_WEB_URL
@ -84,8 +76,6 @@ class MemberInviteEmailApi(Resource):
for invitee_email in invitee_emails:
try:
if not inviter.current_tenant:
raise ValueError("No current tenant")
token = RegisterService.invite_new_member(
inviter.current_tenant, invitee_email, interface_language, role=invitee_role, inviter=inviter
)
@ -107,7 +97,7 @@ class MemberInviteEmailApi(Resource):
return {
"result": "success",
"invitation_results": invitation_results,
"tenant_id": str(inviter.current_tenant.id) if inviter.current_tenant else "",
"tenant_id": str(current_user.current_tenant.id),
}, 201
@ -118,10 +108,6 @@ class MemberCancelInviteApi(Resource):
@login_required
@account_initialization_required
def delete(self, member_id):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant:
raise ValueError("No current tenant")
member = db.session.query(Account).where(Account.id == str(member_id)).first()
if member is None:
abort(404)
@ -137,10 +123,7 @@ class MemberCancelInviteApi(Resource):
except Exception as e:
raise ValueError(str(e))
return {
"result": "success",
"tenant_id": str(current_user.current_tenant.id) if current_user.current_tenant else "",
}, 200
return {"result": "success", "tenant_id": str(current_user.current_tenant.id)}, 200
class MemberUpdateRoleApi(Resource):
@ -158,10 +141,6 @@ class MemberUpdateRoleApi(Resource):
if not TenantAccountRole.is_valid_role(new_role):
return {"code": "invalid-role", "message": "Invalid role"}, 400
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant:
raise ValueError("No current tenant")
member = db.session.get(Account, str(member_id))
if not member:
abort(404)
@ -185,10 +164,6 @@ class DatasetOperatorMemberListApi(Resource):
@account_initialization_required
@marshal_with(account_with_role_list_fields)
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant:
raise ValueError("No current tenant")
members = TenantService.get_dataset_operator_members(current_user.current_tenant)
return {"result": "success", "accounts": members}, 200
@ -209,10 +184,6 @@ class SendOwnerTransferEmailApi(Resource):
raise EmailSendIpLimitError()
# check if the current user is the owner of the workspace
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant:
raise ValueError("No current tenant")
if not TenantService.is_owner(current_user, current_user.current_tenant):
raise NotOwnerError()
@ -227,7 +198,7 @@ class SendOwnerTransferEmailApi(Resource):
account=current_user,
email=email,
language=language,
workspace_name=current_user.current_tenant.name if current_user.current_tenant else "",
workspace_name=current_user.current_tenant.name,
)
return {"result": "success", "data": token}
@ -244,10 +215,6 @@ class OwnerTransferCheckApi(Resource):
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
# check if the current user is the owner of the workspace
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant:
raise ValueError("No current tenant")
if not TenantService.is_owner(current_user, current_user.current_tenant):
raise NotOwnerError()
@ -289,10 +256,6 @@ class OwnerTransfer(Resource):
args = parser.parse_args()
# check if the current user is the owner of the workspace
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant:
raise ValueError("No current tenant")
if not TenantService.is_owner(current_user, current_user.current_tenant):
raise NotOwnerError()
@ -311,11 +274,9 @@ class OwnerTransfer(Resource):
member = db.session.get(Account, str(member_id))
if not member:
abort(404)
return # Never reached, but helps type checker
if not current_user.current_tenant:
raise ValueError("No current tenant")
if not TenantService.is_member(member, current_user.current_tenant):
else:
member_account = member
if not TenantService.is_member(member_account, current_user.current_tenant):
raise MemberNotInTenantError()
try:
@ -325,13 +286,13 @@ class OwnerTransfer(Resource):
AccountService.send_new_owner_transfer_notify_email(
account=member,
email=member.email,
workspace_name=current_user.current_tenant.name if current_user.current_tenant else "",
workspace_name=current_user.current_tenant.name,
)
AccountService.send_old_owner_transfer_notify_email(
account=current_user,
email=current_user.email,
workspace_name=current_user.current_tenant.name if current_user.current_tenant else "",
workspace_name=current_user.current_tenant.name,
new_owner_email=member.email,
)

View File

@ -12,7 +12,6 @@ from core.model_runtime.errors.validate import CredentialsValidateFailedError
from core.model_runtime.utils.encoders import jsonable_encoder
from libs.helper import StrLen, uuid_value
from libs.login import login_required
from models.account import Account
from services.billing_service import BillingService
from services.model_provider_service import ModelProviderService
@ -22,10 +21,6 @@ class ModelProviderListApi(Resource):
@login_required
@account_initialization_required
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
tenant_id = current_user.current_tenant_id
parser = reqparse.RequestParser()
@ -50,10 +45,6 @@ class ModelProviderCredentialApi(Resource):
@login_required
@account_initialization_required
def get(self, provider: str):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
tenant_id = current_user.current_tenant_id
# if credential_id is not provided, return current used credential
parser = reqparse.RequestParser()
@ -71,20 +62,16 @@ class ModelProviderCredentialApi(Resource):
@login_required
@account_initialization_required
def post(self, provider: str):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
parser.add_argument("name", type=StrLen(30), required=False, nullable=True, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
args = parser.parse_args()
model_provider_service = ModelProviderService()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
try:
model_provider_service.create_provider_credential(
tenant_id=current_user.current_tenant_id,
@ -101,21 +88,17 @@ class ModelProviderCredentialApi(Resource):
@login_required
@account_initialization_required
def put(self, provider: str):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("credential_id", type=uuid_value, required=True, nullable=False, location="json")
parser.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
parser.add_argument("name", type=StrLen(30), required=False, nullable=True, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
args = parser.parse_args()
model_provider_service = ModelProviderService()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
try:
model_provider_service.update_provider_credential(
tenant_id=current_user.current_tenant_id,
@ -133,16 +116,12 @@ class ModelProviderCredentialApi(Resource):
@login_required
@account_initialization_required
def delete(self, provider: str):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("credential_id", type=uuid_value, required=True, nullable=False, location="json")
args = parser.parse_args()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
model_provider_service = ModelProviderService()
model_provider_service.remove_provider_credential(
tenant_id=current_user.current_tenant_id, provider=provider, credential_id=args["credential_id"]
@ -156,16 +135,12 @@ class ModelProviderCredentialSwitchApi(Resource):
@login_required
@account_initialization_required
def post(self, provider: str):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("credential_id", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
service = ModelProviderService()
service.switch_active_provider_credential(
tenant_id=current_user.current_tenant_id,
@ -180,14 +155,10 @@ class ModelProviderValidateApi(Resource):
@login_required
@account_initialization_required
def post(self, provider: str):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
args = parser.parse_args()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
tenant_id = current_user.current_tenant_id
model_provider_service = ModelProviderService()
@ -234,13 +205,9 @@ class PreferredProviderTypeUpdateApi(Resource):
@login_required
@account_initialization_required
def post(self, provider: str):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
if not current_user.is_admin_or_owner:
raise Forbidden()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
tenant_id = current_user.current_tenant_id
parser = reqparse.RequestParser()
@ -269,11 +236,7 @@ class ModelProviderPaymentCheckoutUrlApi(Resource):
def get(self, provider: str):
if provider != "anthropic":
raise ValueError(f"provider name {provider} is invalid")
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
BillingService.is_tenant_owner_or_admin(current_user)
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
data = BillingService.get_model_provider_payment_link(
provider_name=provider,
tenant_id=current_user.current_tenant_id,

View File

@ -219,11 +219,7 @@ class ModelProviderModelCredentialApi(Resource):
model_load_balancing_service = ModelLoadBalancingService()
is_load_balancing_enabled, load_balancing_configs = model_load_balancing_service.get_load_balancing_configs(
tenant_id=tenant_id,
provider=provider,
model=args["model"],
model_type=args["model_type"],
config_from=args.get("config_from", ""),
tenant_id=tenant_id, provider=provider, model=args["model"], model_type=args["model_type"]
)
if args.get("config_from", "") == "predefined-model":
@ -267,7 +263,7 @@ class ModelProviderModelCredentialApi(Resource):
choices=[mt.value for mt in ModelType],
location="json",
)
parser.add_argument("name", type=StrLen(30), required=False, nullable=True, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
parser.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
args = parser.parse_args()
@ -313,7 +309,7 @@ class ModelProviderModelCredentialApi(Resource):
)
parser.add_argument("credential_id", type=uuid_value, required=True, nullable=False, location="json")
parser.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
parser.add_argument("name", type=StrLen(30), required=False, nullable=True, location="json")
parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
args = parser.parse_args()
model_provider_service = ModelProviderService()

View File

@ -107,22 +107,6 @@ class PluginIconApi(Resource):
icon_cache_max_age = dify_config.TOOL_ICON_CACHE_MAX_AGE
return send_file(io.BytesIO(icon_bytes), mimetype=mimetype, max_age=icon_cache_max_age)
class PluginAssetApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
req = reqparse.RequestParser()
req.add_argument("plugin_unique_identifier", type=str, required=True, location="args")
req.add_argument("file_name", type=str, required=True, location="args")
args = req.parse_args()
tenant_id = current_user.current_tenant_id
try:
binary = PluginService.extract_asset(tenant_id, args["plugin_unique_identifier"], args["file_name"])
return send_file(io.BytesIO(binary), mimetype="application/octet-stream")
except PluginDaemonClientSideError as e:
raise ValueError(e)
class PluginUploadFromPkgApi(Resource):
@setup_required
@ -532,18 +516,20 @@ class PluginFetchDynamicSelectOptionsApi(Resource):
parser.add_argument("provider", type=str, required=True, location="args")
parser.add_argument("action", type=str, required=True, location="args")
parser.add_argument("parameter", type=str, required=True, location="args")
parser.add_argument("credential_id", type=str, required=False, location="args")
parser.add_argument("provider_type", type=str, required=True, location="args")
args = parser.parse_args()
try:
options = PluginParameterService.get_dynamic_select_options(
tenant_id,
user_id,
args["plugin_id"],
args["provider"],
args["action"],
args["parameter"],
args["provider_type"],
tenant_id=tenant_id,
user_id=user_id,
plugin_id=args["plugin_id"],
provider=args["provider"],
action=args["action"],
parameter=args["parameter"],
credential_id=args["credential_id"],
provider_type=args["provider_type"],
)
except PluginDaemonClientSideError as e:
raise ValueError(e)
@ -659,34 +645,11 @@ class PluginAutoUpgradeExcludePluginApi(Resource):
return jsonable_encoder({"success": PluginAutoUpgradeService.exclude_plugin(tenant_id, args["plugin_id"])})
class PluginReadmeApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
tenant_id = current_user.current_tenant_id
parser = reqparse.RequestParser()
parser.add_argument("plugin_unique_identifier", type=str, required=True, location="args")
parser.add_argument("language", type=str, required=False, location="args")
args = parser.parse_args()
return jsonable_encoder(
{
"readme": PluginService.fetch_plugin_readme(
tenant_id,
args["plugin_unique_identifier"],
args.get("language", "en-US")
)
}
)
api.add_resource(PluginDebuggingKeyApi, "/workspaces/current/plugin/debugging-key")
api.add_resource(PluginListApi, "/workspaces/current/plugin/list")
api.add_resource(PluginReadmeApi, "/workspaces/current/plugin/readme")
api.add_resource(PluginListLatestVersionsApi, "/workspaces/current/plugin/list/latest-versions")
api.add_resource(PluginListInstallationsFromIdsApi, "/workspaces/current/plugin/list/installations/ids")
api.add_resource(PluginIconApi, "/workspaces/current/plugin/icon")
api.add_resource(PluginAssetApi, "/workspaces/current/plugin/asset")
api.add_resource(PluginUploadFromPkgApi, "/workspaces/current/plugin/upload/pkg")
api.add_resource(PluginUploadFromGithubApi, "/workspaces/current/plugin/upload/github")
api.add_resource(PluginUploadFromBundleApi, "/workspaces/current/plugin/upload/bundle")

View File

@ -22,8 +22,8 @@ from core.mcp.error import MCPAuthError, MCPError
from core.mcp.mcp_client import MCPClient
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.entities.plugin import ToolProviderID
from core.plugin.entities.plugin_daemon import CredentialType
from core.plugin.impl.oauth import OAuthHandler
from core.tools.entities.tool_entities import CredentialType
from libs.helper import StrLen, alphanumeric, uuid_value
from libs.login import login_required
from services.plugin.oauth_service import OAuthProxyService
@ -865,7 +865,6 @@ class ToolProviderMCPApi(Resource):
parser.add_argument(
"sse_read_timeout", type=float, required=False, nullable=False, location="json", default=300
)
parser.add_argument("headers", type=dict, required=False, nullable=True, location="json", default={})
args = parser.parse_args()
user = current_user
if not is_valid_url(args["server_url"]):
@ -882,7 +881,6 @@ class ToolProviderMCPApi(Resource):
server_identifier=args["server_identifier"],
timeout=args["timeout"],
sse_read_timeout=args["sse_read_timeout"],
headers=args["headers"],
)
)
@ -900,7 +898,6 @@ class ToolProviderMCPApi(Resource):
parser.add_argument("server_identifier", type=str, required=True, nullable=False, location="json")
parser.add_argument("timeout", type=float, required=False, nullable=True, location="json")
parser.add_argument("sse_read_timeout", type=float, required=False, nullable=True, location="json")
parser.add_argument("headers", type=dict, required=False, nullable=True, location="json")
args = parser.parse_args()
if not is_valid_url(args["server_url"]):
if "[__HIDDEN__]" in args["server_url"]:
@ -918,7 +915,6 @@ class ToolProviderMCPApi(Resource):
server_identifier=args["server_identifier"],
timeout=args.get("timeout"),
sse_read_timeout=args.get("sse_read_timeout"),
headers=args.get("headers"),
)
return {"result": "success"}
@ -955,9 +951,6 @@ class ToolMCPAuthApi(Resource):
authed=False,
authorization_code=args["authorization_code"],
for_list=True,
headers=provider.decrypted_headers,
timeout=provider.timeout,
sse_read_timeout=provider.sse_read_timeout,
):
MCPToolManageService.update_mcp_provider_credentials(
mcp_provider=provider,

View File

@ -0,0 +1,589 @@
import logging
from flask import make_response, redirect, request
from flask_restx import Resource, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden
from configs import dify_config
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.entities.plugin import TriggerProviderID
from core.plugin.entities.plugin_daemon import CredentialType
from core.plugin.impl.oauth import OAuthHandler
from core.trigger.entities.entities import SubscriptionBuilderUpdater
from core.trigger.trigger_manager import TriggerManager
from extensions.ext_database import db
from libs.login import current_user, login_required
from models.account import Account
from services.plugin.oauth_service import OAuthProxyService
from services.trigger.trigger_provider_service import TriggerProviderService
from services.trigger.trigger_subscription_builder_service import TriggerSubscriptionBuilderService
from services.workflow_plugin_trigger_service import WorkflowPluginTriggerService
logger = logging.getLogger(__name__)
class TriggerProviderListApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
"""List all trigger providers for the current tenant"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
return jsonable_encoder(TriggerProviderService.list_trigger_providers(user.current_tenant_id))
class TriggerProviderInfoApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""Get info for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
return jsonable_encoder(
TriggerProviderService.get_trigger_provider(user.current_tenant_id, TriggerProviderID(provider))
)
class TriggerSubscriptionListApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""List all trigger subscriptions for the current tenant's provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
return jsonable_encoder(
TriggerProviderService.list_trigger_provider_subscriptions(
tenant_id=user.current_tenant_id, provider_id=TriggerProviderID(provider)
)
)
except Exception as e:
logger.exception("Error listing trigger providers", exc_info=e)
raise
class TriggerSubscriptionBuilderCreateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider):
"""Add a new subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("credential_type", type=str, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
credential_type = CredentialType.of(args.get("credential_type") or CredentialType.UNAUTHORIZED.value)
subscription_builder = TriggerSubscriptionBuilderService.create_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
user_id=user.id,
provider_id=TriggerProviderID(provider),
credential_type=credential_type,
)
return jsonable_encoder({"subscription_builder": subscription_builder})
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error adding provider credential", exc_info=e)
raise
class TriggerSubscriptionBuilderGetApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider, subscription_builder_id):
"""Get a subscription instance for a trigger provider"""
return jsonable_encoder(
TriggerSubscriptionBuilderService.get_subscription_builder_by_id(subscription_builder_id)
)
class TriggerSubscriptionBuilderVerifyApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider, subscription_builder_id):
"""Verify a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
# The credentials of the subscription builder
parser.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
TriggerSubscriptionBuilderService.update_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
credentials=args.get("credentials", None),
),
)
return TriggerSubscriptionBuilderService.verify_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
user_id=user.id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
)
except Exception as e:
logger.exception("Error verifying provider credential", exc_info=e)
raise
class TriggerSubscriptionBuilderUpdateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider, subscription_builder_id):
"""Update a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
parser = reqparse.RequestParser()
# The name of the subscription builder
parser.add_argument("name", type=str, required=False, nullable=True, location="json")
# The parameters of the subscription builder
parser.add_argument("parameters", type=dict, required=False, nullable=True, location="json")
# The properties of the subscription builder
parser.add_argument("properties", type=dict, required=False, nullable=True, location="json")
# The credentials of the subscription builder
parser.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
return jsonable_encoder(
TriggerSubscriptionBuilderService.update_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
name=args.get("name", None),
parameters=args.get("parameters", None),
properties=args.get("properties", None),
credentials=args.get("credentials", None),
),
)
)
except Exception as e:
logger.exception("Error updating provider credential", exc_info=e)
raise
class TriggerSubscriptionBuilderLogsApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider, subscription_builder_id):
"""Get the request logs for a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
try:
logs = TriggerSubscriptionBuilderService.list_logs(subscription_builder_id)
return jsonable_encoder({"logs": [log.model_dump(mode="json") for log in logs]})
except Exception as e:
logger.exception("Error getting request logs for subscription builder", exc_info=e)
raise
class TriggerSubscriptionBuilderBuildApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider, subscription_builder_id):
"""Build a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
# The name of the subscription builder
parser.add_argument("name", type=str, required=False, nullable=True, location="json")
# The parameters of the subscription builder
parser.add_argument("parameters", type=dict, required=False, nullable=True, location="json")
# The properties of the subscription builder
parser.add_argument("properties", type=dict, required=False, nullable=True, location="json")
# The credentials of the subscription builder
parser.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
TriggerSubscriptionBuilderService.update_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
name=args.get("name", None),
parameters=args.get("parameters", None),
properties=args.get("properties", None),
),
)
TriggerSubscriptionBuilderService.build_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
user_id=user.id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
)
return 200
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error building provider credential", exc_info=e)
raise
class TriggerSubscriptionDeleteApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, subscription_id):
"""Delete a subscription instance"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
with Session(db.engine) as session:
# Delete trigger provider subscription
TriggerProviderService.delete_trigger_provider(
session=session,
tenant_id=user.current_tenant_id,
subscription_id=subscription_id,
)
# Delete plugin triggers
WorkflowPluginTriggerService.delete_plugin_trigger_by_subscription(
session=session,
tenant_id=user.current_tenant_id,
subscription_id=subscription_id,
)
session.commit()
return {"result": "success"}
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error deleting provider credential", exc_info=e)
raise
class TriggerOAuthAuthorizeApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""Initiate OAuth authorization flow for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
try:
provider_id = TriggerProviderID(provider)
plugin_id = provider_id.plugin_id
provider_name = provider_id.provider_name
tenant_id = user.current_tenant_id
# Get OAuth client configuration
oauth_client_params = TriggerProviderService.get_oauth_client(
tenant_id=tenant_id,
provider_id=provider_id,
)
if oauth_client_params is None:
raise Forbidden("No OAuth client configuration found for this trigger provider")
# Create subscription builder
subscription_builder = TriggerSubscriptionBuilderService.create_trigger_subscription_builder(
tenant_id=tenant_id,
user_id=user.id,
provider_id=provider_id,
credential_type=CredentialType.OAUTH2,
)
# Create OAuth handler and proxy context
oauth_handler = OAuthHandler()
context_id = OAuthProxyService.create_proxy_context(
user_id=user.id,
tenant_id=tenant_id,
plugin_id=plugin_id,
provider=provider_name,
extra_data={
"subscription_builder_id": subscription_builder.id,
},
)
# Build redirect URI for callback
redirect_uri = f"{dify_config.CONSOLE_API_URL}/console/api/oauth/plugin/{provider}/trigger/callback"
# Get authorization URL
authorization_url_response = oauth_handler.get_authorization_url(
tenant_id=tenant_id,
user_id=user.id,
plugin_id=plugin_id,
provider=provider_name,
redirect_uri=redirect_uri,
system_credentials=oauth_client_params,
)
# Create response with cookie
response = make_response(
jsonable_encoder(
{
"authorization_url": authorization_url_response.authorization_url,
"subscription_builder_id": subscription_builder.id,
"subscription_builder": subscription_builder,
}
)
)
response.set_cookie(
"context_id",
context_id,
httponly=True,
samesite="Lax",
max_age=OAuthProxyService.__MAX_AGE__,
)
return response
except Exception as e:
logger.exception("Error initiating OAuth flow", exc_info=e)
raise
class TriggerOAuthCallbackApi(Resource):
@setup_required
def get(self, provider):
"""Handle OAuth callback for trigger provider"""
context_id = request.cookies.get("context_id")
if not context_id:
raise Forbidden("context_id not found")
# Use and validate proxy context
context = OAuthProxyService.use_proxy_context(context_id)
if context is None:
raise Forbidden("Invalid context_id")
# Parse provider ID
provider_id = TriggerProviderID(provider)
plugin_id = provider_id.plugin_id
provider_name = provider_id.provider_name
user_id = context.get("user_id")
tenant_id = context.get("tenant_id")
subscription_builder_id = context.get("subscription_builder_id")
# Get OAuth client configuration
oauth_client_params = TriggerProviderService.get_oauth_client(
tenant_id=tenant_id,
provider_id=provider_id,
)
if oauth_client_params is None:
raise Forbidden("No OAuth client configuration found for this trigger provider")
# Get OAuth credentials from callback
oauth_handler = OAuthHandler()
redirect_uri = f"{dify_config.CONSOLE_API_URL}/console/api/oauth/plugin/{provider}/trigger/callback"
credentials_response = oauth_handler.get_credentials(
tenant_id=tenant_id,
user_id=user_id,
plugin_id=plugin_id,
provider=provider_name,
redirect_uri=redirect_uri,
system_credentials=oauth_client_params,
request=request,
)
credentials = credentials_response.credentials
expires_at = credentials_response.expires_at
if not credentials:
raise Exception("Failed to get OAuth credentials")
# Update subscription builder
TriggerSubscriptionBuilderService.update_trigger_subscription_builder(
tenant_id=tenant_id,
provider_id=provider_id,
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
credentials=credentials,
credential_expires_at=expires_at,
),
)
# Redirect to OAuth callback page
return redirect(f"{dify_config.CONSOLE_WEB_URL}/oauth-callback")
class TriggerOAuthClientManageApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""Get OAuth client configuration for a provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
provider_id = TriggerProviderID(provider)
# Get custom OAuth client params if exists
custom_params = TriggerProviderService.get_custom_oauth_client_params(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
# Check if custom client is enabled
is_custom_enabled = TriggerProviderService.is_oauth_custom_client_enabled(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
# Check if there's a system OAuth client
system_client = TriggerProviderService.get_oauth_client(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
provider_controller = TriggerManager.get_trigger_provider(user.current_tenant_id, provider_id)
redirect_uri = f"{dify_config.CONSOLE_API_URL}/console/api/oauth/plugin/{provider}/trigger/callback"
return jsonable_encoder(
{
"configured": bool(custom_params or system_client),
"oauth_client_schema": provider_controller.get_oauth_client_schema(),
"custom_configured": bool(custom_params),
"custom_enabled": is_custom_enabled,
"redirect_uri": redirect_uri,
"params": custom_params if custom_params else {},
}
)
except Exception as e:
logger.exception("Error getting OAuth client", exc_info=e)
raise
@setup_required
@login_required
@account_initialization_required
def post(self, provider):
"""Configure custom OAuth client for a provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("client_params", type=dict, required=False, nullable=True, location="json")
parser.add_argument("enabled", type=bool, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
provider_id = TriggerProviderID(provider)
return TriggerProviderService.save_custom_oauth_client_params(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
client_params=args.get("client_params"),
enabled=args.get("enabled"),
)
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error configuring OAuth client", exc_info=e)
raise
@setup_required
@login_required
@account_initialization_required
def delete(self, provider):
"""Remove custom OAuth client configuration"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
provider_id = TriggerProviderID(provider)
return TriggerProviderService.delete_custom_oauth_client_params(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error removing OAuth client", exc_info=e)
raise
# Trigger Subscription
api.add_resource(TriggerProviderListApi, "/workspaces/current/triggers")
api.add_resource(TriggerProviderInfoApi, "/workspaces/current/trigger-provider/<path:provider>/info")
api.add_resource(TriggerSubscriptionListApi, "/workspaces/current/trigger-provider/<path:provider>/subscriptions/list")
api.add_resource(
TriggerSubscriptionDeleteApi,
"/workspaces/current/trigger-provider/<path:subscription_id>/subscriptions/delete",
)
# Trigger Subscription Builder
api.add_resource(
TriggerSubscriptionBuilderCreateApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/create",
)
api.add_resource(
TriggerSubscriptionBuilderGetApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderUpdateApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/update/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderVerifyApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/verify/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderBuildApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/build/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderLogsApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/logs/<path:subscription_builder_id>",
)
# OAuth
api.add_resource(
TriggerOAuthAuthorizeApi, "/workspaces/current/trigger-provider/<path:provider>/subscriptions/oauth/authorize"
)
api.add_resource(TriggerOAuthCallbackApi, "/oauth/plugin/<path:provider>/trigger/callback")
api.add_resource(TriggerOAuthClientManageApi, "/workspaces/current/trigger-provider/<path:provider>/oauth/client")

View File

@ -25,7 +25,7 @@ from controllers.console.wraps import (
from extensions.ext_database import db
from libs.helper import TimestampField
from libs.login import login_required
from models.account import Account, Tenant, TenantStatus
from models.account import Tenant, TenantStatus
from services.account_service import TenantService
from services.feature_service import FeatureService
from services.file_service import FileService
@ -70,8 +70,6 @@ class TenantListApi(Resource):
@login_required
@account_initialization_required
def get(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
tenants = TenantService.get_join_tenants(current_user)
tenant_dicts = []
@ -85,7 +83,7 @@ class TenantListApi(Resource):
"status": tenant.status,
"created_at": tenant.created_at,
"plan": features.billing.subscription.plan if features.billing.enabled else "sandbox",
"current": tenant.id == current_user.current_tenant_id if current_user.current_tenant_id else False,
"current": tenant.id == current_user.current_tenant_id,
}
tenant_dicts.append(tenant_dict)
@ -127,11 +125,7 @@ class TenantApi(Resource):
if request.path == "/info":
logger.warning("Deprecated URL /info was used.")
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
tenant = current_user.current_tenant
if not tenant:
raise ValueError("No current tenant")
if tenant.status == TenantStatus.ARCHIVE:
tenants = TenantService.get_join_tenants(current_user)
@ -143,8 +137,6 @@ class TenantApi(Resource):
else:
raise Unauthorized("workspace is archived")
if not tenant:
raise ValueError("No tenant available")
return WorkspaceService.get_tenant_info(tenant), 200
@ -153,8 +145,6 @@ class SwitchWorkspaceApi(Resource):
@login_required
@account_initialization_required
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("tenant_id", type=str, required=True, location="json")
args = parser.parse_args()
@ -178,15 +168,11 @@ class CustomConfigWorkspaceApi(Resource):
@account_initialization_required
@cloud_edition_billing_resource_check("workspace_custom")
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("remove_webapp_brand", type=bool, location="json")
parser.add_argument("replace_webapp_logo", type=str, location="json")
args = parser.parse_args()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
tenant = db.get_or_404(Tenant, current_user.current_tenant_id)
custom_config_dict = {
@ -208,8 +194,6 @@ class WebappLogoWorkspaceApi(Resource):
@account_initialization_required
@cloud_edition_billing_resource_check("workspace_custom")
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
# check file
if "file" not in request.files:
raise NoFileUploadedError()
@ -248,14 +232,10 @@ class WorkspaceInfoApi(Resource):
@account_initialization_required
# Change workspace name
def post(self):
if not isinstance(current_user, Account):
raise ValueError("Invalid user account")
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, required=True, location="json")
args = parser.parse_args()
if not current_user.current_tenant_id:
raise ValueError("No current tenant")
tenant = db.get_or_404(Tenant, current_user.current_tenant_id)
tenant.name = args["name"]
db.session.commit()

View File

@ -2,9 +2,7 @@ import contextlib
import json
import os
import time
from collections.abc import Callable
from functools import wraps
from typing import ParamSpec, TypeVar
from flask import abort, request
from flask_login import current_user
@ -21,13 +19,10 @@ from services.operation_service import OperationService
from .error import NotInitValidateError, NotSetupError, UnauthorizedAndForceLogout
P = ParamSpec("P")
R = TypeVar("R")
def account_initialization_required(view: Callable[P, R]):
def account_initialization_required(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
# check account initialization
account = current_user
@ -39,9 +34,9 @@ def account_initialization_required(view: Callable[P, R]):
return decorated
def only_edition_cloud(view: Callable[P, R]):
def only_edition_cloud(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
if dify_config.EDITION != "CLOUD":
abort(404)
@ -50,9 +45,9 @@ def only_edition_cloud(view: Callable[P, R]):
return decorated
def only_edition_enterprise(view: Callable[P, R]):
def only_edition_enterprise(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
if not dify_config.ENTERPRISE_ENABLED:
abort(404)
@ -61,9 +56,9 @@ def only_edition_enterprise(view: Callable[P, R]):
return decorated
def only_edition_self_hosted(view: Callable[P, R]):
def only_edition_self_hosted(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
if dify_config.EDITION != "SELF_HOSTED":
abort(404)
@ -72,9 +67,9 @@ def only_edition_self_hosted(view: Callable[P, R]):
return decorated
def cloud_edition_billing_enabled(view: Callable[P, R]):
def cloud_edition_billing_enabled(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
features = FeatureService.get_features(current_user.current_tenant_id)
if not features.billing.enabled:
abort(403, "Billing feature is not enabled.")
@ -84,9 +79,9 @@ def cloud_edition_billing_enabled(view: Callable[P, R]):
def cloud_edition_billing_resource_check(resource: str):
def interceptor(view: Callable[P, R]):
def interceptor(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
features = FeatureService.get_features(current_user.current_tenant_id)
if features.billing.enabled:
members = features.members
@ -125,9 +120,9 @@ def cloud_edition_billing_resource_check(resource: str):
def cloud_edition_billing_knowledge_limit_check(resource: str):
def interceptor(view: Callable[P, R]):
def interceptor(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
features = FeatureService.get_features(current_user.current_tenant_id)
if features.billing.enabled:
if resource == "add_segment":
@ -147,9 +142,9 @@ def cloud_edition_billing_knowledge_limit_check(resource: str):
def cloud_edition_billing_rate_limit_check(resource: str):
def interceptor(view: Callable[P, R]):
def interceptor(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
if resource == "knowledge":
knowledge_rate_limit = FeatureService.get_knowledge_rate_limit(current_user.current_tenant_id)
if knowledge_rate_limit.enabled:
@ -181,9 +176,9 @@ def cloud_edition_billing_rate_limit_check(resource: str):
return interceptor
def cloud_utm_record(view: Callable[P, R]):
def cloud_utm_record(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
with contextlib.suppress(Exception):
features = FeatureService.get_features(current_user.current_tenant_id)
@ -199,9 +194,9 @@ def cloud_utm_record(view: Callable[P, R]):
return decorated
def setup_required(view: Callable[P, R]):
def setup_required(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
# check setup
if (
dify_config.EDITION == "SELF_HOSTED"
@ -217,9 +212,9 @@ def setup_required(view: Callable[P, R]):
return decorated
def enterprise_license_required(view: Callable[P, R]):
def enterprise_license_required(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
settings = FeatureService.get_system_features()
if settings.license.status in [LicenseStatus.INACTIVE, LicenseStatus.EXPIRED, LicenseStatus.LOST]:
raise UnauthorizedAndForceLogout("Your license is invalid. Please contact your administrator.")
@ -229,9 +224,9 @@ def enterprise_license_required(view: Callable[P, R]):
return decorated
def email_password_login_enabled(view: Callable[P, R]):
def email_password_login_enabled(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
features = FeatureService.get_system_features()
if features.enable_email_password_login:
return view(*args, **kwargs)
@ -242,9 +237,9 @@ def email_password_login_enabled(view: Callable[P, R]):
return decorated
def enable_change_email(view: Callable[P, R]):
def enable_change_email(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
features = FeatureService.get_system_features()
if features.enable_change_email:
return view(*args, **kwargs)
@ -255,9 +250,9 @@ def enable_change_email(view: Callable[P, R]):
return decorated
def is_allow_transfer_owner(view: Callable[P, R]):
def is_allow_transfer_owner(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
features = FeatureService.get_features(current_user.current_tenant_id)
if features.is_allow_transfer_workspace:
return view(*args, **kwargs)

View File

@ -10,10 +10,11 @@ api = ExternalApi(
version="1.0",
title="Files API",
description="API for file operations including upload and preview",
doc="/docs", # Enable Swagger UI at /files/docs
)
files_ns = Namespace("files", description="File operations", path="/")
from . import image_preview, tool_files, upload # pyright: ignore[reportUnusedImport]
from . import image_preview, tool_files, upload
api.add_namespace(files_ns)

View File

@ -10,13 +10,14 @@ api = ExternalApi(
version="1.0",
title="Inner API",
description="Internal APIs for enterprise features, billing, and plugin communication",
doc="/docs", # Enable Swagger UI at /inner/api/docs
)
# Create namespace
inner_api_ns = Namespace("inner_api", description="Internal API operations", path="/")
from . import mail as _mail # pyright: ignore[reportUnusedImport]
from .plugin import plugin as _plugin # pyright: ignore[reportUnusedImport]
from .workspace import workspace as _workspace # pyright: ignore[reportUnusedImport]
from . import mail
from .plugin import plugin
from .workspace import workspace
api.add_namespace(inner_api_ns)

View File

@ -37,9 +37,9 @@ from models.model import EndUser
@inner_api_ns.route("/invoke/llm")
class PluginInvokeLLMApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeLLM)
@inner_api_ns.doc("plugin_invoke_llm")
@inner_api_ns.doc(description="Invoke LLM models through plugin interface")
@ -60,9 +60,9 @@ class PluginInvokeLLMApi(Resource):
@inner_api_ns.route("/invoke/llm/structured-output")
class PluginInvokeLLMWithStructuredOutputApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeLLMWithStructuredOutput)
@inner_api_ns.doc("plugin_invoke_llm_structured")
@inner_api_ns.doc(description="Invoke LLM models with structured output through plugin interface")
@ -85,9 +85,9 @@ class PluginInvokeLLMWithStructuredOutputApi(Resource):
@inner_api_ns.route("/invoke/text-embedding")
class PluginInvokeTextEmbeddingApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeTextEmbedding)
@inner_api_ns.doc("plugin_invoke_text_embedding")
@inner_api_ns.doc(description="Invoke text embedding models through plugin interface")
@ -115,9 +115,9 @@ class PluginInvokeTextEmbeddingApi(Resource):
@inner_api_ns.route("/invoke/rerank")
class PluginInvokeRerankApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeRerank)
@inner_api_ns.doc("plugin_invoke_rerank")
@inner_api_ns.doc(description="Invoke rerank models through plugin interface")
@ -141,9 +141,9 @@ class PluginInvokeRerankApi(Resource):
@inner_api_ns.route("/invoke/tts")
class PluginInvokeTTSApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeTTS)
@inner_api_ns.doc("plugin_invoke_tts")
@inner_api_ns.doc(description="Invoke text-to-speech models through plugin interface")
@ -168,9 +168,9 @@ class PluginInvokeTTSApi(Resource):
@inner_api_ns.route("/invoke/speech2text")
class PluginInvokeSpeech2TextApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeSpeech2Text)
@inner_api_ns.doc("plugin_invoke_speech2text")
@inner_api_ns.doc(description="Invoke speech-to-text models through plugin interface")
@ -194,9 +194,9 @@ class PluginInvokeSpeech2TextApi(Resource):
@inner_api_ns.route("/invoke/moderation")
class PluginInvokeModerationApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeModeration)
@inner_api_ns.doc("plugin_invoke_moderation")
@inner_api_ns.doc(description="Invoke moderation models through plugin interface")
@ -220,9 +220,9 @@ class PluginInvokeModerationApi(Resource):
@inner_api_ns.route("/invoke/tool")
class PluginInvokeToolApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeTool)
@inner_api_ns.doc("plugin_invoke_tool")
@inner_api_ns.doc(description="Invoke tools through plugin interface")
@ -252,9 +252,9 @@ class PluginInvokeToolApi(Resource):
@inner_api_ns.route("/invoke/parameter-extractor")
class PluginInvokeParameterExtractorNodeApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeParameterExtractorNode)
@inner_api_ns.doc("plugin_invoke_parameter_extractor")
@inner_api_ns.doc(description="Invoke parameter extractor node through plugin interface")
@ -285,9 +285,9 @@ class PluginInvokeParameterExtractorNodeApi(Resource):
@inner_api_ns.route("/invoke/question-classifier")
class PluginInvokeQuestionClassifierNodeApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeQuestionClassifierNode)
@inner_api_ns.doc("plugin_invoke_question_classifier")
@inner_api_ns.doc(description="Invoke question classifier node through plugin interface")
@ -318,9 +318,9 @@ class PluginInvokeQuestionClassifierNodeApi(Resource):
@inner_api_ns.route("/invoke/app")
class PluginInvokeAppApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeApp)
@inner_api_ns.doc("plugin_invoke_app")
@inner_api_ns.doc(description="Invoke application through plugin interface")
@ -348,9 +348,9 @@ class PluginInvokeAppApi(Resource):
@inner_api_ns.route("/invoke/encrypt")
class PluginInvokeEncryptApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeEncrypt)
@inner_api_ns.doc("plugin_invoke_encrypt")
@inner_api_ns.doc(description="Encrypt or decrypt data through plugin interface")
@ -375,9 +375,9 @@ class PluginInvokeEncryptApi(Resource):
@inner_api_ns.route("/invoke/summary")
class PluginInvokeSummaryApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestInvokeSummary)
@inner_api_ns.doc("plugin_invoke_summary")
@inner_api_ns.doc(description="Invoke summary functionality through plugin interface")
@ -405,9 +405,9 @@ class PluginInvokeSummaryApi(Resource):
@inner_api_ns.route("/upload/file/request")
class PluginUploadFileRequestApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestRequestUploadFile)
@inner_api_ns.doc("plugin_upload_file_request")
@inner_api_ns.doc(description="Request signed URL for file upload through plugin interface")
@ -426,9 +426,9 @@ class PluginUploadFileRequestApi(Resource):
@inner_api_ns.route("/fetch/app/info")
class PluginFetchAppInfoApi(Resource):
@get_user_tenant
@setup_required
@plugin_inner_api_only
@get_user_tenant
@plugin_data(payload_type=RequestFetchAppInfo)
@inner_api_ns.doc("plugin_fetch_app_info")
@inner_api_ns.doc(description="Fetch application information through plugin interface")

View File

@ -1,6 +1,6 @@
from collections.abc import Callable
from functools import wraps
from typing import Optional, ParamSpec, TypeVar, cast
from typing import Optional
from flask import current_app, request
from flask_login import user_logged_in
@ -8,72 +8,65 @@ from flask_restx import reqparse
from pydantic import BaseModel
from sqlalchemy.orm import Session
from core.file.constants import DEFAULT_SERVICE_API_USER_ID
from extensions.ext_database import db
from libs.login import current_user
from models.account import Tenant
from libs.login import _get_user
from models.account import Account, Tenant
from models.model import EndUser
P = ParamSpec("P")
R = TypeVar("R")
from services.account_service import AccountService
def get_user(tenant_id: str, user_id: str | None) -> EndUser:
"""
Get current user
NOTE: user_id is not trusted, it could be maliciously set to any value.
As a result, it could only be considered as an end user id.
"""
def get_user(tenant_id: str, user_id: str | None) -> Account | EndUser:
try:
with Session(db.engine) as session:
if not user_id:
user_id = DEFAULT_SERVICE_API_USER_ID
user_model = (
session.query(EndUser)
.where(
EndUser.session_id == user_id,
EndUser.tenant_id == tenant_id,
)
.first()
)
if not user_model:
user_model = EndUser(
tenant_id=tenant_id,
type="service_api",
is_anonymous=user_id == DEFAULT_SERVICE_API_USER_ID,
session_id=user_id,
)
session.add(user_model)
session.commit()
session.refresh(user_model)
user_id = "DEFAULT-USER"
if user_id == "DEFAULT-USER":
user_model = session.query(EndUser).where(EndUser.session_id == "DEFAULT-USER").first()
if not user_model:
user_model = EndUser(
tenant_id=tenant_id,
type="service_api",
is_anonymous=True if user_id == "DEFAULT-USER" else False,
session_id=user_id,
)
session.add(user_model)
session.commit()
session.refresh(user_model)
else:
user_model = AccountService.load_user(user_id)
if not user_model:
user_model = session.query(EndUser).where(EndUser.id == user_id).first()
if not user_model:
raise ValueError("user not found")
except Exception:
raise ValueError("user not found")
return user_model
def get_user_tenant(view: Optional[Callable[P, R]] = None):
def decorator(view_func: Callable[P, R]):
def get_user_tenant(view: Optional[Callable] = None):
def decorator(view_func):
@wraps(view_func)
def decorated_view(*args: P.args, **kwargs: P.kwargs):
def decorated_view(*args, **kwargs):
# fetch json body
parser = reqparse.RequestParser()
parser.add_argument("tenant_id", type=str, required=True, location="json")
parser.add_argument("user_id", type=str, required=True, location="json")
p = parser.parse_args()
kwargs = parser.parse_args()
user_id = cast(str, p.get("user_id"))
tenant_id = cast(str, p.get("tenant_id"))
user_id = kwargs.get("user_id")
tenant_id = kwargs.get("tenant_id")
if not tenant_id:
raise ValueError("tenant_id is required")
if not user_id:
user_id = DEFAULT_SERVICE_API_USER_ID
user_id = "DEFAULT-USER"
del kwargs["tenant_id"]
del kwargs["user_id"]
try:
tenant_model = (
@ -95,7 +88,7 @@ def get_user_tenant(view: Optional[Callable[P, R]] = None):
kwargs["user_model"] = user
current_app.login_manager._update_request_context_with_user(user) # type: ignore
user_logged_in.send(current_app._get_current_object(), user=current_user) # type: ignore
user_logged_in.send(current_app._get_current_object(), user=_get_user()) # type: ignore
return view_func(*args, **kwargs)
@ -107,9 +100,9 @@ def get_user_tenant(view: Optional[Callable[P, R]] = None):
return decorator(view)
def plugin_data(view: Optional[Callable[P, R]] = None, *, payload_type: type[BaseModel]):
def decorator(view_func: Callable[P, R]):
def decorated_view(*args: P.args, **kwargs: P.kwargs):
def plugin_data(view: Optional[Callable] = None, *, payload_type: type[BaseModel]):
def decorator(view_func):
def decorated_view(*args, **kwargs):
try:
data = request.get_json()
except Exception:

View File

@ -46,9 +46,9 @@ def enterprise_inner_api_only(view: Callable[P, R]):
return decorated
def enterprise_inner_api_user_auth(view: Callable[P, R]):
def enterprise_inner_api_user_auth(view):
@wraps(view)
def decorated(*args: P.args, **kwargs: P.kwargs):
def decorated(*args, **kwargs):
if not dify_config.INNER_API:
return view(*args, **kwargs)

View File

@ -10,10 +10,11 @@ api = ExternalApi(
version="1.0",
title="MCP API",
description="API for Model Context Protocol operations",
doc="/docs", # Enable Swagger UI at /mcp/docs
)
mcp_ns = Namespace("mcp", description="MCP operations", path="/")
from . import mcp # pyright: ignore[reportUnusedImport]
from . import mcp
api.add_namespace(mcp_ns)

View File

@ -9,10 +9,9 @@ from controllers.console.app.mcp_server import AppMCPServerStatus
from controllers.mcp import mcp_ns
from core.app.app_config.entities import VariableEntity
from core.mcp import types as mcp_types
from core.mcp.server.streamable_http import handle_mcp_request
from extensions.ext_database import db
from libs import helper
from models.model import App, AppMCPServer, AppMode, EndUser
from models.model import App, AppMCPServer, AppMode
class MCPRequestError(Exception):
@ -99,7 +98,7 @@ class MCPAppApi(Resource):
return mcp_server, app
def _validate_server_status(self, mcp_server: AppMCPServer):
def _validate_server_status(self, mcp_server: AppMCPServer) -> None:
"""Validate MCP server status"""
if mcp_server.status != AppMCPServerStatus.ACTIVE:
raise MCPRequestError(mcp_types.INVALID_REQUEST, "Server is not active")
@ -195,50 +194,6 @@ class MCPAppApi(Resource):
except ValidationError as e:
raise MCPRequestError(mcp_types.INVALID_PARAMS, f"Invalid MCP request: {str(e)}")
def _retrieve_end_user(self, tenant_id: str, mcp_server_id: str, session: Session) -> EndUser | None:
"""Get end user from existing session - optimized query"""
return (
session.query(EndUser)
.where(EndUser.tenant_id == tenant_id)
.where(EndUser.session_id == mcp_server_id)
.where(EndUser.type == "mcp")
.first()
)
def _create_end_user(
self, client_name: str, tenant_id: str, app_id: str, mcp_server_id: str, session: Session
) -> EndUser:
"""Create end user in existing session"""
end_user = EndUser(
tenant_id=tenant_id,
app_id=app_id,
type="mcp",
name=client_name,
session_id=mcp_server_id,
)
session.add(end_user)
session.flush() # Use flush instead of commit to keep transaction open
session.refresh(end_user)
return end_user
def _handle_mcp_request(
self,
app: App,
mcp_server: AppMCPServer,
mcp_request: mcp_types.ClientRequest,
user_input_form: list[VariableEntity],
session: Session,
request_id: Union[int, str],
) -> mcp_types.JSONRPCResponse | mcp_types.JSONRPCError | None:
"""Handle MCP request and return response"""
end_user = self._retrieve_end_user(mcp_server.tenant_id, mcp_server.id, session)
if not end_user and isinstance(mcp_request.root, mcp_types.InitializeRequest):
client_info = mcp_request.root.params.clientInfo
client_name = f"{client_info.name}@{client_info.version}"
# Commit the session before creating end user to avoid transaction conflicts
session.commit()
with Session(db.engine, expire_on_commit=False) as create_session, create_session.begin():
end_user = self._create_end_user(client_name, app.tenant_id, app.id, mcp_server.id, create_session)
return handle_mcp_request(app, mcp_request, user_input_form, mcp_server, end_user, request_id)
mcp_server_handler = MCPServerStreamableHTTPRequestHandler(app, request, converted_user_input_form)
response = mcp_server_handler.handle()
return helper.compact_generate_response(response)

View File

@ -10,31 +10,14 @@ api = ExternalApi(
version="1.0",
title="Service API",
description="API for application services",
doc="/docs", # Enable Swagger UI at /v1/docs
)
service_api_ns = Namespace("service_api", description="Service operations", path="/")
from . import index # pyright: ignore[reportUnusedImport]
from .app import (
annotation, # pyright: ignore[reportUnusedImport]
app, # pyright: ignore[reportUnusedImport]
audio, # pyright: ignore[reportUnusedImport]
completion, # pyright: ignore[reportUnusedImport]
conversation, # pyright: ignore[reportUnusedImport]
file, # pyright: ignore[reportUnusedImport]
file_preview, # pyright: ignore[reportUnusedImport]
message, # pyright: ignore[reportUnusedImport]
site, # pyright: ignore[reportUnusedImport]
workflow, # pyright: ignore[reportUnusedImport]
)
from .dataset import (
dataset, # pyright: ignore[reportUnusedImport]
document, # pyright: ignore[reportUnusedImport]
hit_testing, # pyright: ignore[reportUnusedImport]
metadata, # pyright: ignore[reportUnusedImport]
segment, # pyright: ignore[reportUnusedImport]
upload_file, # pyright: ignore[reportUnusedImport]
)
from .workspace import models # pyright: ignore[reportUnusedImport]
from . import index
from .app import annotation, app, audio, completion, conversation, file, file_preview, message, site, workflow
from .dataset import dataset, document, hit_testing, metadata, segment, upload_file
from .workspace import models
api.add_namespace(service_api_ns)

View File

@ -55,7 +55,7 @@ class AudioApi(Resource):
file = request.files["file"]
try:
response = AudioService.transcript_asr(app_model=app_model, file=file, end_user=end_user.id)
response = AudioService.transcript_asr(app_model=app_model, file=file, end_user=end_user)
return response
except services.errors.app_model_config.AppModelConfigBrokenError:

View File

@ -1,5 +1,4 @@
from flask_restx import Resource, reqparse
from flask_restx._http import HTTPStatus
from flask_restx.inputs import int_range
from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, NotFound
@ -122,7 +121,7 @@ class ConversationDetailApi(Resource):
}
)
@validate_app_token(fetch_user_arg=FetchUserArg(fetch_from=WhereisUserArg.JSON))
@service_api_ns.marshal_with(build_conversation_delete_model(service_api_ns), code=HTTPStatus.NO_CONTENT)
@service_api_ns.marshal_with(build_conversation_delete_model(service_api_ns), code=204)
def delete(self, app_model: App, end_user: EndUser, c_id):
"""Delete a specific conversation."""
app_mode = AppMode.value_of(app_model.mode)

View File

@ -59,7 +59,7 @@ class FilePreviewApi(Resource):
args = file_preview_parser.parse_args()
# Validate file ownership and get file objects
_, upload_file = self._validate_file_ownership(file_id, app_model.id)
message_file, upload_file = self._validate_file_ownership(file_id, app_model.id)
# Get file content generator
try:

View File

@ -30,7 +30,6 @@ from extensions.ext_database import db
from fields.document_fields import document_fields, document_status_fields
from libs.login import current_user
from models.dataset import Dataset, Document, DocumentSegment
from models.model import EndUser
from services.dataset_service import DatasetService, DocumentService
from services.entities.knowledge_entities.knowledge_entities import KnowledgeConfig
from services.file_service import FileService
@ -299,9 +298,6 @@ class DocumentAddByFileApi(DatasetApiResource):
if not file.filename:
raise FilenameNotExistsError
if not isinstance(current_user, EndUser):
raise ValueError("Invalid user account")
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),
@ -391,8 +387,6 @@ class DocumentUpdateByFileApi(DatasetApiResource):
raise FilenameNotExistsError
try:
if not isinstance(current_user, EndUser):
raise ValueError("Invalid user account")
upload_file = FileService.upload_file(
filename=file.filename,
content=file.read(),
@ -416,7 +410,7 @@ class DocumentUpdateByFileApi(DatasetApiResource):
DocumentService.document_create_args_validate(knowledge_config)
try:
documents, _ = DocumentService.save_document_with_dataset_id(
documents, batch = DocumentService.save_document_with_dataset_id(
dataset=dataset,
knowledge_config=knowledge_config,
account=dataset.created_by_account,

Some files were not shown because too many files have changed in this diff Show More