diff --git a/api/.env.example b/api/.env.example index 3d5d866fed..b5dbfff238 100644 --- a/api/.env.example +++ b/api/.env.example @@ -33,6 +33,9 @@ TRIGGER_URL=http://localhost:5001 # The time in seconds after the signature is rejected FILES_ACCESS_TIMEOUT=300 +# Collaboration mode toggle +ENABLE_COLLABORATION_MODE=false + # Access token expiration time in minutes ACCESS_TOKEN_EXPIRE_MINUTES=60 diff --git a/api/README.md b/api/README.md index 794b05d3af..9d89b490b0 100644 --- a/api/README.md +++ b/api/README.md @@ -1,6 +1,6 @@ # Dify Backend API -## Usage +## Setup and Run > [!IMPORTANT] > @@ -8,48 +8,77 @@ > [`uv`](https://docs.astral.sh/uv/) as the package manager > for Dify API backend service. -1. Start the docker-compose stack +`uv` and `pnpm` are required to run the setup and development commands below. - The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using `docker-compose`. +### Using scripts (recommended) + +The scripts resolve paths relative to their location, so you can run them from anywhere. + +1. Run setup (copies env files and installs dependencies). ```bash - cd ../docker - cp middleware.env.example middleware.env - # change the profile to mysql if you are not using postgres,change the profile to other vector database if you are not using weaviate - docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d - cd ../api + ./dev/setup ``` -1. Copy `.env.example` to `.env` +1. Review `api/.env`, `web/.env.local`, and `docker/middleware.env` values (see the `SECRET_KEY` note below). - ```cli - cp .env.example .env +1. Start middleware (PostgreSQL/Redis/Weaviate). + + ```bash + ./dev/start-docker-compose ``` -> [!IMPORTANT] -> -> When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the site’s top-level domain (e.g., `example.com`). The frontend and backend must be under the same top-level domain in order to share authentication cookies. +1. Start backend (runs migrations first). -1. Generate a `SECRET_KEY` in the `.env` file. - - bash for Linux - - ```bash for Linux - sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env + ```bash + ./dev/start-api ``` - bash for Mac +1. Start Dify [web](../web) service. - ```bash for Mac - secret_key=$(openssl rand -base64 42) - sed -i '' "/^SECRET_KEY=/c\\ - SECRET_KEY=${secret_key}" .env + ```bash + ./dev/start-web ``` -1. Create environment. +1. Set up your application by visiting `http://localhost:3000`. - Dify API service uses [UV](https://docs.astral.sh/uv/) to manage dependencies. - First, you need to add the uv package manager, if you don't have it already. +1. Optional: start the worker service (async tasks, runs from `api`). + + ```bash + ./dev/start-worker + ``` + +1. Optional: start Celery Beat (scheduled tasks). + + ```bash + ./dev/start-beat + ``` + +### Manual commands + +
+Show manual setup and run steps + +These commands assume you start from the repository root. + +1. Start the docker-compose stack. + + The backend requires middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using `docker-compose`. + + ```bash + cp docker/middleware.env.example docker/middleware.env + # Use mysql or another vector database profile if you are not using postgres/weaviate. + docker compose -f docker/docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d + ``` + +1. Copy env files. + + ```bash + cp api/.env.example api/.env + cp web/.env.example web/.env.local + ``` + +1. Install UV if needed. ```bash pip install uv @@ -57,60 +86,96 @@ brew install uv ``` -1. Install dependencies +1. Install API dependencies. ```bash - uv sync --dev + cd api + uv sync --group dev ``` -1. Run migrate - - Before the first launch, migrate the database to the latest version. +1. Install web dependencies. ```bash + cd web + pnpm install + cd .. + ``` + +1. Start backend (runs migrations first, in a new terminal). + + ```bash + cd api uv run flask db upgrade - ``` - -1. Start backend - - ```bash uv run flask run --host 0.0.0.0 --port=5001 --debug ``` -1. Start Dify [web](../web) service. +1. Start Dify [web](../web) service (in a new terminal). -1. Setup your application by visiting `http://localhost:3000`. + ```bash + cd web + pnpm dev:inspect + ``` -1. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service. +1. Set up your application by visiting `http://localhost:3000`. -```bash -uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention -``` +1. Optional: start the worker service (async tasks, in a new terminal). -Additionally, if you want to debug the celery scheduled tasks, you can run the following command in another terminal to start the beat service: + ```bash + cd api + uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention + ``` -```bash -uv run celery -A app.celery beat -``` +1. Optional: start Celery Beat (scheduled tasks, in a new terminal). + + ```bash + cd api + uv run celery -A app.celery beat + ``` + +
+ +### Environment notes + +> [!IMPORTANT] +> +> When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the site’s top-level domain (e.g., `example.com`). The frontend and backend must be under the same top-level domain in order to share authentication cookies. + +- Generate a `SECRET_KEY` in the `.env` file. + + bash for Linux + + ```bash + sed -i "/^SECRET_KEY=/c\\SECRET_KEY=$(openssl rand -base64 42)" .env + ``` + + bash for Mac + + ```bash + secret_key=$(openssl rand -base64 42) + sed -i '' "/^SECRET_KEY=/c\\ + SECRET_KEY=${secret_key}" .env + ``` ## Testing 1. Install dependencies for both the backend and the test environment ```bash - uv sync --dev + cd api + uv sync --group dev ``` 1. Run the tests locally with mocked system environment variables in `tool.pytest_env` section in `pyproject.toml`, more can check [Claude.md](../CLAUDE.md) ```bash + cd api uv run pytest # Run all tests uv run pytest tests/unit_tests/ # Unit tests only uv run pytest tests/integration_tests/ # Integration tests # Code quality - ../dev/reformat # Run all formatters and linters - uv run ruff check --fix ./ # Fix linting issues - uv run ruff format ./ # Format code - uv run basedpyright . # Type checking + ./dev/reformat # Run all formatters and linters + uv run ruff check --fix ./ # Fix linting issues + uv run ruff format ./ # Format code + uv run basedpyright . # Type checking ``` diff --git a/api/app.py b/api/app.py index 99f70f32d5..1d611eac58 100644 --- a/api/app.py +++ b/api/app.py @@ -1,3 +1,4 @@ +import os import sys @@ -8,10 +9,15 @@ def is_db_command() -> bool: # create app +flask_app = None +socketio_app = None + if is_db_command(): from app_factory import create_migrations_app app = create_migrations_app() + socketio_app = app + flask_app = app else: # Gunicorn and Celery handle monkey patching automatically in production by # specifying the `gevent` worker class. Manual monkey patching is not required here. @@ -22,8 +28,15 @@ else: from app_factory import create_app - app = create_app() - celery = app.extensions["celery"] + socketio_app, flask_app = create_app() + app = flask_app + celery = flask_app.extensions["celery"] if __name__ == "__main__": - app.run(host="0.0.0.0", port=5001) + from gevent import pywsgi + from geventwebsocket.handler import WebSocketHandler # type: ignore[reportMissingTypeStubs] + + host = os.environ.get("HOST", "0.0.0.0") + port = int(os.environ.get("PORT", 5001)) + server = pywsgi.WSGIServer((host, port), socketio_app, handler_class=WebSocketHandler) + server.serve_forever() diff --git a/api/app_factory.py b/api/app_factory.py index 07859a3758..ab50165a24 100644 --- a/api/app_factory.py +++ b/api/app_factory.py @@ -1,6 +1,7 @@ import logging import time +import socketio # type: ignore[reportMissingTypeStubs] from opentelemetry.trace import get_current_span from opentelemetry.trace.span import INVALID_SPAN_ID, INVALID_TRACE_ID @@ -8,6 +9,7 @@ from configs import dify_config from contexts.wrapper import RecyclableContextVar from core.logging.context import init_request_context from dify_app import DifyApp +from extensions.ext_socketio import sio logger = logging.getLogger(__name__) @@ -60,14 +62,18 @@ def create_flask_app_with_configs() -> DifyApp: return dify_app -def create_app() -> DifyApp: +def create_app() -> tuple[socketio.WSGIApp, DifyApp]: start_time = time.perf_counter() app = create_flask_app_with_configs() initialize_extensions(app) + + sio.app = app + socketio_app = socketio.WSGIApp(sio, app) + end_time = time.perf_counter() if dify_config.DEBUG: logger.info("Finished create_app (%s ms)", round((end_time - start_time) * 1000, 2)) - return app + return socketio_app, app def initialize_extensions(app: DifyApp): diff --git a/api/configs/feature/__init__.py b/api/configs/feature/__init__.py index b8876c661c..1792190e34 100644 --- a/api/configs/feature/__init__.py +++ b/api/configs/feature/__init__.py @@ -1240,6 +1240,13 @@ class PositionConfig(BaseSettings): return {item.strip() for item in self.POSITION_TOOL_EXCLUDES.split(",") if item.strip() != ""} +class CollaborationConfig(BaseSettings): + ENABLE_COLLABORATION_MODE: bool = Field( + description="Whether to enable collaboration mode features across the workspace", + default=False, + ) + + class LoginConfig(BaseSettings): ENABLE_EMAIL_CODE_LOGIN: bool = Field( description="whether to enable email code login", @@ -1359,6 +1366,7 @@ class FeatureConfig( WorkflowConfig, WorkflowNodeExecutionConfig, WorkspaceConfig, + CollaborationConfig, LoginConfig, AccountConfig, SwaggerUIConfig, diff --git a/api/controllers/console/__init__.py b/api/controllers/console/__init__.py index 0c19ee9fc5..3decf4d116 100644 --- a/api/controllers/console/__init__.py +++ b/api/controllers/console/__init__.py @@ -64,6 +64,7 @@ from .app import ( statistic, workflow, workflow_app_log, + workflow_comment, workflow_draft_variable, workflow_run, workflow_statistic, @@ -115,6 +116,7 @@ from .explore import ( saved_message, trial, ) +from .socketio import workflow as socketio_workflow # pyright: ignore[reportUnusedImport] # Import tag controllers from .tag import tags @@ -211,6 +213,7 @@ __all__ = [ "website", "workflow", "workflow_app_log", + "workflow_comment", "workflow_draft_variable", "workflow_run", "workflow_statistic", diff --git a/api/controllers/console/app/error.py b/api/controllers/console/app/error.py index 4b031efec9..da4b2b89b8 100644 --- a/api/controllers/console/app/error.py +++ b/api/controllers/console/app/error.py @@ -82,13 +82,13 @@ class ProviderNotSupportSpeechToTextError(BaseHTTPException): class DraftWorkflowNotExist(BaseHTTPException): error_code = "draft_workflow_not_exist" description = "Draft workflow need to be initialized." - code = 400 + code = 404 class DraftWorkflowNotSync(BaseHTTPException): error_code = "draft_workflow_not_sync" description = "Workflow graph might have been modified, please refresh and resubmit." - code = 400 + code = 409 class TracingConfigNotExist(BaseHTTPException): diff --git a/api/controllers/console/app/workflow.py b/api/controllers/console/app/workflow.py index 29200a3d22..4e9722e0dc 100644 --- a/api/controllers/console/app/workflow.py +++ b/api/controllers/console/app/workflow.py @@ -32,8 +32,10 @@ from core.trigger.debug.event_selectors import ( from core.workflow.enums import NodeType from core.workflow.graph_engine.manager import GraphEngineManager from extensions.ext_database import db +from extensions.ext_redis import redis_client from factories import file_factory, variable_factory from fields.member_fields import simple_account_fields +from fields.online_user_fields import online_user_list_fields from fields.workflow_fields import workflow_fields, workflow_pagination_fields from fields.workflow_run_fields import workflow_run_node_execution_fields from libs import helper @@ -43,6 +45,7 @@ from libs.login import current_account_with_tenant, login_required from models import App from models.model import AppMode from models.workflow import Workflow +from repositories.workflow_collaboration_repository import WORKFLOW_ONLINE_USERS_PREFIX from services.app_generate_service import AppGenerateService from services.errors.app import WorkflowHashNotEqualError from services.errors.llm import InvokeRateLimitError @@ -182,6 +185,14 @@ class WorkflowUpdatePayload(BaseModel): marked_comment: str | None = Field(default=None, max_length=100) +class WorkflowFeaturesPayload(BaseModel): + features: dict[str, Any] = Field(..., description="Workflow feature configuration") + + +class WorkflowOnlineUsersQuery(BaseModel): + workflow_ids: str = Field(..., description="Comma-separated workflow IDs") + + class DraftWorkflowTriggerRunPayload(BaseModel): node_id: str @@ -214,6 +225,8 @@ reg(DefaultBlockConfigQuery) reg(ConvertToWorkflowPayload) reg(WorkflowListQuery) reg(WorkflowUpdatePayload) +reg(WorkflowFeaturesPayload) +reg(WorkflowOnlineUsersQuery) reg(DraftWorkflowTriggerRunPayload) reg(DraftWorkflowTriggerRunAllPayload) reg(NestedNodeGraphPayload) @@ -804,6 +817,31 @@ class ConvertToWorkflowApi(Resource): } +@console_ns.route("/apps//workflows/draft/features") +class WorkflowFeaturesApi(Resource): + """Update draft workflow features.""" + + @console_ns.expect(console_ns.models[WorkflowFeaturesPayload.__name__]) + @console_ns.doc("update_workflow_features") + @console_ns.doc(description="Update draft workflow features") + @console_ns.doc(params={"app_id": "Application ID"}) + @console_ns.response(200, "Workflow features updated successfully") + @setup_required + @login_required + @account_initialization_required + @get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW]) + def post(self, app_model: App): + current_user, _ = current_account_with_tenant() + + args = WorkflowFeaturesPayload.model_validate(console_ns.payload or {}) + features = args.features + + workflow_service = WorkflowService() + workflow_service.update_draft_workflow_features(app_model=app_model, features=features, account=current_user) + + return {"result": "success"} + + @console_ns.route("/apps//workflows") class PublishedAllWorkflowApi(Resource): @console_ns.expect(console_ns.models[WorkflowListQuery.__name__]) @@ -1230,3 +1268,32 @@ class NestedNodeGraphApi(Resource): response = service.generate_nested_node_graph(tenant_id=app_model.tenant_id, request=request) return response.model_dump() + + +@console_ns.route("/apps/workflows/online-users") +class WorkflowOnlineUsersApi(Resource): + @console_ns.expect(console_ns.models[WorkflowOnlineUsersQuery.__name__]) + @console_ns.doc("get_workflow_online_users") + @console_ns.doc(description="Get workflow online users") + @setup_required + @login_required + @account_initialization_required + @marshal_with(online_user_list_fields) + def get(self): + args = WorkflowOnlineUsersQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore + + workflow_ids = [workflow_id.strip() for workflow_id in args.workflow_ids.split(",") if workflow_id.strip()] + + results = [] + for workflow_id in workflow_ids: + users_json = redis_client.hgetall(f"{WORKFLOW_ONLINE_USERS_PREFIX}{workflow_id}") + + users = [] + for _, user_info_json in users_json.items(): + try: + users.append(json.loads(user_info_json)) + except Exception: + continue + results.append({"workflow_id": workflow_id, "users": users}) + + return {"data": results} diff --git a/api/controllers/console/app/workflow_comment.py b/api/controllers/console/app/workflow_comment.py new file mode 100644 index 0000000000..6b06a5922a --- /dev/null +++ b/api/controllers/console/app/workflow_comment.py @@ -0,0 +1,317 @@ +import logging + +from flask_restx import Resource, fields, marshal_with +from pydantic import BaseModel, Field + +from controllers.console import console_ns +from controllers.console.app.wraps import get_app_model +from controllers.console.wraps import account_initialization_required, setup_required +from fields.member_fields import account_with_role_fields +from fields.workflow_comment_fields import ( + workflow_comment_basic_fields, + workflow_comment_create_fields, + workflow_comment_detail_fields, + workflow_comment_reply_create_fields, + workflow_comment_reply_update_fields, + workflow_comment_resolve_fields, + workflow_comment_update_fields, +) +from libs.login import current_user, login_required +from models import App +from services.account_service import TenantService +from services.workflow_comment_service import WorkflowCommentService + +logger = logging.getLogger(__name__) +DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}" + + +class WorkflowCommentCreatePayload(BaseModel): + position_x: float = Field(..., description="Comment X position") + position_y: float = Field(..., description="Comment Y position") + content: str = Field(..., description="Comment content") + mentioned_user_ids: list[str] = Field(default_factory=list, description="Mentioned user IDs") + + +class WorkflowCommentUpdatePayload(BaseModel): + content: str = Field(..., description="Comment content") + position_x: float | None = Field(default=None, description="Comment X position") + position_y: float | None = Field(default=None, description="Comment Y position") + mentioned_user_ids: list[str] = Field(default_factory=list, description="Mentioned user IDs") + + +class WorkflowCommentReplyCreatePayload(BaseModel): + content: str = Field(..., description="Reply content") + mentioned_user_ids: list[str] = Field(default_factory=list, description="Mentioned user IDs") + + +class WorkflowCommentReplyUpdatePayload(BaseModel): + content: str = Field(..., description="Reply content") + mentioned_user_ids: list[str] = Field(default_factory=list, description="Mentioned user IDs") + + +for model in ( + WorkflowCommentCreatePayload, + WorkflowCommentUpdatePayload, + WorkflowCommentReplyCreatePayload, + WorkflowCommentReplyUpdatePayload, +): + console_ns.schema_model(model.__name__, model.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0)) + +workflow_comment_basic_model = console_ns.model("WorkflowCommentBasic", workflow_comment_basic_fields) +workflow_comment_detail_model = console_ns.model("WorkflowCommentDetail", workflow_comment_detail_fields) +workflow_comment_create_model = console_ns.model("WorkflowCommentCreate", workflow_comment_create_fields) +workflow_comment_update_model = console_ns.model("WorkflowCommentUpdate", workflow_comment_update_fields) +workflow_comment_resolve_model = console_ns.model("WorkflowCommentResolve", workflow_comment_resolve_fields) +workflow_comment_reply_create_model = console_ns.model( + "WorkflowCommentReplyCreate", workflow_comment_reply_create_fields +) +workflow_comment_reply_update_model = console_ns.model( + "WorkflowCommentReplyUpdate", workflow_comment_reply_update_fields +) +workflow_comment_mention_users_model = console_ns.model( + "WorkflowCommentMentionUsers", + {"users": fields.List(fields.Nested(account_with_role_fields))}, +) + + +@console_ns.route("/apps//workflow/comments") +class WorkflowCommentListApi(Resource): + """API for listing and creating workflow comments.""" + + @console_ns.doc("list_workflow_comments") + @console_ns.doc(description="Get all comments for a workflow") + @console_ns.doc(params={"app_id": "Application ID"}) + @console_ns.response(200, "Comments retrieved successfully", workflow_comment_basic_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_basic_model, envelope="data") + def get(self, app_model: App): + """Get all comments for a workflow.""" + comments = WorkflowCommentService.get_comments(tenant_id=current_user.current_tenant_id, app_id=app_model.id) + + return comments + + @console_ns.doc("create_workflow_comment") + @console_ns.doc(description="Create a new workflow comment") + @console_ns.doc(params={"app_id": "Application ID"}) + @console_ns.expect(console_ns.models[WorkflowCommentCreatePayload.__name__]) + @console_ns.response(201, "Comment created successfully", workflow_comment_create_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_create_model) + def post(self, app_model: App): + """Create a new workflow comment.""" + payload = WorkflowCommentCreatePayload.model_validate(console_ns.payload or {}) + + result = WorkflowCommentService.create_comment( + tenant_id=current_user.current_tenant_id, + app_id=app_model.id, + created_by=current_user.id, + content=payload.content, + position_x=payload.position_x, + position_y=payload.position_y, + mentioned_user_ids=payload.mentioned_user_ids, + ) + + return result, 201 + + +@console_ns.route("/apps//workflow/comments/") +class WorkflowCommentDetailApi(Resource): + """API for managing individual workflow comments.""" + + @console_ns.doc("get_workflow_comment") + @console_ns.doc(description="Get a specific workflow comment") + @console_ns.doc(params={"app_id": "Application ID", "comment_id": "Comment ID"}) + @console_ns.response(200, "Comment retrieved successfully", workflow_comment_detail_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_detail_model) + def get(self, app_model: App, comment_id: str): + """Get a specific workflow comment.""" + comment = WorkflowCommentService.get_comment( + tenant_id=current_user.current_tenant_id, app_id=app_model.id, comment_id=comment_id + ) + + return comment + + @console_ns.doc("update_workflow_comment") + @console_ns.doc(description="Update a workflow comment") + @console_ns.doc(params={"app_id": "Application ID", "comment_id": "Comment ID"}) + @console_ns.expect(console_ns.models[WorkflowCommentUpdatePayload.__name__]) + @console_ns.response(200, "Comment updated successfully", workflow_comment_update_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_update_model) + def put(self, app_model: App, comment_id: str): + """Update a workflow comment.""" + payload = WorkflowCommentUpdatePayload.model_validate(console_ns.payload or {}) + + result = WorkflowCommentService.update_comment( + tenant_id=current_user.current_tenant_id, + app_id=app_model.id, + comment_id=comment_id, + user_id=current_user.id, + content=payload.content, + position_x=payload.position_x, + position_y=payload.position_y, + mentioned_user_ids=payload.mentioned_user_ids, + ) + + return result + + @console_ns.doc("delete_workflow_comment") + @console_ns.doc(description="Delete a workflow comment") + @console_ns.doc(params={"app_id": "Application ID", "comment_id": "Comment ID"}) + @console_ns.response(204, "Comment deleted successfully") + @login_required + @setup_required + @account_initialization_required + @get_app_model() + def delete(self, app_model: App, comment_id: str): + """Delete a workflow comment.""" + WorkflowCommentService.delete_comment( + tenant_id=current_user.current_tenant_id, + app_id=app_model.id, + comment_id=comment_id, + user_id=current_user.id, + ) + + return {"result": "success"}, 204 + + +@console_ns.route("/apps//workflow/comments//resolve") +class WorkflowCommentResolveApi(Resource): + """API for resolving and reopening workflow comments.""" + + @console_ns.doc("resolve_workflow_comment") + @console_ns.doc(description="Resolve a workflow comment") + @console_ns.doc(params={"app_id": "Application ID", "comment_id": "Comment ID"}) + @console_ns.response(200, "Comment resolved successfully", workflow_comment_resolve_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_resolve_model) + def post(self, app_model: App, comment_id: str): + """Resolve a workflow comment.""" + comment = WorkflowCommentService.resolve_comment( + tenant_id=current_user.current_tenant_id, + app_id=app_model.id, + comment_id=comment_id, + user_id=current_user.id, + ) + + return comment + + +@console_ns.route("/apps//workflow/comments//replies") +class WorkflowCommentReplyApi(Resource): + """API for managing comment replies.""" + + @console_ns.doc("create_workflow_comment_reply") + @console_ns.doc(description="Add a reply to a workflow comment") + @console_ns.doc(params={"app_id": "Application ID", "comment_id": "Comment ID"}) + @console_ns.expect(console_ns.models[WorkflowCommentReplyCreatePayload.__name__]) + @console_ns.response(201, "Reply created successfully", workflow_comment_reply_create_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_reply_create_model) + def post(self, app_model: App, comment_id: str): + """Add a reply to a workflow comment.""" + # Validate comment access first + WorkflowCommentService.validate_comment_access( + comment_id=comment_id, tenant_id=current_user.current_tenant_id, app_id=app_model.id + ) + + payload = WorkflowCommentReplyCreatePayload.model_validate(console_ns.payload or {}) + + result = WorkflowCommentService.create_reply( + comment_id=comment_id, + content=payload.content, + created_by=current_user.id, + mentioned_user_ids=payload.mentioned_user_ids, + ) + + return result, 201 + + +@console_ns.route("/apps//workflow/comments//replies/") +class WorkflowCommentReplyDetailApi(Resource): + """API for managing individual comment replies.""" + + @console_ns.doc("update_workflow_comment_reply") + @console_ns.doc(description="Update a comment reply") + @console_ns.doc(params={"app_id": "Application ID", "comment_id": "Comment ID", "reply_id": "Reply ID"}) + @console_ns.expect(console_ns.models[WorkflowCommentReplyUpdatePayload.__name__]) + @console_ns.response(200, "Reply updated successfully", workflow_comment_reply_update_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_reply_update_model) + def put(self, app_model: App, comment_id: str, reply_id: str): + """Update a comment reply.""" + # Validate comment access first + WorkflowCommentService.validate_comment_access( + comment_id=comment_id, tenant_id=current_user.current_tenant_id, app_id=app_model.id + ) + + payload = WorkflowCommentReplyUpdatePayload.model_validate(console_ns.payload or {}) + + reply = WorkflowCommentService.update_reply( + reply_id=reply_id, + user_id=current_user.id, + content=payload.content, + mentioned_user_ids=payload.mentioned_user_ids, + ) + + return reply + + @console_ns.doc("delete_workflow_comment_reply") + @console_ns.doc(description="Delete a comment reply") + @console_ns.doc(params={"app_id": "Application ID", "comment_id": "Comment ID", "reply_id": "Reply ID"}) + @console_ns.response(204, "Reply deleted successfully") + @login_required + @setup_required + @account_initialization_required + @get_app_model() + def delete(self, app_model: App, comment_id: str, reply_id: str): + """Delete a comment reply.""" + # Validate comment access first + WorkflowCommentService.validate_comment_access( + comment_id=comment_id, tenant_id=current_user.current_tenant_id, app_id=app_model.id + ) + + WorkflowCommentService.delete_reply(reply_id=reply_id, user_id=current_user.id) + + return {"result": "success"}, 204 + + +@console_ns.route("/apps//workflow/comments/mention-users") +class WorkflowCommentMentionUsersApi(Resource): + """API for getting mentionable users for workflow comments.""" + + @console_ns.doc("workflow_comment_mention_users") + @console_ns.doc(description="Get all users in current tenant for mentions") + @console_ns.doc(params={"app_id": "Application ID"}) + @console_ns.response(200, "Mentionable users retrieved successfully", workflow_comment_mention_users_model) + @login_required + @setup_required + @account_initialization_required + @get_app_model() + @marshal_with(workflow_comment_mention_users_model) + def get(self, app_model: App): + """Get all users in current tenant for mentions.""" + members = TenantService.get_tenant_members(current_user.current_tenant) + return {"users": members} diff --git a/api/controllers/console/app/workflow_draft_variable.py b/api/controllers/console/app/workflow_draft_variable.py index 5c3211ff41..1e40731439 100644 --- a/api/controllers/console/app/workflow_draft_variable.py +++ b/api/controllers/console/app/workflow_draft_variable.py @@ -22,8 +22,8 @@ from core.variables.segments import ArrayFileSegment, ArrayPromptMessageSegment, from core.variables.types import SegmentType from core.workflow.constants import CONVERSATION_VARIABLE_NODE_ID, SYSTEM_VARIABLE_NODE_ID from extensions.ext_database import db +from factories import variable_factory from factories.file_factory import build_from_mapping, build_from_mappings -from factories.variable_factory import build_segment_with_type from libs.login import current_account_with_tenant, login_required from models import App, AppMode from models.workflow import WorkflowDraftVariable @@ -44,6 +44,16 @@ class WorkflowDraftVariableUpdatePayload(BaseModel): value: Any | None = Field(default=None, description="Variable value") +class ConversationVariableUpdatePayload(BaseModel): + conversation_variables: list[dict[str, Any]] = Field( + ..., description="Conversation variables for the draft workflow" + ) + + +class EnvironmentVariableUpdatePayload(BaseModel): + environment_variables: list[dict[str, Any]] = Field(..., description="Environment variables for the draft workflow") + + console_ns.schema_model( WorkflowDraftVariableListQuery.__name__, WorkflowDraftVariableListQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0), @@ -52,6 +62,14 @@ console_ns.schema_model( WorkflowDraftVariableUpdatePayload.__name__, WorkflowDraftVariableUpdatePayload.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0), ) +console_ns.schema_model( + ConversationVariableUpdatePayload.__name__, + ConversationVariableUpdatePayload.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0), +) +console_ns.schema_model( + EnvironmentVariableUpdatePayload.__name__, + EnvironmentVariableUpdatePayload.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0), +) def _convert_values_to_json_serializable_object(value: Segment): @@ -389,7 +407,7 @@ class VariableApi(Resource): if len(raw_value) > 0 and not isinstance(raw_value[0], dict): raise InvalidArgumentError(description=f"expected dict for files[0], got {type(raw_value)}") raw_value = build_from_mappings(mappings=raw_value, tenant_id=app_model.tenant_id) - new_value = build_segment_with_type(variable.value_type, raw_value) + new_value = variable_factory.build_segment_with_type(variable.value_type, raw_value) draft_var_srv.update_variable(variable, name=new_name, value=new_value) db.session.commit() return variable @@ -482,6 +500,34 @@ class ConversationVariableCollectionApi(Resource): db.session.commit() return _get_variable_list(app_model, CONVERSATION_VARIABLE_NODE_ID) + @console_ns.expect(console_ns.models[ConversationVariableUpdatePayload.__name__]) + @console_ns.doc("update_conversation_variables") + @console_ns.doc(description="Update conversation variables for workflow draft") + @console_ns.doc(params={"app_id": "Application ID"}) + @console_ns.response(200, "Conversation variables updated successfully") + @setup_required + @login_required + @account_initialization_required + @edit_permission_required + @get_app_model(mode=AppMode.ADVANCED_CHAT) + def post(self, app_model: App): + payload = ConversationVariableUpdatePayload.model_validate(console_ns.payload or {}) + + workflow_service = WorkflowService() + + conversation_variables_list = payload.conversation_variables + conversation_variables = [ + variable_factory.build_conversation_variable_from_mapping(obj) for obj in conversation_variables_list + ] + + workflow_service.update_draft_workflow_conversation_variables( + app_model=app_model, + account=current_user, + conversation_variables=conversation_variables, + ) + + return {"result": "success"} + @console_ns.route("/apps//workflows/draft/system-variables") class SystemVariableCollectionApi(Resource): @@ -533,3 +579,31 @@ class EnvironmentVariableCollectionApi(Resource): ) return {"items": env_vars_list} + + @console_ns.expect(console_ns.models[EnvironmentVariableUpdatePayload.__name__]) + @console_ns.doc("update_environment_variables") + @console_ns.doc(description="Update environment variables for workflow draft") + @console_ns.doc(params={"app_id": "Application ID"}) + @console_ns.response(200, "Environment variables updated successfully") + @setup_required + @login_required + @account_initialization_required + @edit_permission_required + @get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW]) + def post(self, app_model: App): + payload = EnvironmentVariableUpdatePayload.model_validate(console_ns.payload or {}) + + workflow_service = WorkflowService() + + environment_variables_list = payload.environment_variables + environment_variables = [ + variable_factory.build_environment_variable_from_mapping(obj) for obj in environment_variables_list + ] + + workflow_service.update_draft_workflow_environment_variables( + app_model=app_model, + account=current_user, + environment_variables=environment_variables, + ) + + return {"result": "success"} diff --git a/api/controllers/console/socketio/__init__.py b/api/controllers/console/socketio/__init__.py new file mode 100644 index 0000000000..8b13789179 --- /dev/null +++ b/api/controllers/console/socketio/__init__.py @@ -0,0 +1 @@ + diff --git a/api/controllers/console/socketio/workflow.py b/api/controllers/console/socketio/workflow.py new file mode 100644 index 0000000000..0bcf940018 --- /dev/null +++ b/api/controllers/console/socketio/workflow.py @@ -0,0 +1,108 @@ +import logging +from collections.abc import Callable +from typing import cast + +from flask import Request as FlaskRequest + +from extensions.ext_socketio import sio +from libs.passport import PassportService +from libs.token import extract_access_token +from repositories.workflow_collaboration_repository import WorkflowCollaborationRepository +from services.account_service import AccountService +from services.workflow_collaboration_service import WorkflowCollaborationService + +repository = WorkflowCollaborationRepository() +collaboration_service = WorkflowCollaborationService(repository, sio) + + +def _sio_on(event: str) -> Callable[[Callable[..., object]], Callable[..., object]]: + return cast(Callable[[Callable[..., object]], Callable[..., object]], sio.on(event)) + + +@_sio_on("connect") +def socket_connect(sid, environ, auth): + """ + WebSocket connect event, do authentication here. + """ + try: + request_environ = FlaskRequest(environ) + token = extract_access_token(request_environ) + except Exception: + logging.exception("Failed to extract token") + token = None + + if not token: + logging.warning("Socket connect rejected: missing token (sid=%s)", sid) + return False + + try: + decoded = PassportService().verify(token) + user_id = decoded.get("user_id") + if not user_id: + logging.warning("Socket connect rejected: missing user_id (sid=%s)", sid) + return False + + with sio.app.app_context(): + user = AccountService.load_logged_in_account(account_id=user_id) + if not user: + logging.warning("Socket connect rejected: user not found (user_id=%s, sid=%s)", user_id, sid) + return False + if not user.has_edit_permission: + logging.warning("Socket connect rejected: no edit permission (user_id=%s, sid=%s)", user_id, sid) + return False + + collaboration_service.save_session(sid, user) + return True + + except Exception: + logging.exception("Socket authentication failed") + return False + + +@_sio_on("user_connect") +def handle_user_connect(sid, data): + """ + Handle user connect event. Each session (tab) is treated as an independent collaborator. + """ + workflow_id = data.get("workflow_id") + if not workflow_id: + return {"msg": "workflow_id is required"}, 400 + + result = collaboration_service.register_session(workflow_id, sid) + if not result: + return {"msg": "unauthorized"}, 401 + + user_id, is_leader = result + return {"msg": "connected", "user_id": user_id, "sid": sid, "isLeader": is_leader} + + +@_sio_on("disconnect") +def handle_disconnect(sid): + """ + Handle session disconnect event. Remove the specific session from online users. + """ + collaboration_service.disconnect_session(sid) + + +@_sio_on("collaboration_event") +def handle_collaboration_event(sid, data): + """ + Handle general collaboration events, include: + 1. mouse_move + 2. vars_and_features_update + 3. sync_request (ask leader to update graph) + 4. app_state_update + 5. mcp_server_update + 6. workflow_update + 7. comments_update + 8. node_panel_presence + """ + return collaboration_service.relay_collaboration_event(sid, data) + + +@_sio_on("graph_event") +def handle_graph_event(sid, data): + """ + Handle graph events - simple broadcast relay. + """ + return collaboration_service.relay_graph_event(sid, data) diff --git a/api/controllers/console/workspace/account.py b/api/controllers/console/workspace/account.py index 527aabbc3d..857d72ee9f 100644 --- a/api/controllers/console/workspace/account.py +++ b/api/controllers/console/workspace/account.py @@ -36,6 +36,7 @@ from controllers.console.wraps import ( only_edition_cloud, setup_required, ) +from core.file import helpers as file_helpers from extensions.ext_database import db from fields.member_fields import account_fields from libs.datetime_utils import naive_utc_now @@ -73,6 +74,10 @@ class AccountAvatarPayload(BaseModel): avatar: str +class AccountAvatarQuery(BaseModel): + avatar: str = Field(..., description="Avatar file ID") + + class AccountInterfaceLanguagePayload(BaseModel): interface_language: str @@ -158,6 +163,7 @@ def reg(cls: type[BaseModel]): reg(AccountInitPayload) reg(AccountNamePayload) reg(AccountAvatarPayload) +reg(AccountAvatarQuery) reg(AccountInterfaceLanguagePayload) reg(AccountInterfaceThemePayload) reg(AccountTimezonePayload) @@ -248,6 +254,18 @@ class AccountNameApi(Resource): @console_ns.route("/account/avatar") class AccountAvatarApi(Resource): + @console_ns.expect(console_ns.models[AccountAvatarQuery.__name__]) + @console_ns.doc("get_account_avatar") + @console_ns.doc(description="Get account avatar url") + @setup_required + @login_required + @account_initialization_required + def get(self): + args = AccountAvatarQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore + + avatar_url = file_helpers.get_signed_file_url(args.avatar) + return {"avatar_url": avatar_url} + @console_ns.expect(console_ns.models[AccountAvatarPayload.__name__]) @setup_required @login_required diff --git a/api/core/app/apps/workflow_app_runner.py b/api/core/app/apps/workflow_app_runner.py index 07160c1d3c..2ef28ffbe5 100644 --- a/api/core/app/apps/workflow_app_runner.py +++ b/api/core/app/apps/workflow_app_runner.py @@ -166,18 +166,22 @@ class WorkflowBasedAppRunner: # Determine which type of single node execution and get graph/variable_pool if single_iteration_run: - graph, variable_pool = self._get_graph_and_variable_pool_of_single_iteration( + graph, variable_pool = self._get_graph_and_variable_pool_for_single_node_run( workflow=workflow, node_id=single_iteration_run.node_id, user_inputs=dict(single_iteration_run.inputs), graph_runtime_state=graph_runtime_state, + node_type_filter_key="iteration_id", + node_type_label="iteration", ) elif single_loop_run: - graph, variable_pool = self._get_graph_and_variable_pool_of_single_loop( + graph, variable_pool = self._get_graph_and_variable_pool_for_single_node_run( workflow=workflow, node_id=single_loop_run.node_id, user_inputs=dict(single_loop_run.inputs), graph_runtime_state=graph_runtime_state, + node_type_filter_key="loop_id", + node_type_label="loop", ) else: raise ValueError("Neither single_iteration_run nor single_loop_run is specified") @@ -314,44 +318,6 @@ class WorkflowBasedAppRunner: return graph, variable_pool - def _get_graph_and_variable_pool_of_single_iteration( - self, - workflow: Workflow, - node_id: str, - user_inputs: dict[str, Any], - graph_runtime_state: GraphRuntimeState, - ) -> tuple[Graph, VariablePool]: - """ - Get variable pool of single iteration - """ - return self._get_graph_and_variable_pool_for_single_node_run( - workflow=workflow, - node_id=node_id, - user_inputs=user_inputs, - graph_runtime_state=graph_runtime_state, - node_type_filter_key="iteration_id", - node_type_label="iteration", - ) - - def _get_graph_and_variable_pool_of_single_loop( - self, - workflow: Workflow, - node_id: str, - user_inputs: dict[str, Any], - graph_runtime_state: GraphRuntimeState, - ) -> tuple[Graph, VariablePool]: - """ - Get variable pool of single loop - """ - return self._get_graph_and_variable_pool_for_single_node_run( - workflow=workflow, - node_id=node_id, - user_inputs=user_inputs, - graph_runtime_state=graph_runtime_state, - node_type_filter_key="loop_id", - node_type_label="loop", - ) - def _handle_event(self, workflow_entry: WorkflowEntry, event: GraphEngineEvent): """ Handle event diff --git a/api/core/rag/datasource/vdb/iris/iris_vector.py b/api/core/rag/datasource/vdb/iris/iris_vector.py index 5bdb0af0b3..50bb2429ec 100644 --- a/api/core/rag/datasource/vdb/iris/iris_vector.py +++ b/api/core/rag/datasource/vdb/iris/iris_vector.py @@ -154,7 +154,7 @@ class IrisConnectionPool: # Add to cache to skip future checks self._schemas_initialized.add(schema) - except Exception as e: + except Exception: conn.rollback() logger.exception("Failed to ensure schema %s exists", schema) raise @@ -177,6 +177,9 @@ class IrisConnectionPool: class IrisVector(BaseVector): """IRIS vector database implementation using native VECTOR type and HNSW indexing.""" + # Fallback score for full-text search when Rank function unavailable or TEXT_INDEX disabled + _FULL_TEXT_FALLBACK_SCORE = 0.5 + def __init__(self, collection_name: str, config: IrisVectorConfig) -> None: super().__init__(collection_name) self.config = config @@ -272,41 +275,131 @@ class IrisVector(BaseVector): return docs def search_by_full_text(self, query: str, **kwargs: Any) -> list[Document]: - """Search documents by full-text using iFind index or fallback to LIKE search.""" + """Search documents by full-text using iFind index with BM25 relevance scoring. + + When IRIS_TEXT_INDEX is enabled, this method uses the auto-generated Rank + function from %iFind.Index.Basic to calculate BM25 relevance scores. The Rank + function is automatically created with naming: {schema}.{table_name}_{index}Rank + + Args: + query: Search query string + **kwargs: Optional parameters including top_k, document_ids_filter + + Returns: + List of Document objects with relevance scores in metadata["score"] + """ top_k = kwargs.get("top_k", 5) + document_ids_filter = kwargs.get("document_ids_filter") with self._get_cursor() as cursor: if self.config.IRIS_TEXT_INDEX: - # Use iFind full-text search with index + # Use iFind full-text search with auto-generated Rank function text_index_name = f"idx_{self.table_name}_text" + # IRIS removes underscores from function names + table_no_underscore = self.table_name.replace("_", "") + index_no_underscore = text_index_name.replace("_", "") + rank_function = f"{self.schema}.{table_no_underscore}_{index_no_underscore}Rank" + + # Build WHERE clause with document ID filter if provided + where_clause = f"WHERE %ID %FIND search_index({text_index_name}, ?)" + # First param for Rank function, second for FIND + params = [query, query] + + if document_ids_filter: + # Add document ID filter + placeholders = ",".join("?" * len(document_ids_filter)) + where_clause += f" AND JSON_VALUE(meta, '$.document_id') IN ({placeholders})" + params.extend(document_ids_filter) + sql = f""" - SELECT TOP {top_k} id, text, meta + SELECT TOP {top_k} + id, + text, + meta, + {rank_function}(%ID, ?) AS score FROM {self.schema}.{self.table_name} - WHERE %ID %FIND search_index({text_index_name}, ?) + {where_clause} + ORDER BY score DESC """ - cursor.execute(sql, (query,)) + + logger.debug( + "iFind search: query='%s', index='%s', rank='%s'", + query, + text_index_name, + rank_function, + ) + + try: + cursor.execute(sql, params) + except Exception: # pylint: disable=broad-exception-caught + # Fallback to query without Rank function if it fails + logger.warning( + "Rank function '%s' failed, using fixed score", + rank_function, + exc_info=True, + ) + sql_fallback = f""" + SELECT TOP {top_k} id, text, meta, {self._FULL_TEXT_FALLBACK_SCORE} AS score + FROM {self.schema}.{self.table_name} + {where_clause} + """ + # Skip first param (for Rank function) + cursor.execute(sql_fallback, params[1:]) else: - # Fallback to LIKE search (inefficient for large datasets) - # Escape special characters for LIKE clause to prevent SQL injection - from libs.helper import escape_like_pattern + # Fallback to LIKE search (IRIS_TEXT_INDEX disabled) + from libs.helper import ( # pylint: disable=import-outside-toplevel + escape_like_pattern, + ) escaped_query = escape_like_pattern(query) query_pattern = f"%{escaped_query}%" + + # Build WHERE clause with document ID filter if provided + where_clause = "WHERE text LIKE ? ESCAPE '\\\\'" + params = [query_pattern] + + if document_ids_filter: + placeholders = ",".join("?" * len(document_ids_filter)) + where_clause += f" AND JSON_VALUE(meta, '$.document_id') IN ({placeholders})" + params.extend(document_ids_filter) + sql = f""" - SELECT TOP {top_k} id, text, meta + SELECT TOP {top_k} id, text, meta, {self._FULL_TEXT_FALLBACK_SCORE} AS score FROM {self.schema}.{self.table_name} - WHERE text LIKE ? ESCAPE '\\' + {where_clause} + ORDER BY LENGTH(text) ASC """ - cursor.execute(sql, (query_pattern,)) + + logger.debug( + "LIKE fallback (TEXT_INDEX disabled): query='%s'", + query_pattern, + ) + cursor.execute(sql, params) docs = [] for row in cursor.fetchall(): - if len(row) >= 3: - metadata = json.loads(row[2]) if row[2] else {} - docs.append(Document(page_content=row[1], metadata=metadata)) + # Expecting 4 columns: id, text, meta, score + if len(row) >= 4: + text_content = row[1] + meta_str = row[2] + score_value = row[3] + + metadata = json.loads(meta_str) if meta_str else {} + # Add score to metadata for hybrid search compatibility + score = float(score_value) if score_value is not None else 0.0 + metadata["score"] = score + + docs.append(Document(page_content=text_content, metadata=metadata)) + + logger.info( + "Full-text search completed: query='%s', results=%d/%d", + query, + len(docs), + top_k, + ) if not docs: - logger.info("Full-text search for '%s' returned no results", query) + logger.warning("Full-text search for '%s' returned no results", query) return docs @@ -370,7 +463,11 @@ class IrisVector(BaseVector): AS %iFind.Index.Basic (LANGUAGE = '{language}', LOWER = 1, INDEXOPTION = 0) """ - logger.info("Creating text index: %s with language: %s", text_index_name, language) + logger.info( + "Creating text index: %s with language: %s", + text_index_name, + language, + ) logger.info("SQL for text index: %s", sql_text_index) cursor.execute(sql_text_index) logger.info("Text index created successfully: %s", text_index_name) diff --git a/api/core/tools/entities/tool_entities.py b/api/core/tools/entities/tool_entities.py index b5c7a6310c..96268d029e 100644 --- a/api/core/tools/entities/tool_entities.py +++ b/api/core/tools/entities/tool_entities.py @@ -130,7 +130,7 @@ class ToolInvokeMessage(BaseModel): text: str class JsonMessage(BaseModel): - json_object: dict + json_object: dict | list suppress_output: bool = Field(default=False, description="Whether to suppress JSON output in result string") class BlobMessage(BaseModel): @@ -144,7 +144,14 @@ class ToolInvokeMessage(BaseModel): end: bool = Field(..., description="Whether the chunk is the last chunk") class FileMessage(BaseModel): - pass + file_marker: str = Field(default="file_marker") + + @model_validator(mode="before") + @classmethod + def validate_file_message(cls, values): + if isinstance(values, dict) and "file_marker" not in values: + raise ValueError("Invalid FileMessage: missing file_marker") + return values class VariableMessage(BaseModel): variable_name: str = Field(..., description="The name of the variable") @@ -234,10 +241,22 @@ class ToolInvokeMessage(BaseModel): @field_validator("message", mode="before") @classmethod - def decode_blob_message(cls, v): + def decode_blob_message(cls, v, info: ValidationInfo): + # 处理 blob 解码 if isinstance(v, dict) and "blob" in v: with contextlib.suppress(Exception): v["blob"] = base64.b64decode(v["blob"]) + + # Force correct message type based on type field + # Only wrap dict types to avoid wrapping already parsed Pydantic model objects + if info.data and isinstance(info.data, dict) and isinstance(v, dict): + msg_type = info.data.get("type") + if msg_type == cls.MessageType.JSON: + if "json_object" not in v: + v = {"json_object": v} + elif msg_type == cls.MessageType.FILE: + v = {"file_marker": "file_marker"} + return v @field_serializer("message") diff --git a/api/core/workflow/nodes/agent/agent_node.py b/api/core/workflow/nodes/agent/agent_node.py index 52f02881e8..ab7f429d98 100644 --- a/api/core/workflow/nodes/agent/agent_node.py +++ b/api/core/workflow/nodes/agent/agent_node.py @@ -660,7 +660,7 @@ class AgentNode(Node[AgentNodeData]): text = "" files: list[File] = [] - json_list: list[dict] = [] + json_list: list[dict | list] = [] agent_logs: list[AgentLogEvent] = [] agent_execution_metadata: Mapping[WorkflowNodeExecutionMetadataKey, Any] = {} @@ -734,13 +734,18 @@ class AgentNode(Node[AgentNodeData]): elif message.type == ToolInvokeMessage.MessageType.JSON: assert isinstance(message.message, ToolInvokeMessage.JsonMessage) if node_type == NodeType.AGENT: - msg_metadata: dict[str, Any] = message.message.json_object.pop("execution_metadata", {}) - llm_usage = LLMUsage.from_metadata(cast(LLMUsageMetadata, msg_metadata)) - agent_execution_metadata = { - WorkflowNodeExecutionMetadataKey(key): value - for key, value in msg_metadata.items() - if key in WorkflowNodeExecutionMetadataKey.__members__.values() - } + if isinstance(message.message.json_object, dict): + msg_metadata: dict[str, Any] = message.message.json_object.pop("execution_metadata", {}) + llm_usage = LLMUsage.from_metadata(cast(LLMUsageMetadata, msg_metadata)) + agent_execution_metadata = { + WorkflowNodeExecutionMetadataKey(key): value + for key, value in msg_metadata.items() + if key in WorkflowNodeExecutionMetadataKey.__members__.values() + } + else: + msg_metadata = {} + llm_usage = LLMUsage.empty_usage() + agent_execution_metadata = {} if message.message.json_object: json_list.append(message.message.json_object) elif message.type == ToolInvokeMessage.MessageType.LINK: @@ -849,7 +854,7 @@ class AgentNode(Node[AgentNodeData]): yield agent_log # Add agent_logs to outputs['json'] to ensure frontend can access thinking process - json_output: list[dict[str, Any]] = [] + json_output: list[dict[str, Any] | list[Any]] = [] # Step 1: append each agent log as its own dict. if agent_logs: diff --git a/api/core/workflow/nodes/datasource/datasource_node.py b/api/core/workflow/nodes/datasource/datasource_node.py index bb2140f42e..925561cf7c 100644 --- a/api/core/workflow/nodes/datasource/datasource_node.py +++ b/api/core/workflow/nodes/datasource/datasource_node.py @@ -301,7 +301,7 @@ class DatasourceNode(Node[DatasourceNodeData]): text = "" files: list[File] = [] - json: list[dict] = [] + json: list[dict | list] = [] variables: dict[str, Any] = {} diff --git a/api/core/workflow/nodes/tool/tool_node.py b/api/core/workflow/nodes/tool/tool_node.py index d49d8330f9..9036edbb59 100644 --- a/api/core/workflow/nodes/tool/tool_node.py +++ b/api/core/workflow/nodes/tool/tool_node.py @@ -269,7 +269,7 @@ class ToolNode(Node[ToolNodeData]): text = "" files: list[File] = [] - json: list[dict] = [] + json: list[dict | list] = [] variables: dict[str, Any] = {} @@ -425,7 +425,7 @@ class ToolNode(Node[ToolNodeData]): message.message.metadata = dict_metadata # Add agent_logs to outputs['json'] to ensure frontend can access thinking process - json_output: list[dict[str, Any]] = [] + json_output: list[dict[str, Any] | list[Any]] = [] # Step 2: normalize JSON into {"data": [...]}.change json to list[dict] if json: diff --git a/api/docker/entrypoint.sh b/api/docker/entrypoint.sh index c0279f893b..b4308f6ba4 100755 --- a/api/docker/entrypoint.sh +++ b/api/docker/entrypoint.sh @@ -119,14 +119,16 @@ elif [[ "${MODE}" == "job" ]]; then else if [[ "${DEBUG}" == "true" ]]; then - exec flask run --host=${DIFY_BIND_ADDRESS:-0.0.0.0} --port=${DIFY_PORT:-5001} --debug + export HOST=${DIFY_BIND_ADDRESS:-0.0.0.0} + export PORT=${DIFY_PORT:-5001} + exec python -m app else exec gunicorn \ --bind "${DIFY_BIND_ADDRESS:-0.0.0.0}:${DIFY_PORT:-5001}" \ --workers ${SERVER_WORKER_AMOUNT:-1} \ - --worker-class ${SERVER_WORKER_CLASS:-gevent} \ + --worker-class ${SERVER_WORKER_CLASS:-geventwebsocket.gunicorn.workers.GeventWebSocketWorker} \ --worker-connections ${SERVER_WORKER_CONNECTIONS:-10} \ --timeout ${GUNICORN_TIMEOUT:-200} \ - app:app + app:socketio_app fi fi diff --git a/api/extensions/ext_socketio.py b/api/extensions/ext_socketio.py new file mode 100644 index 0000000000..5ed82bac8d --- /dev/null +++ b/api/extensions/ext_socketio.py @@ -0,0 +1,5 @@ +import socketio # type: ignore[reportMissingTypeStubs] + +from configs import dify_config + +sio = socketio.Server(async_mode="gevent", cors_allowed_origins=dify_config.CONSOLE_CORS_ALLOW_ORIGINS) diff --git a/api/fields/online_user_fields.py b/api/fields/online_user_fields.py new file mode 100644 index 0000000000..8fe0dc6a64 --- /dev/null +++ b/api/fields/online_user_fields.py @@ -0,0 +1,17 @@ +from flask_restx import fields + +online_user_partial_fields = { + "user_id": fields.String, + "username": fields.String, + "avatar": fields.String, + "sid": fields.String, +} + +workflow_online_users_fields = { + "workflow_id": fields.String, + "users": fields.List(fields.Nested(online_user_partial_fields)), +} + +online_user_list_fields = { + "data": fields.List(fields.Nested(workflow_online_users_fields)), +} diff --git a/api/fields/workflow_comment_fields.py b/api/fields/workflow_comment_fields.py new file mode 100644 index 0000000000..c708dd3460 --- /dev/null +++ b/api/fields/workflow_comment_fields.py @@ -0,0 +1,96 @@ +from flask_restx import fields + +from libs.helper import AvatarUrlField, TimestampField + +# basic account fields for comments +account_fields = { + "id": fields.String, + "name": fields.String, + "email": fields.String, + "avatar_url": AvatarUrlField, +} + +# Comment mention fields +workflow_comment_mention_fields = { + "mentioned_user_id": fields.String, + "mentioned_user_account": fields.Nested(account_fields, allow_null=True), + "reply_id": fields.String, +} + +# Comment reply fields +workflow_comment_reply_fields = { + "id": fields.String, + "content": fields.String, + "created_by": fields.String, + "created_by_account": fields.Nested(account_fields, allow_null=True), + "created_at": TimestampField, +} + +# Basic comment fields (for list views) +workflow_comment_basic_fields = { + "id": fields.String, + "position_x": fields.Float, + "position_y": fields.Float, + "content": fields.String, + "created_by": fields.String, + "created_by_account": fields.Nested(account_fields, allow_null=True), + "created_at": TimestampField, + "updated_at": TimestampField, + "resolved": fields.Boolean, + "resolved_at": TimestampField, + "resolved_by": fields.String, + "resolved_by_account": fields.Nested(account_fields, allow_null=True), + "reply_count": fields.Integer, + "mention_count": fields.Integer, + "participants": fields.List(fields.Nested(account_fields)), +} + +# Detailed comment fields (for single comment view) +workflow_comment_detail_fields = { + "id": fields.String, + "position_x": fields.Float, + "position_y": fields.Float, + "content": fields.String, + "created_by": fields.String, + "created_by_account": fields.Nested(account_fields, allow_null=True), + "created_at": TimestampField, + "updated_at": TimestampField, + "resolved": fields.Boolean, + "resolved_at": TimestampField, + "resolved_by": fields.String, + "resolved_by_account": fields.Nested(account_fields, allow_null=True), + "replies": fields.List(fields.Nested(workflow_comment_reply_fields)), + "mentions": fields.List(fields.Nested(workflow_comment_mention_fields)), +} + +# Comment creation response fields (simplified) +workflow_comment_create_fields = { + "id": fields.String, + "created_at": TimestampField, +} + +# Comment update response fields (simplified) +workflow_comment_update_fields = { + "id": fields.String, + "updated_at": TimestampField, +} + +# Comment resolve response fields +workflow_comment_resolve_fields = { + "id": fields.String, + "resolved": fields.Boolean, + "resolved_at": TimestampField, + "resolved_by": fields.String, +} + +# Reply creation response fields (simplified) +workflow_comment_reply_create_fields = { + "id": fields.String, + "created_at": TimestampField, +} + +# Reply update response fields +workflow_comment_reply_update_fields = { + "id": fields.String, + "updated_at": TimestampField, +} diff --git a/api/migrations/versions/2026_01_17_1726-227822d22895_add_workflow_comments_table.py b/api/migrations/versions/2026_01_17_1726-227822d22895_add_workflow_comments_table.py new file mode 100644 index 0000000000..a0c92dc21f --- /dev/null +++ b/api/migrations/versions/2026_01_17_1726-227822d22895_add_workflow_comments_table.py @@ -0,0 +1,90 @@ +"""Add workflow comments table + +Revision ID: 227822d22895 +Revises: 9d77545f524e +Create Date: 2025-08-22 17:26:15.255980 + +""" +from alembic import op +import models as models +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = '227822d22895' +down_revision = '9d77545f524e' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.create_table('workflow_comments', + sa.Column('id', models.types.StringUUID(), server_default=sa.text('uuidv7()'), nullable=False), + sa.Column('tenant_id', models.types.StringUUID(), nullable=False), + sa.Column('app_id', models.types.StringUUID(), nullable=False), + sa.Column('position_x', sa.Float(), nullable=False), + sa.Column('position_y', sa.Float(), nullable=False), + sa.Column('content', sa.Text(), nullable=False), + sa.Column('created_by', models.types.StringUUID(), nullable=False), + sa.Column('created_at', sa.DateTime(), server_default=sa.text('CURRENT_TIMESTAMP'), nullable=False), + sa.Column('updated_at', sa.DateTime(), server_default=sa.text('CURRENT_TIMESTAMP'), nullable=False), + sa.Column('resolved', sa.Boolean(), server_default=sa.text('false'), nullable=False), + sa.Column('resolved_at', sa.DateTime(), nullable=True), + sa.Column('resolved_by', models.types.StringUUID(), nullable=True), + sa.PrimaryKeyConstraint('id', name='workflow_comments_pkey') + ) + with op.batch_alter_table('workflow_comments', schema=None) as batch_op: + batch_op.create_index('workflow_comments_app_idx', ['tenant_id', 'app_id'], unique=False) + batch_op.create_index('workflow_comments_created_at_idx', ['created_at'], unique=False) + + op.create_table('workflow_comment_replies', + sa.Column('id', models.types.StringUUID(), server_default=sa.text('uuidv7()'), nullable=False), + sa.Column('comment_id', models.types.StringUUID(), nullable=False), + sa.Column('content', sa.Text(), nullable=False), + sa.Column('created_by', models.types.StringUUID(), nullable=False), + sa.Column('created_at', sa.DateTime(), server_default=sa.text('CURRENT_TIMESTAMP'), nullable=False), + sa.Column('updated_at', sa.DateTime(), server_default=sa.text('CURRENT_TIMESTAMP'), nullable=False), + sa.ForeignKeyConstraint(['comment_id'], ['workflow_comments.id'], name=op.f('workflow_comment_replies_comment_id_fkey'), ondelete='CASCADE'), + sa.PrimaryKeyConstraint('id', name='workflow_comment_replies_pkey') + ) + with op.batch_alter_table('workflow_comment_replies', schema=None) as batch_op: + batch_op.create_index('comment_replies_comment_idx', ['comment_id'], unique=False) + batch_op.create_index('comment_replies_created_at_idx', ['created_at'], unique=False) + + op.create_table('workflow_comment_mentions', + sa.Column('id', models.types.StringUUID(), server_default=sa.text('uuidv7()'), nullable=False), + sa.Column('comment_id', models.types.StringUUID(), nullable=False), + sa.Column('reply_id', models.types.StringUUID(), nullable=True), + sa.Column('mentioned_user_id', models.types.StringUUID(), nullable=False), + sa.ForeignKeyConstraint(['comment_id'], ['workflow_comments.id'], name=op.f('workflow_comment_mentions_comment_id_fkey'), ondelete='CASCADE'), + sa.ForeignKeyConstraint(['reply_id'], ['workflow_comment_replies.id'], name=op.f('workflow_comment_mentions_reply_id_fkey'), ondelete='CASCADE'), + sa.PrimaryKeyConstraint('id', name='workflow_comment_mentions_pkey') + ) + with op.batch_alter_table('workflow_comment_mentions', schema=None) as batch_op: + batch_op.create_index('comment_mentions_comment_idx', ['comment_id'], unique=False) + batch_op.create_index('comment_mentions_reply_idx', ['reply_id'], unique=False) + batch_op.create_index('comment_mentions_user_idx', ['mentioned_user_id'], unique=False) + + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + with op.batch_alter_table('workflow_comment_mentions', schema=None) as batch_op: + batch_op.drop_index('comment_mentions_user_idx') + batch_op.drop_index('comment_mentions_reply_idx') + batch_op.drop_index('comment_mentions_comment_idx') + + op.drop_table('workflow_comment_mentions') + with op.batch_alter_table('workflow_comment_replies', schema=None) as batch_op: + batch_op.drop_index('comment_replies_created_at_idx') + batch_op.drop_index('comment_replies_comment_idx') + + op.drop_table('workflow_comment_replies') + with op.batch_alter_table('workflow_comments', schema=None) as batch_op: + batch_op.drop_index('workflow_comments_created_at_idx') + batch_op.drop_index('workflow_comments_app_idx') + + op.drop_table('workflow_comments') + # ### end Alembic commands ### diff --git a/api/models/__init__.py b/api/models/__init__.py index 538d378160..ff345cd656 100644 --- a/api/models/__init__.py +++ b/api/models/__init__.py @@ -10,6 +10,11 @@ from .account import ( ) from .api_based_extension import APIBasedExtension, APIBasedExtensionPoint from .app_asset import AppAssets +from .comment import ( + WorkflowComment, + WorkflowCommentMention, + WorkflowCommentReply, +) from .dataset import ( AppDatasetJoin, Dataset, @@ -213,6 +218,9 @@ __all__ = [ "WorkflowAppLog", "WorkflowAppLogCreatedFrom", "WorkflowArchiveLog", + "WorkflowComment", + "WorkflowCommentMention", + "WorkflowCommentReply", "WorkflowFeature", "WorkflowFeatures", "WorkflowNodeExecutionModel", diff --git a/api/models/comment.py b/api/models/comment.py new file mode 100644 index 0000000000..21ccfa13db --- /dev/null +++ b/api/models/comment.py @@ -0,0 +1,210 @@ +"""Workflow comment models.""" + +from datetime import datetime +from typing import Optional + +from sqlalchemy import Index, func +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from .account import Account +from .base import Base +from .engine import db +from .types import StringUUID + + +class WorkflowComment(Base): + """Workflow comment model for canvas commenting functionality. + + Comments are associated with apps rather than specific workflow versions, + since an app has only one draft workflow at a time and comments should persist + across workflow version changes. + + Attributes: + id: Comment ID + tenant_id: Workspace ID + app_id: App ID (primary association, comments belong to apps) + position_x: X coordinate on canvas + position_y: Y coordinate on canvas + content: Comment content + created_by: Creator account ID + created_at: Creation time + updated_at: Last update time + resolved: Whether comment is resolved + resolved_at: Resolution time + resolved_by: Resolver account ID + """ + + __tablename__ = "workflow_comments" + __table_args__ = ( + db.PrimaryKeyConstraint("id", name="workflow_comments_pkey"), + Index("workflow_comments_app_idx", "tenant_id", "app_id"), + Index("workflow_comments_created_at_idx", "created_at"), + ) + + id: Mapped[str] = mapped_column(StringUUID, server_default=db.text("uuidv7()")) + tenant_id: Mapped[str] = mapped_column(StringUUID, nullable=False) + app_id: Mapped[str] = mapped_column(StringUUID, nullable=False) + position_x: Mapped[float] = mapped_column(db.Float) + position_y: Mapped[float] = mapped_column(db.Float) + content: Mapped[str] = mapped_column(db.Text, nullable=False) + created_by: Mapped[str] = mapped_column(StringUUID, nullable=False) + created_at: Mapped[datetime] = mapped_column(db.DateTime, nullable=False, server_default=func.current_timestamp()) + updated_at: Mapped[datetime] = mapped_column( + db.DateTime, nullable=False, server_default=func.current_timestamp(), onupdate=func.current_timestamp() + ) + resolved: Mapped[bool] = mapped_column(db.Boolean, nullable=False, server_default=db.text("false")) + resolved_at: Mapped[datetime | None] = mapped_column(db.DateTime) + resolved_by: Mapped[str | None] = mapped_column(StringUUID) + + # Relationships + replies: Mapped[list["WorkflowCommentReply"]] = relationship( + "WorkflowCommentReply", back_populates="comment", cascade="all, delete-orphan" + ) + mentions: Mapped[list["WorkflowCommentMention"]] = relationship( + "WorkflowCommentMention", back_populates="comment", cascade="all, delete-orphan" + ) + + @property + def created_by_account(self): + """Get creator account.""" + if hasattr(self, "_created_by_account_cache"): + return self._created_by_account_cache + return db.session.get(Account, self.created_by) + + def cache_created_by_account(self, account: Account | None) -> None: + """Cache creator account to avoid extra queries.""" + self._created_by_account_cache = account + + @property + def resolved_by_account(self): + """Get resolver account.""" + if hasattr(self, "_resolved_by_account_cache"): + return self._resolved_by_account_cache + if self.resolved_by: + return db.session.get(Account, self.resolved_by) + return None + + def cache_resolved_by_account(self, account: Account | None) -> None: + """Cache resolver account to avoid extra queries.""" + self._resolved_by_account_cache = account + + @property + def reply_count(self): + """Get reply count.""" + return len(self.replies) + + @property + def mention_count(self): + """Get mention count.""" + return len(self.mentions) + + @property + def participants(self): + """Get all participants (creator + repliers + mentioned users).""" + participant_ids = set() + + # Add comment creator + participant_ids.add(self.created_by) + + # Add reply creators + participant_ids.update(reply.created_by for reply in self.replies) + + # Add mentioned users + participant_ids.update(mention.mentioned_user_id for mention in self.mentions) + + # Get account objects + participants = [] + for user_id in participant_ids: + account = db.session.get(Account, user_id) + if account: + participants.append(account) + + return participants + + +class WorkflowCommentReply(Base): + """Workflow comment reply model. + + Attributes: + id: Reply ID + comment_id: Parent comment ID + content: Reply content + created_by: Creator account ID + created_at: Creation time + """ + + __tablename__ = "workflow_comment_replies" + __table_args__ = ( + db.PrimaryKeyConstraint("id", name="workflow_comment_replies_pkey"), + Index("comment_replies_comment_idx", "comment_id"), + Index("comment_replies_created_at_idx", "created_at"), + ) + + id: Mapped[str] = mapped_column(StringUUID, server_default=db.text("uuidv7()")) + comment_id: Mapped[str] = mapped_column( + StringUUID, db.ForeignKey("workflow_comments.id", ondelete="CASCADE"), nullable=False + ) + content: Mapped[str] = mapped_column(db.Text, nullable=False) + created_by: Mapped[str] = mapped_column(StringUUID, nullable=False) + created_at: Mapped[datetime] = mapped_column(db.DateTime, nullable=False, server_default=func.current_timestamp()) + updated_at: Mapped[datetime] = mapped_column( + db.DateTime, nullable=False, server_default=func.current_timestamp(), onupdate=func.current_timestamp() + ) + # Relationships + comment: Mapped["WorkflowComment"] = relationship("WorkflowComment", back_populates="replies") + + @property + def created_by_account(self): + """Get creator account.""" + if hasattr(self, "_created_by_account_cache"): + return self._created_by_account_cache + return db.session.get(Account, self.created_by) + + def cache_created_by_account(self, account: Account | None) -> None: + """Cache creator account to avoid extra queries.""" + self._created_by_account_cache = account + + +class WorkflowCommentMention(Base): + """Workflow comment mention model. + + Mentions are only for internal accounts since end users + cannot access workflow canvas and commenting features. + + Attributes: + id: Mention ID + comment_id: Parent comment ID + mentioned_user_id: Mentioned account ID + """ + + __tablename__ = "workflow_comment_mentions" + __table_args__ = ( + db.PrimaryKeyConstraint("id", name="workflow_comment_mentions_pkey"), + Index("comment_mentions_comment_idx", "comment_id"), + Index("comment_mentions_reply_idx", "reply_id"), + Index("comment_mentions_user_idx", "mentioned_user_id"), + ) + + id: Mapped[str] = mapped_column(StringUUID, server_default=db.text("uuidv7()")) + comment_id: Mapped[str] = mapped_column( + StringUUID, db.ForeignKey("workflow_comments.id", ondelete="CASCADE"), nullable=False + ) + reply_id: Mapped[str | None] = mapped_column( + StringUUID, db.ForeignKey("workflow_comment_replies.id", ondelete="CASCADE"), nullable=True + ) + mentioned_user_id: Mapped[str] = mapped_column(StringUUID, nullable=False) + + # Relationships + comment: Mapped["WorkflowComment"] = relationship("WorkflowComment", back_populates="mentions") + reply: Mapped[Optional["WorkflowCommentReply"]] = relationship("WorkflowCommentReply") + + @property + def mentioned_user_account(self): + """Get mentioned account.""" + if hasattr(self, "_mentioned_user_account_cache"): + return self._mentioned_user_account_cache + return db.session.get(Account, self.mentioned_user_id) + + def cache_mentioned_user_account(self, account: Account | None) -> None: + """Cache mentioned account to avoid extra queries.""" + self._mentioned_user_account_cache = account diff --git a/api/models/model.py b/api/models/model.py index bd4792f7e1..5cef46dbc0 100644 --- a/api/models/model.py +++ b/api/models/model.py @@ -317,40 +317,48 @@ class App(Base): return None -class AppModelConfig(Base): +class AppModelConfig(TypeBase): __tablename__ = "app_model_configs" __table_args__ = (sa.PrimaryKeyConstraint("id", name="app_model_config_pkey"), sa.Index("app_app_id_idx", "app_id")) - id = mapped_column(StringUUID, default=lambda: str(uuid4())) - app_id = mapped_column(StringUUID, nullable=False) - provider = mapped_column(String(255), nullable=True) - model_id = mapped_column(String(255), nullable=True) - configs = mapped_column(sa.JSON, nullable=True) - created_by = mapped_column(StringUUID, nullable=True) - created_at = mapped_column(sa.DateTime, nullable=False, server_default=func.current_timestamp()) - updated_by = mapped_column(StringUUID, nullable=True) - updated_at = mapped_column( - sa.DateTime, nullable=False, server_default=func.current_timestamp(), onupdate=func.current_timestamp() + id: Mapped[str] = mapped_column(StringUUID, default=lambda: str(uuid4()), init=False) + app_id: Mapped[str] = mapped_column(StringUUID, nullable=False) + provider: Mapped[str | None] = mapped_column(String(255), nullable=True, default=None) + model_id: Mapped[str | None] = mapped_column(String(255), nullable=True, default=None) + configs: Mapped[Any | None] = mapped_column(sa.JSON, nullable=True, default=None) + created_by: Mapped[str | None] = mapped_column(StringUUID, nullable=True, default=None) + created_at: Mapped[datetime] = mapped_column( + sa.DateTime, nullable=False, server_default=func.current_timestamp(), init=False ) - opening_statement = mapped_column(LongText) - suggested_questions = mapped_column(LongText) - suggested_questions_after_answer = mapped_column(LongText) - speech_to_text = mapped_column(LongText) - text_to_speech = mapped_column(LongText) - more_like_this = mapped_column(LongText) - model = mapped_column(LongText) - user_input_form = mapped_column(LongText) - dataset_query_variable = mapped_column(String(255)) - pre_prompt = mapped_column(LongText) - agent_mode = mapped_column(LongText) - sensitive_word_avoidance = mapped_column(LongText) - retriever_resource = mapped_column(LongText) - prompt_type = mapped_column(String(255), nullable=False, server_default=sa.text("'simple'")) - chat_prompt_config = mapped_column(LongText) - completion_prompt_config = mapped_column(LongText) - dataset_configs = mapped_column(LongText) - external_data_tools = mapped_column(LongText) - file_upload = mapped_column(LongText) + updated_by: Mapped[str | None] = mapped_column(StringUUID, nullable=True, default=None) + updated_at: Mapped[datetime] = mapped_column( + sa.DateTime, + nullable=False, + server_default=func.current_timestamp(), + onupdate=func.current_timestamp(), + init=False, + ) + opening_statement: Mapped[str | None] = mapped_column(LongText, default=None) + suggested_questions: Mapped[str | None] = mapped_column(LongText, default=None) + suggested_questions_after_answer: Mapped[str | None] = mapped_column(LongText, default=None) + speech_to_text: Mapped[str | None] = mapped_column(LongText, default=None) + text_to_speech: Mapped[str | None] = mapped_column(LongText, default=None) + more_like_this: Mapped[str | None] = mapped_column(LongText, default=None) + model: Mapped[str | None] = mapped_column(LongText, default=None) + user_input_form: Mapped[str | None] = mapped_column(LongText, default=None) + dataset_query_variable: Mapped[str | None] = mapped_column(String(255), default=None) + pre_prompt: Mapped[str | None] = mapped_column(LongText, default=None) + agent_mode: Mapped[str | None] = mapped_column(LongText, default=None) + sensitive_word_avoidance: Mapped[str | None] = mapped_column(LongText, default=None) + retriever_resource: Mapped[str | None] = mapped_column(LongText, default=None) + prompt_type: Mapped[str] = mapped_column( + String(255), nullable=False, server_default=sa.text("'simple'"), default="simple" + ) + chat_prompt_config: Mapped[str | None] = mapped_column(LongText, default=None) + completion_prompt_config: Mapped[str | None] = mapped_column(LongText, default=None) + dataset_configs: Mapped[str | None] = mapped_column(LongText, default=None) + external_data_tools: Mapped[str | None] = mapped_column(LongText, default=None) + file_upload: Mapped[str | None] = mapped_column(LongText, default=None) @property def app(self) -> App | None: @@ -812,8 +820,8 @@ class Conversation(Base): override_model_configs = json.loads(self.override_model_configs) if "model" in override_model_configs: - app_model_config = AppModelConfig() - app_model_config = app_model_config.from_model_config_dict(override_model_configs) + # where is app_id? + app_model_config = AppModelConfig(app_id=self.app_id).from_model_config_dict(override_model_configs) model_config = app_model_config.to_dict() else: model_config["configs"] = override_model_configs diff --git a/api/models/workflow.py b/api/models/workflow.py index 2cd4e44a9f..70b9968f7e 100644 --- a/api/models/workflow.py +++ b/api/models/workflow.py @@ -257,8 +257,7 @@ class Workflow(Base): # bug # # Currently, the following functions / methods would mutate the returned dict: # - # - `_get_graph_and_variable_pool_of_single_iteration`. - # - `_get_graph_and_variable_pool_of_single_loop`. + # - `_get_graph_and_variable_pool_for_single_node_run`. return json.loads(self.graph) if self.graph else {} def get_node_config_by_id(self, node_id: str) -> Mapping[str, Any]: @@ -435,7 +434,7 @@ class Workflow(Base): # bug :return: hash """ - entity = {"graph": self.graph_dict, "features": self.features_dict} + entity = {"graph": self.graph_dict} return helper.generate_text_hash(json.dumps(entity, sort_keys=True)) diff --git a/api/pyproject.toml b/api/pyproject.toml index f55a4508a9..007fe076fc 100644 --- a/api/pyproject.toml +++ b/api/pyproject.toml @@ -22,6 +22,7 @@ dependencies = [ "flask-orjson~=2.0.0", "flask-sqlalchemy~=3.1.1", "gevent~=25.9.1", + "gevent-websocket~=0.10.1", "gmpy2~=2.2.1", "google-api-core==2.18.0", "google-api-python-client==2.90.0", @@ -33,7 +34,7 @@ dependencies = [ "httpx~=0.28.1", "python-socks>=2.4.4", "jieba==0.42.1", - "json-repair>=0.41.1", + "json-repair>=0.55.1", "jsonschema>=4.25.1", "langfuse~=2.51.3", "langsmith~=0.1.77", @@ -74,6 +75,7 @@ dependencies = [ "pypdfium2==5.2.0", "python-docx~=1.1.0", "python-dotenv==1.0.1", + "python-socketio~=5.13.0", "pyyaml~=6.0.1", "readabilipy~=0.3.0", "redis[hiredis]~=6.1.0", diff --git a/api/repositories/workflow_collaboration_repository.py b/api/repositories/workflow_collaboration_repository.py new file mode 100644 index 0000000000..000f80496d --- /dev/null +++ b/api/repositories/workflow_collaboration_repository.py @@ -0,0 +1,147 @@ +from __future__ import annotations + +import json +from typing import TypedDict + +from extensions.ext_redis import redis_client + +SESSION_STATE_TTL_SECONDS = 3600 +WORKFLOW_ONLINE_USERS_PREFIX = "workflow_online_users:" +WORKFLOW_LEADER_PREFIX = "workflow_leader:" +WS_SID_MAP_PREFIX = "ws_sid_map:" + + +class WorkflowSessionInfo(TypedDict): + user_id: str + username: str + avatar: str | None + sid: str + connected_at: int + + +class SidMapping(TypedDict): + workflow_id: str + user_id: str + + +class WorkflowCollaborationRepository: + def __init__(self) -> None: + self._redis = redis_client + + def __repr__(self) -> str: + return f"{self.__class__.__name__}(redis_client={self._redis})" + + @staticmethod + def workflow_key(workflow_id: str) -> str: + return f"{WORKFLOW_ONLINE_USERS_PREFIX}{workflow_id}" + + @staticmethod + def leader_key(workflow_id: str) -> str: + return f"{WORKFLOW_LEADER_PREFIX}{workflow_id}" + + @staticmethod + def sid_key(sid: str) -> str: + return f"{WS_SID_MAP_PREFIX}{sid}" + + @staticmethod + def _decode(value: str | bytes | None) -> str | None: + if value is None: + return None + if isinstance(value, bytes): + return value.decode("utf-8") + return value + + def refresh_session_state(self, workflow_id: str, sid: str) -> None: + workflow_key = self.workflow_key(workflow_id) + sid_key = self.sid_key(sid) + if self._redis.exists(workflow_key): + self._redis.expire(workflow_key, SESSION_STATE_TTL_SECONDS) + if self._redis.exists(sid_key): + self._redis.expire(sid_key, SESSION_STATE_TTL_SECONDS) + + def set_session_info(self, workflow_id: str, session_info: WorkflowSessionInfo) -> None: + workflow_key = self.workflow_key(workflow_id) + self._redis.hset(workflow_key, session_info["sid"], json.dumps(session_info)) + self._redis.set( + self.sid_key(session_info["sid"]), + json.dumps({"workflow_id": workflow_id, "user_id": session_info["user_id"]}), + ex=SESSION_STATE_TTL_SECONDS, + ) + self.refresh_session_state(workflow_id, session_info["sid"]) + + def get_sid_mapping(self, sid: str) -> SidMapping | None: + raw = self._redis.get(self.sid_key(sid)) + if not raw: + return None + value = self._decode(raw) + if not value: + return None + try: + return json.loads(value) + except (TypeError, json.JSONDecodeError): + return None + + def delete_session(self, workflow_id: str, sid: str) -> None: + self._redis.hdel(self.workflow_key(workflow_id), sid) + self._redis.delete(self.sid_key(sid)) + + def session_exists(self, workflow_id: str, sid: str) -> bool: + return bool(self._redis.hexists(self.workflow_key(workflow_id), sid)) + + def sid_mapping_exists(self, sid: str) -> bool: + return bool(self._redis.exists(self.sid_key(sid))) + + def get_session_sids(self, workflow_id: str) -> list[str]: + raw_sids = self._redis.hkeys(self.workflow_key(workflow_id)) + decoded_sids: list[str] = [] + for sid in raw_sids: + decoded = self._decode(sid) + if decoded: + decoded_sids.append(decoded) + return decoded_sids + + def list_sessions(self, workflow_id: str) -> list[WorkflowSessionInfo]: + sessions_json = self._redis.hgetall(self.workflow_key(workflow_id)) + users: list[WorkflowSessionInfo] = [] + + for session_info_json in sessions_json.values(): + value = self._decode(session_info_json) + if not value: + continue + try: + session_info = json.loads(value) + except (TypeError, json.JSONDecodeError): + continue + + if not isinstance(session_info, dict): + continue + if "user_id" not in session_info or "username" not in session_info or "sid" not in session_info: + continue + + users.append( + { + "user_id": str(session_info["user_id"]), + "username": str(session_info["username"]), + "avatar": session_info.get("avatar"), + "sid": str(session_info["sid"]), + "connected_at": int(session_info.get("connected_at") or 0), + } + ) + + return users + + def get_current_leader(self, workflow_id: str) -> str | None: + raw = self._redis.get(self.leader_key(workflow_id)) + return self._decode(raw) + + def set_leader_if_absent(self, workflow_id: str, sid: str) -> bool: + return bool(self._redis.set(self.leader_key(workflow_id), sid, nx=True, ex=SESSION_STATE_TTL_SECONDS)) + + def set_leader(self, workflow_id: str, sid: str) -> None: + self._redis.set(self.leader_key(workflow_id), sid, ex=SESSION_STATE_TTL_SECONDS) + + def delete_leader(self, workflow_id: str) -> None: + self._redis.delete(self.leader_key(workflow_id)) + + def expire_leader(self, workflow_id: str) -> None: + self._redis.expire(self.leader_key(workflow_id), SESSION_STATE_TTL_SECONDS) diff --git a/api/services/app_dsl_service.py b/api/services/app_dsl_service.py index acd2a25a86..da22464d39 100644 --- a/api/services/app_dsl_service.py +++ b/api/services/app_dsl_service.py @@ -521,12 +521,10 @@ class AppDslService: raise ValueError("Missing model_config for chat/agent-chat/completion app") # Initialize or update model config if not app.app_model_config: - app_model_config = AppModelConfig().from_model_config_dict(model_config) + app_model_config = AppModelConfig( + app_id=app.id, created_by=account.id, updated_by=account.id + ).from_model_config_dict(model_config) app_model_config.id = str(uuid4()) - app_model_config.app_id = app.id - app_model_config.created_by = account.id - app_model_config.updated_by = account.id - app.app_model_config_id = app_model_config.id self._session.add(app_model_config) diff --git a/api/services/app_service.py b/api/services/app_service.py index 02ebfbace0..af458ff618 100644 --- a/api/services/app_service.py +++ b/api/services/app_service.py @@ -150,10 +150,9 @@ class AppService: db.session.flush() if default_model_config: - app_model_config = AppModelConfig(**default_model_config) - app_model_config.app_id = app.id - app_model_config.created_by = account.id - app_model_config.updated_by = account.id + app_model_config = AppModelConfig( + **default_model_config, app_id=app.id, created_by=account.id, updated_by=account.id + ) db.session.add(app_model_config) db.session.flush() diff --git a/api/services/feature_service.py b/api/services/feature_service.py index b2fb3784e8..d81447fb1f 100644 --- a/api/services/feature_service.py +++ b/api/services/feature_service.py @@ -161,6 +161,7 @@ class SystemFeatureModel(BaseModel): enable_email_code_login: bool = False enable_email_password_login: bool = True enable_social_oauth_login: bool = False + enable_collaboration_mode: bool = False is_allow_register: bool = False is_allow_create_workspace: bool = False is_email_setup: bool = False @@ -224,6 +225,7 @@ class FeatureService: system_features.enable_email_code_login = dify_config.ENABLE_EMAIL_CODE_LOGIN system_features.enable_email_password_login = dify_config.ENABLE_EMAIL_PASSWORD_LOGIN system_features.enable_social_oauth_login = dify_config.ENABLE_SOCIAL_OAUTH_LOGIN + system_features.enable_collaboration_mode = dify_config.ENABLE_COLLABORATION_MODE system_features.is_allow_register = dify_config.ALLOW_REGISTER system_features.is_allow_create_workspace = dify_config.ALLOW_CREATE_WORKSPACE system_features.is_email_setup = dify_config.MAIL_TYPE is not None and dify_config.MAIL_TYPE != "" diff --git a/api/services/message_service.py b/api/services/message_service.py index e1a256e64d..a53ca8b22d 100644 --- a/api/services/message_service.py +++ b/api/services/message_service.py @@ -261,10 +261,9 @@ class MessageService: else: conversation_override_model_configs = json.loads(conversation.override_model_configs) app_model_config = AppModelConfig( - id=conversation.app_model_config_id, app_id=app_model.id, ) - + app_model_config.id = conversation.app_model_config_id app_model_config = app_model_config.from_model_config_dict(conversation_override_model_configs) if not app_model_config: raise ValueError("did not find app model config") diff --git a/api/services/workflow_collaboration_service.py b/api/services/workflow_collaboration_service.py new file mode 100644 index 0000000000..02d08fcaff --- /dev/null +++ b/api/services/workflow_collaboration_service.py @@ -0,0 +1,196 @@ +from __future__ import annotations + +import logging +import time +from collections.abc import Mapping + +from models.account import Account +from repositories.workflow_collaboration_repository import WorkflowCollaborationRepository, WorkflowSessionInfo + + +class WorkflowCollaborationService: + def __init__(self, repository: WorkflowCollaborationRepository, socketio) -> None: + self._repository = repository + self._socketio = socketio + + def __repr__(self) -> str: + return f"{self.__class__.__name__}(repository={self._repository})" + + def save_session(self, sid: str, user: Account) -> None: + self._socketio.save_session( + sid, + { + "user_id": user.id, + "username": user.name, + "avatar": user.avatar, + }, + ) + + def register_session(self, workflow_id: str, sid: str) -> tuple[str, bool] | None: + session = self._socketio.get_session(sid) + user_id = session.get("user_id") + if not user_id: + return None + + session_info: WorkflowSessionInfo = { + "user_id": str(user_id), + "username": str(session.get("username", "Unknown")), + "avatar": session.get("avatar"), + "sid": sid, + "connected_at": int(time.time()), + } + + self._repository.set_session_info(workflow_id, session_info) + + leader_sid = self.get_or_set_leader(workflow_id, sid) + is_leader = leader_sid == sid + + self._socketio.enter_room(sid, workflow_id) + self.broadcast_online_users(workflow_id) + + self._socketio.emit("status", {"isLeader": is_leader}, room=sid) + + return str(user_id), is_leader + + def disconnect_session(self, sid: str) -> None: + mapping = self._repository.get_sid_mapping(sid) + if not mapping: + return + + workflow_id = mapping["workflow_id"] + self._repository.delete_session(workflow_id, sid) + + self.handle_leader_disconnect(workflow_id, sid) + self.broadcast_online_users(workflow_id) + + def relay_collaboration_event(self, sid: str, data: Mapping[str, object]) -> tuple[dict[str, str], int]: + mapping = self._repository.get_sid_mapping(sid) + if not mapping: + return {"msg": "unauthorized"}, 401 + + workflow_id = mapping["workflow_id"] + user_id = mapping["user_id"] + self.refresh_session_state(workflow_id, sid) + + event_type = data.get("type") + event_data = data.get("data") + timestamp = data.get("timestamp", int(time.time())) + + if not event_type: + return {"msg": "invalid event type"}, 400 + + self._socketio.emit( + "collaboration_update", + {"type": event_type, "userId": user_id, "data": event_data, "timestamp": timestamp}, + room=workflow_id, + skip_sid=sid, + ) + + return {"msg": "event_broadcasted"}, 200 + + def relay_graph_event(self, sid: str, data: object) -> tuple[dict[str, str], int]: + mapping = self._repository.get_sid_mapping(sid) + if not mapping: + return {"msg": "unauthorized"}, 401 + + workflow_id = mapping["workflow_id"] + self.refresh_session_state(workflow_id, sid) + + self._socketio.emit("graph_update", data, room=workflow_id, skip_sid=sid) + + return {"msg": "graph_update_broadcasted"}, 200 + + def get_or_set_leader(self, workflow_id: str, sid: str) -> str: + current_leader = self._repository.get_current_leader(workflow_id) + + if current_leader: + if self.is_session_active(workflow_id, current_leader): + return current_leader + self._repository.delete_session(workflow_id, current_leader) + self._repository.delete_leader(workflow_id) + + was_set = self._repository.set_leader_if_absent(workflow_id, sid) + + if was_set: + if current_leader: + self.broadcast_leader_change(workflow_id, sid) + return sid + + current_leader = self._repository.get_current_leader(workflow_id) + if current_leader: + return current_leader + + return sid + + def handle_leader_disconnect(self, workflow_id: str, disconnected_sid: str) -> None: + current_leader = self._repository.get_current_leader(workflow_id) + if not current_leader: + return + + if current_leader != disconnected_sid: + return + + session_sids = self._repository.get_session_sids(workflow_id) + if session_sids: + new_leader_sid = session_sids[0] + self._repository.set_leader(workflow_id, new_leader_sid) + self.broadcast_leader_change(workflow_id, new_leader_sid) + else: + self._repository.delete_leader(workflow_id) + + def broadcast_leader_change(self, workflow_id: str, new_leader_sid: str) -> None: + for sid in self._repository.get_session_sids(workflow_id): + try: + is_leader = sid == new_leader_sid + self._socketio.emit("status", {"isLeader": is_leader}, room=sid) + except Exception: + logging.exception("Failed to emit leader status to session %s", sid) + + def get_current_leader(self, workflow_id: str) -> str | None: + return self._repository.get_current_leader(workflow_id) + + def broadcast_online_users(self, workflow_id: str) -> None: + users = self._repository.list_sessions(workflow_id) + users.sort(key=lambda x: x.get("connected_at") or 0) + + leader_sid = self.get_current_leader(workflow_id) + + self._socketio.emit( + "online_users", + {"workflow_id": workflow_id, "users": users, "leader": leader_sid}, + room=workflow_id, + ) + + def refresh_session_state(self, workflow_id: str, sid: str) -> None: + self._repository.refresh_session_state(workflow_id, sid) + self._ensure_leader(workflow_id, sid) + + def _ensure_leader(self, workflow_id: str, sid: str) -> None: + current_leader = self._repository.get_current_leader(workflow_id) + if current_leader and self.is_session_active(workflow_id, current_leader): + self._repository.expire_leader(workflow_id) + return + + if current_leader: + self._repository.delete_leader(workflow_id) + + self._repository.set_leader(workflow_id, sid) + self.broadcast_leader_change(workflow_id, sid) + + def is_session_active(self, workflow_id: str, sid: str) -> bool: + if not sid: + return False + + try: + if not self._socketio.manager.is_connected(sid, "/"): + return False + except AttributeError: + return False + + if not self._repository.session_exists(workflow_id, sid): + return False + + if not self._repository.sid_mapping_exists(sid): + return False + + return True diff --git a/api/services/workflow_comment_service.py b/api/services/workflow_comment_service.py new file mode 100644 index 0000000000..40e0df7e87 --- /dev/null +++ b/api/services/workflow_comment_service.py @@ -0,0 +1,345 @@ +import logging +from collections.abc import Sequence + +from sqlalchemy import desc, select +from sqlalchemy.orm import Session, selectinload +from werkzeug.exceptions import Forbidden, NotFound + +from extensions.ext_database import db +from libs.datetime_utils import naive_utc_now +from libs.helper import uuid_value +from models import WorkflowComment, WorkflowCommentMention, WorkflowCommentReply +from models.account import Account + +logger = logging.getLogger(__name__) + + +class WorkflowCommentService: + """Service for managing workflow comments.""" + + @staticmethod + def _validate_content(content: str) -> None: + if len(content.strip()) == 0: + raise ValueError("Comment content cannot be empty") + + if len(content) > 1000: + raise ValueError("Comment content cannot exceed 1000 characters") + + @staticmethod + def get_comments(tenant_id: str, app_id: str) -> Sequence[WorkflowComment]: + """Get all comments for a workflow.""" + with Session(db.engine) as session: + # Get all comments with eager loading + stmt = ( + select(WorkflowComment) + .options(selectinload(WorkflowComment.replies), selectinload(WorkflowComment.mentions)) + .where(WorkflowComment.tenant_id == tenant_id, WorkflowComment.app_id == app_id) + .order_by(desc(WorkflowComment.created_at)) + ) + + comments = session.scalars(stmt).all() + + # Batch preload all Account objects to avoid N+1 queries + WorkflowCommentService._preload_accounts(session, comments) + + return comments + + @staticmethod + def _preload_accounts(session: Session, comments: Sequence[WorkflowComment]) -> None: + """Batch preload Account objects for comments, replies, and mentions.""" + # Collect all user IDs + user_ids: set[str] = set() + for comment in comments: + user_ids.add(comment.created_by) + if comment.resolved_by: + user_ids.add(comment.resolved_by) + user_ids.update(reply.created_by for reply in comment.replies) + user_ids.update(mention.mentioned_user_id for mention in comment.mentions) + + if not user_ids: + return + + # Batch query all accounts + accounts = session.scalars(select(Account).where(Account.id.in_(user_ids))).all() + account_map = {str(account.id): account for account in accounts} + + # Cache accounts on objects + for comment in comments: + comment.cache_created_by_account(account_map.get(comment.created_by)) + comment.cache_resolved_by_account(account_map.get(comment.resolved_by) if comment.resolved_by else None) + for reply in comment.replies: + reply.cache_created_by_account(account_map.get(reply.created_by)) + for mention in comment.mentions: + mention.cache_mentioned_user_account(account_map.get(mention.mentioned_user_id)) + + @staticmethod + def get_comment(tenant_id: str, app_id: str, comment_id: str, session: Session | None = None) -> WorkflowComment: + """Get a specific comment.""" + + def _get_comment(session: Session) -> WorkflowComment: + stmt = ( + select(WorkflowComment) + .options(selectinload(WorkflowComment.replies), selectinload(WorkflowComment.mentions)) + .where( + WorkflowComment.id == comment_id, + WorkflowComment.tenant_id == tenant_id, + WorkflowComment.app_id == app_id, + ) + ) + comment = session.scalar(stmt) + + if not comment: + raise NotFound("Comment not found") + + # Preload accounts to avoid N+1 queries + WorkflowCommentService._preload_accounts(session, [comment]) + + return comment + + if session is not None: + return _get_comment(session) + else: + with Session(db.engine, expire_on_commit=False) as session: + return _get_comment(session) + + @staticmethod + def create_comment( + tenant_id: str, + app_id: str, + created_by: str, + content: str, + position_x: float, + position_y: float, + mentioned_user_ids: list[str] | None = None, + ) -> dict: + """Create a new workflow comment.""" + WorkflowCommentService._validate_content(content) + + with Session(db.engine) as session: + comment = WorkflowComment( + tenant_id=tenant_id, + app_id=app_id, + position_x=position_x, + position_y=position_y, + content=content, + created_by=created_by, + ) + + session.add(comment) + session.flush() # Get the comment ID for mentions + + # Create mentions if specified + mentioned_user_ids = mentioned_user_ids or [] + for user_id in mentioned_user_ids: + if isinstance(user_id, str) and uuid_value(user_id): + mention = WorkflowCommentMention( + comment_id=comment.id, + reply_id=None, # This is a comment mention, not reply mention + mentioned_user_id=user_id, + ) + session.add(mention) + + session.commit() + + # Return only what we need - id and created_at + return {"id": comment.id, "created_at": comment.created_at} + + @staticmethod + def update_comment( + tenant_id: str, + app_id: str, + comment_id: str, + user_id: str, + content: str, + position_x: float | None = None, + position_y: float | None = None, + mentioned_user_ids: list[str] | None = None, + ) -> dict: + """Update a workflow comment.""" + WorkflowCommentService._validate_content(content) + + with Session(db.engine, expire_on_commit=False) as session: + # Get comment with validation + stmt = select(WorkflowComment).where( + WorkflowComment.id == comment_id, + WorkflowComment.tenant_id == tenant_id, + WorkflowComment.app_id == app_id, + ) + comment = session.scalar(stmt) + + if not comment: + raise NotFound("Comment not found") + + # Only the creator can update the comment + if comment.created_by != user_id: + raise Forbidden("Only the comment creator can update it") + + # Update comment fields + comment.content = content + if position_x is not None: + comment.position_x = position_x + if position_y is not None: + comment.position_y = position_y + + # Update mentions - first remove existing mentions for this comment only (not replies) + existing_mentions = session.scalars( + select(WorkflowCommentMention).where( + WorkflowCommentMention.comment_id == comment.id, + WorkflowCommentMention.reply_id.is_(None), # Only comment mentions, not reply mentions + ) + ).all() + for mention in existing_mentions: + session.delete(mention) + + # Add new mentions + mentioned_user_ids = mentioned_user_ids or [] + for user_id_str in mentioned_user_ids: + if isinstance(user_id_str, str) and uuid_value(user_id_str): + mention = WorkflowCommentMention( + comment_id=comment.id, + reply_id=None, # This is a comment mention + mentioned_user_id=user_id_str, + ) + session.add(mention) + + session.commit() + + return {"id": comment.id, "updated_at": comment.updated_at} + + @staticmethod + def delete_comment(tenant_id: str, app_id: str, comment_id: str, user_id: str) -> None: + """Delete a workflow comment.""" + with Session(db.engine, expire_on_commit=False) as session: + comment = WorkflowCommentService.get_comment(tenant_id, app_id, comment_id, session) + + # Only the creator can delete the comment + if comment.created_by != user_id: + raise Forbidden("Only the comment creator can delete it") + + # Delete associated mentions (both comment and reply mentions) + mentions = session.scalars( + select(WorkflowCommentMention).where(WorkflowCommentMention.comment_id == comment_id) + ).all() + for mention in mentions: + session.delete(mention) + + # Delete associated replies + replies = session.scalars( + select(WorkflowCommentReply).where(WorkflowCommentReply.comment_id == comment_id) + ).all() + for reply in replies: + session.delete(reply) + + session.delete(comment) + session.commit() + + @staticmethod + def resolve_comment(tenant_id: str, app_id: str, comment_id: str, user_id: str) -> WorkflowComment: + """Resolve a workflow comment.""" + with Session(db.engine, expire_on_commit=False) as session: + comment = WorkflowCommentService.get_comment(tenant_id, app_id, comment_id, session) + if comment.resolved: + return comment + + comment.resolved = True + comment.resolved_at = naive_utc_now() + comment.resolved_by = user_id + session.commit() + + return comment + + @staticmethod + def create_reply( + comment_id: str, content: str, created_by: str, mentioned_user_ids: list[str] | None = None + ) -> dict: + """Add a reply to a workflow comment.""" + WorkflowCommentService._validate_content(content) + + with Session(db.engine, expire_on_commit=False) as session: + # Check if comment exists + comment = session.get(WorkflowComment, comment_id) + if not comment: + raise NotFound("Comment not found") + + reply = WorkflowCommentReply(comment_id=comment_id, content=content, created_by=created_by) + + session.add(reply) + session.flush() # Get the reply ID for mentions + + # Create mentions if specified + mentioned_user_ids = mentioned_user_ids or [] + for user_id in mentioned_user_ids: + if isinstance(user_id, str) and uuid_value(user_id): + # Create mention linking to specific reply + mention = WorkflowCommentMention( + comment_id=comment_id, reply_id=reply.id, mentioned_user_id=user_id + ) + session.add(mention) + + session.commit() + + return {"id": reply.id, "created_at": reply.created_at} + + @staticmethod + def update_reply(reply_id: str, user_id: str, content: str, mentioned_user_ids: list[str] | None = None) -> dict: + """Update a comment reply.""" + WorkflowCommentService._validate_content(content) + + with Session(db.engine, expire_on_commit=False) as session: + reply = session.get(WorkflowCommentReply, reply_id) + if not reply: + raise NotFound("Reply not found") + + # Only the creator can update the reply + if reply.created_by != user_id: + raise Forbidden("Only the reply creator can update it") + + reply.content = content + + # Update mentions - first remove existing mentions for this reply + existing_mentions = session.scalars( + select(WorkflowCommentMention).where(WorkflowCommentMention.reply_id == reply.id) + ).all() + for mention in existing_mentions: + session.delete(mention) + + # Add mentions + mentioned_user_ids = mentioned_user_ids or [] + for user_id_str in mentioned_user_ids: + if isinstance(user_id_str, str) and uuid_value(user_id_str): + mention = WorkflowCommentMention( + comment_id=reply.comment_id, reply_id=reply.id, mentioned_user_id=user_id_str + ) + session.add(mention) + + session.commit() + session.refresh(reply) # Refresh to get updated timestamp + + return {"id": reply.id, "updated_at": reply.updated_at} + + @staticmethod + def delete_reply(reply_id: str, user_id: str) -> None: + """Delete a comment reply.""" + with Session(db.engine, expire_on_commit=False) as session: + reply = session.get(WorkflowCommentReply, reply_id) + if not reply: + raise NotFound("Reply not found") + + # Only the creator can delete the reply + if reply.created_by != user_id: + raise Forbidden("Only the reply creator can delete it") + + # Delete associated mentions first + mentions = session.scalars( + select(WorkflowCommentMention).where(WorkflowCommentMention.reply_id == reply_id) + ).all() + for mention in mentions: + session.delete(mention) + + session.delete(reply) + session.commit() + + @staticmethod + def validate_comment_access(comment_id: str, tenant_id: str, app_id: str) -> WorkflowComment: + """Validate that a comment belongs to the specified tenant and app.""" + return WorkflowCommentService.get_comment(tenant_id, app_id, comment_id) diff --git a/api/services/workflow_service.py b/api/services/workflow_service.py index b9b155ce80..4ab91be0ce 100644 --- a/api/services/workflow_service.py +++ b/api/services/workflow_service.py @@ -254,6 +254,78 @@ class WorkflowService: # return draft workflow return workflow + def update_draft_workflow_environment_variables( + self, + *, + app_model: App, + environment_variables: Sequence[VariableBase], + account: Account, + ): + """ + Update draft workflow environment variables + """ + # fetch draft workflow by app_model + workflow = self.get_draft_workflow(app_model=app_model) + + if not workflow: + raise ValueError("No draft workflow found.") + + workflow.environment_variables = environment_variables + workflow.updated_by = account.id + workflow.updated_at = naive_utc_now() + + # commit db session changes + db.session.commit() + + def update_draft_workflow_conversation_variables( + self, + *, + app_model: App, + conversation_variables: Sequence[VariableBase], + account: Account, + ): + """ + Update draft workflow conversation variables + """ + # fetch draft workflow by app_model + workflow = self.get_draft_workflow(app_model=app_model) + + if not workflow: + raise ValueError("No draft workflow found.") + + workflow.conversation_variables = conversation_variables + workflow.updated_by = account.id + workflow.updated_at = naive_utc_now() + + # commit db session changes + db.session.commit() + + def update_draft_workflow_features( + self, + *, + app_model: App, + features: dict, + account: Account, + ): + """ + Update draft workflow features + """ + # fetch draft workflow by app_model + workflow = self.get_draft_workflow(app_model=app_model) + + if not workflow: + raise ValueError("No draft workflow found.") + + # validate features structure + self.validate_features_structure(app_model=app_model, features=features) + + workflow.features = json.dumps(features) + workflow.updated_by = account.id + workflow.updated_at = naive_utc_now() + + # commit db session changes + db.session.commit() + def publish_workflow( self, *, diff --git a/api/tests/integration_tests/conftest.py b/api/tests/integration_tests/conftest.py index 948cf8b3a0..7ff56eee91 100644 --- a/api/tests/integration_tests/conftest.py +++ b/api/tests/integration_tests/conftest.py @@ -38,7 +38,7 @@ os.environ["OPENDAL_FS_ROOT"] = "/tmp/dify-storage" os.environ.setdefault("STORAGE_TYPE", "opendal") os.environ.setdefault("OPENDAL_SCHEME", "fs") -_CACHED_APP = create_app() +_SIO_APP, _CACHED_APP = create_app() @pytest.fixture diff --git a/api/tests/test_containers_integration_tests/conftest.py b/api/tests/test_containers_integration_tests/conftest.py index d6d2d30305..b67b48947c 100644 --- a/api/tests/test_containers_integration_tests/conftest.py +++ b/api/tests/test_containers_integration_tests/conftest.py @@ -364,7 +364,7 @@ def _create_app_with_containers() -> Flask: # Create and configure the Flask application logger.info("Initializing Flask application...") - app = create_app() + sio_app, app = create_app() logger.info("Flask application created successfully") # Initialize database schema diff --git a/api/tests/test_containers_integration_tests/services/test_agent_service.py b/api/tests/test_containers_integration_tests/services/test_agent_service.py index a22d6f8fbf..6eedbd6cfa 100644 --- a/api/tests/test_containers_integration_tests/services/test_agent_service.py +++ b/api/tests/test_containers_integration_tests/services/test_agent_service.py @@ -172,7 +172,6 @@ class TestAgentService: # Create app model config app_model_config = AppModelConfig( - id=fake.uuid4(), app_id=app.id, provider="openai", model_id="gpt-3.5-turbo", @@ -180,6 +179,7 @@ class TestAgentService: model="gpt-3.5-turbo", agent_mode=json.dumps({"enabled": True, "strategy": "react", "tools": []}), ) + app_model_config.id = fake.uuid4() db.session.add(app_model_config) db.session.commit() @@ -413,7 +413,6 @@ class TestAgentService: # Create app model config app_model_config = AppModelConfig( - id=fake.uuid4(), app_id=app.id, provider="openai", model_id="gpt-3.5-turbo", @@ -421,6 +420,7 @@ class TestAgentService: model="gpt-3.5-turbo", agent_mode=json.dumps({"enabled": True, "strategy": "react", "tools": []}), ) + app_model_config.id = fake.uuid4() db.session.add(app_model_config) db.session.commit() @@ -485,7 +485,6 @@ class TestAgentService: # Create app model config app_model_config = AppModelConfig( - id=fake.uuid4(), app_id=app.id, provider="openai", model_id="gpt-3.5-turbo", @@ -493,6 +492,7 @@ class TestAgentService: model="gpt-3.5-turbo", agent_mode=json.dumps({"enabled": True, "strategy": "react", "tools": []}), ) + app_model_config.id = fake.uuid4() db.session.add(app_model_config) db.session.commit() diff --git a/api/tests/test_containers_integration_tests/services/test_app_dsl_service.py b/api/tests/test_containers_integration_tests/services/test_app_dsl_service.py index 119f92d772..e2a450b90c 100644 --- a/api/tests/test_containers_integration_tests/services/test_app_dsl_service.py +++ b/api/tests/test_containers_integration_tests/services/test_app_dsl_service.py @@ -226,26 +226,27 @@ class TestAppDslService: app, account = self._create_test_app_and_account(db_session_with_containers, mock_external_service_dependencies) # Create model config for the app - model_config = AppModelConfig() - model_config.id = fake.uuid4() - model_config.app_id = app.id - model_config.provider = "openai" - model_config.model_id = "gpt-3.5-turbo" - model_config.model = json.dumps( - { - "provider": "openai", - "name": "gpt-3.5-turbo", - "mode": "chat", - "completion_params": { - "max_tokens": 1000, - "temperature": 0.7, - }, - } + model_config = AppModelConfig( + app_id=app.id, + provider="openai", + model_id="gpt-3.5-turbo", + model=json.dumps( + { + "provider": "openai", + "name": "gpt-3.5-turbo", + "mode": "chat", + "completion_params": { + "max_tokens": 1000, + "temperature": 0.7, + }, + } + ), + pre_prompt="You are a helpful assistant.", + prompt_type="simple", + created_by=account.id, + updated_by=account.id, ) - model_config.pre_prompt = "You are a helpful assistant." - model_config.prompt_type = "simple" - model_config.created_by = account.id - model_config.updated_by = account.id + model_config.id = fake.uuid4() # Set the app_model_config_id to link the config app.app_model_config_id = model_config.id diff --git a/api/tests/test_containers_integration_tests/services/test_feature_service.py b/api/tests/test_containers_integration_tests/services/test_feature_service.py index bd2fd14ffa..bc3b60d778 100644 --- a/api/tests/test_containers_integration_tests/services/test_feature_service.py +++ b/api/tests/test_containers_integration_tests/services/test_feature_service.py @@ -274,6 +274,7 @@ class TestFeatureService: mock_config.ENABLE_EMAIL_CODE_LOGIN = True mock_config.ENABLE_EMAIL_PASSWORD_LOGIN = True mock_config.ENABLE_SOCIAL_OAUTH_LOGIN = False + mock_config.ENABLE_COLLABORATION_MODE = True mock_config.ALLOW_REGISTER = False mock_config.ALLOW_CREATE_WORKSPACE = False mock_config.MAIL_TYPE = "smtp" @@ -298,6 +299,7 @@ class TestFeatureService: # Verify authentication settings assert result.enable_email_code_login is True assert result.enable_email_password_login is False + assert result.enable_collaboration_mode is True assert result.is_allow_register is False assert result.is_allow_create_workspace is False @@ -402,6 +404,7 @@ class TestFeatureService: mock_config.ENABLE_EMAIL_CODE_LOGIN = True mock_config.ENABLE_EMAIL_PASSWORD_LOGIN = True mock_config.ENABLE_SOCIAL_OAUTH_LOGIN = False + mock_config.ENABLE_COLLABORATION_MODE = False mock_config.ALLOW_REGISTER = True mock_config.ALLOW_CREATE_WORKSPACE = True mock_config.MAIL_TYPE = "smtp" @@ -423,6 +426,7 @@ class TestFeatureService: assert result.enable_email_code_login is True assert result.enable_email_password_login is True assert result.enable_social_oauth_login is False + assert result.enable_collaboration_mode is False assert result.is_allow_register is True assert result.is_allow_create_workspace is True assert result.is_email_setup is True diff --git a/api/tests/test_containers_integration_tests/services/test_workflow_service.py b/api/tests/test_containers_integration_tests/services/test_workflow_service.py index 88c6313f64..cb691d5c3d 100644 --- a/api/tests/test_containers_integration_tests/services/test_workflow_service.py +++ b/api/tests/test_containers_integration_tests/services/test_workflow_service.py @@ -925,24 +925,24 @@ class TestWorkflowService: # Create app model config (required for conversion) from models.model import AppModelConfig - app_model_config = AppModelConfig() - app_model_config.id = fake.uuid4() - app_model_config.app_id = app.id - app_model_config.tenant_id = app.tenant_id - app_model_config.provider = "openai" - app_model_config.model_id = "gpt-3.5-turbo" - # Set the model field directly - this is what model_dict property returns - app_model_config.model = json.dumps( - { - "provider": "openai", - "name": "gpt-3.5-turbo", - "completion_params": {"max_tokens": 1000, "temperature": 0.7}, - } + app_model_config = AppModelConfig( + app_id=app.id, + provider="openai", + model_id="gpt-3.5-turbo", + # Set the model field directly - this is what model_dict property returns + model=json.dumps( + { + "provider": "openai", + "name": "gpt-3.5-turbo", + "completion_params": {"max_tokens": 1000, "temperature": 0.7}, + } + ), + # Set pre_prompt for PromptTemplateConfigManager + pre_prompt="You are a helpful assistant.", + created_by=account.id, + updated_by=account.id, ) - # Set pre_prompt for PromptTemplateConfigManager - app_model_config.pre_prompt = "You are a helpful assistant." - app_model_config.created_by = account.id - app_model_config.updated_by = account.id + app_model_config.id = fake.uuid4() from extensions.ext_database import db @@ -987,24 +987,24 @@ class TestWorkflowService: # Create app model config (required for conversion) from models.model import AppModelConfig - app_model_config = AppModelConfig() - app_model_config.id = fake.uuid4() - app_model_config.app_id = app.id - app_model_config.tenant_id = app.tenant_id - app_model_config.provider = "openai" - app_model_config.model_id = "gpt-3.5-turbo" - # Set the model field directly - this is what model_dict property returns - app_model_config.model = json.dumps( - { - "provider": "openai", - "name": "gpt-3.5-turbo", - "completion_params": {"max_tokens": 1000, "temperature": 0.7}, - } + app_model_config = AppModelConfig( + app_id=app.id, + provider="openai", + model_id="gpt-3.5-turbo", + # Set the model field directly - this is what model_dict property returns + model=json.dumps( + { + "provider": "openai", + "name": "gpt-3.5-turbo", + "completion_params": {"max_tokens": 1000, "temperature": 0.7}, + } + ), + # Set pre_prompt for PromptTemplateConfigManager + pre_prompt="Complete the following text:", + created_by=account.id, + updated_by=account.id, ) - # Set pre_prompt for PromptTemplateConfigManager - app_model_config.pre_prompt = "Complete the following text:" - app_model_config.created_by = account.id - app_model_config.updated_by = account.id + app_model_config.id = fake.uuid4() from extensions.ext_database import db diff --git a/api/tests/unit_tests/core/workflow/nodes/http_request/test_http_request_executor.py b/api/tests/unit_tests/core/workflow/nodes/http_request/test_http_request_executor.py index 27df938102..2a9db2d328 100644 --- a/api/tests/unit_tests/core/workflow/nodes/http_request/test_http_request_executor.py +++ b/api/tests/unit_tests/core/workflow/nodes/http_request/test_http_request_executor.py @@ -475,3 +475,130 @@ def test_valid_api_key_works(): headers = executor._assembling_headers() assert "Authorization" in headers assert headers["Authorization"] == "Bearer valid-api-key-123" + + +def test_executor_with_json_body_and_unquoted_uuid_variable(): + """Test that unquoted UUID variables are correctly handled in JSON body. + + This test verifies the fix for issue #31436 where json_repair would truncate + certain UUID patterns (like 57eeeeb1-...) when they appeared as unquoted values. + """ + # UUID that triggers the json_repair truncation bug + test_uuid = "57eeeeb1-450b-482c-81b9-4be77e95dee2" + + variable_pool = VariablePool( + system_variables=SystemVariable.empty(), + user_inputs={}, + ) + variable_pool.add(["pre_node_id", "uuid"], test_uuid) + + node_data = HttpRequestNodeData( + title="Test JSON Body with Unquoted UUID Variable", + method="post", + url="https://api.example.com/data", + authorization=HttpRequestNodeAuthorization(type="no-auth"), + headers="Content-Type: application/json", + params="", + body=HttpRequestNodeBody( + type="json", + data=[ + BodyData( + key="", + type="text", + # UUID variable without quotes - this is the problematic case + value='{"rowId": {{#pre_node_id.uuid#}}}', + ) + ], + ), + ) + + executor = Executor( + node_data=node_data, + timeout=HttpRequestNodeTimeout(connect=10, read=30, write=30), + variable_pool=variable_pool, + ) + + # The UUID should be preserved in full, not truncated + assert executor.json == {"rowId": test_uuid} + assert len(executor.json["rowId"]) == len(test_uuid) + + +def test_executor_with_json_body_and_unquoted_uuid_with_newlines(): + """Test that unquoted UUID variables with newlines in JSON are handled correctly. + + This is a specific case from issue #31436 where the JSON body contains newlines. + """ + test_uuid = "57eeeeb1-450b-482c-81b9-4be77e95dee2" + + variable_pool = VariablePool( + system_variables=SystemVariable.empty(), + user_inputs={}, + ) + variable_pool.add(["pre_node_id", "uuid"], test_uuid) + + node_data = HttpRequestNodeData( + title="Test JSON Body with Unquoted UUID and Newlines", + method="post", + url="https://api.example.com/data", + authorization=HttpRequestNodeAuthorization(type="no-auth"), + headers="Content-Type: application/json", + params="", + body=HttpRequestNodeBody( + type="json", + data=[ + BodyData( + key="", + type="text", + # JSON with newlines and unquoted UUID variable + value='{\n"rowId": {{#pre_node_id.uuid#}}\n}', + ) + ], + ), + ) + + executor = Executor( + node_data=node_data, + timeout=HttpRequestNodeTimeout(connect=10, read=30, write=30), + variable_pool=variable_pool, + ) + + # The UUID should be preserved in full + assert executor.json == {"rowId": test_uuid} + + +def test_executor_with_json_body_preserves_numbers_and_strings(): + """Test that numbers are preserved and string values are properly quoted.""" + variable_pool = VariablePool( + system_variables=SystemVariable.empty(), + user_inputs={}, + ) + variable_pool.add(["node", "count"], 42) + variable_pool.add(["node", "id"], "abc-123") + + node_data = HttpRequestNodeData( + title="Test JSON Body with mixed types", + method="post", + url="https://api.example.com/data", + authorization=HttpRequestNodeAuthorization(type="no-auth"), + headers="", + params="", + body=HttpRequestNodeBody( + type="json", + data=[ + BodyData( + key="", + type="text", + value='{"count": {{#node.count#}}, "id": {{#node.id#}}}', + ) + ], + ), + ) + + executor = Executor( + node_data=node_data, + timeout=HttpRequestNodeTimeout(connect=10, read=30, write=30), + variable_pool=variable_pool, + ) + + assert executor.json["count"] == 42 + assert executor.json["id"] == "abc-123" diff --git a/api/tests/unit_tests/repositories/test_workflow_collaboration_repository.py b/api/tests/unit_tests/repositories/test_workflow_collaboration_repository.py new file mode 100644 index 0000000000..1f47e8b692 --- /dev/null +++ b/api/tests/unit_tests/repositories/test_workflow_collaboration_repository.py @@ -0,0 +1,121 @@ +import json +from unittest.mock import Mock + +import pytest + +from repositories import workflow_collaboration_repository as repo_module +from repositories.workflow_collaboration_repository import WorkflowCollaborationRepository + + +class TestWorkflowCollaborationRepository: + @pytest.fixture + def mock_redis(self, monkeypatch: pytest.MonkeyPatch) -> Mock: + mock_redis = Mock() + monkeypatch.setattr(repo_module, "redis_client", mock_redis) + return mock_redis + + def test_get_sid_mapping_returns_mapping(self, mock_redis: Mock) -> None: + # Arrange + mock_redis.get.return_value = b'{"workflow_id":"wf-1","user_id":"u-1"}' + repository = WorkflowCollaborationRepository() + + # Act + result = repository.get_sid_mapping("sid-1") + + # Assert + assert result == {"workflow_id": "wf-1", "user_id": "u-1"} + + def test_list_sessions_filters_invalid_entries(self, mock_redis: Mock) -> None: + # Arrange + mock_redis.hgetall.return_value = { + b"sid-1": b'{"user_id":"u-1","username":"Jane","sid":"sid-1","connected_at":2}', + b"sid-2": b'{"username":"Missing","sid":"sid-2"}', + b"sid-3": b"not-json", + } + repository = WorkflowCollaborationRepository() + + # Act + result = repository.list_sessions("wf-1") + + # Assert + assert result == [ + { + "user_id": "u-1", + "username": "Jane", + "avatar": None, + "sid": "sid-1", + "connected_at": 2, + } + ] + + def test_set_session_info_persists_payload(self, mock_redis: Mock) -> None: + # Arrange + mock_redis.exists.return_value = True + repository = WorkflowCollaborationRepository() + payload = { + "user_id": "u-1", + "username": "Jane", + "avatar": None, + "sid": "sid-1", + "connected_at": 1, + } + + # Act + repository.set_session_info("wf-1", payload) + + # Assert + assert mock_redis.hset.called + workflow_key, sid, session_json = mock_redis.hset.call_args.args + assert workflow_key == "workflow_online_users:wf-1" + assert sid == "sid-1" + assert json.loads(session_json)["user_id"] == "u-1" + assert mock_redis.set.called + + def test_refresh_session_state_expires_keys(self, mock_redis: Mock) -> None: + # Arrange + mock_redis.exists.return_value = True + repository = WorkflowCollaborationRepository() + + # Act + repository.refresh_session_state("wf-1", "sid-1") + + # Assert + assert mock_redis.expire.call_count == 2 + + def test_get_current_leader_decodes_bytes(self, mock_redis: Mock) -> None: + # Arrange + mock_redis.get.return_value = b"sid-1" + repository = WorkflowCollaborationRepository() + + # Act + result = repository.get_current_leader("wf-1") + + # Assert + assert result == "sid-1" + + def test_set_leader_if_absent_uses_nx(self, mock_redis: Mock) -> None: + # Arrange + mock_redis.set.return_value = True + repository = WorkflowCollaborationRepository() + + # Act + result = repository.set_leader_if_absent("wf-1", "sid-1") + + # Assert + assert result is True + _key, _value = mock_redis.set.call_args.args + assert _key == "workflow_leader:wf-1" + assert _value == "sid-1" + assert mock_redis.set.call_args.kwargs["nx"] is True + assert "ex" in mock_redis.set.call_args.kwargs + + def test_get_session_sids_decodes(self, mock_redis: Mock) -> None: + # Arrange + mock_redis.hkeys.return_value = [b"sid-1", "sid-2"] + repository = WorkflowCollaborationRepository() + + # Act + result = repository.get_session_sids("wf-1") + + # Assert + assert result == ["sid-1", "sid-2"] diff --git a/api/tests/unit_tests/services/test_workflow_collaboration_service.py b/api/tests/unit_tests/services/test_workflow_collaboration_service.py new file mode 100644 index 0000000000..d5334d5e34 --- /dev/null +++ b/api/tests/unit_tests/services/test_workflow_collaboration_service.py @@ -0,0 +1,271 @@ +from unittest.mock import Mock, patch + +import pytest + +from repositories.workflow_collaboration_repository import WorkflowCollaborationRepository +from services.workflow_collaboration_service import WorkflowCollaborationService + + +class TestWorkflowCollaborationService: + @pytest.fixture + def service(self) -> tuple[WorkflowCollaborationService, Mock, Mock]: + repository = Mock(spec=WorkflowCollaborationRepository) + socketio = Mock() + return WorkflowCollaborationService(repository, socketio), repository, socketio + + def test_register_session_returns_leader_status( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, socketio = service + socketio.get_session.return_value = {"user_id": "u-1", "username": "Jane", "avatar": None} + + with ( + patch.object(collaboration_service, "get_or_set_leader", return_value="sid-1"), + patch.object(collaboration_service, "broadcast_online_users"), + ): + # Act + result = collaboration_service.register_session("wf-1", "sid-1") + + # Assert + assert result == ("u-1", True) + repository.set_session_info.assert_called_once() + socketio.enter_room.assert_called_once_with("sid-1", "wf-1") + socketio.emit.assert_called_once_with("status", {"isLeader": True}, room="sid-1") + + def test_register_session_returns_none_when_missing_user( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, _repository, socketio = service + socketio.get_session.return_value = {} + + # Act + result = collaboration_service.register_session("wf-1", "sid-1") + + # Assert + assert result is None + + def test_relay_collaboration_event_unauthorized( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_sid_mapping.return_value = None + + # Act + result = collaboration_service.relay_collaboration_event("sid-1", {}) + + # Assert + assert result == ({"msg": "unauthorized"}, 401) + + def test_relay_collaboration_event_emits_update( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, socketio = service + repository.get_sid_mapping.return_value = {"workflow_id": "wf-1", "user_id": "u-1"} + payload = {"type": "mouse_move", "data": {"x": 1}, "timestamp": 123} + + # Act + result = collaboration_service.relay_collaboration_event("sid-1", payload) + + # Assert + assert result == ({"msg": "event_broadcasted"}, 200) + socketio.emit.assert_called_once_with( + "collaboration_update", + {"type": "mouse_move", "userId": "u-1", "data": {"x": 1}, "timestamp": 123}, + room="wf-1", + skip_sid="sid-1", + ) + + def test_relay_graph_event_unauthorized(self, service: tuple[WorkflowCollaborationService, Mock, Mock]) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_sid_mapping.return_value = None + + # Act + result = collaboration_service.relay_graph_event("sid-1", {"nodes": []}) + + # Assert + assert result == ({"msg": "unauthorized"}, 401) + + def test_disconnect_session_no_mapping(self, service: tuple[WorkflowCollaborationService, Mock, Mock]) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_sid_mapping.return_value = None + + # Act + collaboration_service.disconnect_session("sid-1") + + # Assert + repository.delete_session.assert_not_called() + + def test_disconnect_session_cleans_up(self, service: tuple[WorkflowCollaborationService, Mock, Mock]) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_sid_mapping.return_value = {"workflow_id": "wf-1", "user_id": "u-1"} + + with ( + patch.object(collaboration_service, "handle_leader_disconnect") as handle_leader_disconnect, + patch.object(collaboration_service, "broadcast_online_users") as broadcast_online_users, + ): + # Act + collaboration_service.disconnect_session("sid-1") + + # Assert + repository.delete_session.assert_called_once_with("wf-1", "sid-1") + handle_leader_disconnect.assert_called_once_with("wf-1", "sid-1") + broadcast_online_users.assert_called_once_with("wf-1") + + def test_get_or_set_leader_returns_active_leader( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_current_leader.return_value = "sid-1" + + with patch.object(collaboration_service, "is_session_active", return_value=True): + # Act + result = collaboration_service.get_or_set_leader("wf-1", "sid-2") + + # Assert + assert result == "sid-1" + repository.set_leader_if_absent.assert_not_called() + + def test_get_or_set_leader_replaces_dead_leader( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_current_leader.return_value = "sid-1" + repository.set_leader_if_absent.return_value = True + + with ( + patch.object(collaboration_service, "is_session_active", return_value=False), + patch.object(collaboration_service, "broadcast_leader_change") as broadcast_leader_change, + ): + # Act + result = collaboration_service.get_or_set_leader("wf-1", "sid-2") + + # Assert + assert result == "sid-2" + repository.delete_session.assert_called_once_with("wf-1", "sid-1") + repository.delete_leader.assert_called_once_with("wf-1") + broadcast_leader_change.assert_called_once_with("wf-1", "sid-2") + + def test_get_or_set_leader_falls_back_to_existing( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_current_leader.side_effect = [None, "sid-3"] + repository.set_leader_if_absent.return_value = False + + # Act + result = collaboration_service.get_or_set_leader("wf-1", "sid-2") + + # Assert + assert result == "sid-3" + + def test_handle_leader_disconnect_elects_new( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_current_leader.return_value = "sid-1" + repository.get_session_sids.return_value = ["sid-2"] + + with patch.object(collaboration_service, "broadcast_leader_change") as broadcast_leader_change: + # Act + collaboration_service.handle_leader_disconnect("wf-1", "sid-1") + + # Assert + repository.set_leader.assert_called_once_with("wf-1", "sid-2") + broadcast_leader_change.assert_called_once_with("wf-1", "sid-2") + + def test_handle_leader_disconnect_clears_when_empty( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_current_leader.return_value = "sid-1" + repository.get_session_sids.return_value = [] + + # Act + collaboration_service.handle_leader_disconnect("wf-1", "sid-1") + + # Assert + repository.delete_leader.assert_called_once_with("wf-1") + + def test_broadcast_online_users_sorts_and_emits( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, socketio = service + repository.list_sessions.return_value = [ + {"user_id": "u-1", "username": "A", "avatar": None, "sid": "sid-1", "connected_at": 3}, + {"user_id": "u-2", "username": "B", "avatar": None, "sid": "sid-2", "connected_at": 1}, + ] + repository.get_current_leader.return_value = "sid-1" + + # Act + collaboration_service.broadcast_online_users("wf-1") + + # Assert + socketio.emit.assert_called_once_with( + "online_users", + { + "workflow_id": "wf-1", + "users": [ + {"user_id": "u-2", "username": "B", "avatar": None, "sid": "sid-2", "connected_at": 1}, + {"user_id": "u-1", "username": "A", "avatar": None, "sid": "sid-1", "connected_at": 3}, + ], + "leader": "sid-1", + }, + room="wf-1", + ) + + def test_refresh_session_state_expires_active_leader( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_current_leader.return_value = "sid-1" + + with patch.object(collaboration_service, "is_session_active", return_value=True): + # Act + collaboration_service.refresh_session_state("wf-1", "sid-1") + + # Assert + repository.refresh_session_state.assert_called_once_with("wf-1", "sid-1") + repository.expire_leader.assert_called_once_with("wf-1") + repository.set_leader.assert_not_called() + + def test_refresh_session_state_sets_leader_when_missing( + self, service: tuple[WorkflowCollaborationService, Mock, Mock] + ) -> None: + # Arrange + collaboration_service, repository, _socketio = service + repository.get_current_leader.return_value = None + + with patch.object(collaboration_service, "broadcast_leader_change") as broadcast_leader_change: + # Act + collaboration_service.refresh_session_state("wf-1", "sid-2") + + # Assert + repository.set_leader.assert_called_once_with("wf-1", "sid-2") + broadcast_leader_change.assert_called_once_with("wf-1", "sid-2") + + def test_relay_graph_event_emits_update(self, service: tuple[WorkflowCollaborationService, Mock, Mock]) -> None: + # Arrange + collaboration_service, repository, socketio = service + repository.get_sid_mapping.return_value = {"workflow_id": "wf-1", "user_id": "u-1"} + + # Act + result = collaboration_service.relay_graph_event("sid-1", {"nodes": []}) + + # Assert + assert result == ({"msg": "graph_update_broadcasted"}, 200) + repository.refresh_session_state.assert_called_once_with("wf-1", "sid-1") + socketio.emit.assert_called_once_with("graph_update", {"nodes": []}, room="wf-1", skip_sid="sid-1") diff --git a/api/tests/unit_tests/services/test_workflow_comment_service.py b/api/tests/unit_tests/services/test_workflow_comment_service.py new file mode 100644 index 0000000000..dfb1c9452f --- /dev/null +++ b/api/tests/unit_tests/services/test_workflow_comment_service.py @@ -0,0 +1,245 @@ +from unittest.mock import MagicMock, Mock, patch + +import pytest +from werkzeug.exceptions import Forbidden, NotFound + +from services import workflow_comment_service as service_module +from services.workflow_comment_service import WorkflowCommentService + + +@pytest.fixture +def mock_session(monkeypatch: pytest.MonkeyPatch) -> Mock: + session = Mock() + context_manager = MagicMock() + context_manager.__enter__.return_value = session + context_manager.__exit__.return_value = False + mock_db = MagicMock() + mock_db.engine = Mock() + monkeypatch.setattr(service_module, "Session", Mock(return_value=context_manager)) + monkeypatch.setattr(service_module, "db", mock_db) + return session + + +def _mock_scalars(result_list: list[object]) -> Mock: + scalars = Mock() + scalars.all.return_value = result_list + return scalars + + +class TestWorkflowCommentService: + def test_validate_content_rejects_empty(self) -> None: + with pytest.raises(ValueError): + WorkflowCommentService._validate_content(" ") + + def test_validate_content_rejects_too_long(self) -> None: + with pytest.raises(ValueError): + WorkflowCommentService._validate_content("a" * 1001) + + def test_create_comment_creates_mentions(self, mock_session: Mock) -> None: + comment = Mock() + comment.id = "comment-1" + comment.created_at = "ts" + + with ( + patch.object(service_module, "WorkflowComment", return_value=comment), + patch.object(service_module, "WorkflowCommentMention", return_value=Mock()), + patch.object(service_module, "uuid_value", side_effect=[True, False]), + ): + result = WorkflowCommentService.create_comment( + tenant_id="tenant-1", + app_id="app-1", + created_by="user-1", + content="hello", + position_x=1.0, + position_y=2.0, + mentioned_user_ids=["user-2", "bad-id"], + ) + + assert result == {"id": "comment-1", "created_at": "ts"} + assert mock_session.add.call_args_list[0].args[0] is comment + assert mock_session.add.call_count == 2 + mock_session.commit.assert_called_once() + + def test_update_comment_raises_not_found(self, mock_session: Mock) -> None: + mock_session.scalar.return_value = None + + with pytest.raises(NotFound): + WorkflowCommentService.update_comment( + tenant_id="tenant-1", + app_id="app-1", + comment_id="comment-1", + user_id="user-1", + content="hello", + ) + + def test_update_comment_raises_forbidden(self, mock_session: Mock) -> None: + comment = Mock() + comment.created_by = "owner" + mock_session.scalar.return_value = comment + + with pytest.raises(Forbidden): + WorkflowCommentService.update_comment( + tenant_id="tenant-1", + app_id="app-1", + comment_id="comment-1", + user_id="intruder", + content="hello", + ) + + def test_update_comment_replaces_mentions(self, mock_session: Mock) -> None: + comment = Mock() + comment.id = "comment-1" + comment.created_by = "owner" + mock_session.scalar.return_value = comment + + existing_mentions = [Mock(), Mock()] + mock_session.scalars.return_value = _mock_scalars(existing_mentions) + + with patch.object(service_module, "uuid_value", side_effect=[True, False]): + result = WorkflowCommentService.update_comment( + tenant_id="tenant-1", + app_id="app-1", + comment_id="comment-1", + user_id="owner", + content="updated", + mentioned_user_ids=["user-2", "bad-id"], + ) + + assert result == {"id": "comment-1", "updated_at": comment.updated_at} + assert mock_session.delete.call_count == 2 + assert mock_session.add.call_count == 1 + mock_session.commit.assert_called_once() + + def test_delete_comment_raises_forbidden(self, mock_session: Mock) -> None: + comment = Mock() + comment.created_by = "owner" + + with patch.object(WorkflowCommentService, "get_comment", return_value=comment): + with pytest.raises(Forbidden): + WorkflowCommentService.delete_comment("tenant-1", "app-1", "comment-1", "intruder") + + def test_delete_comment_removes_related_entities(self, mock_session: Mock) -> None: + comment = Mock() + comment.created_by = "owner" + + mentions = [Mock(), Mock()] + replies = [Mock()] + mock_session.scalars.side_effect = [_mock_scalars(mentions), _mock_scalars(replies)] + + with patch.object(WorkflowCommentService, "get_comment", return_value=comment): + WorkflowCommentService.delete_comment("tenant-1", "app-1", "comment-1", "owner") + + assert mock_session.delete.call_count == 4 + mock_session.commit.assert_called_once() + + def test_resolve_comment_sets_fields(self, mock_session: Mock) -> None: + comment = Mock() + comment.resolved = False + comment.resolved_at = None + comment.resolved_by = None + + with ( + patch.object(WorkflowCommentService, "get_comment", return_value=comment), + patch.object(service_module, "naive_utc_now", return_value="now"), + ): + result = WorkflowCommentService.resolve_comment("tenant-1", "app-1", "comment-1", "user-1") + + assert result is comment + assert comment.resolved is True + assert comment.resolved_at == "now" + assert comment.resolved_by == "user-1" + mock_session.commit.assert_called_once() + + def test_resolve_comment_noop_when_already_resolved(self, mock_session: Mock) -> None: + comment = Mock() + comment.resolved = True + + with patch.object(WorkflowCommentService, "get_comment", return_value=comment): + result = WorkflowCommentService.resolve_comment("tenant-1", "app-1", "comment-1", "user-1") + + assert result is comment + mock_session.commit.assert_not_called() + + def test_create_reply_requires_comment(self, mock_session: Mock) -> None: + mock_session.get.return_value = None + + with pytest.raises(NotFound): + WorkflowCommentService.create_reply("comment-1", "hello", "user-1") + + def test_create_reply_creates_mentions(self, mock_session: Mock) -> None: + mock_session.get.return_value = Mock() + reply = Mock() + reply.id = "reply-1" + reply.created_at = "ts" + + with ( + patch.object(service_module, "WorkflowCommentReply", return_value=reply), + patch.object(service_module, "WorkflowCommentMention", return_value=Mock()), + patch.object(service_module, "uuid_value", side_effect=[True, False]), + ): + result = WorkflowCommentService.create_reply( + comment_id="comment-1", + content="hello", + created_by="user-1", + mentioned_user_ids=["user-2", "bad-id"], + ) + + assert result == {"id": "reply-1", "created_at": "ts"} + assert mock_session.add.call_count == 2 + mock_session.commit.assert_called_once() + + def test_update_reply_raises_not_found(self, mock_session: Mock) -> None: + mock_session.get.return_value = None + + with pytest.raises(NotFound): + WorkflowCommentService.update_reply("reply-1", "user-1", "hello") + + def test_update_reply_raises_forbidden(self, mock_session: Mock) -> None: + reply = Mock() + reply.created_by = "owner" + mock_session.get.return_value = reply + + with pytest.raises(Forbidden): + WorkflowCommentService.update_reply("reply-1", "intruder", "hello") + + def test_update_reply_replaces_mentions(self, mock_session: Mock) -> None: + reply = Mock() + reply.id = "reply-1" + reply.comment_id = "comment-1" + reply.created_by = "owner" + reply.updated_at = "updated" + mock_session.get.return_value = reply + mock_session.scalars.return_value = _mock_scalars([Mock()]) + + with patch.object(service_module, "uuid_value", side_effect=[True, False]): + result = WorkflowCommentService.update_reply( + reply_id="reply-1", + user_id="owner", + content="new", + mentioned_user_ids=["user-2", "bad-id"], + ) + + assert result == {"id": "reply-1", "updated_at": "updated"} + assert mock_session.delete.call_count == 1 + assert mock_session.add.call_count == 1 + mock_session.commit.assert_called_once() + mock_session.refresh.assert_called_once_with(reply) + + def test_delete_reply_raises_forbidden(self, mock_session: Mock) -> None: + reply = Mock() + reply.created_by = "owner" + mock_session.get.return_value = reply + + with pytest.raises(Forbidden): + WorkflowCommentService.delete_reply("reply-1", "intruder") + + def test_delete_reply_removes_mentions(self, mock_session: Mock) -> None: + reply = Mock() + reply.created_by = "owner" + mock_session.get.return_value = reply + mock_session.scalars.return_value = _mock_scalars([Mock(), Mock()]) + + WorkflowCommentService.delete_reply("reply-1", "owner") + + assert mock_session.delete.call_count == 3 + mock_session.commit.assert_called_once() diff --git a/api/tests/unit_tests/services/test_workflow_service.py b/api/tests/unit_tests/services/test_workflow_service.py index ae5b194afb..a8e70ce872 100644 --- a/api/tests/unit_tests/services/test_workflow_service.py +++ b/api/tests/unit_tests/services/test_workflow_service.py @@ -10,7 +10,7 @@ This test suite covers: """ import json -from unittest.mock import MagicMock, patch +from unittest.mock import MagicMock, Mock, patch import pytest @@ -630,6 +630,79 @@ class TestWorkflowService: with pytest.raises(ValueError, match="Invalid app mode"): workflow_service.validate_features_structure(app, features) + # ==================== Draft Workflow Variable Update Tests ==================== + # These tests verify updating draft workflow environment/conversation variables + + def test_update_draft_workflow_environment_variables_updates_workflow(self, workflow_service, mock_db_session): + """Test update_draft_workflow_environment_variables updates draft fields.""" + app = TestWorkflowAssociatedDataFactory.create_app_mock() + account = TestWorkflowAssociatedDataFactory.create_account_mock() + workflow = TestWorkflowAssociatedDataFactory.create_workflow_mock() + variables = [Mock()] + + with ( + patch.object(workflow_service, "get_draft_workflow", return_value=workflow), + patch("services.workflow_service.naive_utc_now", return_value="now"), + ): + workflow_service.update_draft_workflow_environment_variables( + app_model=app, + environment_variables=variables, + account=account, + ) + + assert workflow.environment_variables == variables + assert workflow.updated_by == account.id + assert workflow.updated_at == "now" + mock_db_session.session.commit.assert_called_once() + + def test_update_draft_workflow_environment_variables_raises_when_missing(self, workflow_service): + """Test update_draft_workflow_environment_variables raises when draft missing.""" + app = TestWorkflowAssociatedDataFactory.create_app_mock() + account = TestWorkflowAssociatedDataFactory.create_account_mock() + + with patch.object(workflow_service, "get_draft_workflow", return_value=None): + with pytest.raises(ValueError, match="No draft workflow found."): + workflow_service.update_draft_workflow_environment_variables( + app_model=app, + environment_variables=[], + account=account, + ) + + def test_update_draft_workflow_conversation_variables_updates_workflow(self, workflow_service, mock_db_session): + """Test update_draft_workflow_conversation_variables updates draft fields.""" + app = TestWorkflowAssociatedDataFactory.create_app_mock() + account = TestWorkflowAssociatedDataFactory.create_account_mock() + workflow = TestWorkflowAssociatedDataFactory.create_workflow_mock() + variables = [Mock()] + + with ( + patch.object(workflow_service, "get_draft_workflow", return_value=workflow), + patch("services.workflow_service.naive_utc_now", return_value="now"), + ): + workflow_service.update_draft_workflow_conversation_variables( + app_model=app, + conversation_variables=variables, + account=account, + ) + + assert workflow.conversation_variables == variables + assert workflow.updated_by == account.id + assert workflow.updated_at == "now" + mock_db_session.session.commit.assert_called_once() + + def test_update_draft_workflow_conversation_variables_raises_when_missing(self, workflow_service): + """Test update_draft_workflow_conversation_variables raises when draft missing.""" + app = TestWorkflowAssociatedDataFactory.create_app_mock() + account = TestWorkflowAssociatedDataFactory.create_account_mock() + + with patch.object(workflow_service, "get_draft_workflow", return_value=None): + with pytest.raises(ValueError, match="No draft workflow found."): + workflow_service.update_draft_workflow_conversation_variables( + app_model=app, + conversation_variables=[], + account=account, + ) + # ==================== Publish Workflow Tests ==================== # These tests verify creating published versions from draft workflows diff --git a/api/uv.lock b/api/uv.lock index 3082dea7f2..ae9b9ab66d 100644 --- a/api/uv.lock +++ b/api/uv.lock @@ -136,21 +136,21 @@ wheels = [ [[package]] name = "alembic" -version = "1.18.0" +version = "1.18.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "mako" }, { name = "sqlalchemy" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/70/a5/57f989c26c078567a08f1d88c337acfcb69c8c9cac6876a34054f35b8112/alembic-1.18.0.tar.gz", hash = "sha256:0c4c03c927dc54d4c56821bdcc988652f4f63bf7b9017fd9d78d63f09fd22b48", size = 2043788, upload-time = "2026-01-09T21:22:23.683Z" } +sdist = { url = "https://files.pythonhosted.org/packages/49/cc/aca263693b2ece99fa99a09b6d092acb89973eb2bb575faef1777e04f8b4/alembic-1.18.1.tar.gz", hash = "sha256:83ac6b81359596816fb3b893099841a0862f2117b2963258e965d70dc62fb866", size = 2044319, upload-time = "2026-01-14T18:53:14.907Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ed/fd/68773667babd452fb48f974c4c1f6e6852c6e41bcf622c745faca1b06605/alembic-1.18.0-py3-none-any.whl", hash = "sha256:3993fcfbc371aa80cdcf13f928b7da21b1c9f783c914f03c3c6375f58efd9250", size = 260967, upload-time = "2026-01-09T21:22:25.333Z" }, + { url = "https://files.pythonhosted.org/packages/83/36/cd9cb6101e81e39076b2fbe303bfa3c85ca34e55142b0324fcbf22c5c6e2/alembic-1.18.1-py3-none-any.whl", hash = "sha256:f1c3b0920b87134e851c25f1f7f236d8a332c34b75416802d06971df5d1b7810", size = 260973, upload-time = "2026-01-14T18:53:17.533Z" }, ] [[package]] name = "alibabacloud-credentials" -version = "1.0.4" +version = "1.0.5" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiofiles" }, @@ -158,9 +158,9 @@ dependencies = [ { name = "alibabacloud-tea" }, { name = "apscheduler" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/98/7b/022d86b8487bbf037685d631cb86bcee75346d514e0e5821f221102cd7ac/alibabacloud_credentials-1.0.4.tar.gz", hash = "sha256:2b71ab30745267abd524d64fbe063f7e02649da2ab6daaf1eec05733b7f9c8f1", size = 40424, upload-time = "2025-12-05T01:56:11.338Z" } +sdist = { url = "https://files.pythonhosted.org/packages/57/0e/5633e96dc9f42eac7e387b617a032c9136e129b8b5f75a9685a36bf17fcb/alibabacloud_credentials-1.0.5.tar.gz", hash = "sha256:2b79a674e51609826fc5c78595782c7997d0887fa29df840895b926df1e98624", size = 40461, upload-time = "2026-01-23T10:29:10.804Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/fb/96/71fecdfd1c91a12072bd1eb143ef289eb72ddfa5cfa45306afbc6a7bd3ca/alibabacloud_credentials-1.0.4-py3-none-any.whl", hash = "sha256:62b1ed768a391029777112a104c848534637af222bdb584c46b89ada4d4538dc", size = 48793, upload-time = "2025-12-05T01:56:10.024Z" }, + { url = "https://files.pythonhosted.org/packages/da/3d/cafabe877fe48243f037d01d76427d64cf66216101866831ecc7888294f2/alibabacloud_credentials-1.0.5-py3-none-any.whl", hash = "sha256:d13dd3a6088e45f5f43e911b4277821ceb0218ecede285ba58834016393036b7", size = 48826, upload-time = "2026-01-23T10:29:09.491Z" }, ] [[package]] @@ -205,13 +205,16 @@ wheels = [ [[package]] name = "alibabacloud-openapi-util" -version = "0.2.2" +version = "0.2.4" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "alibabacloud-tea-util" }, { name = "cryptography" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/f6/50/5f41ab550d7874c623f6e992758429802c4b52a6804db437017e5387de33/alibabacloud_openapi_util-0.2.2.tar.gz", hash = "sha256:ebbc3906f554cb4bf8f513e43e8a33e8b6a3d4a0ef13617a0e14c3dda8ef52a8", size = 7201, upload-time = "2023-10-23T07:44:18.523Z" } +sdist = { url = "https://files.pythonhosted.org/packages/f6/51/be5802851a4ed20ac2c6db50ac8354a6e431e93db6e714ca39b50983626f/alibabacloud_openapi_util-0.2.4.tar.gz", hash = "sha256:87022b9dcb7593a601f7a40ca698227ac3ccb776b58cb7b06b8dc7f510995c34", size = 7981, upload-time = "2026-01-15T08:05:03.947Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/08/46/9b217343648b366eb93447f5d93116e09a61956005794aed5ef95a2e9e2e/alibabacloud_openapi_util-0.2.4-py3-none-any.whl", hash = "sha256:a2474f230b5965ae9a8c286e0dc86132a887928d02d20b8182656cf6b1b6c5bd", size = 7661, upload-time = "2026-01-15T08:05:01.374Z" }, +] [[package]] name = "alibabacloud-openplatform20191219" @@ -305,7 +308,7 @@ sdist = { url = "https://files.pythonhosted.org/packages/32/eb/5e82e419c3061823f [[package]] name = "aliyun-log-python-sdk" -version = "0.9.41" +version = "0.9.42" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "dateparser" }, @@ -317,7 +320,7 @@ dependencies = [ { name = "requests" }, { name = "six" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ea/3b/c1999c5d3194430616f2e8a6db304add77c23ae0e4ff85f1b2c2abb6c738/aliyun_log_python_sdk-0.9.41.tar.gz", hash = "sha256:442dc46e997ca8a607f4129243d940d7d6cd60d1d7bd2405fcdd391dca07e2d5", size = 154294, upload-time = "2026-01-13T13:40:10.264Z" } +sdist = { url = "https://files.pythonhosted.org/packages/10/44/c77ddc6abc0770318f8c3c59db6711c04cee3507cc4f84b267d46f86ad9f/aliyun_log_python_sdk-0.9.42.tar.gz", hash = "sha256:27d2a857743fa61576947aa16e46cd3a1bab151bf3a5493b32b4e2a995362e29", size = 154460, upload-time = "2026-01-15T03:43:31.811Z" } [[package]] name = "aliyun-python-sdk-core" @@ -594,6 +597,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/57/f4/a69c20ee4f660081a7dedb1ac57f29be9378e04edfcb90c526b923d4bebc/beautifulsoup4-4.12.2-py3-none-any.whl", hash = "sha256:bd2520ca0d9d7d12694a53d44ac482d181b4ec1888909b035a3dbf40d0f57d4a", size = 142979, upload-time = "2023-04-07T15:02:50.77Z" }, ] +[[package]] +name = "bidict" +version = "0.23.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9a/6e/026678aa5a830e07cd9498a05d3e7e650a4f56a42f267a53d22bcda1bdc9/bidict-0.23.1.tar.gz", hash = "sha256:03069d763bc387bbd20e7d49914e75fc4132a41937fa3405417e1a5a2d006d71", size = 29093, upload-time = "2024-02-18T19:09:05.748Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/99/37/e8730c3587a65eb5645d4aba2d27aae48e8003614d6aaf15dda67f702f1f/bidict-0.23.1-py3-none-any.whl", hash = "sha256:5dae8d4d79b552a71cbabc7deb25dfe8ce710b17ff41711e13010ead2abfc3e5", size = 32764, upload-time = "2024-02-18T19:09:04.156Z" }, +] + [[package]] name = "billiard" version = "4.2.4" @@ -628,16 +640,16 @@ wheels = [ [[package]] name = "boto3-stubs" -version = "1.42.27" +version = "1.42.34" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "botocore-stubs" }, { name = "types-s3transfer" }, { name = "typing-extensions", marker = "python_full_version < '3.12'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/19/68/35600de02d824ad06b91902027a3f9c94ff43d5ba487f26a4006678e9d0e/boto3_stubs-1.42.27.tar.gz", hash = "sha256:9c35521b704a0b9f7bd2ce226d07d6eb94c0c35d5663fb7a2e7521d747cef967", size = 100907, upload-time = "2026-01-13T20:41:03.521Z" } +sdist = { url = "https://files.pythonhosted.org/packages/4d/e4/959e63b009194cae2fad6ddff8ef1c0e7e2f9113bca4c7ec20fa579e4d7a/boto3_stubs-1.42.34.tar.gz", hash = "sha256:fafcc3713c331bac11bf55fe913e5a3a01820f0cde640cfc4694df5a94aa9557", size = 100898, upload-time = "2026-01-23T20:42:10.353Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/b3/bd/66ae1b848632eb5fce74787363e8d767143838f817378d7e04cd92f7ae9d/boto3_stubs-1.42.27-py3-none-any.whl", hash = "sha256:2ce6bc2c71d19eade43179b9fa76ff5726b59668c1e6eef0c1f5aed6406675d3", size = 69782, upload-time = "2026-01-13T20:41:00.551Z" }, + { url = "https://files.pythonhosted.org/packages/a3/c4/1aba1653afc3cf5ef985235cea05d3e9e6736033f10ebbf102a23fc0152d/boto3_stubs-1.42.34-py3-none-any.whl", hash = "sha256:eb98cf3cc0a74ed75ea4945152cf10da57c8c9628104a13db16cde10176219ab", size = 69782, upload-time = "2026-01-23T20:42:05.699Z" }, ] [package.optional-dependencies] @@ -652,8 +664,7 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "jmespath" }, { name = "python-dateutil" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/7c/9c/1df6deceee17c88f7170bad8325aa91452529d683486273928eecfd946d8/botocore-1.35.99.tar.gz", hash = "sha256:1eab44e969c39c5f3d9a3104a0836c24715579a455f12b3979a31d7cde51b3c3", size = 13490969, upload-time = "2025-01-14T20:20:11.419Z" } wheels = [ @@ -662,14 +673,14 @@ wheels = [ [[package]] name = "botocore-stubs" -version = "1.42.27" +version = "1.42.34" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "types-awscrt" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/19/28/16998a7a4a7d6128025a3a85c8419f6c410573314223ef0a9962cf4bcb84/botocore_stubs-1.42.27.tar.gz", hash = "sha256:1e5bc3f8879dc0c8cf98e668d108b3314d34db8f342ade2a9a53d88f27dc3292", size = 42396, upload-time = "2026-01-13T21:28:35.741Z" } +sdist = { url = "https://files.pythonhosted.org/packages/8f/1e/024e45fb46a21d085b541ce0ad8f1bef97ce17c5e72d1dc0e4d09d29e399/botocore_stubs-1.42.34.tar.gz", hash = "sha256:f3d1c5b45c2cbe16f63719abe639b23a1eeb3fec9c3ea0a72688585b462e8ce3", size = 42408, upload-time = "2026-01-23T20:33:38.691Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/d6/cd/fcb5810d0b7f98349128b223a2196589cb1757c6882895a8a3fb102010e0/botocore_stubs-1.42.27-py3-none-any.whl", hash = "sha256:b0075eb627800cc3bb6486595b4322e2ed3b3e36925bf1700d7b48ac14bfa37f", size = 66761, upload-time = "2026-01-13T21:28:34.131Z" }, + { url = "https://files.pythonhosted.org/packages/4c/c8/3845c17b89ff19e2c2474801a6737d1766ee8e80cf38d7d97e1fedc28537/botocore_stubs-1.42.34-py3-none-any.whl", hash = "sha256:afc08661122eff6939d88cd250084ac148e392f8a1a389d51a31a4b9dab59358", size = 66760, upload-time = "2026-01-23T20:33:37.146Z" }, ] [[package]] @@ -947,8 +958,7 @@ dependencies = [ { name = "grpcio" }, { name = "httpx" }, { name = "importlib-resources" }, - { name = "kubernetes", version = "33.1.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "kubernetes", version = "34.1.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, + { name = "kubernetes" }, { name = "mmh3" }, { name = "numpy" }, { name = "onnxruntime" }, @@ -1053,8 +1063,7 @@ dependencies = [ { name = "certifi" }, { name = "lz4" }, { name = "pytz" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, { name = "zstandard" }, ] sdist = { url = "https://files.pythonhosted.org/packages/7b/fd/f8bea1157d40f117248dcaa9abdbf68c729513fcf2098ab5cb4aa58768b8/clickhouse_connect-0.10.0.tar.gz", hash = "sha256:a0256328802c6e5580513e197cef7f9ba49a99fc98e9ba410922873427569564", size = 104753, upload-time = "2025-11-14T20:31:00.947Z" } @@ -1090,8 +1099,7 @@ dependencies = [ { name = "python-dateutil" }, { name = "requests" }, { name = "sqlalchemy" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] wheels = [ { url = "https://files.pythonhosted.org/packages/d2/3a/74e13d78518e27ed479d507d24e1bc9b36d35545b008a22d855abf9bd108/clickzetta_connector_python-0.8.109-py3-none-any.whl", hash = "sha256:204e3144bb33eb93b085a247d44fd11a8b91f9f72d4a853d8ad4e31cf11ab17f", size = 78333, upload-time = "2025-12-24T13:46:09.62Z" }, @@ -1308,16 +1316,16 @@ wheels = [ [[package]] name = "databricks-sdk" -version = "0.77.0" +version = "0.80.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "google-auth" }, { name = "protobuf" }, { name = "requests" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/1e/54/f2393af72974d8e70bfd44b52f17b917190b4934bab66508f4afec9e9285/databricks_sdk-0.77.0.tar.gz", hash = "sha256:56b6017ae17dc31ee821e49746d2ff627a029127166c702088428527813422bd", size = 827462, upload-time = "2026-01-06T13:19:09.432Z" } +sdist = { url = "https://files.pythonhosted.org/packages/e7/b0/adeac3cdbb8fd286565b93af7779c8c21966f437bfd1dec0bde3e243fbd6/databricks_sdk-0.80.0.tar.gz", hash = "sha256:53e5228edd12caf619f4fd3c3d62fddd3ff4d5b30e1680e6b6aec68ac40e770b", size = 837547, upload-time = "2026-01-22T20:30:50.858Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/03/aa/d9186684dbcd338cf51e93621bcf53efa843fe647c0f29f549ebddbf2870/databricks_sdk-0.77.0-py3-none-any.whl", hash = "sha256:42e3211b7dbd53b81a795981149ee08d5b025442a73e464264569edc8c46ee0a", size = 779180, upload-time = "2026-01-06T13:19:07.757Z" }, + { url = "https://files.pythonhosted.org/packages/08/71/ac3a16e620e4de7cea10695d7e926a6b00d7790208a932d81dd0b3136772/databricks_sdk-0.80.0-py3-none-any.whl", hash = "sha256:e654b08b945b4fc9651cfe65389035382c0885d74435123a0e4860d007fc963b", size = 788323, upload-time = "2026-01-22T20:30:49.372Z" }, ] [[package]] @@ -1380,8 +1388,7 @@ dependencies = [ { name = "pydantic" }, { name = "python-dateutil" }, { name = "typing-extensions" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/e1/2f/df4138f03bdcc17fcc4737504df9176d9db8d40a91abc1e91d79c99436d3/daytona_api_client-0.128.1.tar.gz", hash = "sha256:e9db105bf5ea7ad4b55431e3bb7db1e3a8937557ffbca7dba6167bc5a6a63c96", size = 125691, upload-time = "2025-12-23T17:03:36.391Z" } wheels = [ @@ -1398,8 +1405,7 @@ dependencies = [ { name = "pydantic" }, { name = "python-dateutil" }, { name = "typing-extensions" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/94/a3/64a11a4e18bea571249615dc7cefb390abf7704569138d12a2b40d841c7d/daytona_api_client_async-0.128.1.tar.gz", hash = "sha256:2fb7507cb4122ae2011aa1f52a38556c1ce9c137173648aa96ca227ef072eadd", size = 126674, upload-time = "2025-12-23T17:03:48.261Z" } wheels = [ @@ -1414,8 +1420,7 @@ dependencies = [ { name = "pydantic" }, { name = "python-dateutil" }, { name = "typing-extensions" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/62/32/e3b205b79341caccaccdd09f4bf0650dbc5346b45a6459bb00e63fb6a6dd/daytona_toolbox_api_client-0.128.1.tar.gz", hash = "sha256:869ee431f485ed535868a93154e29c10e46fb2c36a0a7af79020385830e23c8f", size = 61374, upload-time = "2025-12-23T17:04:03.418Z" } wheels = [ @@ -1432,8 +1437,7 @@ dependencies = [ { name = "pydantic" }, { name = "python-dateutil" }, { name = "typing-extensions" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/3b/c0/ee0534242826c8f4c09f9a422e77d676d8ce5fa37599d421fa596d5bef35/daytona_toolbox_api_client_async-0.128.1.tar.gz", hash = "sha256:d9ef0ec4d17fcc611e5c8d17ae300afb825b32bf8346fa6a2a8576d760ef0304", size = 58335, upload-time = "2025-12-23T17:04:01.946Z" } wheels = [ @@ -1498,10 +1502,10 @@ dependencies = [ { name = "celery" }, { name = "charset-normalizer" }, { name = "croniter" }, - { name = "fastopenapi", extra = ["flask"] }, { name = "daytona" }, { name = "docker" }, { name = "e2b-code-interpreter" }, + { name = "fastopenapi", extra = ["flask"] }, { name = "flask" }, { name = "flask-compress" }, { name = "flask-cors" }, @@ -1511,6 +1515,7 @@ dependencies = [ { name = "flask-restx" }, { name = "flask-sqlalchemy" }, { name = "gevent" }, + { name = "gevent-websocket" }, { name = "gmpy2" }, { name = "google-api-core" }, { name = "google-api-python-client" }, @@ -1561,6 +1566,7 @@ dependencies = [ { name = "pypdfium2" }, { name = "python-docx" }, { name = "python-dotenv" }, + { name = "python-socketio" }, { name = "python-socks" }, { name = "pyyaml" }, { name = "readabilipy" }, @@ -1701,10 +1707,10 @@ requires-dist = [ { name = "celery", specifier = "~=5.5.2" }, { name = "charset-normalizer", specifier = ">=3.4.4" }, { name = "croniter", specifier = ">=6.0.0" }, - { name = "fastopenapi", extras = ["flask"], specifier = ">=0.7.0" }, { name = "daytona", specifier = "==0.128.1" }, { name = "docker", specifier = ">=7.1.0" }, { name = "e2b-code-interpreter", specifier = ">=2.4.1" }, + { name = "fastopenapi", extras = ["flask"], specifier = ">=0.7.0" }, { name = "flask", specifier = "~=3.1.2" }, { name = "flask-compress", specifier = ">=1.17,<1.18" }, { name = "flask-cors", specifier = "~=6.0.0" }, @@ -1714,6 +1720,7 @@ requires-dist = [ { name = "flask-restx", specifier = "~=1.3.0" }, { name = "flask-sqlalchemy", specifier = "~=3.1.1" }, { name = "gevent", specifier = "~=25.9.1" }, + { name = "gevent-websocket", specifier = "~=0.10.1" }, { name = "gmpy2", specifier = "~=2.2.1" }, { name = "google-api-core", specifier = "==2.18.0" }, { name = "google-api-python-client", specifier = "==2.90.0" }, @@ -1725,7 +1732,7 @@ requires-dist = [ { name = "httpx", specifier = "~=0.28.1" }, { name = "httpx-sse", specifier = "~=0.4.0" }, { name = "jieba", specifier = "==0.42.1" }, - { name = "json-repair", specifier = ">=0.41.1" }, + { name = "json-repair", specifier = ">=0.55.1" }, { name = "jsonschema", specifier = ">=4.25.1" }, { name = "langfuse", specifier = "~=2.51.3" }, { name = "langsmith", specifier = "~=0.1.77" }, @@ -1764,6 +1771,7 @@ requires-dist = [ { name = "pypdfium2", specifier = "==5.2.0" }, { name = "python-docx", specifier = "~=1.1.0" }, { name = "python-dotenv", specifier = "==1.0.1" }, + { name = "python-socketio", specifier = "~=5.13.0" }, { name = "python-socks", specifier = ">=2.4.4" }, { name = "pyyaml", specifier = "~=6.0.1" }, { name = "readabilipy", specifier = "~=0.3.0" }, @@ -1916,8 +1924,7 @@ source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pywin32", marker = "sys_platform == 'win32'" }, { name = "requests" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/91/9b/4a2ea29aeba62471211598dac5d96825bb49348fa07e906ea930394a83ce/docker-7.1.0.tar.gz", hash = "sha256:ad8c70e6e3f8926cb8a92619b832b4ea5299e2831c14284663184e200546fa6c", size = 117834, upload-time = "2024-05-23T11:13:57.216Z" } wheels = [ @@ -1969,7 +1976,7 @@ wheels = [ [[package]] name = "e2b" -version = "2.10.1" +version = "2.10.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, @@ -1983,9 +1990,9 @@ dependencies = [ { name = "typing-extensions" }, { name = "wcmatch" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/9d/62/25b508c25016fbda4ef1f39ae30041f82bff9614c1964a47a1d158054af7/e2b-2.10.1.tar.gz", hash = "sha256:26c6318834aef80615d46eee29319b91b0a4dcdac5316df2ab5fb9616e4d46b9", size = 114610, upload-time = "2026-01-12T13:23:11.335Z" } +sdist = { url = "https://files.pythonhosted.org/packages/10/16/afd0b78b12bc50570ec3a3cd6d668e3c112aa250e02a7cc10fd7fc717142/e2b-2.10.2.tar.gz", hash = "sha256:b77ecd620fd057b81a9610da18141811c003cc6f446c39c7ec7b9e9dc147d864", size = 114601, upload-time = "2026-01-15T16:44:44.88Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/aa/40/80aaa7d3197183eeedf3d154df35139586368da9f63bd3f1b03702183a38/e2b-2.10.1-py3-none-any.whl", hash = "sha256:9bc0336e3a8ff03a4db8ecae467ed21c7411e40be6fd66b0f935923e8489c789", size = 213504, upload-time = "2026-01-12T13:23:09.735Z" }, + { url = "https://files.pythonhosted.org/packages/10/ab/54d17995ef09436120464fc997b5399c0920c95bc007efc315ba5518349d/e2b-2.10.2-py3-none-any.whl", hash = "sha256:c719291fc9b3006b286809f6e820b803a1aab9a6f5ae4fe0140ead17efbce821", size = 213497, upload-time = "2026-01-15T16:44:43.067Z" }, ] [[package]] @@ -2008,8 +2015,7 @@ version = "8.17.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "certifi" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/6a/54/d498a766ac8fa475f931da85a154666cc81a70f8eb4a780bc8e4e934e9ac/elastic_transport-8.17.1.tar.gz", hash = "sha256:5edef32ac864dca8e2f0a613ef63491ee8d6b8cfb52881fa7313ba9290cac6d2", size = 73425, upload-time = "2025-03-13T07:28:30.776Z" } wheels = [ @@ -2386,6 +2392,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d5/98/caf06d5d22a7c129c1fb2fc1477306902a2c8ddfd399cd26bbbd4caf2141/gevent-25.9.1-cp312-cp312-win_amd64.whl", hash = "sha256:4acd6bcd5feabf22c7c5174bd3b9535ee9f088d2bbce789f740ad8d6554b18f3", size = 1682837, upload-time = "2025-09-17T19:48:47.318Z" }, ] +[[package]] +name = "gevent-websocket" +version = "0.10.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "gevent" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/98/d2/6fa19239ff1ab072af40ebf339acd91fb97f34617c2ee625b8e34bf42393/gevent-websocket-0.10.1.tar.gz", hash = "sha256:7eaef32968290c9121f7c35b973e2cc302ffb076d018c9068d2f5ca8b2d85fb0", size = 18366, upload-time = "2017-03-12T22:46:05.68Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7b/84/2dc373eb6493e00c884cc11e6c059ec97abae2678d42f06bf780570b0193/gevent_websocket-0.10.1-py3-none-any.whl", hash = "sha256:17b67d91282f8f4c973eba0551183fc84f56f1c90c8f6b6b30256f31f66f5242", size = 22987, upload-time = "2017-03-12T22:46:03.611Z" }, +] + [[package]] name = "gitdb" version = "4.0.12" @@ -2570,7 +2588,7 @@ wheels = [ [[package]] name = "google-cloud-resource-manager" -version = "1.15.0" +version = "1.16.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "google-api-core", extra = ["grpc"] }, @@ -2580,9 +2598,9 @@ dependencies = [ { name = "proto-plus" }, { name = "protobuf" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/fc/19/b95d0e8814ce42522e434cdd85c0cb6236d874d9adf6685fc8e6d1fda9d1/google_cloud_resource_manager-1.15.0.tar.gz", hash = "sha256:3d0b78c3daa713f956d24e525b35e9e9a76d597c438837171304d431084cedaf", size = 449227, upload-time = "2025-10-20T14:57:01.108Z" } +sdist = { url = "https://files.pythonhosted.org/packages/4e/7f/db00b2820475793a52958dc55fe9ec2eb8e863546e05fcece9b921f86ebe/google_cloud_resource_manager-1.16.0.tar.gz", hash = "sha256:cc938f87cc36c2672f062b1e541650629e0d954c405a4dac35ceedee70c267c3", size = 459840, upload-time = "2026-01-15T13:04:07.726Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/8c/93/5aef41a5f146ad4559dd7040ae5fa8e7ddcab4dfadbef6cb4b66d775e690/google_cloud_resource_manager-1.15.0-py3-none-any.whl", hash = "sha256:0ccde5db644b269ddfdf7b407a2c7b60bdbf459f8e666344a5285601d00c7f6d", size = 397151, upload-time = "2025-10-20T14:53:45.409Z" }, + { url = "https://files.pythonhosted.org/packages/94/ff/4b28bcc791d9d7e4ac8fea00fbd90ccb236afda56746a3b4564d2ae45df3/google_cloud_resource_manager-1.16.0-py3-none-any.whl", hash = "sha256:fb9a2ad2b5053c508e1c407ac31abfd1a22e91c32876c1892830724195819a28", size = 400218, upload-time = "2026-01-15T13:02:47.378Z" }, ] [[package]] @@ -2691,26 +2709,28 @@ wheels = [ [[package]] name = "greenlet" -version = "3.3.0" +version = "3.3.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/c7/e5/40dbda2736893e3e53d25838e0f19a2b417dfc122b9989c91918db30b5d3/greenlet-3.3.0.tar.gz", hash = "sha256:a82bb225a4e9e4d653dd2fb7b8b2d36e4fb25bc0165422a11e48b88e9e6f78fb", size = 190651, upload-time = "2025-12-04T14:49:44.05Z" } +sdist = { url = "https://files.pythonhosted.org/packages/8a/99/1cd3411c56a410994669062bd73dd58270c00cc074cac15f385a1fd91f8a/greenlet-3.3.1.tar.gz", hash = "sha256:41848f3230b58c08bb43dee542e74a2a2e34d3c59dc3076cec9151aeeedcae98", size = 184690, upload-time = "2026-01-23T15:31:02.076Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/1f/cb/48e964c452ca2b92175a9b2dca037a553036cb053ba69e284650ce755f13/greenlet-3.3.0-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:e29f3018580e8412d6aaf5641bb7745d38c85228dacf51a73bd4e26ddf2a6a8e", size = 274908, upload-time = "2025-12-04T14:23:26.435Z" }, - { url = "https://files.pythonhosted.org/packages/28/da/38d7bff4d0277b594ec557f479d65272a893f1f2a716cad91efeb8680953/greenlet-3.3.0-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a687205fb22794e838f947e2194c0566d3812966b41c78709554aa883183fb62", size = 577113, upload-time = "2025-12-04T14:50:05.493Z" }, - { url = "https://files.pythonhosted.org/packages/3c/f2/89c5eb0faddc3ff014f1c04467d67dee0d1d334ab81fadbf3744847f8a8a/greenlet-3.3.0-cp311-cp311-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4243050a88ba61842186cb9e63c7dfa677ec146160b0efd73b855a3d9c7fcf32", size = 590338, upload-time = "2025-12-04T14:57:41.136Z" }, - { url = "https://files.pythonhosted.org/packages/80/d7/db0a5085035d05134f8c089643da2b44cc9b80647c39e93129c5ef170d8f/greenlet-3.3.0-cp311-cp311-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:670d0f94cd302d81796e37299bcd04b95d62403883b24225c6b5271466612f45", size = 601098, upload-time = "2025-12-04T15:07:11.898Z" }, - { url = "https://files.pythonhosted.org/packages/dc/a6/e959a127b630a58e23529972dbc868c107f9d583b5a9f878fb858c46bc1a/greenlet-3.3.0-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6cb3a8ec3db4a3b0eb8a3c25436c2d49e3505821802074969db017b87bc6a948", size = 590206, upload-time = "2025-12-04T14:26:01.254Z" }, - { url = "https://files.pythonhosted.org/packages/48/60/29035719feb91798693023608447283b266b12efc576ed013dd9442364bb/greenlet-3.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:2de5a0b09eab81fc6a382791b995b1ccf2b172a9fec934747a7a23d2ff291794", size = 1550668, upload-time = "2025-12-04T15:04:22.439Z" }, - { url = "https://files.pythonhosted.org/packages/0a/5f/783a23754b691bfa86bd72c3033aa107490deac9b2ef190837b860996c9f/greenlet-3.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4449a736606bd30f27f8e1ff4678ee193bc47f6ca810d705981cfffd6ce0d8c5", size = 1615483, upload-time = "2025-12-04T14:27:28.083Z" }, - { url = "https://files.pythonhosted.org/packages/1d/d5/c339b3b4bc8198b7caa4f2bd9fd685ac9f29795816d8db112da3d04175bb/greenlet-3.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:7652ee180d16d447a683c04e4c5f6441bae7ba7b17ffd9f6b3aff4605e9e6f71", size = 301164, upload-time = "2025-12-04T14:42:51.577Z" }, - { url = "https://files.pythonhosted.org/packages/f8/0a/a3871375c7b9727edaeeea994bfff7c63ff7804c9829c19309ba2e058807/greenlet-3.3.0-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:b01548f6e0b9e9784a2c99c5651e5dc89ffcbe870bc5fb2e5ef864e9cc6b5dcb", size = 276379, upload-time = "2025-12-04T14:23:30.498Z" }, - { url = "https://files.pythonhosted.org/packages/43/ab/7ebfe34dce8b87be0d11dae91acbf76f7b8246bf9d6b319c741f99fa59c6/greenlet-3.3.0-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:349345b770dc88f81506c6861d22a6ccd422207829d2c854ae2af8025af303e3", size = 597294, upload-time = "2025-12-04T14:50:06.847Z" }, - { url = "https://files.pythonhosted.org/packages/a4/39/f1c8da50024feecd0793dbd5e08f526809b8ab5609224a2da40aad3a7641/greenlet-3.3.0-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e8e18ed6995e9e2c0b4ed264d2cf89260ab3ac7e13555b8032b25a74c6d18655", size = 607742, upload-time = "2025-12-04T14:57:42.349Z" }, - { url = "https://files.pythonhosted.org/packages/77/cb/43692bcd5f7a0da6ec0ec6d58ee7cddb606d055ce94a62ac9b1aa481e969/greenlet-3.3.0-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c024b1e5696626890038e34f76140ed1daf858e37496d33f2af57f06189e70d7", size = 622297, upload-time = "2025-12-04T15:07:13.552Z" }, - { url = "https://files.pythonhosted.org/packages/75/b0/6bde0b1011a60782108c01de5913c588cf51a839174538d266de15e4bf4d/greenlet-3.3.0-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:047ab3df20ede6a57c35c14bf5200fcf04039d50f908270d3f9a7a82064f543b", size = 609885, upload-time = "2025-12-04T14:26:02.368Z" }, - { url = "https://files.pythonhosted.org/packages/49/0e/49b46ac39f931f59f987b7cd9f34bfec8ef81d2a1e6e00682f55be5de9f4/greenlet-3.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2d9ad37fc657b1102ec880e637cccf20191581f75c64087a549e66c57e1ceb53", size = 1567424, upload-time = "2025-12-04T15:04:23.757Z" }, - { url = "https://files.pythonhosted.org/packages/05/f5/49a9ac2dff7f10091935def9165c90236d8f175afb27cbed38fb1d61ab6b/greenlet-3.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:83cd0e36932e0e7f36a64b732a6f60c2fc2df28c351bae79fbaf4f8092fe7614", size = 1636017, upload-time = "2025-12-04T14:27:29.688Z" }, - { url = "https://files.pythonhosted.org/packages/6c/79/3912a94cf27ec503e51ba493692d6db1e3cd8ac7ac52b0b47c8e33d7f4f9/greenlet-3.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a7a34b13d43a6b78abf828a6d0e87d3385680eaf830cd60d20d52f249faabf39", size = 301964, upload-time = "2025-12-04T14:36:58.316Z" }, + { url = "https://files.pythonhosted.org/packages/ec/e8/2e1462c8fdbe0f210feb5ac7ad2d9029af8be3bf45bd9fa39765f821642f/greenlet-3.3.1-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:5fd23b9bc6d37b563211c6abbb1b3cab27db385a4449af5c32e932f93017080c", size = 274974, upload-time = "2026-01-23T15:31:02.891Z" }, + { url = "https://files.pythonhosted.org/packages/7e/a8/530a401419a6b302af59f67aaf0b9ba1015855ea7e56c036b5928793c5bd/greenlet-3.3.1-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:09f51496a0bfbaa9d74d36a52d2580d1ef5ed4fdfcff0a73730abfbbbe1403dd", size = 577175, upload-time = "2026-01-23T16:00:56.213Z" }, + { url = "https://files.pythonhosted.org/packages/8e/89/7e812bb9c05e1aaef9b597ac1d0962b9021d2c6269354966451e885c4e6b/greenlet-3.3.1-cp311-cp311-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb0feb07fe6e6a74615ee62a880007d976cf739b6669cce95daa7373d4fc69c5", size = 590401, upload-time = "2026-01-23T16:05:26.365Z" }, + { url = "https://files.pythonhosted.org/packages/70/ae/e2d5f0e59b94a2269b68a629173263fa40b63da32f5c231307c349315871/greenlet-3.3.1-cp311-cp311-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:67ea3fc73c8cd92f42467a72b75e8f05ed51a0e9b1d15398c913416f2dafd49f", size = 601161, upload-time = "2026-01-23T16:15:53.456Z" }, + { url = "https://files.pythonhosted.org/packages/5c/ae/8d472e1f5ac5efe55c563f3eabb38c98a44b832602e12910750a7c025802/greenlet-3.3.1-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:39eda9ba259cc9801da05351eaa8576e9aa83eb9411e8f0c299e05d712a210f2", size = 590272, upload-time = "2026-01-23T15:32:49.411Z" }, + { url = "https://files.pythonhosted.org/packages/a8/51/0fde34bebfcadc833550717eade64e35ec8738e6b097d5d248274a01258b/greenlet-3.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e2e7e882f83149f0a71ac822ebf156d902e7a5d22c9045e3e0d1daf59cee2cc9", size = 1550729, upload-time = "2026-01-23T16:04:20.867Z" }, + { url = "https://files.pythonhosted.org/packages/16/c9/2fb47bee83b25b119d5a35d580807bb8b92480a54b68fef009a02945629f/greenlet-3.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:80aa4d79eb5564f2e0a6144fcc744b5a37c56c4a92d60920720e99210d88db0f", size = 1615552, upload-time = "2026-01-23T15:33:45.743Z" }, + { url = "https://files.pythonhosted.org/packages/1f/54/dcf9f737b96606f82f8dd05becfb8d238db0633dd7397d542a296fe9cad3/greenlet-3.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:32e4ca9777c5addcbf42ff3915d99030d8e00173a56f80001fb3875998fe410b", size = 226462, upload-time = "2026-01-23T15:36:50.422Z" }, + { url = "https://files.pythonhosted.org/packages/91/37/61e1015cf944ddd2337447d8e97fb423ac9bc21f9963fb5f206b53d65649/greenlet-3.3.1-cp311-cp311-win_arm64.whl", hash = "sha256:da19609432f353fed186cc1b85e9440db93d489f198b4bdf42ae19cc9d9ac9b4", size = 225715, upload-time = "2026-01-23T15:33:17.298Z" }, + { url = "https://files.pythonhosted.org/packages/f9/c8/9d76a66421d1ae24340dfae7e79c313957f6e3195c144d2c73333b5bfe34/greenlet-3.3.1-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:7e806ca53acf6d15a888405880766ec84721aa4181261cd11a457dfe9a7a4975", size = 276443, upload-time = "2026-01-23T15:30:10.066Z" }, + { url = "https://files.pythonhosted.org/packages/81/99/401ff34bb3c032d1f10477d199724f5e5f6fbfb59816ad1455c79c1eb8e7/greenlet-3.3.1-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d842c94b9155f1c9b3058036c24ffb8ff78b428414a19792b2380be9cecf4f36", size = 597359, upload-time = "2026-01-23T16:00:57.394Z" }, + { url = "https://files.pythonhosted.org/packages/2b/bc/4dcc0871ed557792d304f50be0f7487a14e017952ec689effe2180a6ff35/greenlet-3.3.1-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:20fedaadd422fa02695f82093f9a98bad3dab5fcda793c658b945fcde2ab27ba", size = 607805, upload-time = "2026-01-23T16:05:28.068Z" }, + { url = "https://files.pythonhosted.org/packages/3b/cd/7a7ca57588dac3389e97f7c9521cb6641fd8b6602faf1eaa4188384757df/greenlet-3.3.1-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c620051669fd04ac6b60ebc70478210119c56e2d5d5df848baec4312e260e4ca", size = 622363, upload-time = "2026-01-23T16:15:54.754Z" }, + { url = "https://files.pythonhosted.org/packages/cf/05/821587cf19e2ce1f2b24945d890b164401e5085f9d09cbd969b0c193cd20/greenlet-3.3.1-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:14194f5f4305800ff329cbf02c5fcc88f01886cadd29941b807668a45f0d2336", size = 609947, upload-time = "2026-01-23T15:32:51.004Z" }, + { url = "https://files.pythonhosted.org/packages/a4/52/ee8c46ed9f8babaa93a19e577f26e3d28a519feac6350ed6f25f1afee7e9/greenlet-3.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7b2fe4150a0cf59f847a67db8c155ac36aed89080a6a639e9f16df5d6c6096f1", size = 1567487, upload-time = "2026-01-23T16:04:22.125Z" }, + { url = "https://files.pythonhosted.org/packages/8f/7c/456a74f07029597626f3a6db71b273a3632aecb9afafeeca452cfa633197/greenlet-3.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:49f4ad195d45f4a66a0eb9c1ba4832bb380570d361912fa3554746830d332149", size = 1636087, upload-time = "2026-01-23T15:33:47.486Z" }, + { url = "https://files.pythonhosted.org/packages/34/2f/5e0e41f33c69655300a5e54aeb637cf8ff57f1786a3aba374eacc0228c1d/greenlet-3.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:cc98b9c4e4870fa983436afa999d4eb16b12872fab7071423d5262fa7120d57a", size = 227156, upload-time = "2026-01-23T15:34:34.808Z" }, + { url = "https://files.pythonhosted.org/packages/c8/ab/717c58343cf02c5265b531384b248787e04d8160b8afe53d9eec053d7b44/greenlet-3.3.1-cp312-cp312-win_arm64.whl", hash = "sha256:bfb2d1763d777de5ee495c85309460f6fd8146e50ec9d0ae0183dbf6f0a829d1", size = 226403, upload-time = "2026-01-23T15:31:39.372Z" }, ] [[package]] @@ -2970,14 +2990,14 @@ wheels = [ [[package]] name = "httplib2" -version = "0.31.1" +version = "0.31.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pyparsing" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/77/df/6eb1d485a513776bbdbb1c919b72e59b5acc51c5e7ef28ad1cd444e252a3/httplib2-0.31.1.tar.gz", hash = "sha256:21591655ac54953624c6ab8d587c71675e379e31e2cfe3147c83c11e9ef41f92", size = 250746, upload-time = "2026-01-13T12:14:14.365Z" } +sdist = { url = "https://files.pythonhosted.org/packages/c1/1f/e86365613582c027dda5ddb64e1010e57a3d53e99ab8a72093fa13d565ec/httplib2-0.31.2.tar.gz", hash = "sha256:385e0869d7397484f4eab426197a4c020b606edd43372492337c0b4010ae5d24", size = 250800, upload-time = "2026-01-23T11:04:44.165Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f0/d8/1b05076441c2f01e4b64f59e5255edc2f0384a711b6d618845c023dc269b/httplib2-0.31.1-py3-none-any.whl", hash = "sha256:d520d22fa7e50c746a7ed856bac298c4300105d01bc2d8c2580a9b57fb9ed617", size = 91101, upload-time = "2026-01-13T12:14:12.676Z" }, + { url = "https://files.pythonhosted.org/packages/2f/90/fd509079dfcab01102c0fdd87f3a9506894bc70afcf9e9785ef6b2b3aff6/httplib2-0.31.2-py3-none-any.whl", hash = "sha256:dbf0c2fa3862acf3c55c078ea9c0bc4481d7dc5117cae71be9514912cf9f8349", size = 91099, upload-time = "2026-01-23T11:04:42.78Z" }, ] [[package]] @@ -3073,14 +3093,14 @@ wheels = [ [[package]] name = "hypothesis" -version = "6.150.2" +version = "6.150.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "sortedcontainers" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/d2/19/a4eee0c98e2ec678854272f79646f34943f8fbbc42689cc355b530c5bc96/hypothesis-6.150.2.tar.gz", hash = "sha256:deb043c41c53eaf0955f4a08739c2a34c3d8040ee3d9a2da0aa5470122979f75", size = 475250, upload-time = "2026-01-13T17:09:22.146Z" } +sdist = { url = "https://files.pythonhosted.org/packages/86/eb/1b0359b52d2136a7f4b8112e60148ef8b5ba480d450f409ed63e18d4c8d2/hypothesis-6.150.3.tar.gz", hash = "sha256:32c88d4b7df3a8483e69877561b520320bf7779b0709c11869e392025e9279d4", size = 475542, upload-time = "2026-01-23T07:53:09.716Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/b3/5e/21caad4acf45db7caf730cca1bc61422283e4c4e841efbc862d17ab81a21/hypothesis-6.150.2-py3-none-any.whl", hash = "sha256:648d6a2be435889e713ba3d335b0fb5e7a250f569b56e6867887c1e7a0d1f02f", size = 542712, upload-time = "2026-01-13T17:09:19.945Z" }, + { url = "https://files.pythonhosted.org/packages/69/6b/c94ca780814aa4e73b52a8d5e9092f0b83f4732a9ecb3d1f8333c93ac131/hypothesis-6.150.3-py3-none-any.whl", hash = "sha256:5577c0f8eff5ac54a8aff1ce32e30c5454167c29360fdabf1bfea0539b1689f9", size = 542960, upload-time = "2026-01-23T07:53:07.309Z" }, ] [[package]] @@ -3259,11 +3279,11 @@ wheels = [ [[package]] name = "json-repair" -version = "0.55.0" +version = "0.55.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/96/74/0f39677fa7c0127129c3f1a37c94d05c30a968ba3047200e54dea375b09a/json_repair-0.55.0.tar.gz", hash = "sha256:9fafb47d92582ef4bdd3520656bdb0fcb37b46cf6aa99c1926b7895abc0a3a4b", size = 38828, upload-time = "2026-01-01T20:29:02.684Z" } +sdist = { url = "https://files.pythonhosted.org/packages/c0/de/71d6bb078d167c0d0959776cee6b6bb8d2ad843f512a5222d7151dde4955/json_repair-0.55.1.tar.gz", hash = "sha256:b27aa0f6bf2e5bf58554037468690446ef26f32ca79c8753282adb3df25fb888", size = 39231, upload-time = "2026-01-23T09:37:20.93Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/bc/00/d7d6b6b3257f9b1f997a558b6f7087b8af7c0a2f525f4fbd864c267a88ab/json_repair-0.55.0-py3-none-any.whl", hash = "sha256:bcf4880f5e6ad21a0f70ab034e3d1d398c2ae9698dc5717d7015afbac77b8ed7", size = 29570, upload-time = "2026-01-01T20:29:01.042Z" }, + { url = "https://files.pythonhosted.org/packages/56/da/289ba9eb550ae420cfc457926f6c49b87cacf8083ee9927e96921888a665/json_repair-0.55.1-py3-none-any.whl", hash = "sha256:a1bcc151982a12bc3ef9e9528198229587b1074999cfe08921ab6333b0c8e206", size = 29743, upload-time = "2026-01-23T09:37:19.404Z" }, ] [[package]] @@ -3319,61 +3339,22 @@ wheels = [ [[package]] name = "kubernetes" -version = "33.1.0" +version = "35.0.0" source = { registry = "https://pypi.org/simple" } -resolution-markers = [ - "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'", -] dependencies = [ - { name = "certifi", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "durationpy", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "google-auth", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "oauthlib", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "python-dateutil", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "pyyaml", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "requests", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "requests-oauthlib", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "six", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, - { name = "websocket-client", marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "certifi" }, + { name = "durationpy" }, + { name = "python-dateutil" }, + { name = "pyyaml" }, + { name = "requests" }, + { name = "requests-oauthlib" }, + { name = "six" }, + { name = "urllib3" }, + { name = "websocket-client" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ae/52/19ebe8004c243fdfa78268a96727c71e08f00ff6fe69a301d0b7fcbce3c2/kubernetes-33.1.0.tar.gz", hash = "sha256:f64d829843a54c251061a8e7a14523b521f2dc5c896cf6d65ccf348648a88993", size = 1036779, upload-time = "2025-06-09T21:57:58.521Z" } +sdist = { url = "https://files.pythonhosted.org/packages/2c/8f/85bf51ad4150f64e8c665daf0d9dfe9787ae92005efb9a4d1cba592bd79d/kubernetes-35.0.0.tar.gz", hash = "sha256:3d00d344944239821458b9efd484d6df9f011da367ecb155dadf9513f05f09ee", size = 1094642, upload-time = "2026-01-16T01:05:27.76Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/89/43/d9bebfc3db7dea6ec80df5cb2aad8d274dd18ec2edd6c4f21f32c237cbbb/kubernetes-33.1.0-py2.py3-none-any.whl", hash = "sha256:544de42b24b64287f7e0aa9513c93cb503f7f40eea39b20f66810011a86eabc5", size = 1941335, upload-time = "2025-06-09T21:57:56.327Z" }, -] - -[[package]] -name = "kubernetes" -version = "34.1.0" -source = { registry = "https://pypi.org/simple" } -resolution-markers = [ - "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'linux'", - "python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'", - "python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'linux'", - "python_full_version < '3.12' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'", - "python_full_version < '3.12' and platform_python_implementation != 'PyPy' and sys_platform != 'linux'", - "python_full_version < '3.12' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'", - "python_full_version < '3.12' and platform_python_implementation == 'PyPy' and sys_platform != 'linux'", -] -dependencies = [ - { name = "certifi", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "durationpy", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "google-auth", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "python-dateutil", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "pyyaml", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "requests", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "requests-oauthlib", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "six", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "websocket-client", marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/ef/55/3f880ef65f559cbed44a9aa20d3bdbc219a2c3a3bac4a30a513029b03ee9/kubernetes-34.1.0.tar.gz", hash = "sha256:8fe8edb0b5d290a2f3ac06596b23f87c658977d46b5f8df9d0f4ea83d0003912", size = 1083771, upload-time = "2025-09-29T20:23:49.283Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ca/ec/65f7d563aa4a62dd58777e8f6aa882f15db53b14eb29aba0c28a20f7eb26/kubernetes-34.1.0-py2.py3-none-any.whl", hash = "sha256:bffba2272534e224e6a7a74d582deb0b545b7c9879d2cd9e4aae9481d1f2cc2a", size = 2008380, upload-time = "2025-09-29T20:23:47.684Z" }, + { url = "https://files.pythonhosted.org/packages/0c/70/05b685ea2dffcb2adbf3cdcea5d8865b7bc66f67249084cf845012a0ff13/kubernetes-35.0.0-py2.py3-none-any.whl", hash = "sha256:39e2b33b46e5834ef6c3985ebfe2047ab39135d41de51ce7641a7ca5b372a13d", size = 2017602, upload-time = "2026-01-16T01:05:25.991Z" }, ] [[package]] @@ -3838,14 +3819,14 @@ wheels = [ [[package]] name = "mypy-boto3-bedrock-runtime" -version = "1.42.3" +version = "1.42.31" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "typing-extensions", marker = "python_full_version < '3.12'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/1b/95/cb46d84a7a1408e14cac8a8dbbb24a612e438dd10b5f284fb5e01deece3a/mypy_boto3_bedrock_runtime-1.42.3.tar.gz", hash = "sha256:15686cf925719f14bc0d6c85530808736005fb431f007e37d40e10daff4032cc", size = 29476, upload-time = "2025-12-04T20:56:45.423Z" } +sdist = { url = "https://files.pythonhosted.org/packages/90/c9/394b33a2797b78a82af4944dcde156b2f5242b5081996e6cabe809a83089/mypy_boto3_bedrock_runtime-1.42.31.tar.gz", hash = "sha256:a661d1aaadd49660dcf0bcf92beba3546047d06b4744fe5fc5b658ecd165b157", size = 29605, upload-time = "2026-01-20T21:18:32.159Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/99/19/f8a855cb1ff098afb6d05e2caa681ee34e1a7e4eb51b273ceb1e9c434a4e/mypy_boto3_bedrock_runtime-1.42.3-py3-none-any.whl", hash = "sha256:e28650aaa64ed4ff84ea75aac681c3a0fab0d2c4dd7dc22c00726949dadf87ea", size = 35631, upload-time = "2025-12-04T20:56:43.487Z" }, + { url = "https://files.pythonhosted.org/packages/06/00/fd3e043e2b9e44229ebf6d4087c7bc8f9fdb87687a999dd1fa06c1a22e4f/mypy_boto3_bedrock_runtime-1.42.31-py3-none-any.whl", hash = "sha256:420961c6c22a9dfdb69bbcc725bff01ae59c6cc347a144e8092aaf9bec1dcdd2", size = 35781, upload-time = "2026-01-20T21:18:29.62Z" }, ] [[package]] @@ -3859,21 +3840,13 @@ wheels = [ [[package]] name = "mysql-connector-python" -version = "9.5.0" +version = "9.6.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/39/33/b332b001bc8c5ee09255a0d4b09a254da674450edd6a3e5228b245ca82a0/mysql_connector_python-9.5.0.tar.gz", hash = "sha256:92fb924285a86d8c146ebd63d94f9eaefa548da7813bc46271508fdc6cc1d596", size = 12251077, upload-time = "2025-10-22T09:05:45.423Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/05/03/77347d58b0027ce93a41858477e08422e498c6ebc24348b1f725ed7a67ae/mysql_connector_python-9.5.0-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:653e70cd10cf2d18dd828fae58dff5f0f7a5cf7e48e244f2093314dddf84a4b9", size = 17578984, upload-time = "2025-10-22T09:01:41.213Z" }, - { url = "https://files.pythonhosted.org/packages/a5/bb/0f45c7ee55ebc56d6731a593d85c0e7f25f83af90a094efebfd5be9fe010/mysql_connector_python-9.5.0-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:5add93f60b3922be71ea31b89bc8a452b876adbb49262561bd559860dae96b3f", size = 18445067, upload-time = "2025-10-22T09:01:43.215Z" }, - { url = "https://files.pythonhosted.org/packages/1c/ec/054de99d4aa50d851a37edca9039280f7194cc1bfd30aab38f5bd6977ebe/mysql_connector_python-9.5.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:20950a5e44896c03e3dc93ceb3a5e9b48c9acae18665ca6e13249b3fe5b96811", size = 33668029, upload-time = "2025-10-22T09:01:45.74Z" }, - { url = "https://files.pythonhosted.org/packages/90/a2/e6095dc3a7ad5c959fe4a65681db63af131f572e57cdffcc7816bc84e3ad/mysql_connector_python-9.5.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:7fdd3205b9242c284019310fa84437f3357b13f598e3f9b5d80d337d4a6406b8", size = 34101687, upload-time = "2025-10-22T09:01:48.462Z" }, - { url = "https://files.pythonhosted.org/packages/9c/88/bc13c33fca11acaf808bd1809d8602d78f5bb84f7b1e7b1a288c383a14fd/mysql_connector_python-9.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:c021d8b0830958b28712c70c53b206b4cf4766948dae201ea7ca588a186605e0", size = 16511749, upload-time = "2025-10-22T09:01:51.032Z" }, - { url = "https://files.pythonhosted.org/packages/02/89/167ebee82f4b01ba7339c241c3cc2518886a2be9f871770a1efa81b940a0/mysql_connector_python-9.5.0-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:a72c2ef9d50b84f3c567c31b3bf30901af740686baa2a4abead5f202e0b7ea61", size = 17581904, upload-time = "2025-10-22T09:01:53.21Z" }, - { url = "https://files.pythonhosted.org/packages/67/46/630ca969ce10b30fdc605d65dab4a6157556d8cc3b77c724f56c2d83cb79/mysql_connector_python-9.5.0-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:bd9ba5a946cfd3b3b2688a75135357e862834b0321ed936fd968049be290872b", size = 18448195, upload-time = "2025-10-22T09:01:55.378Z" }, - { url = "https://files.pythonhosted.org/packages/f6/87/4c421f41ad169d8c9065ad5c46673c7af889a523e4899c1ac1d6bfd37262/mysql_connector_python-9.5.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:5ef7accbdf8b5f6ec60d2a1550654b7e27e63bf6f7b04020d5fb4191fb02bc4d", size = 33668638, upload-time = "2025-10-22T09:01:57.896Z" }, - { url = "https://files.pythonhosted.org/packages/a6/01/67cf210d50bfefbb9224b9a5c465857c1767388dade1004c903c8e22a991/mysql_connector_python-9.5.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:a6e0a4a0274d15e3d4c892ab93f58f46431222117dba20608178dfb2cc4d5fd8", size = 34102899, upload-time = "2025-10-22T09:02:00.291Z" }, - { url = "https://files.pythonhosted.org/packages/cd/ef/3d1a67d503fff38cc30e11d111cf28f0976987fb175f47b10d44494e1080/mysql_connector_python-9.5.0-cp312-cp312-win_amd64.whl", hash = "sha256:b6c69cb37600b7e22f476150034e2afbd53342a175e20aea887f8158fc5e3ff6", size = 16512684, upload-time = "2025-10-22T09:02:02.411Z" }, - { url = "https://files.pythonhosted.org/packages/95/e1/45373c06781340c7b74fe9b88b85278ac05321889a307eaa5be079a997d4/mysql_connector_python-9.5.0-py2.py3-none-any.whl", hash = "sha256:ace137b88eb6fdafa1e5b2e03ac76ce1b8b1844b3a4af1192a02ae7c1a45bdee", size = 479047, upload-time = "2025-10-22T09:02:27.809Z" }, + { url = "https://files.pythonhosted.org/packages/2a/08/0e9bce000736454c2b8bb4c40bded79328887483689487dad7df4cf59fb7/mysql_connector_python-9.6.0-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:011931f7392a1087e10d305b0303f2a20cc1af2c1c8a15cd5691609aa95dfcbd", size = 17582646, upload-time = "2026-01-21T09:04:48.327Z" }, + { url = "https://files.pythonhosted.org/packages/93/aa/3dd4db039fc6a9bcbdbade83be9914ead6786c0be4918170dfaf89327b76/mysql_connector_python-9.6.0-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:b5212372aff6833473d2560ac87d3df9fb2498d0faacb7ebf231d947175fa36a", size = 18449358, upload-time = "2026-01-21T09:04:50.278Z" }, + { url = "https://files.pythonhosted.org/packages/53/38/ecd6d35382b6265ff5f030464d53b45e51ff2c2523ab88771c277fd84c05/mysql_connector_python-9.6.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:61deca6e243fafbb3cf08ae27bd0c83d0f8188de8456e46aeba0d3db15bb7230", size = 34169309, upload-time = "2026-01-21T09:04:52.402Z" }, + { url = "https://files.pythonhosted.org/packages/18/1d/fe1133eb76089342854d8fbe88e28598f7e06bc684a763d21fc7b23f1d5e/mysql_connector_python-9.6.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:adabbc5e1475cdf5fb6f1902a25edc3bd1e0726fa45f01ab1b8f479ff43b3337", size = 34541101, upload-time = "2026-01-21T09:04:55.897Z" }, ] [[package]] @@ -4168,8 +4141,7 @@ dependencies = [ { name = "python-dateutil" }, { name = "requests" }, { name = "six" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/e4/dc/acb182db6bb0c71f1e6e41c49260e01d68e52a03efb64e44aed3cc7f483f/opensearch-py-2.4.0.tar.gz", hash = "sha256:7eba2b6ed2ddcf33225bfebfba2aee026877838cc39f760ec80f27827308cc4b", size = 182924, upload-time = "2023-11-15T21:41:37.329Z" } wheels = [ @@ -4776,13 +4748,14 @@ wheels = [ [[package]] name = "polyfile-weave" -version = "0.5.8" +version = "0.5.9" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "abnf" }, { name = "chardet" }, { name = "cint" }, { name = "fickling" }, + { name = "filelock" }, { name = "graphviz" }, { name = "intervaltree" }, { name = "jinja2" }, @@ -4792,11 +4765,10 @@ dependencies = [ { name = "pillow" }, { name = "pyreadline3", marker = "sys_platform == 'win32'" }, { name = "pyyaml" }, - { name = "setuptools" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/e7/d4/76e56e4429646d9353b4287794f8324ff94201bdb0a2c35ce88cf3de90d0/polyfile_weave-0.5.8.tar.gz", hash = "sha256:cf2ca6a1351165fbbf2971ace4b8bebbb03b2c00e4f2159ff29bed88854e7b32", size = 5989602, upload-time = "2026-01-08T04:21:26.689Z" } +sdist = { url = "https://files.pythonhosted.org/packages/70/55/e5400762e3884f743d59291e71eaaa9c52dd7e144b75a11911e74ec1bac9/polyfile_weave-0.5.9.tar.gz", hash = "sha256:12341fab03e06ede1bfebbd3627dd24015fde5353ea74ece2da186321b818bdb", size = 6024974, upload-time = "2026-01-22T22:08:48.081Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/54/32/c09fd626366c00325d1981e310be5cac8661c09206098d267a592e0c5000/polyfile_weave-0.5.8-py3-none-any.whl", hash = "sha256:f68c570ef189a4219798a7c797730fc3b7feace7ff5bd7e662490f89b772964a", size = 1656208, upload-time = "2026-01-08T04:21:15.213Z" }, + { url = "https://files.pythonhosted.org/packages/52/94/215005530a48c5f7d4ec4a31acdb5828f2bfb985cc6e577b0eaa5882c0e2/polyfile_weave-0.5.9-py3-none-any.whl", hash = "sha256:6ae4b1b5eeac9f5bfc862474484d6d3e33655fab31749d93af0b0a91fddabfc7", size = 1700174, upload-time = "2026-01-22T22:08:46.346Z" }, ] [[package]] @@ -4827,7 +4799,7 @@ wheels = [ [[package]] name = "posthog" -version = "7.5.1" +version = "7.6.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "backoff" }, @@ -4837,9 +4809,9 @@ dependencies = [ { name = "six" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/98/3b/866af11cb12e9d35feffcd480d4ebf31f87b2164926b9c670cbdafabc814/posthog-7.5.1.tar.gz", hash = "sha256:d8a8165b3d47465023ea2f919982a34890e2dda76402ec47d6c68424b2534a55", size = 145244, upload-time = "2026-01-08T21:18:39.266Z" } +sdist = { url = "https://files.pythonhosted.org/packages/9d/10/dcbe5d12ba5e62b2a9c9004a80117765468198c44ffef16d2b54f938bddf/posthog-7.6.0.tar.gz", hash = "sha256:941dfd278ee427c9b14640f09b35b5bb52a71bdf028d7dbb7307e1838fd3002e", size = 146194, upload-time = "2026-01-19T16:23:04.571Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/1f/03/ba011712ce9d07fe87dcfb72474c388d960e6d0c4f2262d2ae11fd27f0c5/posthog-7.5.1-py3-none-any.whl", hash = "sha256:fd3431ce32c9bbfb1e3775e3633c32ee589c052b0054fafe5ed9e4b17c1969d3", size = 167555, upload-time = "2026-01-08T21:18:37.437Z" }, + { url = "https://files.pythonhosted.org/packages/ae/f6/8d4a2d1b67368fec425f32911e2f3638d5ac9e8abfebc698ac426fcf65db/posthog-7.6.0-py3-none-any.whl", hash = "sha256:c4dd78cf77c4fecceb965f86066e5ac37886ef867d68ffe75a1db5d681d7d9ad", size = 168426, upload-time = "2026-01-19T16:23:02.71Z" }, ] [[package]] @@ -4991,33 +4963,33 @@ wheels = [ [[package]] name = "pyarrow" -version = "22.0.0" +version = "23.0.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/30/53/04a7fdc63e6056116c9ddc8b43bc28c12cdd181b85cbeadb79278475f3ae/pyarrow-22.0.0.tar.gz", hash = "sha256:3d600dc583260d845c7d8a6db540339dd883081925da2bd1c5cb808f720b3cd9", size = 1151151, upload-time = "2025-10-24T12:30:00.762Z" } +sdist = { url = "https://files.pythonhosted.org/packages/01/33/ffd9c3eb087fa41dd79c3cf20c4c0ae3cdb877c4f8e1107a446006344924/pyarrow-23.0.0.tar.gz", hash = "sha256:180e3150e7edfcd182d3d9afba72f7cf19839a497cc76555a8dce998a8f67615", size = 1167185, upload-time = "2026-01-18T16:19:42.218Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/2e/b7/18f611a8cdc43417f9394a3ccd3eace2f32183c08b9eddc3d17681819f37/pyarrow-22.0.0-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:3e294c5eadfb93d78b0763e859a0c16d4051fc1c5231ae8956d61cb0b5666f5a", size = 34272022, upload-time = "2025-10-24T10:04:28.973Z" }, - { url = "https://files.pythonhosted.org/packages/26/5c/f259e2526c67eb4b9e511741b19870a02363a47a35edbebc55c3178db22d/pyarrow-22.0.0-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:69763ab2445f632d90b504a815a2a033f74332997052b721002298ed6de40f2e", size = 35995834, upload-time = "2025-10-24T10:04:35.467Z" }, - { url = "https://files.pythonhosted.org/packages/50/8d/281f0f9b9376d4b7f146913b26fac0aa2829cd1ee7e997f53a27411bbb92/pyarrow-22.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:b41f37cabfe2463232684de44bad753d6be08a7a072f6a83447eeaf0e4d2a215", size = 45030348, upload-time = "2025-10-24T10:04:43.366Z" }, - { url = "https://files.pythonhosted.org/packages/f5/e5/53c0a1c428f0976bf22f513d79c73000926cb00b9c138d8e02daf2102e18/pyarrow-22.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:35ad0f0378c9359b3f297299c3309778bb03b8612f987399a0333a560b43862d", size = 47699480, upload-time = "2025-10-24T10:04:51.486Z" }, - { url = "https://files.pythonhosted.org/packages/95/e1/9dbe4c465c3365959d183e6345d0a8d1dc5b02ca3f8db4760b3bc834cf25/pyarrow-22.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8382ad21458075c2e66a82a29d650f963ce51c7708c7c0ff313a8c206c4fd5e8", size = 48011148, upload-time = "2025-10-24T10:04:59.585Z" }, - { url = "https://files.pythonhosted.org/packages/c5/b4/7caf5d21930061444c3cf4fa7535c82faf5263e22ce43af7c2759ceb5b8b/pyarrow-22.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1a812a5b727bc09c3d7ea072c4eebf657c2f7066155506ba31ebf4792f88f016", size = 50276964, upload-time = "2025-10-24T10:05:08.175Z" }, - { url = "https://files.pythonhosted.org/packages/ae/f3/cec89bd99fa3abf826f14d4e53d3d11340ce6f6af4d14bdcd54cd83b6576/pyarrow-22.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:ec5d40dd494882704fb876c16fa7261a69791e784ae34e6b5992e977bd2e238c", size = 28106517, upload-time = "2025-10-24T10:05:14.314Z" }, - { url = "https://files.pythonhosted.org/packages/af/63/ba23862d69652f85b615ca14ad14f3bcfc5bf1b99ef3f0cd04ff93fdad5a/pyarrow-22.0.0-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:bea79263d55c24a32b0d79c00a1c58bb2ee5f0757ed95656b01c0fb310c5af3d", size = 34211578, upload-time = "2025-10-24T10:05:21.583Z" }, - { url = "https://files.pythonhosted.org/packages/b1/d0/f9ad86fe809efd2bcc8be32032fa72e8b0d112b01ae56a053006376c5930/pyarrow-22.0.0-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:12fe549c9b10ac98c91cf791d2945e878875d95508e1a5d14091a7aaa66d9cf8", size = 35989906, upload-time = "2025-10-24T10:05:29.485Z" }, - { url = "https://files.pythonhosted.org/packages/b4/a8/f910afcb14630e64d673f15904ec27dd31f1e009b77033c365c84e8c1e1d/pyarrow-22.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:334f900ff08ce0423407af97e6c26ad5d4e3b0763645559ece6fbf3747d6a8f5", size = 45021677, upload-time = "2025-10-24T10:05:38.274Z" }, - { url = "https://files.pythonhosted.org/packages/13/95/aec81f781c75cd10554dc17a25849c720d54feafb6f7847690478dcf5ef8/pyarrow-22.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:c6c791b09c57ed76a18b03f2631753a4960eefbbca80f846da8baefc6491fcfe", size = 47726315, upload-time = "2025-10-24T10:05:47.314Z" }, - { url = "https://files.pythonhosted.org/packages/bb/d4/74ac9f7a54cfde12ee42734ea25d5a3c9a45db78f9def949307a92720d37/pyarrow-22.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c3200cb41cdbc65156e5f8c908d739b0dfed57e890329413da2748d1a2cd1a4e", size = 47990906, upload-time = "2025-10-24T10:05:58.254Z" }, - { url = "https://files.pythonhosted.org/packages/2e/71/fedf2499bf7a95062eafc989ace56572f3343432570e1c54e6599d5b88da/pyarrow-22.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ac93252226cf288753d8b46280f4edf3433bf9508b6977f8dd8526b521a1bbb9", size = 50306783, upload-time = "2025-10-24T10:06:08.08Z" }, - { url = "https://files.pythonhosted.org/packages/68/ed/b202abd5a5b78f519722f3d29063dda03c114711093c1995a33b8e2e0f4b/pyarrow-22.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:44729980b6c50a5f2bfcc2668d36c569ce17f8b17bccaf470c4313dcbbf13c9d", size = 27972883, upload-time = "2025-10-24T10:06:14.204Z" }, + { url = "https://files.pythonhosted.org/packages/aa/c0/57fe251102ca834fee0ef69a84ad33cc0ff9d5dfc50f50b466846356ecd7/pyarrow-23.0.0-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:5574d541923efcbfdf1294a2746ae3b8c2498a2dc6cd477882f6f4e7b1ac08d3", size = 34276762, upload-time = "2026-01-18T16:14:34.128Z" }, + { url = "https://files.pythonhosted.org/packages/f8/4e/24130286548a5bc250cbed0b6bbf289a2775378a6e0e6f086ae8c68fc098/pyarrow-23.0.0-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:2ef0075c2488932e9d3c2eb3482f9459c4be629aa673b725d5e3cf18f777f8e4", size = 35821420, upload-time = "2026-01-18T16:14:40.699Z" }, + { url = "https://files.pythonhosted.org/packages/ee/55/a869e8529d487aa2e842d6c8865eb1e2c9ec33ce2786eb91104d2c3e3f10/pyarrow-23.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:65666fc269669af1ef1c14478c52222a2aa5c907f28b68fb50a203c777e4f60c", size = 44457412, upload-time = "2026-01-18T16:14:49.051Z" }, + { url = "https://files.pythonhosted.org/packages/36/81/1de4f0edfa9a483bbdf0082a05790bd6a20ed2169ea12a65039753be3a01/pyarrow-23.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:4d85cb6177198f3812db4788e394b757223f60d9a9f5ad6634b3e32be1525803", size = 47534285, upload-time = "2026-01-18T16:14:56.748Z" }, + { url = "https://files.pythonhosted.org/packages/f2/04/464a052d673b5ece074518f27377861662449f3c1fdb39ce740d646fd098/pyarrow-23.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1a9ff6fa4141c24a03a1a434c63c8fa97ce70f8f36bccabc18ebba905ddf0f17", size = 48157913, upload-time = "2026-01-18T16:15:05.114Z" }, + { url = "https://files.pythonhosted.org/packages/f4/1b/32a4de9856ee6688c670ca2def588382e573cce45241a965af04c2f61687/pyarrow-23.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:84839d060a54ae734eb60a756aeacb62885244aaa282f3c968f5972ecc7b1ecc", size = 50582529, upload-time = "2026-01-18T16:15:12.846Z" }, + { url = "https://files.pythonhosted.org/packages/db/c7/d6581f03e9b9e44ea60b52d1750ee1a7678c484c06f939f45365a45f7eef/pyarrow-23.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:a149a647dbfe928ce8830a713612aa0b16e22c64feac9d1761529778e4d4eaa5", size = 27542646, upload-time = "2026-01-18T16:15:18.89Z" }, + { url = "https://files.pythonhosted.org/packages/3d/bd/c861d020831ee57609b73ea721a617985ece817684dc82415b0bc3e03ac3/pyarrow-23.0.0-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:5961a9f646c232697c24f54d3419e69b4261ba8a8b66b0ac54a1851faffcbab8", size = 34189116, upload-time = "2026-01-18T16:15:28.054Z" }, + { url = "https://files.pythonhosted.org/packages/8c/23/7725ad6cdcbaf6346221391e7b3eecd113684c805b0a95f32014e6fa0736/pyarrow-23.0.0-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:632b3e7c3d232f41d64e1a4a043fb82d44f8a349f339a1188c6a0dd9d2d47d8a", size = 35803831, upload-time = "2026-01-18T16:15:33.798Z" }, + { url = "https://files.pythonhosted.org/packages/57/06/684a421543455cdc2944d6a0c2cc3425b028a4c6b90e34b35580c4899743/pyarrow-23.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:76242c846db1411f1d6c2cc3823be6b86b40567ee24493344f8226ba34a81333", size = 44436452, upload-time = "2026-01-18T16:15:41.598Z" }, + { url = "https://files.pythonhosted.org/packages/c6/6f/8f9eb40c2328d66e8b097777ddcf38494115ff9f1b5bc9754ba46991191e/pyarrow-23.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:b73519f8b52ae28127000986bf228fda781e81d3095cd2d3ece76eb5cf760e1b", size = 47557396, upload-time = "2026-01-18T16:15:51.252Z" }, + { url = "https://files.pythonhosted.org/packages/10/6e/f08075f1472e5159553501fde2cc7bc6700944bdabe49a03f8a035ee6ccd/pyarrow-23.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:068701f6823449b1b6469120f399a1239766b117d211c5d2519d4ed5861f75de", size = 48147129, upload-time = "2026-01-18T16:16:00.299Z" }, + { url = "https://files.pythonhosted.org/packages/7d/82/d5a680cd507deed62d141cc7f07f7944a6766fc51019f7f118e4d8ad0fb8/pyarrow-23.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1801ba947015d10e23bca9dd6ef5d0e9064a81569a89b6e9a63b59224fd060df", size = 50596642, upload-time = "2026-01-18T16:16:08.502Z" }, + { url = "https://files.pythonhosted.org/packages/a9/26/4f29c61b3dce9fa7780303b86895ec6a0917c9af927101daaaf118fbe462/pyarrow-23.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:52265266201ec25b6839bf6bd4ea918ca6d50f31d13e1cf200b4261cd11dc25c", size = 27660628, upload-time = "2026-01-18T16:16:15.28Z" }, ] [[package]] name = "pyasn1" -version = "0.6.1" +version = "0.6.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/ba/e9/01f1a64245b89f039897cb0130016d79f77d52669aae6ee7b159a6c4c018/pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034", size = 145322, upload-time = "2024-09-10T22:41:42.55Z" } +sdist = { url = "https://files.pythonhosted.org/packages/fe/b6/6e630dff89739fcd427e3f72b3d905ce0acb85a45d4ec3e2678718a3487f/pyasn1-0.6.2.tar.gz", hash = "sha256:9b59a2b25ba7e4f8197db7686c09fb33e658b98339fadb826e9512629017833b", size = 146586, upload-time = "2026-01-16T18:04:18.534Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/c8/f1/d6a797abb14f6283c0ddff96bbdd46937f64122b8c925cab503dd37f8214/pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629", size = 83135, upload-time = "2024-09-11T16:00:36.122Z" }, + { url = "https://files.pythonhosted.org/packages/44/b5/a96872e5184f354da9c84ae119971a0a4c221fe9b27a4d94bd43f2596727/pyasn1-0.6.2-py3-none-any.whl", hash = "sha256:1eb26d860996a18e9b6ed05e7aae0e9fc21619fcee6af91cca9bad4fbea224bf", size = 83371, upload-time = "2026-01-16T18:04:17.174Z" }, ] [[package]] @@ -5034,11 +5006,11 @@ wheels = [ [[package]] name = "pycparser" -version = "2.23" +version = "3.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/fe/cf/d2d3b9f5699fb1e4615c8e32ff220203e43b248e1dfcc6736ad9057731ca/pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2", size = 173734, upload-time = "2025-09-09T13:23:47.91Z" } +sdist = { url = "https://files.pythonhosted.org/packages/1b/7d/92392ff7815c21062bea51aa7b87d45576f649f16458d78b7cf94b9ab2e6/pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", size = 103492, upload-time = "2026-01-21T14:26:51.89Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/a0/e3/59cd50310fc9b59512193629e1984c1f95e5c8ae6e5d8c69532ccc65a7fe/pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934", size = 118140, upload-time = "2025-09-09T13:23:46.651Z" }, + { url = "https://files.pythonhosted.org/packages/0c/c3/44f3fbbfa403ea2a7c779186dc20772604442dde72947e7d01069cbe98e3/pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992", size = 48172, upload-time = "2026-01-21T14:26:50.693Z" }, ] [[package]] @@ -5215,7 +5187,7 @@ wheels = [ [[package]] name = "pyobvector" -version = "0.2.21" +version = "0.2.22" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiomysql" }, @@ -5225,9 +5197,9 @@ dependencies = [ { name = "sqlalchemy" }, { name = "sqlglot" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/12/ed/e7a50fc2634d92785a2b2132597b3f045b6f8a06785b7ec304cb9b393c3f/pyobvector-0.2.21.tar.gz", hash = "sha256:e766492edd90c16af140ffce2c2fefffabf09d016c815434fdcc5d659a412e66", size = 72608, upload-time = "2026-01-13T03:02:57.268Z" } +sdist = { url = "https://files.pythonhosted.org/packages/30/b9/443d65757cdfb47d31ef4b9ed0609628ae468e52e57033051e1fad256c59/pyobvector-0.2.22.tar.gz", hash = "sha256:0bd4af46cfdfbc67e691d5b49f3b0662f702a7a42a7f7a240f1021af378e793c", size = 72706, upload-time = "2026-01-15T03:19:57.4Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/d2/25/3c8b4c430fe13eb1da2790c42fd486200c44aa5a81688c713e67124ba464/pyobvector-0.2.21-py3-none-any.whl", hash = "sha256:5bd0af468bf60fbfffd855bd86a4af1361bad26c6fe65bfd5a28181698e33148", size = 60565, upload-time = "2026-01-13T03:02:55.737Z" }, + { url = "https://files.pythonhosted.org/packages/e0/88/1583888a4ce85202d93fa03f2817681637465668e8b260ef1b9d5a39c3ca/pyobvector-0.2.22-py3-none-any.whl", hash = "sha256:4a0f5c094af7ca8242fdf9e5111e75544de0a9615491e9ec2f9d218dc909b509", size = 60627, upload-time = "2026-01-15T03:19:55.918Z" }, ] [[package]] @@ -5241,11 +5213,11 @@ wheels = [ [[package]] name = "pyparsing" -version = "3.3.1" +version = "3.3.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/33/c1/1d9de9aeaa1b89b0186e5fe23294ff6517fce1bc69149185577cd31016b2/pyparsing-3.3.1.tar.gz", hash = "sha256:47fad0f17ac1e2cad3de3b458570fbc9b03560aa029ed5e16ee5554da9a2251c", size = 1550512, upload-time = "2025-12-23T03:14:04.391Z" } +sdist = { url = "https://files.pythonhosted.org/packages/f3/91/9c6ee907786a473bf81c5f53cf703ba0957b23ab84c264080fb5a450416f/pyparsing-3.3.2.tar.gz", hash = "sha256:c777f4d763f140633dcb6d8a3eda953bf7a214dc4eff598413c070bcdc117cbc", size = 6851574, upload-time = "2026-01-21T03:57:59.36Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/8b/40/2614036cdd416452f5bf98ec037f38a1afb17f327cb8e6b652d4729e0af8/pyparsing-3.3.1-py3-none-any.whl", hash = "sha256:023b5e7e5520ad96642e2c6db4cb683d3970bd640cdf7115049a6e9c3682df82", size = 121793, upload-time = "2025-12-23T03:14:02.103Z" }, + { url = "https://files.pythonhosted.org/packages/10/bd/c038d7cc38edc1aa5bf91ab8068b63d4308c66c4c8bb3cbba7dfbc049f9c/pyparsing-3.3.2-py3-none-any.whl", hash = "sha256:850ba148bd908d7e2411587e247a1e4f0327839c40e2e5e6d05a007ecc69911d", size = 122781, upload-time = "2026-01-21T03:57:55.912Z" }, ] [[package]] @@ -5469,6 +5441,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/6a/3e/b68c118422ec867fa7ab88444e1274aa40681c606d59ac27de5a5588f082/python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a", size = 19863, upload-time = "2024-01-23T06:32:58.246Z" }, ] +[[package]] +name = "python-engineio" +version = "4.13.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "simple-websocket" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/42/5a/349caac055e03ef9e56ed29fa304846063b1771ee54ab8132bf98b29491e/python_engineio-4.13.0.tar.gz", hash = "sha256:f9c51a8754d2742ba832c24b46ed425fdd3064356914edd5a1e8ffde76ab7709", size = 92194, upload-time = "2025-12-24T22:38:05.111Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/50/74/c655a6eda0fd188d490c14142a0f0380655ac7099604e1fbf8fa1a97f0a1/python_engineio-4.13.0-py3-none-any.whl", hash = "sha256:57b94eac094fa07b050c6da59f48b12250ab1cd920765f4849963e3d89ad9de3", size = 59676, upload-time = "2025-12-24T22:38:03.56Z" }, +] + [[package]] name = "python-http-client" version = "3.3.7" @@ -5525,6 +5509,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d9/4f/00be2196329ebbff56ce564aa94efb0fbc828d00de250b1980de1a34ab49/python_pptx-1.0.2-py3-none-any.whl", hash = "sha256:160838e0b8565a8b1f67947675886e9fea18aa5e795db7ae531606d68e785cba", size = 472788, upload-time = "2024-08-07T17:33:28.192Z" }, ] +[[package]] +name = "python-socketio" +version = "5.13.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "bidict" }, + { name = "python-engineio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/21/1a/396d50ccf06ee539fa758ce5623b59a9cb27637fc4b2dc07ed08bf495e77/python_socketio-5.13.0.tar.gz", hash = "sha256:ac4e19a0302ae812e23b712ec8b6427ca0521f7c582d6abb096e36e24a263029", size = 121125, upload-time = "2025-04-12T15:46:59.933Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3c/32/b4fb8585d1be0f68bde7e110dffbcf354915f77ad8c778563f0ad9655c02/python_socketio-5.13.0-py3-none-any.whl", hash = "sha256:51f68d6499f2df8524668c24bcec13ba1414117cfb3a90115c559b601ab10caf", size = 77800, upload-time = "2025-04-12T15:46:58.412Z" }, +] + [[package]] name = "python-socks" version = "2.8.0" @@ -5603,8 +5600,7 @@ dependencies = [ { name = "numpy" }, { name = "portalocker" }, { name = "pydantic" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/86/cf/db06a74694bf8f126ed4a869c70ef576f01ee691ef20799fba3d561d3565/qdrant_client-1.9.0.tar.gz", hash = "sha256:7b1792f616651a6f0a76312f945c13d088e9451726795b82ce0350f7df3b7981", size = 199999, upload-time = "2024-04-22T13:35:49.444Z" } wheels = [ @@ -5708,38 +5704,42 @@ wheels = [ [[package]] name = "regex" -version = "2025.11.3" +version = "2026.1.15" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/cc/a9/546676f25e573a4cf00fe8e119b78a37b6a8fe2dc95cda877b30889c9c45/regex-2025.11.3.tar.gz", hash = "sha256:1fedc720f9bb2494ce31a58a1631f9c82df6a09b49c19517ea5cc280b4541e01", size = 414669, upload-time = "2025-11-03T21:34:22.089Z" } +sdist = { url = "https://files.pythonhosted.org/packages/0b/86/07d5056945f9ec4590b518171c4254a5925832eb727b56d3c38a7476f316/regex-2026.1.15.tar.gz", hash = "sha256:164759aa25575cbc0651bef59a0b18353e54300d79ace8084c818ad8ac72b7d5", size = 414811, upload-time = "2026-01-14T23:18:02.775Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f7/90/4fb5056e5f03a7048abd2b11f598d464f0c167de4f2a51aa868c376b8c70/regex-2025.11.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:eadade04221641516fa25139273505a1c19f9bf97589a05bc4cfcd8b4a618031", size = 488081, upload-time = "2025-11-03T21:31:11.946Z" }, - { url = "https://files.pythonhosted.org/packages/85/23/63e481293fac8b069d84fba0299b6666df720d875110efd0338406b5d360/regex-2025.11.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:feff9e54ec0dd3833d659257f5c3f5322a12eee58ffa360984b716f8b92983f4", size = 290554, upload-time = "2025-11-03T21:31:13.387Z" }, - { url = "https://files.pythonhosted.org/packages/2b/9d/b101d0262ea293a0066b4522dfb722eb6a8785a8c3e084396a5f2c431a46/regex-2025.11.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3b30bc921d50365775c09a7ed446359e5c0179e9e2512beec4a60cbcef6ddd50", size = 288407, upload-time = "2025-11-03T21:31:14.809Z" }, - { url = "https://files.pythonhosted.org/packages/0c/64/79241c8209d5b7e00577ec9dca35cd493cc6be35b7d147eda367d6179f6d/regex-2025.11.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f99be08cfead2020c7ca6e396c13543baea32343b7a9a5780c462e323bd8872f", size = 793418, upload-time = "2025-11-03T21:31:16.556Z" }, - { url = "https://files.pythonhosted.org/packages/3d/e2/23cd5d3573901ce8f9757c92ca4db4d09600b865919b6d3e7f69f03b1afd/regex-2025.11.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6dd329a1b61c0ee95ba95385fb0c07ea0d3fe1a21e1349fa2bec272636217118", size = 860448, upload-time = "2025-11-03T21:31:18.12Z" }, - { url = "https://files.pythonhosted.org/packages/2a/4c/aecf31beeaa416d0ae4ecb852148d38db35391aac19c687b5d56aedf3a8b/regex-2025.11.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4c5238d32f3c5269d9e87be0cf096437b7622b6920f5eac4fd202468aaeb34d2", size = 907139, upload-time = "2025-11-03T21:31:20.753Z" }, - { url = "https://files.pythonhosted.org/packages/61/22/b8cb00df7d2b5e0875f60628594d44dba283e951b1ae17c12f99e332cc0a/regex-2025.11.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:10483eefbfb0adb18ee9474498c9a32fcf4e594fbca0543bb94c48bac6183e2e", size = 800439, upload-time = "2025-11-03T21:31:22.069Z" }, - { url = "https://files.pythonhosted.org/packages/02/a8/c4b20330a5cdc7a8eb265f9ce593f389a6a88a0c5f280cf4d978f33966bc/regex-2025.11.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:78c2d02bb6e1da0720eedc0bad578049cad3f71050ef8cd065ecc87691bed2b0", size = 782965, upload-time = "2025-11-03T21:31:23.598Z" }, - { url = "https://files.pythonhosted.org/packages/b4/4c/ae3e52988ae74af4b04d2af32fee4e8077f26e51b62ec2d12d246876bea2/regex-2025.11.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e6b49cd2aad93a1790ce9cffb18964f6d3a4b0b3dbdbd5de094b65296fce6e58", size = 854398, upload-time = "2025-11-03T21:31:25.008Z" }, - { url = "https://files.pythonhosted.org/packages/06/d1/a8b9cf45874eda14b2e275157ce3b304c87e10fb38d9fc26a6e14eb18227/regex-2025.11.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:885b26aa3ee56433b630502dc3d36ba78d186a00cc535d3806e6bfd9ed3c70ab", size = 845897, upload-time = "2025-11-03T21:31:26.427Z" }, - { url = "https://files.pythonhosted.org/packages/ea/fe/1830eb0236be93d9b145e0bd8ab499f31602fe0999b1f19e99955aa8fe20/regex-2025.11.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ddd76a9f58e6a00f8772e72cff8ebcff78e022be95edf018766707c730593e1e", size = 788906, upload-time = "2025-11-03T21:31:28.078Z" }, - { url = "https://files.pythonhosted.org/packages/66/47/dc2577c1f95f188c1e13e2e69d8825a5ac582ac709942f8a03af42ed6e93/regex-2025.11.3-cp311-cp311-win32.whl", hash = "sha256:3e816cc9aac1cd3cc9a4ec4d860f06d40f994b5c7b4d03b93345f44e08cc68bf", size = 265812, upload-time = "2025-11-03T21:31:29.72Z" }, - { url = "https://files.pythonhosted.org/packages/50/1e/15f08b2f82a9bbb510621ec9042547b54d11e83cb620643ebb54e4eb7d71/regex-2025.11.3-cp311-cp311-win_amd64.whl", hash = "sha256:087511f5c8b7dfbe3a03f5d5ad0c2a33861b1fc387f21f6f60825a44865a385a", size = 277737, upload-time = "2025-11-03T21:31:31.422Z" }, - { url = "https://files.pythonhosted.org/packages/f4/fc/6500eb39f5f76c5e47a398df82e6b535a5e345f839581012a418b16f9cc3/regex-2025.11.3-cp311-cp311-win_arm64.whl", hash = "sha256:1ff0d190c7f68ae7769cd0313fe45820ba07ffebfddfaa89cc1eb70827ba0ddc", size = 270290, upload-time = "2025-11-03T21:31:33.041Z" }, - { url = "https://files.pythonhosted.org/packages/e8/74/18f04cb53e58e3fb107439699bd8375cf5a835eec81084e0bddbd122e4c2/regex-2025.11.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bc8ab71e2e31b16e40868a40a69007bc305e1109bd4658eb6cad007e0bf67c41", size = 489312, upload-time = "2025-11-03T21:31:34.343Z" }, - { url = "https://files.pythonhosted.org/packages/78/3f/37fcdd0d2b1e78909108a876580485ea37c91e1acf66d3bb8e736348f441/regex-2025.11.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:22b29dda7e1f7062a52359fca6e58e548e28c6686f205e780b02ad8ef710de36", size = 291256, upload-time = "2025-11-03T21:31:35.675Z" }, - { url = "https://files.pythonhosted.org/packages/bf/26/0a575f58eb23b7ebd67a45fccbc02ac030b737b896b7e7a909ffe43ffd6a/regex-2025.11.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3a91e4a29938bc1a082cc28fdea44be420bf2bebe2665343029723892eb073e1", size = 288921, upload-time = "2025-11-03T21:31:37.07Z" }, - { url = "https://files.pythonhosted.org/packages/ea/98/6a8dff667d1af907150432cf5abc05a17ccd32c72a3615410d5365ac167a/regex-2025.11.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:08b884f4226602ad40c5d55f52bf91a9df30f513864e0054bad40c0e9cf1afb7", size = 798568, upload-time = "2025-11-03T21:31:38.784Z" }, - { url = "https://files.pythonhosted.org/packages/64/15/92c1db4fa4e12733dd5a526c2dd2b6edcbfe13257e135fc0f6c57f34c173/regex-2025.11.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3e0b11b2b2433d1c39c7c7a30e3f3d0aeeea44c2a8d0bae28f6b95f639927a69", size = 864165, upload-time = "2025-11-03T21:31:40.559Z" }, - { url = "https://files.pythonhosted.org/packages/f9/e7/3ad7da8cdee1ce66c7cd37ab5ab05c463a86ffeb52b1a25fe7bd9293b36c/regex-2025.11.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:87eb52a81ef58c7ba4d45c3ca74e12aa4b4e77816f72ca25258a85b3ea96cb48", size = 912182, upload-time = "2025-11-03T21:31:42.002Z" }, - { url = "https://files.pythonhosted.org/packages/84/bd/9ce9f629fcb714ffc2c3faf62b6766ecb7a585e1e885eb699bcf130a5209/regex-2025.11.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a12ab1f5c29b4e93db518f5e3872116b7e9b1646c9f9f426f777b50d44a09e8c", size = 803501, upload-time = "2025-11-03T21:31:43.815Z" }, - { url = "https://files.pythonhosted.org/packages/7c/0f/8dc2e4349d8e877283e6edd6c12bdcebc20f03744e86f197ab6e4492bf08/regex-2025.11.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7521684c8c7c4f6e88e35ec89680ee1aa8358d3f09d27dfbdf62c446f5d4c695", size = 787842, upload-time = "2025-11-03T21:31:45.353Z" }, - { url = "https://files.pythonhosted.org/packages/f9/73/cff02702960bc185164d5619c0c62a2f598a6abff6695d391b096237d4ab/regex-2025.11.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7fe6e5440584e94cc4b3f5f4d98a25e29ca12dccf8873679a635638349831b98", size = 858519, upload-time = "2025-11-03T21:31:46.814Z" }, - { url = "https://files.pythonhosted.org/packages/61/83/0e8d1ae71e15bc1dc36231c90b46ee35f9d52fab2e226b0e039e7ea9c10a/regex-2025.11.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:8e026094aa12b43f4fd74576714e987803a315c76edb6b098b9809db5de58f74", size = 850611, upload-time = "2025-11-03T21:31:48.289Z" }, - { url = "https://files.pythonhosted.org/packages/c8/f5/70a5cdd781dcfaa12556f2955bf170cd603cb1c96a1827479f8faea2df97/regex-2025.11.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:435bbad13e57eb5606a68443af62bed3556de2f46deb9f7d4237bc2f1c9fb3a0", size = 789759, upload-time = "2025-11-03T21:31:49.759Z" }, - { url = "https://files.pythonhosted.org/packages/59/9b/7c29be7903c318488983e7d97abcf8ebd3830e4c956c4c540005fcfb0462/regex-2025.11.3-cp312-cp312-win32.whl", hash = "sha256:3839967cf4dc4b985e1570fd8d91078f0c519f30491c60f9ac42a8db039be204", size = 266194, upload-time = "2025-11-03T21:31:51.53Z" }, - { url = "https://files.pythonhosted.org/packages/1a/67/3b92df89f179d7c367be654ab5626ae311cb28f7d5c237b6bb976cd5fbbb/regex-2025.11.3-cp312-cp312-win_amd64.whl", hash = "sha256:e721d1b46e25c481dc5ded6f4b3f66c897c58d2e8cfdf77bbced84339108b0b9", size = 277069, upload-time = "2025-11-03T21:31:53.151Z" }, - { url = "https://files.pythonhosted.org/packages/d7/55/85ba4c066fe5094d35b249c3ce8df0ba623cfd35afb22d6764f23a52a1c5/regex-2025.11.3-cp312-cp312-win_arm64.whl", hash = "sha256:64350685ff08b1d3a6fff33f45a9ca183dc1d58bbfe4981604e70ec9801bbc26", size = 270330, upload-time = "2025-11-03T21:31:54.514Z" }, + { url = "https://files.pythonhosted.org/packages/d0/c9/0c80c96eab96948363d270143138d671d5731c3a692b417629bf3492a9d6/regex-2026.1.15-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:1ae6020fb311f68d753b7efa9d4b9a5d47a5d6466ea0d5e3b5a471a960ea6e4a", size = 488168, upload-time = "2026-01-14T23:14:16.129Z" }, + { url = "https://files.pythonhosted.org/packages/17/f0/271c92f5389a552494c429e5cc38d76d1322eb142fb5db3c8ccc47751468/regex-2026.1.15-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:eddf73f41225942c1f994914742afa53dc0d01a6e20fe14b878a1b1edc74151f", size = 290636, upload-time = "2026-01-14T23:14:17.715Z" }, + { url = "https://files.pythonhosted.org/packages/a0/f9/5f1fd077d106ca5655a0f9ff8f25a1ab55b92128b5713a91ed7134ff688e/regex-2026.1.15-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e8cd52557603f5c66a548f69421310886b28b7066853089e1a71ee710e1cdc1", size = 288496, upload-time = "2026-01-14T23:14:19.326Z" }, + { url = "https://files.pythonhosted.org/packages/b5/e1/8f43b03a4968c748858ec77f746c286d81f896c2e437ccf050ebc5d3128c/regex-2026.1.15-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5170907244b14303edc5978f522f16c974f32d3aa92109fabc2af52411c9433b", size = 793503, upload-time = "2026-01-14T23:14:20.922Z" }, + { url = "https://files.pythonhosted.org/packages/8d/4e/a39a5e8edc5377a46a7c875c2f9a626ed3338cb3bb06931be461c3e1a34a/regex-2026.1.15-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2748c1ec0663580b4510bd89941a31560b4b439a0b428b49472a3d9944d11cd8", size = 860535, upload-time = "2026-01-14T23:14:22.405Z" }, + { url = "https://files.pythonhosted.org/packages/dc/1c/9dce667a32a9477f7a2869c1c767dc00727284a9fa3ff5c09a5c6c03575e/regex-2026.1.15-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2f2775843ca49360508d080eaa87f94fa248e2c946bbcd963bb3aae14f333413", size = 907225, upload-time = "2026-01-14T23:14:23.897Z" }, + { url = "https://files.pythonhosted.org/packages/a4/3c/87ca0a02736d16b6262921425e84b48984e77d8e4e572c9072ce96e66c30/regex-2026.1.15-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d9ea2604370efc9a174c1b5dcc81784fb040044232150f7f33756049edfc9026", size = 800526, upload-time = "2026-01-14T23:14:26.039Z" }, + { url = "https://files.pythonhosted.org/packages/4b/ff/647d5715aeea7c87bdcbd2f578f47b415f55c24e361e639fe8c0cc88878f/regex-2026.1.15-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0dcd31594264029b57bf16f37fd7248a70b3b764ed9e0839a8f271b2d22c0785", size = 773446, upload-time = "2026-01-14T23:14:28.109Z" }, + { url = "https://files.pythonhosted.org/packages/af/89/bf22cac25cb4ba0fe6bff52ebedbb65b77a179052a9d6037136ae93f42f4/regex-2026.1.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c08c1f3e34338256732bd6938747daa3c0d5b251e04b6e43b5813e94d503076e", size = 783051, upload-time = "2026-01-14T23:14:29.929Z" }, + { url = "https://files.pythonhosted.org/packages/1e/f4/6ed03e71dca6348a5188363a34f5e26ffd5db1404780288ff0d79513bce4/regex-2026.1.15-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e43a55f378df1e7a4fa3547c88d9a5a9b7113f653a66821bcea4718fe6c58763", size = 854485, upload-time = "2026-01-14T23:14:31.366Z" }, + { url = "https://files.pythonhosted.org/packages/d9/9a/8e8560bd78caded8eb137e3e47612430a05b9a772caf60876435192d670a/regex-2026.1.15-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:f82110ab962a541737bd0ce87978d4c658f06e7591ba899192e2712a517badbb", size = 762195, upload-time = "2026-01-14T23:14:32.802Z" }, + { url = "https://files.pythonhosted.org/packages/38/6b/61fc710f9aa8dfcd764fe27d37edfaa023b1a23305a0d84fccd5adb346ea/regex-2026.1.15-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:27618391db7bdaf87ac6c92b31e8f0dfb83a9de0075855152b720140bda177a2", size = 845986, upload-time = "2026-01-14T23:14:34.898Z" }, + { url = "https://files.pythonhosted.org/packages/fd/2e/fbee4cb93f9d686901a7ca8d94285b80405e8c34fe4107f63ffcbfb56379/regex-2026.1.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bfb0d6be01fbae8d6655c8ca21b3b72458606c4aec9bbc932db758d47aba6db1", size = 788992, upload-time = "2026-01-14T23:14:37.116Z" }, + { url = "https://files.pythonhosted.org/packages/ed/14/3076348f3f586de64b1ab75a3fbabdaab7684af7f308ad43be7ef1849e55/regex-2026.1.15-cp311-cp311-win32.whl", hash = "sha256:b10e42a6de0e32559a92f2f8dc908478cc0fa02838d7dbe764c44dca3fa13569", size = 265893, upload-time = "2026-01-14T23:14:38.426Z" }, + { url = "https://files.pythonhosted.org/packages/0f/19/772cf8b5fc803f5c89ba85d8b1870a1ca580dc482aa030383a9289c82e44/regex-2026.1.15-cp311-cp311-win_amd64.whl", hash = "sha256:e9bf3f0bbdb56633c07d7116ae60a576f846efdd86a8848f8d62b749e1209ca7", size = 277840, upload-time = "2026-01-14T23:14:39.785Z" }, + { url = "https://files.pythonhosted.org/packages/78/84/d05f61142709474da3c0853222d91086d3e1372bcdab516c6fd8d80f3297/regex-2026.1.15-cp311-cp311-win_arm64.whl", hash = "sha256:41aef6f953283291c4e4e6850607bd71502be67779586a61472beacb315c97ec", size = 270374, upload-time = "2026-01-14T23:14:41.592Z" }, + { url = "https://files.pythonhosted.org/packages/92/81/10d8cf43c807d0326efe874c1b79f22bfb0fb226027b0b19ebc26d301408/regex-2026.1.15-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:4c8fcc5793dde01641a35905d6731ee1548f02b956815f8f1cab89e515a5bdf1", size = 489398, upload-time = "2026-01-14T23:14:43.741Z" }, + { url = "https://files.pythonhosted.org/packages/90/b0/7c2a74e74ef2a7c32de724658a69a862880e3e4155cba992ba04d1c70400/regex-2026.1.15-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:bfd876041a956e6a90ad7cdb3f6a630c07d491280bfeed4544053cd434901681", size = 291339, upload-time = "2026-01-14T23:14:45.183Z" }, + { url = "https://files.pythonhosted.org/packages/19/4d/16d0773d0c818417f4cc20aa0da90064b966d22cd62a8c46765b5bd2d643/regex-2026.1.15-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9250d087bc92b7d4899ccd5539a1b2334e44eee85d848c4c1aef8e221d3f8c8f", size = 289003, upload-time = "2026-01-14T23:14:47.25Z" }, + { url = "https://files.pythonhosted.org/packages/c6/e4/1fc4599450c9f0863d9406e944592d968b8d6dfd0d552a7d569e43bceada/regex-2026.1.15-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c8a154cf6537ebbc110e24dabe53095e714245c272da9c1be05734bdad4a61aa", size = 798656, upload-time = "2026-01-14T23:14:48.77Z" }, + { url = "https://files.pythonhosted.org/packages/b2/e6/59650d73a73fa8a60b3a590545bfcf1172b4384a7df2e7fe7b9aab4e2da9/regex-2026.1.15-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8050ba2e3ea1d8731a549e83c18d2f0999fbc99a5f6bd06b4c91449f55291804", size = 864252, upload-time = "2026-01-14T23:14:50.528Z" }, + { url = "https://files.pythonhosted.org/packages/6e/ab/1d0f4d50a1638849a97d731364c9a80fa304fec46325e48330c170ee8e80/regex-2026.1.15-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0bf065240704cb8951cc04972cf107063917022511273e0969bdb34fc173456c", size = 912268, upload-time = "2026-01-14T23:14:52.952Z" }, + { url = "https://files.pythonhosted.org/packages/dd/df/0d722c030c82faa1d331d1921ee268a4e8fb55ca8b9042c9341c352f17fa/regex-2026.1.15-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c32bef3e7aeee75746748643667668ef941d28b003bfc89994ecf09a10f7a1b5", size = 803589, upload-time = "2026-01-14T23:14:55.182Z" }, + { url = "https://files.pythonhosted.org/packages/66/23/33289beba7ccb8b805c6610a8913d0131f834928afc555b241caabd422a9/regex-2026.1.15-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:d5eaa4a4c5b1906bd0d2508d68927f15b81821f85092e06f1a34a4254b0e1af3", size = 775700, upload-time = "2026-01-14T23:14:56.707Z" }, + { url = "https://files.pythonhosted.org/packages/e7/65/bf3a42fa6897a0d3afa81acb25c42f4b71c274f698ceabd75523259f6688/regex-2026.1.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:86c1077a3cc60d453d4084d5b9649065f3bf1184e22992bd322e1f081d3117fb", size = 787928, upload-time = "2026-01-14T23:14:58.312Z" }, + { url = "https://files.pythonhosted.org/packages/f4/f5/13bf65864fc314f68cdd6d8ca94adcab064d4d39dbd0b10fef29a9da48fc/regex-2026.1.15-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:2b091aefc05c78d286657cd4db95f2e6313375ff65dcf085e42e4c04d9c8d410", size = 858607, upload-time = "2026-01-14T23:15:00.657Z" }, + { url = "https://files.pythonhosted.org/packages/a3/31/040e589834d7a439ee43fb0e1e902bc81bd58a5ba81acffe586bb3321d35/regex-2026.1.15-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:57e7d17f59f9ebfa9667e6e5a1c0127b96b87cb9cede8335482451ed00788ba4", size = 763729, upload-time = "2026-01-14T23:15:02.248Z" }, + { url = "https://files.pythonhosted.org/packages/9b/84/6921e8129687a427edf25a34a5594b588b6d88f491320b9de5b6339a4fcb/regex-2026.1.15-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:c6c4dcdfff2c08509faa15d36ba7e5ef5fcfab25f1e8f85a0c8f45bc3a30725d", size = 850697, upload-time = "2026-01-14T23:15:03.878Z" }, + { url = "https://files.pythonhosted.org/packages/8a/87/3d06143d4b128f4229158f2de5de6c8f2485170c7221e61bf381313314b2/regex-2026.1.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:cf8ff04c642716a7f2048713ddc6278c5fd41faa3b9cab12607c7abecd012c22", size = 789849, upload-time = "2026-01-14T23:15:06.102Z" }, + { url = "https://files.pythonhosted.org/packages/77/69/c50a63842b6bd48850ebc7ab22d46e7a2a32d824ad6c605b218441814639/regex-2026.1.15-cp312-cp312-win32.whl", hash = "sha256:82345326b1d8d56afbe41d881fdf62f1926d7264b2fc1537f99ae5da9aad7913", size = 266279, upload-time = "2026-01-14T23:15:07.678Z" }, + { url = "https://files.pythonhosted.org/packages/f2/36/39d0b29d087e2b11fd8191e15e81cce1b635fcc845297c67f11d0d19274d/regex-2026.1.15-cp312-cp312-win_amd64.whl", hash = "sha256:4def140aa6156bc64ee9912383d4038f3fdd18fee03a6f222abd4de6357ce42a", size = 277166, upload-time = "2026-01-14T23:15:09.257Z" }, + { url = "https://files.pythonhosted.org/packages/28/32/5b8e476a12262748851fa8ab1b0be540360692325975b094e594dfebbb52/regex-2026.1.15-cp312-cp312-win_arm64.whl", hash = "sha256:c6c565d9a6e1a8d783c1948937ffc377dd5771e83bd56de8317c450a954d2056", size = 270415, upload-time = "2026-01-14T23:15:10.743Z" }, ] [[package]] @@ -5750,8 +5750,7 @@ dependencies = [ { name = "certifi" }, { name = "charset-normalizer" }, { name = "idna" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" } wheels = [ @@ -5811,15 +5810,15 @@ wheels = [ [[package]] name = "rich" -version = "14.2.0" +version = "14.3.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markdown-it-py" }, { name = "pygments" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/fb/d2/8920e102050a0de7bfabeb4c4614a49248cf8d5d7a8d01885fbb24dc767a/rich-14.2.0.tar.gz", hash = "sha256:73ff50c7c0c1c77c8243079283f4edb376f0f6442433aecb8ce7e6d0b92d1fe4", size = 219990, upload-time = "2025-10-09T14:16:53.064Z" } +sdist = { url = "https://files.pythonhosted.org/packages/aa/9c/137848452e130e71f3ca9a9876751ddcac99e4b1f248ed297996c8c2d728/rich-14.3.0.tar.gz", hash = "sha256:b75e54d3abbcc49137e83e4db54dc86c5e47687eebc95aa0305363231a36e699", size = 230113, upload-time = "2026-01-24T12:25:46.336Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/25/7a/b0178788f8dc6cafce37a212c99565fa1fe7872c70c6c9c1e1a372d9d88f/rich-14.2.0-py3-none-any.whl", hash = "sha256:76bc51fe2e57d2b1be1f96c524b890b816e334ab4c1e45888799bfaab0021edd", size = 243393, upload-time = "2025-10-09T14:16:51.245Z" }, + { url = "https://files.pythonhosted.org/packages/fa/e0/83cbdcb81b5cbbbe355648dd402b410437806544f48ee218a2354798f012/rich-14.3.0-py3-none-any.whl", hash = "sha256:0b8c1e368c1125b9e993c2d2f1342802525f4853fc6dac2e8e9e88bac0f45bce", size = 309950, upload-time = "2026-01-24T12:25:44.679Z" }, ] [[package]] @@ -5886,28 +5885,28 @@ wheels = [ [[package]] name = "ruff" -version = "0.14.11" +version = "0.14.14" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d4/77/9a7fe084d268f8855d493e5031ea03fa0af8cc05887f638bf1c4e3363eb8/ruff-0.14.11.tar.gz", hash = "sha256:f6dc463bfa5c07a59b1ff2c3b9767373e541346ea105503b4c0369c520a66958", size = 5993417, upload-time = "2026-01-08T19:11:58.322Z" } +sdist = { url = "https://files.pythonhosted.org/packages/2e/06/f71e3a86b2df0dfa2d2f72195941cd09b44f87711cb7fa5193732cb9a5fc/ruff-0.14.14.tar.gz", hash = "sha256:2d0f819c9a90205f3a867dbbd0be083bee9912e170fd7d9704cc8ae45824896b", size = 4515732, upload-time = "2026-01-22T22:30:17.527Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/f0/a6/a4c40a5aaa7e331f245d2dc1ac8ece306681f52b636b40ef87c88b9f7afd/ruff-0.14.11-py3-none-linux_armv6l.whl", hash = "sha256:f6ff2d95cbd335841a7217bdfd9c1d2e44eac2c584197ab1385579d55ff8830e", size = 12951208, upload-time = "2026-01-08T19:12:09.218Z" }, - { url = "https://files.pythonhosted.org/packages/5c/5c/360a35cb7204b328b685d3129c08aca24765ff92b5a7efedbdd6c150d555/ruff-0.14.11-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:6f6eb5c1c8033680f4172ea9c8d3706c156223010b8b97b05e82c59bdc774ee6", size = 13330075, upload-time = "2026-01-08T19:12:02.549Z" }, - { url = "https://files.pythonhosted.org/packages/1b/9e/0cc2f1be7a7d33cae541824cf3f95b4ff40d03557b575912b5b70273c9ec/ruff-0.14.11-py3-none-macosx_11_0_arm64.whl", hash = "sha256:f2fc34cc896f90080fca01259f96c566f74069a04b25b6205d55379d12a6855e", size = 12257809, upload-time = "2026-01-08T19:12:00.366Z" }, - { url = "https://files.pythonhosted.org/packages/a7/e5/5faab97c15bb75228d9f74637e775d26ac703cc2b4898564c01ab3637c02/ruff-0.14.11-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:53386375001773ae812b43205d6064dae49ff0968774e6befe16a994fc233caa", size = 12678447, upload-time = "2026-01-08T19:12:13.899Z" }, - { url = "https://files.pythonhosted.org/packages/1b/33/e9767f60a2bef779fb5855cab0af76c488e0ce90f7bb7b8a45c8a2ba4178/ruff-0.14.11-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a697737dce1ca97a0a55b5ff0434ee7205943d4874d638fe3ae66166ff46edbe", size = 12758560, upload-time = "2026-01-08T19:11:42.55Z" }, - { url = "https://files.pythonhosted.org/packages/eb/84/4c6cf627a21462bb5102f7be2a320b084228ff26e105510cd2255ea868e5/ruff-0.14.11-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6845ca1da8ab81ab1dce755a32ad13f1db72e7fba27c486d5d90d65e04d17b8f", size = 13599296, upload-time = "2026-01-08T19:11:30.371Z" }, - { url = "https://files.pythonhosted.org/packages/88/e1/92b5ed7ea66d849f6157e695dc23d5d6d982bd6aa8d077895652c38a7cae/ruff-0.14.11-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:e36ce2fd31b54065ec6f76cb08d60159e1b32bdf08507862e32f47e6dde8bcbf", size = 15048981, upload-time = "2026-01-08T19:12:04.742Z" }, - { url = "https://files.pythonhosted.org/packages/61/df/c1bd30992615ac17c2fb64b8a7376ca22c04a70555b5d05b8f717163cf9f/ruff-0.14.11-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:590bcc0e2097ecf74e62a5c10a6b71f008ad82eb97b0a0079e85defe19fe74d9", size = 14633183, upload-time = "2026-01-08T19:11:40.069Z" }, - { url = "https://files.pythonhosted.org/packages/04/e9/fe552902f25013dd28a5428a42347d9ad20c4b534834a325a28305747d64/ruff-0.14.11-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:53fe71125fc158210d57fe4da26e622c9c294022988d08d9347ec1cf782adafe", size = 14050453, upload-time = "2026-01-08T19:11:37.555Z" }, - { url = "https://files.pythonhosted.org/packages/ae/93/f36d89fa021543187f98991609ce6e47e24f35f008dfe1af01379d248a41/ruff-0.14.11-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a35c9da08562f1598ded8470fcfef2afb5cf881996e6c0a502ceb61f4bc9c8a3", size = 13757889, upload-time = "2026-01-08T19:12:07.094Z" }, - { url = "https://files.pythonhosted.org/packages/b7/9f/c7fb6ecf554f28709a6a1f2a7f74750d400979e8cd47ed29feeaa1bd4db8/ruff-0.14.11-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:0f3727189a52179393ecf92ec7057c2210203e6af2676f08d92140d3e1ee72c1", size = 13955832, upload-time = "2026-01-08T19:11:55.064Z" }, - { url = "https://files.pythonhosted.org/packages/db/a0/153315310f250f76900a98278cf878c64dfb6d044e184491dd3289796734/ruff-0.14.11-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:eb09f849bd37147a789b85995ff734a6c4a095bed5fd1608c4f56afc3634cde2", size = 12586522, upload-time = "2026-01-08T19:11:35.356Z" }, - { url = "https://files.pythonhosted.org/packages/2f/2b/a73a2b6e6d2df1d74bf2b78098be1572191e54bec0e59e29382d13c3adc5/ruff-0.14.11-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:c61782543c1231bf71041461c1f28c64b961d457d0f238ac388e2ab173d7ecb7", size = 12724637, upload-time = "2026-01-08T19:11:47.796Z" }, - { url = "https://files.pythonhosted.org/packages/f0/41/09100590320394401cd3c48fc718a8ba71c7ddb1ffd07e0ad6576b3a3df2/ruff-0.14.11-py3-none-musllinux_1_2_i686.whl", hash = "sha256:82ff352ea68fb6766140381748e1f67f83c39860b6446966cff48a315c3e2491", size = 13145837, upload-time = "2026-01-08T19:11:32.87Z" }, - { url = "https://files.pythonhosted.org/packages/3b/d8/e035db859d1d3edf909381eb8ff3e89a672d6572e9454093538fe6f164b0/ruff-0.14.11-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:728e56879df4ca5b62a9dde2dd0eb0edda2a55160c0ea28c4025f18c03f86984", size = 13850469, upload-time = "2026-01-08T19:12:11.694Z" }, - { url = "https://files.pythonhosted.org/packages/4e/02/bb3ff8b6e6d02ce9e3740f4c17dfbbfb55f34c789c139e9cd91985f356c7/ruff-0.14.11-py3-none-win32.whl", hash = "sha256:337c5dd11f16ee52ae217757d9b82a26400be7efac883e9e852646f1557ed841", size = 12851094, upload-time = "2026-01-08T19:11:45.163Z" }, - { url = "https://files.pythonhosted.org/packages/58/f1/90ddc533918d3a2ad628bc3044cdfc094949e6d4b929220c3f0eb8a1c998/ruff-0.14.11-py3-none-win_amd64.whl", hash = "sha256:f981cea63d08456b2c070e64b79cb62f951aa1305282974d4d5216e6e0178ae6", size = 14001379, upload-time = "2026-01-08T19:11:52.591Z" }, - { url = "https://files.pythonhosted.org/packages/c4/1c/1dbe51782c0e1e9cfce1d1004752672d2d4629ea46945d19d731ad772b3b/ruff-0.14.11-py3-none-win_arm64.whl", hash = "sha256:649fb6c9edd7f751db276ef42df1f3df41c38d67d199570ae2a7bd6cbc3590f0", size = 12938644, upload-time = "2026-01-08T19:11:50.027Z" }, + { url = "https://files.pythonhosted.org/packages/d2/89/20a12e97bc6b9f9f68343952da08a8099c57237aef953a56b82711d55edd/ruff-0.14.14-py3-none-linux_armv6l.whl", hash = "sha256:7cfe36b56e8489dee8fbc777c61959f60ec0f1f11817e8f2415f429552846aed", size = 10467650, upload-time = "2026-01-22T22:30:08.578Z" }, + { url = "https://files.pythonhosted.org/packages/a3/b1/c5de3fd2d5a831fcae21beda5e3589c0ba67eec8202e992388e4b17a6040/ruff-0.14.14-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:6006a0082336e7920b9573ef8a7f52eec837add1265cc74e04ea8a4368cd704c", size = 10883245, upload-time = "2026-01-22T22:30:04.155Z" }, + { url = "https://files.pythonhosted.org/packages/b8/7c/3c1db59a10e7490f8f6f8559d1db8636cbb13dccebf18686f4e3c9d7c772/ruff-0.14.14-py3-none-macosx_11_0_arm64.whl", hash = "sha256:026c1d25996818f0bf498636686199d9bd0d9d6341c9c2c3b62e2a0198b758de", size = 10231273, upload-time = "2026-01-22T22:30:34.642Z" }, + { url = "https://files.pythonhosted.org/packages/a1/6e/5e0e0d9674be0f8581d1f5e0f0a04761203affce3232c1a1189d0e3b4dad/ruff-0.14.14-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f666445819d31210b71e0a6d1c01e24447a20b85458eea25a25fe8142210ae0e", size = 10585753, upload-time = "2026-01-22T22:30:31.781Z" }, + { url = "https://files.pythonhosted.org/packages/23/09/754ab09f46ff1884d422dc26d59ba18b4e5d355be147721bb2518aa2a014/ruff-0.14.14-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3c0f18b922c6d2ff9a5e6c3ee16259adc513ca775bcf82c67ebab7cbd9da5bc8", size = 10286052, upload-time = "2026-01-22T22:30:24.827Z" }, + { url = "https://files.pythonhosted.org/packages/c8/cc/e71f88dd2a12afb5f50733851729d6b571a7c3a35bfdb16c3035132675a0/ruff-0.14.14-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1629e67489c2dea43e8658c3dba659edbfd87361624b4040d1df04c9740ae906", size = 11043637, upload-time = "2026-01-22T22:30:13.239Z" }, + { url = "https://files.pythonhosted.org/packages/67/b2/397245026352494497dac935d7f00f1468c03a23a0c5db6ad8fc49ca3fb2/ruff-0.14.14-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:27493a2131ea0f899057d49d303e4292b2cae2bb57253c1ed1f256fbcd1da480", size = 12194761, upload-time = "2026-01-22T22:30:22.542Z" }, + { url = "https://files.pythonhosted.org/packages/5b/06/06ef271459f778323112c51b7587ce85230785cd64e91772034ddb88f200/ruff-0.14.14-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:01ff589aab3f5b539e35db38425da31a57521efd1e4ad1ae08fc34dbe30bd7df", size = 12005701, upload-time = "2026-01-22T22:30:20.499Z" }, + { url = "https://files.pythonhosted.org/packages/41/d6/99364514541cf811ccc5ac44362f88df66373e9fec1b9d1c4cc830593fe7/ruff-0.14.14-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1cc12d74eef0f29f51775f5b755913eb523546b88e2d733e1d701fe65144e89b", size = 11282455, upload-time = "2026-01-22T22:29:59.679Z" }, + { url = "https://files.pythonhosted.org/packages/ca/71/37daa46f89475f8582b7762ecd2722492df26421714a33e72ccc9a84d7a5/ruff-0.14.14-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bb8481604b7a9e75eff53772496201690ce2687067e038b3cc31aaf16aa0b974", size = 11215882, upload-time = "2026-01-22T22:29:57.032Z" }, + { url = "https://files.pythonhosted.org/packages/2c/10/a31f86169ec91c0705e618443ee74ede0bdd94da0a57b28e72db68b2dbac/ruff-0.14.14-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:14649acb1cf7b5d2d283ebd2f58d56b75836ed8c6f329664fa91cdea19e76e66", size = 11180549, upload-time = "2026-01-22T22:30:27.175Z" }, + { url = "https://files.pythonhosted.org/packages/fd/1e/c723f20536b5163adf79bdd10c5f093414293cdf567eed9bdb7b83940f3f/ruff-0.14.14-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e8058d2145566510790eab4e2fad186002e288dec5e0d343a92fe7b0bc1b3e13", size = 10543416, upload-time = "2026-01-22T22:30:01.964Z" }, + { url = "https://files.pythonhosted.org/packages/3e/34/8a84cea7e42c2d94ba5bde1d7a4fae164d6318f13f933d92da6d7c2041ff/ruff-0.14.14-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:e651e977a79e4c758eb807f0481d673a67ffe53cfa92209781dfa3a996cf8412", size = 10285491, upload-time = "2026-01-22T22:30:29.51Z" }, + { url = "https://files.pythonhosted.org/packages/55/ef/b7c5ea0be82518906c978e365e56a77f8de7678c8bb6651ccfbdc178c29f/ruff-0.14.14-py3-none-musllinux_1_2_i686.whl", hash = "sha256:cc8b22da8d9d6fdd844a68ae937e2a0adf9b16514e9a97cc60355e2d4b219fc3", size = 10733525, upload-time = "2026-01-22T22:30:06.499Z" }, + { url = "https://files.pythonhosted.org/packages/6a/5b/aaf1dfbcc53a2811f6cc0a1759de24e4b03e02ba8762daabd9b6bd8c59e3/ruff-0.14.14-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:16bc890fb4cc9781bb05beb5ab4cd51be9e7cb376bf1dd3580512b24eb3fda2b", size = 11315626, upload-time = "2026-01-22T22:30:36.848Z" }, + { url = "https://files.pythonhosted.org/packages/2c/aa/9f89c719c467dfaf8ad799b9bae0df494513fb21d31a6059cb5870e57e74/ruff-0.14.14-py3-none-win32.whl", hash = "sha256:b530c191970b143375b6a68e6f743800b2b786bbcf03a7965b06c4bf04568167", size = 10502442, upload-time = "2026-01-22T22:30:38.93Z" }, + { url = "https://files.pythonhosted.org/packages/87/44/90fa543014c45560cae1fffc63ea059fb3575ee6e1cb654562197e5d16fb/ruff-0.14.14-py3-none-win_amd64.whl", hash = "sha256:3dde1435e6b6fe5b66506c1dff67a421d0b7f6488d466f651c07f4cab3bf20fd", size = 11630486, upload-time = "2026-01-22T22:30:10.852Z" }, + { url = "https://files.pythonhosted.org/packages/9e/6a/40fee331a52339926a92e17ae748827270b288a35ef4a15c9c8f2ec54715/ruff-0.14.14-py3-none-win_arm64.whl", hash = "sha256:56e6981a98b13a32236a72a8da421d7839221fa308b223b9283312312e5ac76c", size = 10920448, upload-time = "2026-01-22T22:30:15.417Z" }, ] [[package]] @@ -5946,14 +5945,14 @@ wheels = [ [[package]] name = "scipy-stubs" -version = "1.17.0.0" +version = "1.17.0.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "optype", extra = ["numpy"] }, ] -sdist = { url = "https://files.pythonhosted.org/packages/12/4c/ae02bab2da86641edb086e1ac72281b208282a4dc0a9713fba4b823d22d9/scipy_stubs-1.17.0.0.tar.gz", hash = "sha256:01953f1c7967876be942afa21c1cc864c92714d01b1970ef8bdd468b4468d4b7", size = 367603, upload-time = "2025-12-31T18:58:25.911Z" } +sdist = { url = "https://files.pythonhosted.org/packages/40/fe/5fa7da49821ea94d60629ae71277fa8d7e16eb20602f720062b6c30a644c/scipy_stubs-1.17.0.2.tar.gz", hash = "sha256:3981bd7fa4c189a8493307afadaee1a830d9a0de8e3ae2f4603f192b6260ef2a", size = 379897, upload-time = "2026-01-22T19:17:08Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ee/53/bc0ad50243638982c572894c9dd5205e690d34fda7ec6cadf1c1ce157e14/scipy_stubs-1.17.0.0-py3-none-any.whl", hash = "sha256:17b336fa6c56afb0cf47e426e3cb36a7eb8b73351ee0a52d1e5cef7634b5e936", size = 572744, upload-time = "2025-12-31T18:58:24.297Z" }, + { url = "https://files.pythonhosted.org/packages/51/e3/20233497e4a27956e7392c3f7879e6ee7f767f268079f24f4b089b70f563/scipy_stubs-1.17.0.2-py3-none-any.whl", hash = "sha256:99d1aa75b7d72a7ee36a68d18bcf1149f62ab577bbd1236c65c471b3b465d824", size = 586137, upload-time = "2026-01-22T19:17:05.802Z" }, ] [[package]] @@ -5976,8 +5975,7 @@ version = "2.28.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "certifi" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/5e/bb/6a41b2e0e9121bed4d2ec68d50568ab95c49f4744156a9bbb789c866c66d/sentry_sdk-2.28.0.tar.gz", hash = "sha256:14d2b73bc93afaf2a9412490329099e6217761cbab13b6ee8bc0e82927e1504e", size = 325052, upload-time = "2025-05-12T07:53:12.785Z" } wheels = [ @@ -5993,11 +5991,11 @@ flask = [ [[package]] name = "setuptools" -version = "80.9.0" +version = "80.10.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/18/5d/3bf57dcd21979b887f014ea83c24ae194cfcd12b9e0fda66b957c69d1fca/setuptools-80.9.0.tar.gz", hash = "sha256:f36b47402ecde768dbfafc46e8e4207b4360c654f1f3bb84475f0a28628fb19c", size = 1319958, upload-time = "2025-05-27T00:56:51.443Z" } +sdist = { url = "https://files.pythonhosted.org/packages/86/ff/f75651350db3cf2ef767371307eb163f3cc1ac03e16fdf3ac347607f7edb/setuptools-80.10.1.tar.gz", hash = "sha256:bf2e513eb8144c3298a3bd28ab1a5edb739131ec5c22e045ff93cd7f5319703a", size = 1229650, upload-time = "2026-01-21T09:42:03.061Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/a3/dc/17031897dae0efacfea57dfd3a82fdd2a2aeb58e0ff71b77b87e44edc772/setuptools-80.9.0-py3-none-any.whl", hash = "sha256:062d34222ad13e0cc312a4c02d73f059e86a4acbfbdea8f8f76b28c99f306922", size = 1201486, upload-time = "2025-05-27T00:56:49.664Z" }, + { url = "https://files.pythonhosted.org/packages/e0/76/f963c61683a39084aa575f98089253e1e852a4417cb8a3a8a422923a5246/setuptools-80.10.1-py3-none-any.whl", hash = "sha256:fc30c51cbcb8199a219c12cc9c281b5925a4978d212f84229c909636d9f6984e", size = 1099859, upload-time = "2026-01-21T09:42:00.688Z" }, ] [[package]] @@ -6036,6 +6034,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" }, ] +[[package]] +name = "simple-websocket" +version = "1.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "wsproto" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b0/d4/bfa032f961103eba93de583b161f0e6a5b63cebb8f2c7d0c6e6efe1e3d2e/simple_websocket-1.1.0.tar.gz", hash = "sha256:7939234e7aa067c534abdab3a9ed933ec9ce4691b0713c78acb195560aa52ae4", size = 17300, upload-time = "2024-10-10T22:39:31.412Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/52/59/0782e51887ac6b07ffd1570e0364cf901ebc36345fea669969d2084baebb/simple_websocket-1.1.0-py3-none-any.whl", hash = "sha256:4af6069630a38ed6c561010f0e11a5bc0d4ca569b36306eb257cd9a192497c8c", size = 13842, upload-time = "2024-10-10T22:39:29.645Z" }, +] + [[package]] name = "six" version = "1.17.0" @@ -6074,37 +6084,38 @@ wheels = [ [[package]] name = "soupsieve" -version = "2.8.1" +version = "2.8.3" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/89/23/adf3796d740536d63a6fbda113d07e60c734b6ed5d3058d1e47fc0495e47/soupsieve-2.8.1.tar.gz", hash = "sha256:4cf733bc50fa805f5df4b8ef4740fc0e0fa6218cf3006269afd3f9d6d80fd350", size = 117856, upload-time = "2025-12-18T13:50:34.655Z" } +sdist = { url = "https://files.pythonhosted.org/packages/7b/ae/2d9c981590ed9999a0d91755b47fc74f74de286b0f5cee14c9269041e6c4/soupsieve-2.8.3.tar.gz", hash = "sha256:3267f1eeea4251fb42728b6dfb746edc9acaffc4a45b27e19450b676586e8349", size = 118627, upload-time = "2026-01-20T04:27:02.457Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/48/f3/b67d6ea49ca9154453b6d70b34ea22f3996b9fa55da105a79d8732227adc/soupsieve-2.8.1-py3-none-any.whl", hash = "sha256:a11fe2a6f3d76ab3cf2de04eb339c1be5b506a8a47f2ceb6d139803177f85434", size = 36710, upload-time = "2025-12-18T13:50:33.267Z" }, + { url = "https://files.pythonhosted.org/packages/46/2c/1462b1d0a634697ae9e55b3cecdcb64788e8b7d63f54d923fcd0bb140aed/soupsieve-2.8.3-py3-none-any.whl", hash = "sha256:ed64f2ba4eebeab06cc4962affce381647455978ffc1e36bb79a545b91f45a95", size = 37016, upload-time = "2026-01-20T04:27:01.012Z" }, ] [[package]] name = "sqlalchemy" -version = "2.0.45" +version = "2.0.46" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "greenlet", marker = "platform_machine == 'AMD64' or platform_machine == 'WIN32' or platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'ppc64le' or platform_machine == 'win32' or platform_machine == 'x86_64'" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/be/f9/5e4491e5ccf42f5d9cfc663741d261b3e6e1683ae7812114e7636409fcc6/sqlalchemy-2.0.45.tar.gz", hash = "sha256:1632a4bda8d2d25703fdad6363058d882541bdaaee0e5e3ddfa0cd3229efce88", size = 9869912, upload-time = "2025-12-09T21:05:16.737Z" } +sdist = { url = "https://files.pythonhosted.org/packages/06/aa/9ce0f3e7a9829ead5c8ce549392f33a12c4555a6c0609bb27d882e9c7ddf/sqlalchemy-2.0.46.tar.gz", hash = "sha256:cf36851ee7219c170bb0793dbc3da3e80c582e04a5437bc601bfe8c85c9216d7", size = 9865393, upload-time = "2026-01-21T18:03:45.119Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/a2/1c/769552a9d840065137272ebe86ffbb0bc92b0f1e0a68ee5266a225f8cd7b/sqlalchemy-2.0.45-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2e90a344c644a4fa871eb01809c32096487928bd2038bf10f3e4515cb688cc56", size = 2153860, upload-time = "2025-12-10T20:03:23.843Z" }, - { url = "https://files.pythonhosted.org/packages/f3/f8/9be54ff620e5b796ca7b44670ef58bc678095d51b0e89d6e3102ea468216/sqlalchemy-2.0.45-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b8c8b41b97fba5f62349aa285654230296829672fc9939cd7f35aab246d1c08b", size = 3309379, upload-time = "2025-12-09T22:06:07.461Z" }, - { url = "https://files.pythonhosted.org/packages/f6/2b/60ce3ee7a5ae172bfcd419ce23259bb874d2cddd44f67c5df3760a1e22f9/sqlalchemy-2.0.45-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:12c694ed6468333a090d2f60950e4250b928f457e4962389553d6ba5fe9951ac", size = 3309948, upload-time = "2025-12-09T22:09:57.643Z" }, - { url = "https://files.pythonhosted.org/packages/a3/42/bac8d393f5db550e4e466d03d16daaafd2bad1f74e48c12673fb499a7fc1/sqlalchemy-2.0.45-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:f7d27a1d977a1cfef38a0e2e1ca86f09c4212666ce34e6ae542f3ed0a33bc606", size = 3261239, upload-time = "2025-12-09T22:06:08.879Z" }, - { url = "https://files.pythonhosted.org/packages/6f/12/43dc70a0528c59842b04ea1c1ed176f072a9b383190eb015384dd102fb19/sqlalchemy-2.0.45-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d62e47f5d8a50099b17e2bfc1b0c7d7ecd8ba6b46b1507b58cc4f05eefc3bb1c", size = 3284065, upload-time = "2025-12-09T22:09:59.454Z" }, - { url = "https://files.pythonhosted.org/packages/cf/9c/563049cf761d9a2ec7bc489f7879e9d94e7b590496bea5bbee9ed7b4cc32/sqlalchemy-2.0.45-cp311-cp311-win32.whl", hash = "sha256:3c5f76216e7b85770d5bb5130ddd11ee89f4d52b11783674a662c7dd57018177", size = 2113480, upload-time = "2025-12-09T21:29:57.03Z" }, - { url = "https://files.pythonhosted.org/packages/bc/fa/09d0a11fe9f15c7fa5c7f0dd26be3d235b0c0cbf2f9544f43bc42efc8a24/sqlalchemy-2.0.45-cp311-cp311-win_amd64.whl", hash = "sha256:a15b98adb7f277316f2c276c090259129ee4afca783495e212048daf846654b2", size = 2138407, upload-time = "2025-12-09T21:29:58.556Z" }, - { url = "https://files.pythonhosted.org/packages/2d/c7/1900b56ce19bff1c26f39a4ce427faec7716c81ac792bfac8b6a9f3dca93/sqlalchemy-2.0.45-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b3ee2aac15169fb0d45822983631466d60b762085bc4535cd39e66bea362df5f", size = 3333760, upload-time = "2025-12-09T22:11:02.66Z" }, - { url = "https://files.pythonhosted.org/packages/0a/93/3be94d96bb442d0d9a60e55a6bb6e0958dd3457751c6f8502e56ef95fed0/sqlalchemy-2.0.45-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba547ac0b361ab4f1608afbc8432db669bd0819b3e12e29fb5fa9529a8bba81d", size = 3348268, upload-time = "2025-12-09T22:13:49.054Z" }, - { url = "https://files.pythonhosted.org/packages/48/4b/f88ded696e61513595e4a9778f9d3f2bf7332cce4eb0c7cedaabddd6687b/sqlalchemy-2.0.45-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:215f0528b914e5c75ef2559f69dca86878a3beeb0c1be7279d77f18e8d180ed4", size = 3278144, upload-time = "2025-12-09T22:11:04.14Z" }, - { url = "https://files.pythonhosted.org/packages/ed/6a/310ecb5657221f3e1bd5288ed83aa554923fb5da48d760a9f7622afeb065/sqlalchemy-2.0.45-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:107029bf4f43d076d4011f1afb74f7c3e2ea029ec82eb23d8527d5e909e97aa6", size = 3313907, upload-time = "2025-12-09T22:13:50.598Z" }, - { url = "https://files.pythonhosted.org/packages/5c/39/69c0b4051079addd57c84a5bfb34920d87456dd4c90cf7ee0df6efafc8ff/sqlalchemy-2.0.45-cp312-cp312-win32.whl", hash = "sha256:0c9f6ada57b58420a2c0277ff853abe40b9e9449f8d7d231763c6bc30f5c4953", size = 2112182, upload-time = "2025-12-09T21:39:30.824Z" }, - { url = "https://files.pythonhosted.org/packages/f7/4e/510db49dd89fc3a6e994bee51848c94c48c4a00dc905e8d0133c251f41a7/sqlalchemy-2.0.45-cp312-cp312-win_amd64.whl", hash = "sha256:8defe5737c6d2179c7997242d6473587c3beb52e557f5ef0187277009f73e5e1", size = 2139200, upload-time = "2025-12-09T21:39:32.321Z" }, - { url = "https://files.pythonhosted.org/packages/bf/e1/3ccb13c643399d22289c6a9786c1a91e3dcbb68bce4beb44926ac2c557bf/sqlalchemy-2.0.45-py3-none-any.whl", hash = "sha256:5225a288e4c8cc2308dbdd874edad6e7d0fd38eac1e9e5f23503425c8eee20d0", size = 1936672, upload-time = "2025-12-09T21:54:52.608Z" }, + { url = "https://files.pythonhosted.org/packages/69/ac/b42ad16800d0885105b59380ad69aad0cce5a65276e269ce2729a2343b6a/sqlalchemy-2.0.46-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:261c4b1f101b4a411154f1da2b76497d73abbfc42740029205d4d01fa1052684", size = 2154851, upload-time = "2026-01-21T18:27:30.54Z" }, + { url = "https://files.pythonhosted.org/packages/a0/60/d8710068cb79f64d002ebed62a7263c00c8fd95f4ebd4b5be8f7ca93f2bc/sqlalchemy-2.0.46-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:181903fe8c1b9082995325f1b2e84ac078b1189e2819380c2303a5f90e114a62", size = 3311241, upload-time = "2026-01-21T18:32:33.45Z" }, + { url = "https://files.pythonhosted.org/packages/2b/0f/20c71487c7219ab3aa7421c7c62d93824c97c1460f2e8bb72404b0192d13/sqlalchemy-2.0.46-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:590be24e20e2424a4c3c1b0835e9405fa3d0af5823a1a9fc02e5dff56471515f", size = 3310741, upload-time = "2026-01-21T18:44:57.887Z" }, + { url = "https://files.pythonhosted.org/packages/65/80/d26d00b3b249ae000eee4db206fcfc564bf6ca5030e4747adf451f4b5108/sqlalchemy-2.0.46-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7568fe771f974abadce52669ef3a03150ff03186d8eb82613bc8adc435a03f01", size = 3263116, upload-time = "2026-01-21T18:32:35.044Z" }, + { url = "https://files.pythonhosted.org/packages/da/ee/74dda7506640923821340541e8e45bd3edd8df78664f1f2e0aae8077192b/sqlalchemy-2.0.46-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ebf7e1e78af38047e08836d33502c7a278915698b7c2145d045f780201679999", size = 3285327, upload-time = "2026-01-21T18:44:59.254Z" }, + { url = "https://files.pythonhosted.org/packages/9f/25/6dcf8abafff1389a21c7185364de145107b7394ecdcb05233815b236330d/sqlalchemy-2.0.46-cp311-cp311-win32.whl", hash = "sha256:9d80ea2ac519c364a7286e8d765d6cd08648f5b21ca855a8017d9871f075542d", size = 2114564, upload-time = "2026-01-21T18:33:15.85Z" }, + { url = "https://files.pythonhosted.org/packages/93/5f/e081490f8523adc0088f777e4ebad3cac21e498ec8a3d4067074e21447a1/sqlalchemy-2.0.46-cp311-cp311-win_amd64.whl", hash = "sha256:585af6afe518732d9ccd3aea33af2edaae4a7aa881af5d8f6f4fe3a368699597", size = 2139233, upload-time = "2026-01-21T18:33:17.528Z" }, + { url = "https://files.pythonhosted.org/packages/b6/35/d16bfa235c8b7caba3730bba43e20b1e376d2224f407c178fbf59559f23e/sqlalchemy-2.0.46-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3a9a72b0da8387f15d5810f1facca8f879de9b85af8c645138cba61ea147968c", size = 2153405, upload-time = "2026-01-21T19:05:54.143Z" }, + { url = "https://files.pythonhosted.org/packages/06/6c/3192e24486749862f495ddc6584ed730c0c994a67550ec395d872a2ad650/sqlalchemy-2.0.46-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2347c3f0efc4de367ba00218e0ae5c4ba2306e47216ef80d6e31761ac97cb0b9", size = 3334702, upload-time = "2026-01-21T18:46:45.384Z" }, + { url = "https://files.pythonhosted.org/packages/ea/a2/b9f33c8d68a3747d972a0bb758c6b63691f8fb8a49014bc3379ba15d4274/sqlalchemy-2.0.46-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9094c8b3197db12aa6f05c51c05daaad0a92b8c9af5388569847b03b1007fb1b", size = 3347664, upload-time = "2026-01-21T18:40:09.979Z" }, + { url = "https://files.pythonhosted.org/packages/aa/d2/3e59e2a91eaec9db7e8dc6b37b91489b5caeb054f670f32c95bcba98940f/sqlalchemy-2.0.46-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:37fee2164cf21417478b6a906adc1a91d69ae9aba8f9533e67ce882f4bb1de53", size = 3277372, upload-time = "2026-01-21T18:46:47.168Z" }, + { url = "https://files.pythonhosted.org/packages/dd/dd/67bc2e368b524e2192c3927b423798deda72c003e73a1e94c21e74b20a85/sqlalchemy-2.0.46-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b1e14b2f6965a685c7128bd315e27387205429c2e339eeec55cb75ca4ab0ea2e", size = 3312425, upload-time = "2026-01-21T18:40:11.548Z" }, + { url = "https://files.pythonhosted.org/packages/43/82/0ecd68e172bfe62247e96cb47867c2d68752566811a4e8c9d8f6e7c38a65/sqlalchemy-2.0.46-cp312-cp312-win32.whl", hash = "sha256:412f26bb4ba942d52016edc8d12fb15d91d3cd46b0047ba46e424213ad407bcb", size = 2113155, upload-time = "2026-01-21T18:42:49.748Z" }, + { url = "https://files.pythonhosted.org/packages/bc/2a/2821a45742073fc0331dc132552b30de68ba9563230853437cac54b2b53e/sqlalchemy-2.0.46-cp312-cp312-win_amd64.whl", hash = "sha256:ea3cd46b6713a10216323cda3333514944e510aa691c945334713fca6b5279ff", size = 2140078, upload-time = "2026-01-21T18:42:51.197Z" }, + { url = "https://files.pythonhosted.org/packages/fc/a1/9c4efa03300926601c19c18582531b45aededfb961ab3c3585f1e24f120b/sqlalchemy-2.0.46-py3-none-any.whl", hash = "sha256:f9c11766e7e7c0a2767dda5acb006a118640c9fc0a4104214b96269bfb78399e", size = 1937882, upload-time = "2026-01-21T18:22:10.456Z" }, ] [[package]] @@ -6248,8 +6259,7 @@ dependencies = [ { name = "numpy" }, { name = "protobuf" }, { name = "six" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/f1/39/47a3ec8e42fe74dd05af1dfed9c3b02b8f8adfdd8656b2c5d4f95f975c9f/tablestore-6.3.7.tar.gz", hash = "sha256:990682dbf6b602f317a2d359b4281dcd054b4326081e7a67b73dbbe95407be51", size = 117440, upload-time = "2025-10-29T02:57:57.415Z" } wheels = [ @@ -6290,8 +6300,7 @@ dependencies = [ { name = "requests" }, { name = "tcvdb-text" }, { name = "ujson" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/19/ec/c80579aff1539257aafcf8dc3f3c13630171f299d65b33b68440e166f27c/tcvectordb-1.6.4.tar.gz", hash = "sha256:6fb18e15ccc6744d5147e9bbd781f84df3d66112de7d9cc615878b3f72d3a29a", size = 75188, upload-time = "2025-03-05T09:14:19.925Z" } wheels = [ @@ -6315,8 +6324,7 @@ dependencies = [ { name = "docker" }, { name = "python-dotenv" }, { name = "typing-extensions" }, - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, { name = "wrapt" }, ] sdist = { url = "https://files.pythonhosted.org/packages/fc/b3/c272537f3ea2f312555efeb86398cc382cd07b740d5f3c730918c36e64e1/testcontainers-4.13.3.tar.gz", hash = "sha256:9d82a7052c9a53c58b69e1dc31da8e7a715e8b3ec1c4df5027561b47e2efe646", size = 79064, upload-time = "2025-11-14T05:08:47.584Z" } @@ -6471,27 +6479,26 @@ wheels = [ [[package]] name = "ty" -version = "0.0.11" +version = "0.0.13" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/bc/45/5ae578480168d4b3c08cf8e5eac3caf8eb7acdb1a06a9bed7519564bd9b4/ty-0.0.11.tar.gz", hash = "sha256:ebcbc7d646847cb6610de1da4ffc849d8b800e29fd1e9ebb81ba8f3fbac88c25", size = 4920340, upload-time = "2026-01-09T21:06:01.592Z" } +sdist = { url = "https://files.pythonhosted.org/packages/5a/dc/b607f00916f5a7c52860b84a66dc17bc6988e8445e96b1d6e175a3837397/ty-0.0.13.tar.gz", hash = "sha256:7a1d135a400ca076407ea30012d1f75419634160ed3b9cad96607bf2956b23b3", size = 4999183, upload-time = "2026-01-21T13:21:16.133Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/0f/34/b1d05cdcd01589a8d2e63011e0a1e24dcefdc2a09d024fee3e27755963f6/ty-0.0.11-py3-none-linux_armv6l.whl", hash = "sha256:68f0b8d07b0a2ea7ec63a08ba2624f853e4f9fa1a06fce47fb453fa279dead5a", size = 9521748, upload-time = "2026-01-09T21:06:13.221Z" }, - { url = "https://files.pythonhosted.org/packages/43/21/f52d93f4b3784b91bfbcabd01b84dc82128f3a9de178536bcf82968f3367/ty-0.0.11-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:cbf82d7ef0618e9ae3cc3c37c33abcfa302c9b3e3b8ff11d71076f98481cb1a8", size = 9454903, upload-time = "2026-01-09T21:06:42.363Z" }, - { url = "https://files.pythonhosted.org/packages/ad/01/3a563dba8b1255e474c35e1c3810b7589e81ae8c41df401b6a37c8e2cde9/ty-0.0.11-py3-none-macosx_11_0_arm64.whl", hash = "sha256:121987c906e02264c3b511b95cb9f8a3cdd66f3283b8bbab678ca3525652e304", size = 8823417, upload-time = "2026-01-09T21:06:26.315Z" }, - { url = "https://files.pythonhosted.org/packages/6f/b1/99b87222c05d3a28fb7bbfb85df4efdde8cb6764a24c1b138f3a615283dd/ty-0.0.11-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:999390b6cc045fe5e1b3da1c2c9ae8e8c0def23b69455e7c9191ba9ffd747023", size = 9290785, upload-time = "2026-01-09T21:05:59.028Z" }, - { url = "https://files.pythonhosted.org/packages/3d/9f/598809a8fff2194f907ba6de07ac3d7b7788342592d8f8b98b1b50c2fb49/ty-0.0.11-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed504d78eb613c49be3c848f236b345b6c13dc6bcfc4b202790a60a97e1d8f35", size = 9359392, upload-time = "2026-01-09T21:06:37.459Z" }, - { url = "https://files.pythonhosted.org/packages/71/3e/aeea2a97b38f3dcd9f8224bf83609848efa4bc2f484085508165567daa7b/ty-0.0.11-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7fedc8b43cc8a9991e0034dd205f957a8380dd29bfce36f2a35b5d321636dfd9", size = 9852973, upload-time = "2026-01-09T21:06:21.245Z" }, - { url = "https://files.pythonhosted.org/packages/72/40/86173116995e38f954811a86339ac4c00a2d8058cc245d3e4903bc4a132c/ty-0.0.11-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:0808bdfb7efe09881bf70249b85b0498fb8b75fbb036ce251c496c20adb10075", size = 10796113, upload-time = "2026-01-09T21:06:16.034Z" }, - { url = "https://files.pythonhosted.org/packages/69/71/97c92c401dacae9baa3696163ebe8371635ebf34ba9fda781110d0124857/ty-0.0.11-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:07185b3e38b18c562056dfbc35fb51d866f872977ea1ebcd64ca24a001b5b4f1", size = 10432137, upload-time = "2026-01-09T21:06:07.498Z" }, - { url = "https://files.pythonhosted.org/packages/18/10/9ab43f3cfc5f7792f6bc97620f54d0a0a81ef700be84ea7f6be330936a99/ty-0.0.11-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b5c72f1ada8eb5be984502a600f71d1a3099e12fb6f3c0607aaba2f86f0e9d80", size = 10240520, upload-time = "2026-01-09T21:06:34.823Z" }, - { url = "https://files.pythonhosted.org/packages/74/18/8dd4fe6df1fd66f3e83b4798eddb1d8482d9d9b105f25099b76703402ebb/ty-0.0.11-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:25f88e8789072830348cb59b761d5ced70642ed5600673b4bf6a849af71eca8b", size = 9973340, upload-time = "2026-01-09T21:06:39.657Z" }, - { url = "https://files.pythonhosted.org/packages/e4/0b/fb2301450cf8f2d7164944d6e1e659cac9ec7021556cc173d54947cf8ef4/ty-0.0.11-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:f370e1047a62dcedcd06e2b27e1f0b16c7f8ea2361d9070fcbf0d0d69baaa192", size = 9262101, upload-time = "2026-01-09T21:06:28.989Z" }, - { url = "https://files.pythonhosted.org/packages/f7/8c/d6374af023541072dee1c8bcfe8242669363a670b7619e6fffcc7415a995/ty-0.0.11-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:52be34047ed6177bfcef9247459a767ec03d775714855e262bca1fb015895e8a", size = 9382756, upload-time = "2026-01-09T21:06:24.097Z" }, - { url = "https://files.pythonhosted.org/packages/0d/44/edd1e63ffa8d49d720c475c2c1c779084e5efe50493afdc261938705d10a/ty-0.0.11-py3-none-musllinux_1_2_i686.whl", hash = "sha256:b9e5762ccb3778779378020b8d78f936b3f52ea83f18785319cceba3ae85d8e6", size = 9553944, upload-time = "2026-01-09T21:06:18.426Z" }, - { url = "https://files.pythonhosted.org/packages/35/cd/4afdb0d182d23d07ff287740c4954cc6dde5c3aed150ec3f2a1d72b00f71/ty-0.0.11-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:e9334646ee3095e778e3dbc45fdb2bddfc16acc7804283830ad84991ece16dd7", size = 10060365, upload-time = "2026-01-09T21:06:45.083Z" }, - { url = "https://files.pythonhosted.org/packages/d1/94/a009ad9d8b359933cfea8721c689c0331189be28650d74dcc6add4d5bb09/ty-0.0.11-py3-none-win32.whl", hash = "sha256:44cfb7bb2d6784bd7ffe7b5d9ea90851d9c4723729c50b5f0732d4b9a2013cfc", size = 9040448, upload-time = "2026-01-09T21:06:32.241Z" }, - { url = "https://files.pythonhosted.org/packages/df/04/5a5dfd0aec0ea99ead1e824ee6e347fb623c464da7886aa1e3660fb0f36c/ty-0.0.11-py3-none-win_amd64.whl", hash = "sha256:1bb205db92715d4a13343bfd5b0c59ce8c0ca0daa34fb220ec9120fc66ccbda7", size = 9780112, upload-time = "2026-01-09T21:06:04.69Z" }, - { url = "https://files.pythonhosted.org/packages/ad/07/47d4fccd7bcf5eea1c634d518d6cb233f535a85d0b63fcd66815759e2fa0/ty-0.0.11-py3-none-win_arm64.whl", hash = "sha256:4688bd87b2dc5c85da277bda78daba14af2e66f3dda4d98f3604e3de75519eba", size = 9194038, upload-time = "2026-01-09T21:06:10.152Z" }, + { url = "https://files.pythonhosted.org/packages/1a/df/3632f1918f4c0a33184f107efc5d436ab6da147fd3d3b94b3af6461efbf4/ty-0.0.13-py3-none-linux_armv6l.whl", hash = "sha256:1b2b8e02697c3a94c722957d712a0615bcc317c9b9497be116ef746615d892f2", size = 9993501, upload-time = "2026-01-21T13:21:26.628Z" }, + { url = "https://files.pythonhosted.org/packages/92/87/6a473ced5ac280c6ce5b1627c71a8a695c64481b99aabc798718376a441e/ty-0.0.13-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f15cdb8e233e2b5adfce673bb21f4c5e8eaf3334842f7eea3c70ac6fda8c1de5", size = 9860986, upload-time = "2026-01-21T13:21:24.425Z" }, + { url = "https://files.pythonhosted.org/packages/5d/9b/d89ae375cf0a7cd9360e1164ce017f8c753759be63b6a11ed4c944abe8c6/ty-0.0.13-py3-none-macosx_11_0_arm64.whl", hash = "sha256:0819e89ac9f0d8af7a062837ce197f0461fee2fc14fd07e2c368780d3a397b73", size = 9350748, upload-time = "2026-01-21T13:21:28.502Z" }, + { url = "https://files.pythonhosted.org/packages/a8/a6/9ad58518056fab344b20c0bb2c1911936ebe195318e8acc3bc45ac1c6b6b/ty-0.0.13-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1de79f481084b7cc7a202ba0d7a75e10970d10ffa4f025b23f2e6b7324b74886", size = 9849884, upload-time = "2026-01-21T13:21:21.886Z" }, + { url = "https://files.pythonhosted.org/packages/b1/c3/8add69095fa179f523d9e9afcc15a00818af0a37f2b237a9b59bc0046c34/ty-0.0.13-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4fb2154cff7c6e95d46bfaba283c60642616f20d73e5f96d0c89c269f3e1bcec", size = 9822975, upload-time = "2026-01-21T13:21:14.292Z" }, + { url = "https://files.pythonhosted.org/packages/a4/05/4c0927c68a0a6d43fb02f3f0b6c19c64e3461dc8ed6c404dde0efb8058f7/ty-0.0.13-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:00be58d89337c27968a20d58ca553458608c5b634170e2bec82824c2e4cf4d96", size = 10294045, upload-time = "2026-01-21T13:21:30.505Z" }, + { url = "https://files.pythonhosted.org/packages/b4/86/6dc190838aba967557fe0bfd494c595d00b5081315a98aaf60c0e632aaeb/ty-0.0.13-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:72435eade1fa58c6218abb4340f43a6c3ff856ae2dc5722a247d3a6dd32e9737", size = 10916460, upload-time = "2026-01-21T13:21:07.788Z" }, + { url = "https://files.pythonhosted.org/packages/04/40/9ead96b7c122e1109dfcd11671184c3506996bf6a649306ec427e81d9544/ty-0.0.13-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:77a548742ee8f621d718159e7027c3b555051d096a49bb580249a6c5fc86c271", size = 10597154, upload-time = "2026-01-21T13:21:18.064Z" }, + { url = "https://files.pythonhosted.org/packages/aa/7d/e832a2c081d2be845dc6972d0c7998914d168ccbc0b9c86794419ab7376e/ty-0.0.13-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:da067c57c289b7cf914669704b552b6207c2cc7f50da4118c3e12388642e6b3f", size = 10410710, upload-time = "2026-01-21T13:21:12.388Z" }, + { url = "https://files.pythonhosted.org/packages/31/e3/898be3a96237a32f05c4c29b43594dc3b46e0eedfe8243058e46153b324f/ty-0.0.13-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d1b50a01fffa140417fca5a24b658fbe0734074a095d5b6f0552484724474343", size = 9826299, upload-time = "2026-01-21T13:21:00.845Z" }, + { url = "https://files.pythonhosted.org/packages/bb/eb/db2d852ce0ed742505ff18ee10d7d252f3acfd6fc60eca7e9c7a0288a6d8/ty-0.0.13-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:0f33c46f52e5e9378378eca0d8059f026f3c8073ace02f7f2e8d079ddfe5207e", size = 9831610, upload-time = "2026-01-21T13:21:05.842Z" }, + { url = "https://files.pythonhosted.org/packages/9e/61/149f59c8abaddcbcbb0bd13b89c7741ae1c637823c5cf92ed2c644fcadef/ty-0.0.13-py3-none-musllinux_1_2_i686.whl", hash = "sha256:168eda24d9a0b202cf3758c2962cc295878842042b7eca9ed2965259f59ce9f2", size = 9978885, upload-time = "2026-01-21T13:21:10.306Z" }, + { url = "https://files.pythonhosted.org/packages/a0/cd/026d4e4af60a80918a8d73d2c42b8262dd43ab2fa7b28d9743004cb88d57/ty-0.0.13-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:d4917678b95dc8cb399cc459fab568ba8d5f0f33b7a94bf840d9733043c43f29", size = 10506453, upload-time = "2026-01-21T13:20:56.633Z" }, + { url = "https://files.pythonhosted.org/packages/63/06/8932833a4eca2df49c997a29afb26721612de8078ae79074c8fe87e17516/ty-0.0.13-py3-none-win32.whl", hash = "sha256:c1f2ec40daa405508b053e5b8e440fbae5fdb85c69c9ab0ee078f8bc00eeec3d", size = 9433482, upload-time = "2026-01-21T13:20:58.717Z" }, + { url = "https://files.pythonhosted.org/packages/aa/fd/e8d972d1a69df25c2cecb20ea50e49ad5f27a06f55f1f5f399a563e71645/ty-0.0.13-py3-none-win_amd64.whl", hash = "sha256:8b7b1ab9f187affbceff89d51076038363b14113be29bda2ddfa17116de1d476", size = 10319156, upload-time = "2026-01-21T13:21:03.266Z" }, + { url = "https://files.pythonhosted.org/packages/2d/c2/05fdd64ac003a560d4fbd1faa7d9a31d75df8f901675e5bed1ee2ceeff87/ty-0.0.13-py3-none-win_arm64.whl", hash = "sha256:1c9630333497c77bb9bcabba42971b96ee1f36c601dd3dcac66b4134f9fa38f0", size = 9808316, upload-time = "2026-01-21T13:20:54.053Z" }, ] [[package]] @@ -6520,11 +6527,11 @@ wheels = [ [[package]] name = "types-awscrt" -version = "0.31.0" +version = "0.31.1" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/18/9f/9be587f2243ea7837ad83aad248ff4d8f9a880ac5a84544e9661e5840a22/types_awscrt-0.31.0.tar.gz", hash = "sha256:aa8b42148af0847be14e2b8ea3637a3518ffab038f8d3be7083950f3ce87d3ff", size = 17817, upload-time = "2026-01-12T06:42:37.711Z" } +sdist = { url = "https://files.pythonhosted.org/packages/97/be/589b7bba42b5681a72bac4d714287afef4e1bb84d07c859610ff631d449e/types_awscrt-0.31.1.tar.gz", hash = "sha256:08b13494f93f45c1a92eb264755fce50ed0d1dc75059abb5e31670feb9a09724", size = 17839, upload-time = "2026-01-16T02:01:23.394Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/4c/8d/87ac494b5165e7650b2bc92ee3325c1339a47323489beeda32dffc9a1334/types_awscrt-0.31.0-py3-none-any.whl", hash = "sha256:009cfe5b9af8c75e8304243490e20a5229e7a56203f1d41481f5522233453f51", size = 42509, upload-time = "2026-01-12T06:42:36.187Z" }, + { url = "https://files.pythonhosted.org/packages/5e/fd/ddca80617f230bd833f99b4fb959abebffd8651f520493cae2e96276b1bd/types_awscrt-0.31.1-py3-none-any.whl", hash = "sha256:7e4364ac635f72bd57f52b093883640b1448a6eded0ecbac6e900bf4b1e4777b", size = 42516, upload-time = "2026-01-16T02:01:21.637Z" }, ] [[package]] @@ -6657,11 +6664,11 @@ wheels = [ [[package]] name = "types-jmespath" -version = "1.0.2.20250809" +version = "1.1.0.20260124" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d5/ff/6848b1603ca47fff317b44dfff78cc1fb0828262f840b3ab951b619d5a22/types_jmespath-1.0.2.20250809.tar.gz", hash = "sha256:e194efec21c0aeae789f701ae25f17c57c25908e789b1123a5c6f8d915b4adff", size = 10248, upload-time = "2025-08-09T03:14:57.996Z" } +sdist = { url = "https://files.pythonhosted.org/packages/2b/ca/c8d7fc6e450c2f8fc6f510cb194754c43b17f933f2dcabcfc6985cbb97a8/types_jmespath-1.1.0.20260124.tar.gz", hash = "sha256:29d86868e72c0820914577077b27d167dcab08b1fc92157a29d537ff7153fdfe", size = 10709, upload-time = "2026-01-24T03:18:46.557Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/0e/6a/65c8be6b6555beaf1a654ae1c2308c2e19a610c0b318a9730e691b79ac79/types_jmespath-1.0.2.20250809-py3-none-any.whl", hash = "sha256:4147d17cc33454f0dac7e78b4e18e532a1330c518d85f7f6d19e5818ab83da21", size = 11494, upload-time = "2025-08-09T03:14:57.292Z" }, + { url = "https://files.pythonhosted.org/packages/61/91/915c4a6e6e9bd2bca3ec0c21c1771b175c59e204b85e57f3f572370fe753/types_jmespath-1.1.0.20260124-py3-none-any.whl", hash = "sha256:ec387666d446b15624215aa9cbd2867ffd885b6c74246d357c65e830c7a138b3", size = 11509, upload-time = "2026-01-24T03:18:45.536Z" }, ] [[package]] @@ -6793,11 +6800,11 @@ wheels = [ [[package]] name = "types-python-dateutil" -version = "2.9.0.20251115" +version = "2.9.0.20260124" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/6a/36/06d01fb52c0d57e9ad0c237654990920fa41195e4b3d640830dabf9eeb2f/types_python_dateutil-2.9.0.20251115.tar.gz", hash = "sha256:8a47f2c3920f52a994056b8786309b43143faa5a64d4cbb2722d6addabdf1a58", size = 16363, upload-time = "2025-11-15T03:00:13.717Z" } +sdist = { url = "https://files.pythonhosted.org/packages/fe/41/4f8eb1ce08688a9e3e23709ed07089ccdeaf95b93745bfb768c6da71197d/types_python_dateutil-2.9.0.20260124.tar.gz", hash = "sha256:7d2db9f860820c30e5b8152bfe78dbdf795f7d1c6176057424e8b3fdd1f581af", size = 16596, upload-time = "2026-01-24T03:18:42.975Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/43/0b/56961d3ba517ed0df9b3a27bfda6514f3d01b28d499d1bce9068cfe4edd1/types_python_dateutil-2.9.0.20251115-py3-none-any.whl", hash = "sha256:9cf9c1c582019753b8639a081deefd7e044b9fa36bd8217f565c6c4e36ee0624", size = 18251, upload-time = "2025-11-15T03:00:12.317Z" }, + { url = "https://files.pythonhosted.org/packages/5a/c2/aa5e3f4103cc8b1dcf92432415dde75d70021d634ecfd95b2e913cf43e17/types_python_dateutil-2.9.0.20260124-py3-none-any.whl", hash = "sha256:f802977ae08bf2260142e7ca1ab9d4403772a254409f7bbdf652229997124951", size = 18266, upload-time = "2026-01-24T03:18:42.155Z" }, ] [[package]] @@ -6863,8 +6870,7 @@ name = "types-requests" version = "2.32.4.20260107" source = { registry = "https://pypi.org/simple" } dependencies = [ - { name = "urllib3", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "(python_full_version < '3.12.4' and platform_python_implementation != 'PyPy') or platform_python_implementation == 'PyPy' or sys_platform != 'linux'" }, - { name = "urllib3", version = "2.6.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'" }, + { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/0f/f3/a0663907082280664d745929205a89d41dffb29e89a50f753af7d57d0a96/types_requests-2.32.4.20260107.tar.gz", hash = "sha256:018a11ac158f801bfa84857ddec1650750e393df8a004a8a9ae2a9bec6fcb24f", size = 23165, upload-time = "2026-01-07T03:20:54.091Z" } wheels = [ @@ -6882,11 +6888,11 @@ wheels = [ [[package]] name = "types-setuptools" -version = "80.9.0.20251223" +version = "80.10.0.20260124" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/00/07/d1b605230730990de20477150191d6dccf6aecc037da94c9960a5d563bc8/types_setuptools-80.9.0.20251223.tar.gz", hash = "sha256:d3411059ae2f5f03985217d86ac6084efea2c9e9cacd5f0869ef950f308169b2", size = 42420, upload-time = "2025-12-23T03:18:26.752Z" } +sdist = { url = "https://files.pythonhosted.org/packages/aa/7e/116539b9610585e34771611e33c88a4c706491fa3565500f5a63139f8731/types_setuptools-80.10.0.20260124.tar.gz", hash = "sha256:1b86d9f0368858663276a0cbe5fe5a9722caf94b5acde8aba0399a6e90680f20", size = 43299, upload-time = "2026-01-24T03:18:39.527Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/78/5c/b8877da94012dbc6643e4eeca22bca9b99b295be05d161f8a403ae9387c0/types_setuptools-80.9.0.20251223-py3-none-any.whl", hash = "sha256:1b36db79d724c2287d83dc052cf887b47c0da6a2fff044378be0b019545f56e6", size = 64318, upload-time = "2025-12-23T03:18:25.868Z" }, + { url = "https://files.pythonhosted.org/packages/2b/7f/016dc5cc718ec6ccaa84fb73ed409ef1c261793fd5e637cdfaa18beb40a9/types_setuptools-80.10.0.20260124-py3-none-any.whl", hash = "sha256:efed7e044f01adb9c2806c7a8e1b6aa3656b8e382379b53d5f26ee3db24d4c01", size = 64333, upload-time = "2026-01-24T03:18:38.344Z" }, ] [[package]] @@ -6921,16 +6927,16 @@ wheels = [ [[package]] name = "types-tensorflow" -version = "2.18.0.20260113" +version = "2.18.0.20260121" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "numpy" }, { name = "types-protobuf" }, { name = "types-requests" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/f1/ae/2e5f5e05df94838f2a8f0d53547815f87549a2f6a7fa56dd0c6e00ef8fec/types_tensorflow-2.18.0.20260113.tar.gz", hash = "sha256:d06a02c407191197618ea363a69b2fd16eabd6eaf362a67616bc19cfdc3206ec", size = 257994, upload-time = "2026-01-13T03:20:23.066Z" } +sdist = { url = "https://files.pythonhosted.org/packages/ed/81/43d17caea48c3454bf64c23cba5f7876fc0cd0f0434f350f61782cc95587/types_tensorflow-2.18.0.20260121.tar.gz", hash = "sha256:7fe9f75fd00be0f53ca97ba3d3b4cf8ab45447f6d3a959ad164cf9ac421a5f89", size = 258281, upload-time = "2026-01-21T03:24:22.488Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/58/3e/6b27094b0afa3ed72be93f67c21dfb68e2a32918a71f3e0e15c5462c8b94/types_tensorflow-2.18.0.20260113-py3-none-any.whl", hash = "sha256:f16ea939a9d5cb9664acdffdeefc229e6dc29e175644c5223b95505d469350b0", size = 329425, upload-time = "2026-01-13T03:20:21.418Z" }, + { url = "https://files.pythonhosted.org/packages/87/84/6510e7c7b29c6005d93fd6762f7d7d4a413ffd8ec8e04ebc53ac2d8c5372/types_tensorflow-2.18.0.20260121-py3-none-any.whl", hash = "sha256:80d9a9528fa52dc215a914d6ba47f5500f54b421efd2923adf98cff1760b2cce", size = 329562, upload-time = "2026-01-21T03:24:21.147Z" }, ] [[package]] @@ -7097,7 +7103,7 @@ pptx = [ [[package]] name = "unstructured-client" -version = "0.42.6" +version = "0.42.8" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "aiofiles" }, @@ -7108,9 +7114,9 @@ dependencies = [ { name = "pypdf" }, { name = "requests-toolbelt" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ff/fe/c6d334d4fb9a4a006125a1a8a3918be643c268290707d48e9cd060b71f7f/unstructured_client-0.42.6.tar.gz", hash = "sha256:ea54f2c4ca3e7a1330f9e77cbc96f88f829518beeec5e1b797b5352f4d76a73a", size = 94179, upload-time = "2025-12-17T03:49:58.38Z" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/67/6afb5337e97566a9dc0337606223893ce01f175bd17bf05844a816581b69/unstructured_client-0.42.8.tar.gz", hash = "sha256:663655548ed5c205efb48b7f38ca0906998b33571512f7c53c60aa811e514464", size = 94400, upload-time = "2026-01-14T21:54:03.373Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5e/12/5aa5d051b32d0c09077a8e83920e794b9bf2315739add4ab821e71fbca58/unstructured_client-0.42.6-py3-none-any.whl", hash = "sha256:c93b1d9d1b9f63a8e961729d00224b3659ef9ef3e14996ea4e53ddc95df671a9", size = 219563, upload-time = "2025-12-17T03:49:56.993Z" }, + { url = "https://files.pythonhosted.org/packages/94/18/d792b297937459ef54e3972b08ce3b5bdd4018d053837a8cfb3c40dd1c49/unstructured_client-0.42.8-py3-none-any.whl", hash = "sha256:6dbdb62d36554a5cbe61dc1b6ef0c8b11a46cc61e2602c2dc22975ba78028214", size = 219970, upload-time = "2026-01-14T21:54:01.206Z" }, ] [[package]] @@ -7134,35 +7140,10 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/a9/99/3ae339466c9183ea5b8ae87b34c0b897eda475d2aec2307cae60e5cd4f29/uritemplate-4.2.0-py3-none-any.whl", hash = "sha256:962201ba1c4edcab02e60f9a0d3821e82dfc5d2d6662a21abd533879bdb8a686", size = 11488, upload-time = "2025-06-02T15:12:03.405Z" }, ] -[[package]] -name = "urllib3" -version = "2.3.0" -source = { registry = "https://pypi.org/simple" } -resolution-markers = [ - "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'linux'", - "python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'", - "python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'", - "python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'linux'", - "python_full_version < '3.12' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'", - "python_full_version < '3.12' and platform_python_implementation != 'PyPy' and sys_platform != 'linux'", - "python_full_version < '3.12' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'", - "python_full_version < '3.12' and platform_python_implementation == 'PyPy' and sys_platform != 'linux'", -] -sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268, upload-time = "2024-12-22T07:47:30.032Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369, upload-time = "2024-12-22T07:47:28.074Z" }, -] - [[package]] name = "urllib3" version = "2.6.3" source = { registry = "https://pypi.org/simple" } -resolution-markers = [ - "python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'", -] sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556, upload-time = "2026-01-07T16:24:43.925Z" } wheels = [ { url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" }, @@ -7341,16 +7322,16 @@ wheels = [ [[package]] name = "wcwidth" -version = "0.2.14" +version = "0.3.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/24/30/6b0809f4510673dc723187aeaf24c7f5459922d01e2f794277a3dfb90345/wcwidth-0.2.14.tar.gz", hash = "sha256:4d478375d31bc5395a3c55c40ccdf3354688364cd61c4f6adacaa9215d0b3605", size = 102293, upload-time = "2025-09-22T16:29:53.023Z" } +sdist = { url = "https://files.pythonhosted.org/packages/05/07/0b5bcc9812b1b2fd331cc88289ef4d47d428afdbbf0216bb7d53942d93d6/wcwidth-0.3.2.tar.gz", hash = "sha256:d469b3059dab6b1077def5923ed0a8bf5738bd4a1a87f686d5e2de455354c4ad", size = 233633, upload-time = "2026-01-23T21:08:52.451Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/af/b5/123f13c975e9f27ab9c0770f514345bd406d0e8d3b7a0723af9d43f710af/wcwidth-0.2.14-py2.py3-none-any.whl", hash = "sha256:a7bb560c8aee30f9957e5f9895805edd20602f2d7f720186dfd906e82b4982e1", size = 37286, upload-time = "2025-09-22T16:29:51.641Z" }, + { url = "https://files.pythonhosted.org/packages/72/c6/1452e716c5af065c018f75d42ca97517a04ac6aae4133722e0424649a07c/wcwidth-0.3.2-py3-none-any.whl", hash = "sha256:817abc6a89e47242a349b5d100cbd244301690d6d8d2ec6335f26fe6640a6315", size = 86280, upload-time = "2026-01-23T21:08:51.362Z" }, ] [[package]] name = "weave" -version = "0.52.23" +version = "0.52.25" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, @@ -7365,9 +7346,9 @@ dependencies = [ { name = "tzdata", marker = "sys_platform == 'win32'" }, { name = "wandb" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/a2/3c/119b64e092218f7c37f9ca65bfda8ef856be6c03687e5701fa67fd16f3df/weave-0.52.23.tar.gz", hash = "sha256:ad4f37cc901cb93a000faedbe1313509e97c2e7a18cbaada574fef820741ff32", size = 647645, upload-time = "2026-01-08T18:23:46.71Z" } +sdist = { url = "https://files.pythonhosted.org/packages/de/c1/3650fd0c1ebbe1bb7cfd4ae549de477def97b29c4632a0aacb8e76c5b632/weave-0.52.25.tar.gz", hash = "sha256:7e1260f5cd7eff0b97e5008ef191e68a5b7b611c07aeea8bc81626f10ee1bab8", size = 657154, upload-time = "2026-01-20T20:12:18.263Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/06/489587dd6f5d06f19535ffbf6cd98e061eef3b50583d8dc40d79271c470b/weave-0.52.23-py3-none-any.whl", hash = "sha256:debe9eac5bdec857dc2507401f075b1b1f3199eccc9ef29f106747c13b23966c", size = 810639, upload-time = "2026-01-08T18:23:44.593Z" }, + { url = "https://files.pythonhosted.org/packages/af/11/02d464838a6fa66228ae5ad4d29d68a9661675a0c787e53d1cd691a5067d/weave-0.52.25-py3-none-any.whl", hash = "sha256:5d0a302059ae507df8d3fd4e39f61a5236612b18272456065056f859bd2be1ee", size = 822409, upload-time = "2026-01-20T20:12:16.356Z" }, ] [[package]] @@ -7487,6 +7468,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/ff/21/abdedb4cdf6ff41ebf01a74087740a709e2edb146490e4d9beea054b0b7a/wrapt-1.16.0-py3-none-any.whl", hash = "sha256:6906c4100a8fcbf2fa735f6059214bb13b97f75b1a61777fcf6432121ef12ef1", size = 23362, upload-time = "2023-11-09T06:33:28.271Z" }, ] +[[package]] +name = "wsproto" +version = "1.3.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c7/79/12135bdf8b9c9367b8701c2c19a14c913c120b882d50b014ca0d38083c2c/wsproto-1.3.2.tar.gz", hash = "sha256:b86885dcf294e15204919950f666e06ffc6c7c114ca900b060d6e16293528294", size = 50116, upload-time = "2025-11-20T18:18:01.871Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a4/f5/10b68b7b1544245097b2a1b8238f66f2fc6dcaeb24ba5d917f52bd2eed4f/wsproto-1.3.2-py3-none-any.whl", hash = "sha256:61eea322cdf56e8cc904bd3ad7573359a242ba65688716b0710a5eb12beab584", size = 24405, upload-time = "2025-11-20T18:18:00.454Z" }, +] + [[package]] name = "xinference-client" version = "1.2.2" diff --git a/dev/setup b/dev/setup new file mode 100755 index 0000000000..399c8f28a5 --- /dev/null +++ b/dev/setup @@ -0,0 +1,28 @@ +#!/usr/bin/env bash +set -euo pipefail + +SCRIPT_DIR="$(dirname "$(realpath "$0")")" +ROOT="$(dirname "$SCRIPT_DIR")" + +API_ENV_EXAMPLE="$ROOT/api/.env.example" +API_ENV="$ROOT/api/.env" +WEB_ENV_EXAMPLE="$ROOT/web/.env.example" +WEB_ENV="$ROOT/web/.env.local" +MIDDLEWARE_ENV_EXAMPLE="$ROOT/docker/middleware.env.example" +MIDDLEWARE_ENV="$ROOT/docker/middleware.env" + +# 1) Copy api/.env.example -> api/.env +cp "$API_ENV_EXAMPLE" "$API_ENV" + +# 2) Copy web/.env.example -> web/.env.local +cp "$WEB_ENV_EXAMPLE" "$WEB_ENV" + +# 3) Copy docker/middleware.env.example -> docker/middleware.env +cp "$MIDDLEWARE_ENV_EXAMPLE" "$MIDDLEWARE_ENV" + +# 4) Install deps +cd "$ROOT/api" +uv sync --group dev + +cd "$ROOT/web" +pnpm install diff --git a/dev/start-api b/dev/start-api index 0b50ad0d0a..bdfa58eca6 100755 --- a/dev/start-api +++ b/dev/start-api @@ -3,8 +3,9 @@ set -x SCRIPT_DIR="$(dirname "$(realpath "$0")")" -cd "$SCRIPT_DIR/.." +cd "$SCRIPT_DIR/../api" +uv run flask db upgrade -uv --directory api run \ +uv run \ flask run --host 0.0.0.0 --port=5001 --debug diff --git a/dev/start-docker-compose b/dev/start-docker-compose new file mode 100755 index 0000000000..9652be169d --- /dev/null +++ b/dev/start-docker-compose @@ -0,0 +1,8 @@ +#!/usr/bin/env bash +set -euo pipefail + +SCRIPT_DIR="$(dirname "$(realpath "$0")")" +ROOT="$(dirname "$SCRIPT_DIR")" + +cd "$ROOT/docker" +docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d diff --git a/dev/start-worker b/dev/start-worker index 7876620188..3e48065631 100755 --- a/dev/start-worker +++ b/dev/start-worker @@ -83,7 +83,7 @@ while [[ $# -gt 0 ]]; do done SCRIPT_DIR="$(dirname "$(realpath "$0")")" -cd "$SCRIPT_DIR/.." +cd "$SCRIPT_DIR/../api" if [[ -n "${ENV_FILE}" ]]; then if [[ ! -f "${ENV_FILE}" ]]; then @@ -123,6 +123,6 @@ echo " Concurrency: ${CONCURRENCY}" echo " Pool: ${POOL}" echo " Log Level: ${LOGLEVEL}" -uv --directory api run \ +uv run \ celery -A app.celery worker \ -P ${POOL} -c ${CONCURRENCY} --loglevel ${LOGLEVEL} -Q ${QUEUES} diff --git a/dev/update-uv b/dev/update-uv index 189bd1f6b1..35d723fe20 100755 --- a/dev/update-uv +++ b/dev/update-uv @@ -4,7 +4,7 @@ set -e set -o pipefail -SCRIPT_DIR="$(dirname "$0")" +SCRIPT_DIR="$(dirname "$(realpath "$0")")" REPO_ROOT="$(dirname "${SCRIPT_DIR}")" # rely on `poetry` in path diff --git a/docker/.env.example b/docker/.env.example index 3bc52127b4..4563999839 100644 --- a/docker/.env.example +++ b/docker/.env.example @@ -129,6 +129,10 @@ MIGRATION_ENABLED=true # The default value is 300 seconds. FILES_ACCESS_TIMEOUT=300 +# Collaboration mode toggle +# To open collaboration features, you also need to set SERVER_WORKER_CLASS=geventwebsocket.gunicorn.workers.GeventWebSocketWorker +ENABLE_COLLABORATION_MODE=false + # Access token expiration time in minutes ACCESS_TOKEN_EXPIRE_MINUTES=60 @@ -164,6 +168,7 @@ SERVER_WORKER_AMOUNT=1 # Modifying it may also decrease throughput. # # It is strongly discouraged to change this parameter. +# If enable collaboration mode, it must be set to geventwebsocket.gunicorn.workers.GeventWebSocketWorker SERVER_WORKER_CLASS=gevent # Default number of worker connections, the default is 10. @@ -401,6 +406,8 @@ CONSOLE_CORS_ALLOW_ORIGINS=* COOKIE_DOMAIN= # When the frontend and backend run on different subdomains, set NEXT_PUBLIC_COOKIE_DOMAIN=1. NEXT_PUBLIC_COOKIE_DOMAIN= +# WebSocket server URL. +NEXT_PUBLIC_SOCKET_URL=ws://localhost NEXT_PUBLIC_BATCH_CONCURRENCY=5 # ------------------------------ diff --git a/docker/docker-compose-template.yaml b/docker/docker-compose-template.yaml index 9659990383..1740161b7b 100644 --- a/docker/docker-compose-template.yaml +++ b/docker/docker-compose-template.yaml @@ -139,6 +139,7 @@ services: APP_API_URL: ${APP_API_URL:-} AMPLITUDE_API_KEY: ${AMPLITUDE_API_KEY:-} NEXT_PUBLIC_COOKIE_DOMAIN: ${NEXT_PUBLIC_COOKIE_DOMAIN:-} + NEXT_PUBLIC_SOCKET_URL: ${NEXT_PUBLIC_SOCKET_URL:-ws://localhost} SENTRY_DSN: ${WEB_SENTRY_DSN:-} NEXT_TELEMETRY_DISABLED: ${NEXT_TELEMETRY_DISABLED:-0} TEXT_GENERATION_TIMEOUT_MS: ${TEXT_GENERATION_TIMEOUT_MS:-60000} diff --git a/docker/docker-compose.yaml b/docker/docker-compose.yaml index 902ca3103c..5fcd3afedf 100644 --- a/docker/docker-compose.yaml +++ b/docker/docker-compose.yaml @@ -33,6 +33,7 @@ x-shared-env: &shared-api-worker-env OPENAI_API_BASE: ${OPENAI_API_BASE:-https://api.openai.com/v1} MIGRATION_ENABLED: ${MIGRATION_ENABLED:-true} FILES_ACCESS_TIMEOUT: ${FILES_ACCESS_TIMEOUT:-300} + ENABLE_COLLABORATION_MODE: ${ENABLE_COLLABORATION_MODE:-false} ACCESS_TOKEN_EXPIRE_MINUTES: ${ACCESS_TOKEN_EXPIRE_MINUTES:-60} REFRESH_TOKEN_EXPIRE_DAYS: ${REFRESH_TOKEN_EXPIRE_DAYS:-30} APP_DEFAULT_ACTIVE_REQUESTS: ${APP_DEFAULT_ACTIVE_REQUESTS:-0} @@ -109,6 +110,7 @@ x-shared-env: &shared-api-worker-env CONSOLE_CORS_ALLOW_ORIGINS: ${CONSOLE_CORS_ALLOW_ORIGINS:-*} COOKIE_DOMAIN: ${COOKIE_DOMAIN:-} NEXT_PUBLIC_COOKIE_DOMAIN: ${NEXT_PUBLIC_COOKIE_DOMAIN:-} + NEXT_PUBLIC_SOCKET_URL: ${NEXT_PUBLIC_SOCKET_URL:-ws://localhost} NEXT_PUBLIC_BATCH_CONCURRENCY: ${NEXT_PUBLIC_BATCH_CONCURRENCY:-5} STORAGE_TYPE: ${STORAGE_TYPE:-opendal} OPENDAL_SCHEME: ${OPENDAL_SCHEME:-fs} @@ -824,6 +826,7 @@ services: APP_API_URL: ${APP_API_URL:-} AMPLITUDE_API_KEY: ${AMPLITUDE_API_KEY:-} NEXT_PUBLIC_COOKIE_DOMAIN: ${NEXT_PUBLIC_COOKIE_DOMAIN:-} + NEXT_PUBLIC_SOCKET_URL: ${NEXT_PUBLIC_SOCKET_URL:-ws://localhost} SENTRY_DSN: ${WEB_SENTRY_DSN:-} NEXT_TELEMETRY_DISABLED: ${NEXT_TELEMETRY_DISABLED:-0} TEXT_GENERATION_TIMEOUT_MS: ${TEXT_GENERATION_TIMEOUT_MS:-60000} diff --git a/docker/nginx/conf.d/default.conf.template b/docker/nginx/conf.d/default.conf.template index 8058deafee..6643f6348c 100644 --- a/docker/nginx/conf.d/default.conf.template +++ b/docker/nginx/conf.d/default.conf.template @@ -14,6 +14,14 @@ server { include proxy.conf; } + location /socket.io/ { + proxy_pass http://api:5001; + include proxy.conf; + proxy_set_header Upgrade $http_upgrade; + proxy_set_header Connection "upgrade"; + proxy_cache_bypass $http_upgrade; + } + location /v1 { proxy_pass http://api:5001; include proxy.conf; diff --git a/docker/nginx/proxy.conf.template b/docker/nginx/proxy.conf.template index 117f806146..3c39e507ff 100644 --- a/docker/nginx/proxy.conf.template +++ b/docker/nginx/proxy.conf.template @@ -5,7 +5,7 @@ proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Forwarded-Port $server_port; proxy_http_version 1.1; -proxy_set_header Connection ""; +# proxy_set_header Connection ""; proxy_buffering off; proxy_read_timeout ${NGINX_PROXY_READ_TIMEOUT}; proxy_send_timeout ${NGINX_PROXY_SEND_TIMEOUT}; diff --git a/web/.env.example b/web/.env.example index df4e725c51..79d45bd0f9 100644 --- a/web/.env.example +++ b/web/.env.example @@ -14,6 +14,8 @@ NEXT_PUBLIC_API_PREFIX=http://localhost:5001/console/api NEXT_PUBLIC_PUBLIC_API_PREFIX=http://localhost:5001/api # When the frontend and backend run on different subdomains, set NEXT_PUBLIC_COOKIE_DOMAIN=1. NEXT_PUBLIC_COOKIE_DOMAIN= +# WebSocket server URL. +NEXT_PUBLIC_SOCKET_URL=ws://localhost:5001 # The API PREFIX for MARKETPLACE NEXT_PUBLIC_MARKETPLACE_API_PREFIX=https://marketplace.dify.ai/api/v1 diff --git a/web/README.md b/web/README.md index 9c731a081a..aa3a04f1b4 100644 --- a/web/README.md +++ b/web/README.md @@ -43,6 +43,8 @@ NEXT_PUBLIC_EDITION=SELF_HOSTED # example: http://cloud.dify.ai/console/api NEXT_PUBLIC_API_PREFIX=http://localhost:5001/console/api NEXT_PUBLIC_COOKIE_DOMAIN= +# WebSocket server URL. +NEXT_PUBLIC_SOCKET_URL=ws://localhost:5001 # The URL for Web APP, refers to the Web App base URL of WEB service if web app domain is different from # console or api domain. # example: http://udify.app/api diff --git a/web/app/(commonLayout)/app/(appDetailLayout)/[appId]/overview/card-view.tsx b/web/app/(commonLayout)/app/(appDetailLayout)/[appId]/overview/card-view.tsx index f07b2932c9..abdb8cd196 100644 --- a/web/app/(commonLayout)/app/(appDetailLayout)/[appId]/overview/card-view.tsx +++ b/web/app/(commonLayout)/app/(appDetailLayout)/[appId]/overview/card-view.tsx @@ -5,7 +5,8 @@ import type { BlockEnum } from '@/app/components/workflow/types' import type { UpdateAppSiteCodeResponse } from '@/models/app' import type { App } from '@/types/app' import type { I18nKeysByPrefix } from '@/types/i18n' -import { useCallback, useMemo } from 'react' +import * as React from 'react' +import { useCallback, useEffect, useMemo } from 'react' import { useTranslation } from 'react-i18next' import { useContext } from 'use-context-selector' import AppCard from '@/app/components/app/overview/app-card' @@ -14,6 +15,8 @@ import { useStore as useAppStore } from '@/app/components/app/store' import Loading from '@/app/components/base/loading' import { ToastContext } from '@/app/components/base/toast' import MCPServiceCard from '@/app/components/tools/mcp/mcp-service-card' +import { collaborationManager } from '@/app/components/workflow/collaboration/core/collaboration-manager' +import { webSocketClient } from '@/app/components/workflow/collaboration/core/websocket-manager' import { isTriggerNode } from '@/app/components/workflow/types' import { NEED_REFRESH_APP_LIST_KEY } from '@/config' import { @@ -74,28 +77,59 @@ const CardView: FC = ({ appId, isInPanel, className }) => { ? buildTriggerModeMessage(t('mcp.server.title', { ns: 'tools' })) : null - const updateAppDetail = async () => { + const updateAppDetail = useCallback(async () => { try { const res = await fetchAppDetail({ url: '/apps', id: appId }) setAppDetail({ ...res }) } - catch (error) { console.error(error) } - } + catch (error) { + console.error(error) + } + }, [appId, setAppDetail]) const handleCallbackResult = (err: Error | null, message?: I18nKeysByPrefix<'common', 'actionMsg.'>) => { const type = err ? 'error' : 'success' message ||= (type === 'success' ? 'modifiedSuccessfully' : 'modifiedUnsuccessfully') - if (type === 'success') + if (type === 'success') { updateAppDetail() + // Emit collaboration event to notify other clients of app state changes + const socket = webSocketClient.getSocket(appId) + if (socket) { + socket.emit('collaboration_event', { + type: 'app_state_update', + data: { timestamp: Date.now() }, + timestamp: Date.now(), + }) + } + } + notify({ type, message: t(`actionMsg.${message}`, { ns: 'common' }) as string, }) } + // Listen for collaborative app state updates from other clients + useEffect(() => { + if (!appId) + return + + const unsubscribe = collaborationManager.onAppStateUpdate(async () => { + try { + // Update app detail when other clients modify app state + await updateAppDetail() + } + catch (error) { + console.error('app state update failed:', error) + } + }) + + return unsubscribe + }, [appId, updateAppDetail]) + const onChangeSiteStatus = async (value: boolean) => { const [err] = await asyncRunSafe( updateAppSiteStatus({ diff --git a/web/app/components/app-sidebar/app-info.tsx b/web/app/components/app-sidebar/app-info.tsx index 255feaccdf..ebfc75891f 100644 --- a/web/app/components/app-sidebar/app-info.tsx +++ b/web/app/components/app-sidebar/app-info.tsx @@ -14,7 +14,7 @@ import { import dynamic from 'next/dynamic' import { useRouter } from 'next/navigation' import * as React from 'react' -import { useCallback, useState } from 'react' +import { useCallback, useEffect, useState } from 'react' import { useTranslation } from 'react-i18next' import { useContext } from 'use-context-selector' import CardView from '@/app/(commonLayout)/app/(appDetailLayout)/[appId]/overview/card-view' @@ -22,10 +22,12 @@ import { useStore as useAppStore } from '@/app/components/app/store' import Button from '@/app/components/base/button' import ContentDialog from '@/app/components/base/content-dialog' import { ToastContext } from '@/app/components/base/toast' +import { collaborationManager } from '@/app/components/workflow/collaboration/core/collaboration-manager' +import { webSocketClient } from '@/app/components/workflow/collaboration/core/websocket-manager' import { NEED_REFRESH_APP_LIST_KEY } from '@/config' import { useAppContext } from '@/context/app-context' import { useProviderContext } from '@/context/provider-context' -import { copyApp, deleteApp, exportAppConfig, updateAppInfo } from '@/service/apps' +import { copyApp, deleteApp, exportAppConfig, fetchAppDetail, updateAppInfo } from '@/service/apps' import { useInvalidateAppList } from '@/service/use-apps' import { fetchWorkflowDraft } from '@/service/workflow' import { AppModeEnum } from '@/types/app' @@ -77,6 +79,19 @@ const AppInfo = ({ expand, onlyShowDetail = false, openState = false, onDetailEx const [secretEnvList, setSecretEnvList] = useState([]) const [showExportWarning, setShowExportWarning] = useState(false) + const emitAppMetaUpdate = useCallback(() => { + if (!appDetail?.id) + return + const socket = webSocketClient.getSocket(appDetail.id) + if (socket) { + socket.emit('collaboration_event', { + type: 'app_meta_update', + data: { timestamp: Date.now() }, + timestamp: Date.now(), + }) + } + }, [appDetail]) + const onEdit: CreateAppModalProps['onConfirm'] = useCallback(async ({ name, icon_type, @@ -105,11 +120,12 @@ const AppInfo = ({ expand, onlyShowDetail = false, openState = false, onDetailEx message: t('editDone', { ns: 'app' }), }) setAppDetail(app) + emitAppMetaUpdate() } catch { notify({ type: 'error', message: t('editFailed', { ns: 'app' }) }) } - }, [appDetail, notify, setAppDetail, t]) + }, [appDetail, notify, setAppDetail, t, emitAppMetaUpdate]) const onCopy: DuplicateAppModalProps['onConfirm'] = async ({ name, icon_type, icon, icon_background }) => { if (!appDetail) @@ -207,6 +223,23 @@ const AppInfo = ({ expand, onlyShowDetail = false, openState = false, onDetailEx setShowConfirmDelete(false) }, [appDetail, invalidateAppList, notify, onPlanInfoChanged, replace, setAppDetail, t]) + useEffect(() => { + if (!appDetail?.id) + return + + const unsubscribe = collaborationManager.onAppMetaUpdate(async () => { + try { + const res = await fetchAppDetail({ url: '/apps', id: appDetail.id }) + setAppDetail({ ...res }) + } + catch (error) { + console.error('failed to refresh app detail from collaboration update:', error) + } + }) + + return unsubscribe + }, [appDetail?.id, setAppDetail]) + const { isCurrentWorkspaceEditor } = useAppContext() if (!appDetail) diff --git a/web/app/components/app/app-publisher/features-wrapper.tsx b/web/app/components/app/app-publisher/features-wrapper.tsx index 381e9a553e..d1a23b008e 100644 --- a/web/app/components/app/app-publisher/features-wrapper.tsx +++ b/web/app/components/app/app-publisher/features-wrapper.tsx @@ -1,6 +1,7 @@ import type { AppPublisherProps } from '@/app/components/app/app-publisher' import type { ModelAndParameter } from '@/app/components/app/configuration/debug/types' import type { FileUpload } from '@/app/components/base/features/types' +import type { PublishWorkflowParams } from '@/types/workflow' import { produce } from 'immer' import * as React from 'react' import { useCallback, useState } from 'react' @@ -13,7 +14,7 @@ import { SupportUploadFileTypes } from '@/app/components/workflow/types' import { Resolution } from '@/types/app' type Props = Omit & { - onPublish?: (modelAndParameter?: ModelAndParameter, features?: any) => Promise | any + onPublish?: (params?: ModelAndParameter | PublishWorkflowParams, features?: any) => Promise | any publishedConfig?: any resetAppConfig?: () => void } @@ -62,8 +63,8 @@ const FeaturesWrappedAppPublisher = (props: Props) => { setRestoreConfirmOpen(false) }, [featuresStore, props]) - const handlePublish = useCallback((modelAndParameter?: ModelAndParameter) => { - return props.onPublish?.(modelAndParameter, features) + const handlePublish = useCallback((params?: ModelAndParameter | PublishWorkflowParams) => { + return props.onPublish?.(params, features) }, [features, props]) return ( diff --git a/web/app/components/app/app-publisher/index.tsx b/web/app/components/app/app-publisher/index.tsx index 0a026a680b..06bbad24a4 100644 --- a/web/app/components/app/app-publisher/index.tsx +++ b/web/app/components/app/app-publisher/index.tsx @@ -1,5 +1,7 @@ import type { ModelAndParameter } from '../configuration/debug/types' +import type { CollaborationUpdate } from '@/app/components/workflow/collaboration/types/collaboration' import type { InputVar, Variable } from '@/app/components/workflow/types' +import type { InstalledApp } from '@/models/explore' import type { I18nKeysByPrefix } from '@/types/i18n' import type { PublishWorkflowParams } from '@/types/workflow' import { @@ -18,6 +20,7 @@ import { useKeyPress } from 'ahooks' import { memo, useCallback, + useContext, useEffect, useMemo, useState, @@ -35,6 +38,9 @@ import { } from '@/app/components/base/portal-to-follow-elem' import UpgradeBtn from '@/app/components/billing/upgrade-btn' import WorkflowToolConfigureButton from '@/app/components/tools/workflow-tool/configure-button' +import { collaborationManager } from '@/app/components/workflow/collaboration/core/collaboration-manager' +import { webSocketClient } from '@/app/components/workflow/collaboration/core/websocket-manager' +import { WorkflowContext } from '@/app/components/workflow/context' import { appDefaultIconBackground } from '@/config' import { useGlobalPublicStore } from '@/context/global-public-context' import { useAsyncWindowOpen } from '@/hooks/use-async-window-open' @@ -43,6 +49,8 @@ import { AccessMode } from '@/models/access-control' import { useAppWhiteListSubjects, useGetUserCanAccessApp } from '@/service/access-control' import { fetchAppDetailDirect } from '@/service/apps' import { fetchInstalledAppList } from '@/service/explore' +import { useInvalidateAppWorkflow } from '@/service/use-workflow' +import { fetchPublishedWorkflow } from '@/service/workflow' import { AppModeEnum } from '@/types/app' import { basePath } from '@/utils/var' import Divider from '../../base/divider' @@ -56,6 +64,10 @@ import SuggestedAction from './suggested-action' type AccessModeLabel = I18nKeysByPrefix<'app', 'accessControlDialog.accessItems.'> +type InstalledAppsResponse = { + installed_apps?: InstalledApp[] +} + const ACCESS_MODE_MAP: Record = { [AccessMode.ORGANIZATION]: { label: 'organization', @@ -102,8 +114,8 @@ export type AppPublisherProps = { debugWithMultipleModel?: boolean multipleModelConfigs?: ModelAndParameter[] /** modelAndParameter is passed when debugWithMultipleModel is true */ - onPublish?: (params?: any) => Promise | any - onRestore?: () => Promise | any + onPublish?: (params?: ModelAndParameter | PublishWorkflowParams) => Promise | void + onRestore?: () => Promise | void onToggle?: (state: boolean) => void crossAxisOffset?: number toolPublished?: boolean @@ -146,6 +158,7 @@ const AppPublisher = ({ const [isAppAccessSet, setIsAppAccessSet] = useState(true) const [embeddingModalOpen, setEmbeddingModalOpen] = useState(false) + const workflowStore = useContext(WorkflowContext) const appDetail = useAppStore(state => state.appDetail) const setAppDetail = useAppStore(s => s.setAppDetail) const systemFeatures = useGlobalPublicStore(s => s.systemFeatures) @@ -158,6 +171,7 @@ const AppPublisher = ({ const { data: userCanAccessApp, isLoading: isGettingUserCanAccessApp, refetch } = useGetUserCanAccessApp({ appId: appDetail?.id, enabled: false }) const { data: appAccessSubjects, isLoading: isGettingAppWhiteListSubjects } = useAppWhiteListSubjects(appDetail?.id, open && systemFeatures.webapp_auth.enabled && appDetail?.access_mode === AccessMode.SPECIFIC_GROUPS_MEMBERS) + const invalidateAppWorkflow = useInvalidateAppWorkflow() const openAsyncWindow = useAsyncWindowOpen() const noAccessPermission = useMemo(() => systemFeatures.webapp_auth.enabled && appDetail && appDetail.access_mode !== AccessMode.EXTERNAL_MEMBERS && !userCanAccessApp?.result, [systemFeatures, appDetail, userCanAccessApp]) @@ -193,12 +207,39 @@ const AppPublisher = ({ try { await onPublish?.(params) setPublished(true) + + const appId = appDetail?.id + const socket = appId ? webSocketClient.getSocket(appId) : null + console.warn('[app-publisher] publish success', { + appId, + hasSocket: Boolean(socket), + }) + if (appId) + invalidateAppWorkflow(appId) + else + console.warn('[app-publisher] missing appId, skip workflow invalidate and socket emit') + if (socket) { + const timestamp = Date.now() + socket.emit('collaboration_event', { + type: 'app_publish_update', + data: { + action: 'published', + timestamp, + }, + timestamp, + }) + } + else if (appId) { + console.warn('[app-publisher] socket not ready, skip collaboration_event emit', { appId }) + } + trackEvent('app_published_time', { action_mode: 'app', app_id: appDetail?.id, app_name: appDetail?.name }) } - catch { + catch (error) { + console.warn('[app-publisher] publish failed', error) setPublished(false) } - }, [appDetail, onPublish]) + }, [appDetail, onPublish, invalidateAppWorkflow]) const handleRestore = useCallback(async () => { try { @@ -227,9 +268,10 @@ const AppPublisher = ({ await openAsyncWindow(async () => { if (!appDetail?.id) throw new Error('App not found') - const { installed_apps }: any = await fetchInstalledAppList(appDetail?.id) || {} - if (installed_apps?.length > 0) - return `${basePath}/explore/installed/${installed_apps[0].id}` + const response = (await fetchInstalledAppList(appDetail?.id)) as InstalledAppsResponse + const installedApps = response?.installed_apps + if (installedApps?.length) + return `${basePath}/explore/installed/${installedApps[0].id}` throw new Error('No app found in Explore') }, { onError: (err) => { @@ -257,6 +299,29 @@ const AppPublisher = ({ handlePublish() }, { exactMatch: true, useCapture: true }) + useEffect(() => { + const appId = appDetail?.id + if (!appId) + return + + const unsubscribe = collaborationManager.onAppPublishUpdate((update: CollaborationUpdate) => { + const action = typeof update.data.action === 'string' ? update.data.action : undefined + if (action === 'published') { + invalidateAppWorkflow(appId) + fetchPublishedWorkflow(`/apps/${appId}/workflows/publish`) + .then((publishedWorkflow) => { + if (publishedWorkflow?.created_at) + workflowStore?.getState().setPublishedAt(publishedWorkflow.created_at) + }) + .catch((error) => { + console.warn('[app-publisher] refresh published workflow failed', error) + }) + } + }) + + return unsubscribe + }, [appDetail?.id, invalidateAppWorkflow, workflowStore]) + const hasPublishedVersion = !!publishedAt const workflowToolDisabled = !hasPublishedVersion || !workflowToolAvailable const workflowToolMessage = workflowToolDisabled ? t('common.workflowAsToolDisabledHint', { ns: 'workflow' }) : undefined diff --git a/web/app/components/app/configuration/index.tsx b/web/app/components/app/configuration/index.tsx index 39f03855d8..46a0191be2 100644 --- a/web/app/components/app/configuration/index.tsx +++ b/web/app/components/app/configuration/index.tsx @@ -18,6 +18,7 @@ import type { TextToSpeechConfig, } from '@/models/debug' import type { ModelConfig as BackendModelConfig, UserInputFormItem, VisionSettings } from '@/types/app' +import type { PublishWorkflowParams } from '@/types/workflow' import { CodeBracketIcon } from '@heroicons/react/20/solid' import { useBoolean, useGetState } from 'ahooks' import { clone } from 'es-toolkit/object' @@ -760,7 +761,8 @@ const Configuration: FC = () => { else { return promptEmpty } })() const contextVarEmpty = mode === AppModeEnum.COMPLETION && dataSets.length > 0 && !hasSetContextVar - const onPublish = async (modelAndParameter?: ModelAndParameter, features?: FeaturesData) => { + const onPublish = async (params?: ModelAndParameter | PublishWorkflowParams, features?: FeaturesData) => { + const modelAndParameter = params && 'model' in params ? params : undefined const modelId = modelAndParameter?.model || modelConfig.model_id const promptTemplate = modelConfig.configs.prompt_template const promptVariables = modelConfig.configs.prompt_variables diff --git a/web/app/components/apps/app-card.tsx b/web/app/components/apps/app-card.tsx index 23b134e5a7..b096d681e2 100644 --- a/web/app/components/apps/app-card.tsx +++ b/web/app/components/apps/app-card.tsx @@ -5,6 +5,7 @@ import type { HtmlContentProps } from '@/app/components/base/popover' import type { Tag } from '@/app/components/base/tag-management/constant' import type { CreateAppModalProps } from '@/app/components/explore/create-app-modal' import type { EnvironmentVariable } from '@/app/components/workflow/types' +import type { WorkflowOnlineUser } from '@/models/app' import type { App } from '@/types/app' import { RiBuildingLine, RiGlobalLine, RiLockLine, RiMoreFill, RiVerifiedBadgeLine } from '@remixicon/react' import dynamic from 'next/dynamic' @@ -20,6 +21,7 @@ import CustomPopover from '@/app/components/base/popover' import TagSelector from '@/app/components/base/tag-management/selector' import Toast, { ToastContext } from '@/app/components/base/toast' import Tooltip from '@/app/components/base/tooltip' +import { UserAvatarList } from '@/app/components/base/user-avatar-list' import { NEED_REFRESH_APP_LIST_KEY } from '@/config' import { useAppContext } from '@/context/app-context' import { useGlobalPublicStore } from '@/context/global-public-context' @@ -58,9 +60,10 @@ const AccessControl = dynamic(() => import('@/app/components/app/app-access-cont export type AppCardProps = { app: App onRefresh?: () => void + onlineUsers?: WorkflowOnlineUser[] } -const AppCard = ({ app, onRefresh }: AppCardProps) => { +const AppCard = ({ app, onRefresh, onlineUsers = [] }: AppCardProps) => { const { t } = useTranslation() const { notify } = useContext(ToastContext) const systemFeatures = useGlobalPublicStore(s => s.systemFeatures) @@ -362,6 +365,19 @@ const AppCard = ({ app, onRefresh }: AppCardProps) => { return `${t('segment.editedAt', { ns: 'datasetDocuments' })} ${timeText}` }, [app.updated_at, app.created_at]) + const onlineUserAvatars = useMemo(() => { + if (!onlineUsers.length) + return [] + + return onlineUsers + .map(user => ({ + id: user.user_id || user.sid || '', + name: user.username || 'User', + avatar_url: user.avatar || undefined, + })) + .filter(user => !!user.id) + }, [onlineUsers]) + return ( <>
{ )}
+
+ {onlineUserAvatars.length > 0 && ( + + )} +
({ })) // Mock config -vi.mock('@/config', () => ({ - NEED_REFRESH_APP_LIST_KEY: 'needRefreshAppList', -})) +vi.mock('@/config', async (importOriginal) => { + const actual = await importOriginal() + return { + ...actual, + NEED_REFRESH_APP_LIST_KEY: 'needRefreshAppList', + } +}) // Mock pay hook vi.mock('@/hooks/use-pay', () => ({ @@ -234,6 +239,21 @@ beforeAll(() => { } as unknown as typeof IntersectionObserver }) +const renderList = () => { + const queryClient = new QueryClient({ + defaultOptions: { + queries: { + retry: false, + }, + }, + }) + return render( + + + , + ) +} + describe('List', () => { beforeEach(() => { vi.clearAllMocks() @@ -260,13 +280,13 @@ describe('List', () => { describe('Rendering', () => { it('should render without crashing', () => { - render() + renderList() // Tab slider renders app type tabs expect(screen.getByText('app.types.all')).toBeInTheDocument() }) it('should render tab slider with all app types', () => { - render() + renderList() expect(screen.getByText('app.types.all')).toBeInTheDocument() expect(screen.getByText('app.types.workflow')).toBeInTheDocument() @@ -277,48 +297,48 @@ describe('List', () => { }) it('should render search input', () => { - render() + renderList() // Input component renders a searchbox expect(screen.getByRole('textbox')).toBeInTheDocument() }) it('should render tag filter', () => { - render() + renderList() // Tag filter renders with placeholder text expect(screen.getByText('common.tag.placeholder')).toBeInTheDocument() }) it('should render created by me checkbox', () => { - render() + renderList() expect(screen.getByText('app.showMyCreatedAppsOnly')).toBeInTheDocument() }) it('should render app cards when apps exist', () => { - render() + renderList() expect(screen.getByTestId('app-card-app-1')).toBeInTheDocument() expect(screen.getByTestId('app-card-app-2')).toBeInTheDocument() }) it('should render new app card for editors', () => { - render() + renderList() expect(screen.getByTestId('new-app-card')).toBeInTheDocument() }) it('should render footer when branding is disabled', () => { - render() + renderList() expect(screen.getByTestId('footer')).toBeInTheDocument() }) it('should render drop DSL hint for editors', () => { - render() + renderList() expect(screen.getByText('app.newApp.dropDSLToCreateApp')).toBeInTheDocument() }) }) describe('Tab Navigation', () => { it('should call setActiveTab when tab is clicked', () => { - render() + renderList() fireEvent.click(screen.getByText('app.types.workflow')) @@ -326,7 +346,7 @@ describe('List', () => { }) it('should call setActiveTab for all tab', () => { - render() + renderList() fireEvent.click(screen.getByText('app.types.all')) @@ -336,12 +356,12 @@ describe('List', () => { describe('Search Functionality', () => { it('should render search input field', () => { - render() + renderList() expect(screen.getByRole('textbox')).toBeInTheDocument() }) it('should handle search input change', () => { - render() + renderList() const input = screen.getByRole('textbox') fireEvent.change(input, { target: { value: 'test search' } }) @@ -350,7 +370,7 @@ describe('List', () => { }) it('should handle search input interaction', () => { - render() + renderList() const input = screen.getByRole('textbox') expect(input).toBeInTheDocument() @@ -360,7 +380,7 @@ describe('List', () => { // Set initial keywords to make clear button visible mockQueryState.keywords = 'existing search' - render() + renderList() // Find and click clear button (Input component uses .group class for clear icon container) const clearButton = document.querySelector('.group') @@ -375,12 +395,12 @@ describe('List', () => { describe('Tag Filter', () => { it('should render tag filter component', () => { - render() + renderList() expect(screen.getByText('common.tag.placeholder')).toBeInTheDocument() }) it('should render tag filter with placeholder', () => { - render() + renderList() // Tag filter is rendered expect(screen.getByText('common.tag.placeholder')).toBeInTheDocument() @@ -389,12 +409,12 @@ describe('List', () => { describe('Created By Me Filter', () => { it('should render checkbox with correct label', () => { - render() + renderList() expect(screen.getByText('app.showMyCreatedAppsOnly')).toBeInTheDocument() }) it('should handle checkbox change', () => { - render() + renderList() // Checkbox component uses data-testid="checkbox-{id}" // CheckboxWithLabel doesn't pass testId, so id is undefined @@ -409,7 +429,7 @@ describe('List', () => { it('should not render new app card for non-editors', () => { mockIsCurrentWorkspaceEditor.mockReturnValue(false) - render() + renderList() expect(screen.queryByTestId('new-app-card')).not.toBeInTheDocument() }) @@ -417,7 +437,7 @@ describe('List', () => { it('should not render drop DSL hint for non-editors', () => { mockIsCurrentWorkspaceEditor.mockReturnValue(false) - render() + renderList() expect(screen.queryByText(/drop dsl file to create app/i)).not.toBeInTheDocument() }) @@ -427,7 +447,7 @@ describe('List', () => { it('should redirect dataset operators to datasets page', () => { mockIsCurrentWorkspaceDatasetOperator.mockReturnValue(true) - render() + renderList() expect(mockReplace).toHaveBeenCalledWith('/datasets') }) @@ -437,7 +457,7 @@ describe('List', () => { it('should call refetch when refresh key is set in localStorage', () => { localStorage.setItem('needRefreshAppList', '1') - render() + renderList() expect(mockRefetch).toHaveBeenCalled() expect(localStorage.getItem('needRefreshAppList')).toBeNull() @@ -446,22 +466,23 @@ describe('List', () => { describe('Edge Cases', () => { it('should handle multiple renders without issues', () => { - const { rerender } = render() + const { unmount } = renderList() expect(screen.getByText('app.types.all')).toBeInTheDocument() - rerender() + unmount() + renderList() expect(screen.getByText('app.types.all')).toBeInTheDocument() }) it('should render app cards correctly', () => { - render() + renderList() expect(screen.getByText('Test App 1')).toBeInTheDocument() expect(screen.getByText('Test App 2')).toBeInTheDocument() }) it('should render with all filter options visible', () => { - render() + renderList() expect(screen.getByRole('textbox')).toBeInTheDocument() expect(screen.getByText('common.tag.placeholder')).toBeInTheDocument() @@ -471,14 +492,14 @@ describe('List', () => { describe('Dragging State', () => { it('should show drop hint when DSL feature is enabled for editors', () => { - render() + renderList() expect(screen.getByText('app.newApp.dropDSLToCreateApp')).toBeInTheDocument() }) }) describe('App Type Tabs', () => { it('should render all app type tabs', () => { - render() + renderList() expect(screen.getByText('app.types.all')).toBeInTheDocument() expect(screen.getByText('app.types.workflow')).toBeInTheDocument() @@ -489,7 +510,7 @@ describe('List', () => { }) it('should call setActiveTab for each app type', () => { - render() + renderList() const appTypeTexts = [ { mode: AppModeEnum.WORKFLOW, text: 'app.types.workflow' }, @@ -508,7 +529,7 @@ describe('List', () => { describe('Search and Filter Integration', () => { it('should display search input with correct attributes', () => { - render() + renderList() const input = screen.getByRole('textbox') expect(input).toBeInTheDocument() @@ -516,13 +537,13 @@ describe('List', () => { }) it('should have tag filter component', () => { - render() + renderList() expect(screen.getByText('common.tag.placeholder')).toBeInTheDocument() }) it('should display created by me label', () => { - render() + renderList() expect(screen.getByText('app.showMyCreatedAppsOnly')).toBeInTheDocument() }) @@ -530,14 +551,14 @@ describe('List', () => { describe('App List Display', () => { it('should display all app cards from data', () => { - render() + renderList() expect(screen.getByTestId('app-card-app-1')).toBeInTheDocument() expect(screen.getByTestId('app-card-app-2')).toBeInTheDocument() }) it('should display app names correctly', () => { - render() + renderList() expect(screen.getByText('Test App 1')).toBeInTheDocument() expect(screen.getByText('Test App 2')).toBeInTheDocument() @@ -546,7 +567,7 @@ describe('List', () => { describe('Footer Visibility', () => { it('should render footer when branding is disabled', () => { - render() + renderList() expect(screen.getByTestId('footer')).toBeInTheDocument() }) @@ -558,14 +579,14 @@ describe('List', () => { describe('Additional Coverage', () => { it('should render dragging state overlay when dragging', () => { mockDragging = true - const { container } = render() + const { container } = renderList() // Component should render successfully with dragging state expect(container).toBeInTheDocument() }) it('should handle app mode filter in query params', () => { - render() + renderList() const workflowTab = screen.getByText('app.types.workflow') fireEvent.click(workflowTab) @@ -574,7 +595,7 @@ describe('List', () => { }) it('should render new app card for editors', () => { - render() + renderList() expect(screen.getByTestId('new-app-card')).toBeInTheDocument() }) @@ -582,7 +603,7 @@ describe('List', () => { describe('DSL File Drop', () => { it('should handle DSL file drop and show modal', () => { - render() + renderList() // Simulate DSL file drop via the callback const mockFile = new File(['test content'], 'test.yml', { type: 'application/yaml' }) @@ -596,7 +617,7 @@ describe('List', () => { }) it('should close DSL modal when onClose is called', () => { - render() + renderList() // Open modal via DSL file drop const mockFile = new File(['test content'], 'test.yml', { type: 'application/yaml' }) @@ -614,7 +635,7 @@ describe('List', () => { }) it('should close DSL modal and refetch when onSuccess is called', () => { - render() + renderList() // Open modal via DSL file drop const mockFile = new File(['test content'], 'test.yml', { type: 'application/yaml' }) @@ -637,7 +658,7 @@ describe('List', () => { describe('Tag Filter Change', () => { it('should handle tag filter value change', () => { vi.useFakeTimers() - render() + renderList() // TagFilter component is rendered expect(screen.getByTestId('tag-filter')).toBeInTheDocument() @@ -661,7 +682,7 @@ describe('List', () => { it('should handle empty tag filter selection', () => { vi.useFakeTimers() - render() + renderList() // Trigger tag filter change with empty array act(() => { @@ -683,7 +704,7 @@ describe('List', () => { describe('Infinite Scroll', () => { it('should call fetchNextPage when intersection observer triggers', () => { mockServiceState.hasNextPage = true - render() + renderList() // Simulate intersection if (intersectionCallback) { @@ -700,7 +721,7 @@ describe('List', () => { it('should not call fetchNextPage when not intersecting', () => { mockServiceState.hasNextPage = true - render() + renderList() // Simulate non-intersection if (intersectionCallback) { @@ -718,7 +739,7 @@ describe('List', () => { it('should not call fetchNextPage when loading', () => { mockServiceState.hasNextPage = true mockServiceState.isLoading = true - render() + renderList() if (intersectionCallback) { act(() => { @@ -736,7 +757,7 @@ describe('List', () => { describe('Error State', () => { it('should handle error state in useEffect', () => { mockServiceState.error = new Error('Test error') - const { container } = render() + const { container } = renderList() // Component should still render expect(container).toBeInTheDocument() diff --git a/web/app/components/apps/list.tsx b/web/app/components/apps/list.tsx index 6255fa50e5..d3fc916c10 100644 --- a/web/app/components/apps/list.tsx +++ b/web/app/components/apps/list.tsx @@ -9,13 +9,14 @@ import { RiMessage3Line, RiRobot3Line, } from '@remixicon/react' +import { useQuery } from '@tanstack/react-query' import { useDebounceFn } from 'ahooks' import dynamic from 'next/dynamic' import { useRouter, } from 'next/navigation' import { parseAsString, useQueryState } from 'nuqs' -import { useCallback, useEffect, useRef, useState } from 'react' +import { useCallback, useEffect, useMemo, useRef, useState } from 'react' import { useTranslation } from 'react-i18next' import { useContext } from 'use-context-selector' import Input from '@/app/components/base/input' @@ -29,7 +30,7 @@ import { useAppContext } from '@/context/app-context' import { useGlobalPublicStore } from '@/context/global-public-context' import { CheckModal } from '@/hooks/use-pay' import { DSLImportStatus } from '@/models/app' -import { importAppBundle } from '@/service/apps' +import { fetchWorkflowOnlineUsers, importAppBundle } from '@/service/apps' import { useInfiniteAppList } from '@/service/use-apps' import { AppModeEnum } from '@/types/app' import { getRedirection } from '@/utils/app-redirection' @@ -156,6 +157,37 @@ const List: FC = ({ refetch, } = useInfiniteAppList(appListQueryParams, { enabled: !isCurrentWorkspaceDatasetOperator }) + const apps = useMemo(() => data?.pages?.flatMap(page => page.data) ?? [], [data]) + + const workflowIds = useMemo(() => { + const ids = new Set() + apps.forEach((appItem) => { + const workflowId = appItem.id + if (!workflowId) + return + + if (appItem.mode === 'workflow' || appItem.mode === 'advanced-chat') + ids.add(workflowId) + }) + return Array.from(ids) + }, [apps]) + + const { data: onlineUsersByWorkflow = {}, refetch: refreshOnlineUsers } = useQuery({ + queryKey: ['apps', 'workflow-online-users', workflowIds], + queryFn: () => fetchWorkflowOnlineUsers({ workflowIds }), + enabled: workflowIds.length > 0, + }) + + useEffect(() => { + const timer = window.setInterval(() => { + refetch() + if (workflowIds.length) + refreshOnlineUsers() + }, 10000) + + return () => window.clearInterval(timer) + }, [workflowIds, refetch, refreshOnlineUsers]) + useEffect(() => { if (controlRefreshList > 0) { refetch() @@ -294,7 +326,7 @@ const List: FC = ({ if (hasAnyApp) { return pages.flatMap(({ data: apps }) => apps).map(app => ( - + )) } diff --git a/web/app/components/base/avatar/index.spec.tsx b/web/app/components/base/avatar/index.spec.tsx index e85690880b..a5b5f9e0fa 100644 --- a/web/app/components/base/avatar/index.spec.tsx +++ b/web/app/components/base/avatar/index.spec.tsx @@ -35,12 +35,14 @@ describe('Avatar', () => { it.each([ { size: undefined, expected: '30px', label: 'default (30px)' }, { size: 50, expected: '50px', label: 'custom (50px)' }, - ])('should apply $label size to img element', ({ size, expected }) => { + ])('should apply $label size to avatar container', ({ size, expected }) => { const props = { name: 'Test', avatar: 'https://example.com/avatar.jpg', size } render() - expect(screen.getByRole('img')).toHaveStyle({ + const img = screen.getByRole('img') + const wrapper = img.parentElement as HTMLElement + expect(wrapper).toHaveStyle({ width: expected, height: expected, fontSize: expected, @@ -60,7 +62,7 @@ describe('Avatar', () => { }) describe('className prop', () => { - it('should merge className with default avatar classes on img', () => { + it('should merge className with default avatar classes on container', () => { const props = { name: 'Test', avatar: 'https://example.com/avatar.jpg', @@ -70,8 +72,9 @@ describe('Avatar', () => { render() const img = screen.getByRole('img') - expect(img).toHaveClass('custom-class') - expect(img).toHaveClass('shrink-0', 'flex', 'items-center', 'rounded-full', 'bg-primary-600') + const wrapper = img.parentElement as HTMLElement + expect(wrapper).toHaveClass('custom-class') + expect(wrapper).toHaveClass('shrink-0', 'flex', 'items-center', 'rounded-full', 'bg-primary-600') }) it('should merge className with default avatar classes on fallback div', () => { @@ -277,10 +280,11 @@ describe('Avatar', () => { render() const img = screen.getByRole('img') + const wrapper = img.parentElement as HTMLElement expect(img).toHaveAttribute('alt', 'Test User') expect(img).toHaveAttribute('src', 'https://example.com/avatar.jpg') - expect(img).toHaveStyle({ width: '64px', height: '64px' }) - expect(img).toHaveClass('custom-avatar') + expect(wrapper).toHaveStyle({ width: '64px', height: '64px' }) + expect(wrapper).toHaveClass('custom-avatar') // Trigger load to verify onError callback fireEvent.load(img) diff --git a/web/app/components/base/avatar/index.tsx b/web/app/components/base/avatar/index.tsx index bf7fa060ef..c432cd9278 100644 --- a/web/app/components/base/avatar/index.tsx +++ b/web/app/components/base/avatar/index.tsx @@ -9,6 +9,7 @@ export type AvatarProps = { className?: string textClassName?: string onError?: (x: boolean) => void + backgroundColor?: string } const Avatar = ({ name, @@ -17,9 +18,18 @@ const Avatar = ({ className, textClassName, onError, + backgroundColor, }: AvatarProps) => { - const avatarClassName = 'shrink-0 flex items-center rounded-full bg-primary-600' - const style = { width: `${size}px`, height: `${size}px`, fontSize: `${size}px`, lineHeight: `${size}px` } + const avatarClassName = backgroundColor + ? 'shrink-0 flex items-center rounded-full' + : 'shrink-0 flex items-center rounded-full bg-primary-600' + const style = { + width: `${size}px`, + height: `${size}px`, + fontSize: `${size}px`, + lineHeight: `${size}px`, + ...(backgroundColor && !avatar ? { backgroundColor } : {}), + } const [imgError, setImgError] = useState(false) const handleError = () => { @@ -35,14 +45,18 @@ const Avatar = ({ if (avatar && !imgError) { return ( - {name} onError?.(false)} - /> + > + {name} onError?.(false)} + /> + ) } diff --git a/web/app/components/base/content-dialog/index.tsx b/web/app/components/base/content-dialog/index.tsx index 0bcde7ce8a..168efc91f3 100644 --- a/web/app/components/base/content-dialog/index.tsx +++ b/web/app/components/base/content-dialog/index.tsx @@ -19,7 +19,7 @@ const ContentDialog = ({
+ + + diff --git a/web/app/components/base/icons/assets/public/other/comment.svg b/web/app/components/base/icons/assets/public/other/comment.svg new file mode 100644 index 0000000000..7f48f22fbd --- /dev/null +++ b/web/app/components/base/icons/assets/public/other/comment.svg @@ -0,0 +1,3 @@ + + + \ No newline at end of file diff --git a/web/app/components/base/icons/src/public/common/EnterKey.json b/web/app/components/base/icons/src/public/common/EnterKey.json new file mode 100644 index 0000000000..1c2b0903ee --- /dev/null +++ b/web/app/components/base/icons/src/public/common/EnterKey.json @@ -0,0 +1,36 @@ +{ + "icon": { + "type": "element", + "isRootNode": true, + "name": "svg", + "attributes": { + "width": "16", + "height": "16", + "viewBox": "0 0 16 16", + "fill": "none", + "xmlns": "http://www.w3.org/2000/svg" + }, + "children": [ + { + "type": "element", + "name": "path", + "attributes": { + "d": "M0 4C0 1.79086 1.79086 0 4 0H12C14.2091 0 16 1.79086 16 4V12C16 14.2091 14.2091 16 12 16H4C1.79086 16 0 14.2091 0 12V4Z", + "fill": "white", + "fill-opacity": "0.12" + }, + "children": [] + }, + { + "type": "element", + "name": "path", + "attributes": { + "d": "M3.42756 8.7358V7.62784H10.8764C11.2003 7.62784 11.4957 7.5483 11.7628 7.3892C12.0298 7.23011 12.2415 7.01705 12.3977 6.75C12.5568 6.48295 12.6364 6.1875 12.6364 5.86364C12.6364 5.53977 12.5568 5.24574 12.3977 4.98153C12.2386 4.71449 12.0256 4.50142 11.7585 4.34233C11.4943 4.18324 11.2003 4.10369 10.8764 4.10369H10.3991V3H10.8764C11.4048 3 11.8849 3.12926 12.3168 3.38778C12.7486 3.64631 13.0938 3.99148 13.3523 4.4233C13.6108 4.85511 13.7401 5.33523 13.7401 5.86364C13.7401 6.25852 13.6648 6.62926 13.5142 6.97585C13.3665 7.32244 13.1619 7.62784 12.9006 7.89205C12.6392 8.15625 12.3352 8.36364 11.9886 8.5142C11.642 8.66193 11.2713 8.7358 10.8764 8.7358H3.42756ZM6.16761 12.0554L2.29403 8.18182L6.16761 4.30824L6.9304 5.07102L3.81534 8.18182L6.9304 11.2926L6.16761 12.0554Z", + "fill": "white" + }, + "children": [] + } + ] + }, + "name": "EnterKey" +} diff --git a/web/app/components/base/icons/src/public/common/EnterKey.tsx b/web/app/components/base/icons/src/public/common/EnterKey.tsx new file mode 100644 index 0000000000..2bf2d4fb19 --- /dev/null +++ b/web/app/components/base/icons/src/public/common/EnterKey.tsx @@ -0,0 +1,20 @@ +// GENERATE BY script +// DON NOT EDIT IT MANUALLY + +import type { IconData } from '@/app/components/base/icons/IconBase' +import * as React from 'react' +import IconBase from '@/app/components/base/icons/IconBase' +import data from './EnterKey.json' + +const Icon = ( + { + ref, + ...props + }: React.SVGProps & { + ref?: React.RefObject> + }, +) => + +Icon.displayName = 'EnterKey' + +export default Icon diff --git a/web/app/components/base/icons/src/public/common/index.ts b/web/app/components/base/icons/src/public/common/index.ts index c19ab569fa..5708fa4dd6 100644 --- a/web/app/components/base/icons/src/public/common/index.ts +++ b/web/app/components/base/icons/src/public/common/index.ts @@ -1,6 +1,7 @@ export { default as D } from './D' export { default as DiagonalDividingLine } from './DiagonalDividingLine' export { default as Dify } from './Dify' +export { default as EnterKey } from './EnterKey' export { default as Gdpr } from './Gdpr' export { default as Github } from './Github' export { default as Highlight } from './Highlight' diff --git a/web/app/components/base/icons/src/public/other/Comment.json b/web/app/components/base/icons/src/public/other/Comment.json new file mode 100644 index 0000000000..780ddc4cf5 --- /dev/null +++ b/web/app/components/base/icons/src/public/other/Comment.json @@ -0,0 +1,26 @@ +{ + "icon": { + "type": "element", + "isRootNode": true, + "name": "svg", + "attributes": { + "xmlns": "http://www.w3.org/2000/svg", + "width": "14", + "height": "12", + "viewBox": "0 0 14 12", + "fill": "none" + }, + "children": [ + { + "type": "element", + "name": "path", + "attributes": { + "d": "M12.3334 4C12.3334 2.52725 11.1395 1.33333 9.66671 1.33333H4.33337C2.86062 1.33333 1.66671 2.52724 1.66671 4V10.6667H9.66671C11.1395 10.6667 12.3334 9.47274 12.3334 8V4ZM7.66671 6.66667V8H4.33337V6.66667H7.66671ZM9.66671 4V5.33333H4.33337V4H9.66671ZM13.6667 8C13.6667 10.2091 11.8758 12 9.66671 12H0.333374V4C0.333374 1.79086 2.12424 0 4.33337 0H9.66671C11.8758 0 13.6667 1.79086 13.6667 4V8Z", + "fill": "currentColor" + }, + "children": [] + } + ] + }, + "name": "Comment" +} diff --git a/web/app/components/base/icons/src/public/other/Comment.tsx b/web/app/components/base/icons/src/public/other/Comment.tsx new file mode 100644 index 0000000000..887754e48e --- /dev/null +++ b/web/app/components/base/icons/src/public/other/Comment.tsx @@ -0,0 +1,20 @@ +// GENERATE BY script +// DON NOT EDIT IT MANUALLY + +import type { IconData } from '@/app/components/base/icons/IconBase' +import * as React from 'react' +import IconBase from '@/app/components/base/icons/IconBase' +import data from './Comment.json' + +const Icon = ( + { + ref, + ...props + }: React.SVGProps & { + ref?: React.RefObject> + }, +) => + +Icon.displayName = 'Comment' + +export default Icon diff --git a/web/app/components/base/icons/src/public/other/index.ts b/web/app/components/base/icons/src/public/other/index.ts index 71887f11cc..84ecd616ed 100644 --- a/web/app/components/base/icons/src/public/other/index.ts +++ b/web/app/components/base/icons/src/public/other/index.ts @@ -1,3 +1,4 @@ +export { default as Comment } from './Comment' export { default as DefaultToolIcon } from './DefaultToolIcon' export { default as Icon3Dots } from './Icon3Dots' export { default as Message3Fill } from './Message3Fill' diff --git a/web/app/components/base/prompt-editor/index.tsx b/web/app/components/base/prompt-editor/index.tsx index e104bed7d7..02de31a39a 100644 --- a/web/app/components/base/prompt-editor/index.tsx +++ b/web/app/components/base/prompt-editor/index.tsx @@ -18,6 +18,7 @@ import type { } from './types' import { CodeNode } from '@lexical/code' import { LexicalComposer } from '@lexical/react/LexicalComposer' +import { useLexicalComposerContext } from '@lexical/react/LexicalComposerContext' import { ContentEditable } from '@lexical/react/LexicalContentEditable' import { LexicalErrorBoundary } from '@lexical/react/LexicalErrorBoundary' import { HistoryPlugin } from '@lexical/react/LexicalHistoryPlugin' @@ -93,6 +94,29 @@ import { } from './plugins/workflow-variable-block' import { textToEditorState } from './utils' +const ValueSyncPlugin: FC<{ value?: string }> = ({ value }) => { + const [editor] = useLexicalComposerContext() + + useEffect(() => { + if (value === undefined) + return + + const incomingValue = value ?? '' + const shouldUpdate = editor.getEditorState().read(() => { + const currentText = $getRoot().getChildren().map(node => node.getTextContent()).join('\n') + return currentText !== incomingValue + }) + + if (!shouldUpdate) + return + + const editorState = editor.parseEditorState(textToEditorState(incomingValue)) + editor.setEditorState(editorState) + }, [editor, value]) + + return null +} + export type PromptEditorProps = { instanceId?: string compact?: boolean @@ -357,6 +381,7 @@ const PromptEditor: FC = ({ ) } + diff --git a/web/app/components/base/user-avatar-list/index.tsx b/web/app/components/base/user-avatar-list/index.tsx new file mode 100644 index 0000000000..f16ef3b7ff --- /dev/null +++ b/web/app/components/base/user-avatar-list/index.tsx @@ -0,0 +1,79 @@ +import type { FC } from 'react' +import { memo } from 'react' +import Avatar from '@/app/components/base/avatar' +import { getUserColor } from '@/app/components/workflow/collaboration/utils/user-color' +import { useAppContext } from '@/context/app-context' + +type User = { + id: string + name: string + avatar_url?: string | null +} + +type UserAvatarListProps = { + users: User[] + maxVisible?: number + size?: number + className?: string + showCount?: boolean +} + +export const UserAvatarList: FC = memo(({ + users, + maxVisible = 3, + size = 24, + className = '', + showCount = true, +}) => { + const { userProfile } = useAppContext() + if (!users.length) + return null + + const shouldShowCount = showCount && users.length > maxVisible + const actualMaxVisible = shouldShowCount ? Math.max(1, maxVisible - 1) : maxVisible + const visibleUsers = users.slice(0, actualMaxVisible) + const remainingCount = users.length - actualMaxVisible + + const currentUserId = userProfile?.id + + return ( +
+ {visibleUsers.map((user, index) => { + const isCurrentUser = user.id === currentUserId + const userColor = isCurrentUser ? undefined : getUserColor(user.id) + return ( +
+ +
+ ) + }, + + )} + {shouldShowCount && remainingCount > 0 && ( +
+ + + {remainingCount} +
+ )} +
+ ) +}) + +UserAvatarList.displayName = 'UserAvatarList' diff --git a/web/app/components/header/account-setting/model-provider-page/system-model-selector/index.tsx b/web/app/components/header/account-setting/model-provider-page/system-model-selector/index.tsx index 29c71e04fc..5cf46eb463 100644 --- a/web/app/components/header/account-setting/model-provider-page/system-model-selector/index.tsx +++ b/web/app/components/header/account-setting/model-provider-page/system-model-selector/index.tsx @@ -144,7 +144,7 @@ const SystemModel: FC = ({ {t('modelProvider.systemModelSettings', { ns: 'common' })} - +
diff --git a/web/app/components/plugins/plugin-detail-panel/datasource-action-list.tsx b/web/app/components/plugins/plugin-detail-panel/datasource-action-list.tsx index b6fec61540..c17559190d 100644 --- a/web/app/components/plugins/plugin-detail-panel/datasource-action-list.tsx +++ b/web/app/components/plugins/plugin-detail-panel/datasource-action-list.tsx @@ -76,7 +76,7 @@ const ActionList = ({ className='w-full' onClick={() => setShowSettingAuth(true)} disabled={!isCurrentWorkspaceManager} - >{t('workflow.nodes.tool.authorize')} + >{t('nodes.tool.authorize', { ns: 'workflow' })} )} */}
{/*
diff --git a/web/app/components/rag-pipeline/hooks/use-available-nodes-meta-data.ts b/web/app/components/rag-pipeline/hooks/use-available-nodes-meta-data.ts index f653d2dcd2..b3a155bcc5 100644 --- a/web/app/components/rag-pipeline/hooks/use-available-nodes-meta-data.ts +++ b/web/app/components/rag-pipeline/hooks/use-available-nodes-meta-data.ts @@ -24,7 +24,7 @@ export const useAvailableNodesMetaData = () => { }, knowledgeBaseDefault, dataSourceEmptyDefault, - ], []) + ] as AvailableNodesMetaData['nodes'], []) const helpLinkUri = useMemo(() => docLink( '/use-dify/knowledge/knowledge-pipeline/knowledge-pipeline-orchestration', diff --git a/web/app/components/rag-pipeline/hooks/use-get-run-and-trace-url.ts b/web/app/components/rag-pipeline/hooks/use-get-run-and-trace-url.ts index f9988b60f8..8ad0861689 100644 --- a/web/app/components/rag-pipeline/hooks/use-get-run-and-trace-url.ts +++ b/web/app/components/rag-pipeline/hooks/use-get-run-and-trace-url.ts @@ -3,8 +3,14 @@ import { useWorkflowStore } from '@/app/components/workflow/store' export const useGetRunAndTraceUrl = () => { const workflowStore = useWorkflowStore() - const getWorkflowRunAndTraceUrl = useCallback((runId: string) => { + const getWorkflowRunAndTraceUrl = useCallback((runId?: string) => { const { pipelineId } = workflowStore.getState() + if (!pipelineId || !runId) { + return { + runUrl: '', + traceUrl: '', + } + } return { runUrl: `/rag/pipelines/${pipelineId}/workflow-runs/${runId}`, diff --git a/web/app/components/tools/mcp/mcp-server-modal.tsx b/web/app/components/tools/mcp/mcp-server-modal.tsx index 1128183c0b..0b03b21b2d 100644 --- a/web/app/components/tools/mcp/mcp-server-modal.tsx +++ b/web/app/components/tools/mcp/mcp-server-modal.tsx @@ -10,6 +10,7 @@ import Divider from '@/app/components/base/divider' import Modal from '@/app/components/base/modal' import Textarea from '@/app/components/base/textarea' import MCPServerParamItem from '@/app/components/tools/mcp/mcp-server-param-item' +import { webSocketClient } from '@/app/components/workflow/collaboration/core/websocket-manager' import { useCreateMCPServer, useInvalidateMCPServerDetail, @@ -59,6 +60,22 @@ const MCPServerModal = ({ return res } + const emitMcpServerUpdate = (action: 'created' | 'updated') => { + const socket = webSocketClient.getSocket(appID) + if (!socket) + return + + const timestamp = Date.now() + socket.emit('collaboration_event', { + type: 'mcp_server_update', + data: { + action, + timestamp, + }, + timestamp, + }) + } + const submit = async () => { if (!data) { const payload: any = { @@ -71,6 +88,7 @@ const MCPServerModal = ({ await createMCPServer(payload) invalidateMCPServerDetail(appID) + emitMcpServerUpdate('created') onHide() } else { @@ -83,6 +101,7 @@ const MCPServerModal = ({ payload.description = description await updateMCPServer(payload) invalidateMCPServerDetail(appID) + emitMcpServerUpdate('updated') onHide() } } @@ -92,6 +111,7 @@ const MCPServerModal = ({ isShow={show} onClose={onHide} className={cn('relative !max-w-[520px] !p-0')} + highPriority >
diff --git a/web/app/components/tools/mcp/mcp-service-card.tsx b/web/app/components/tools/mcp/mcp-service-card.tsx index d0c0a1664a..e05e3642dd 100644 --- a/web/app/components/tools/mcp/mcp-service-card.tsx +++ b/web/app/components/tools/mcp/mcp-service-card.tsx @@ -1,6 +1,8 @@ 'use client' +import type { CollaborationUpdate } from '@/app/components/workflow/collaboration/types/collaboration' +import type { InputVar } from '@/app/components/workflow/types' import type { AppDetailResponse } from '@/models/app' -import type { AppSSO } from '@/types/app' +import type { AppSSO, ModelConfig, UserInputFormItem } from '@/types/app' import { RiEditLine, RiLoopLeftLine } from '@remixicon/react' import * as React from 'react' import { useEffect, useMemo, useState } from 'react' @@ -16,6 +18,8 @@ import Switch from '@/app/components/base/switch' import Tooltip from '@/app/components/base/tooltip' import Indicator from '@/app/components/header/indicator' import MCPServerModal from '@/app/components/tools/mcp/mcp-server-modal' +import { collaborationManager } from '@/app/components/workflow/collaboration/core/collaboration-manager' +import { webSocketClient } from '@/app/components/workflow/collaboration/core/websocket-manager' import { BlockEnum } from '@/app/components/workflow/types' import { useAppContext } from '@/context/app-context' import { useDocLink } from '@/context/i18n' @@ -36,6 +40,16 @@ export type IAppCardProps = { triggerModeMessage?: React.ReactNode // display-only message explaining the trigger restriction } +type BasicAppConfig = Partial & { + updated_at?: number +} + +type McpServerParam = { + label: string + variable: string + type: string +} + function MCPServiceCard({ appInfo, triggerModeDisabled = false, @@ -54,16 +68,16 @@ function MCPServiceCard({ const isAdvancedApp = appInfo?.mode === AppModeEnum.ADVANCED_CHAT || appInfo?.mode === AppModeEnum.WORKFLOW const isBasicApp = !isAdvancedApp const { data: currentWorkflow } = useAppWorkflow(isAdvancedApp ? appId : '') - const [basicAppConfig, setBasicAppConfig] = useState({}) - const basicAppInputForm = useMemo(() => { - if (!isBasicApp || !basicAppConfig?.user_input_form) + const [basicAppConfig, setBasicAppConfig] = useState({}) + const basicAppInputForm = useMemo(() => { + if (!isBasicApp || !basicAppConfig.user_input_form) return [] - return basicAppConfig.user_input_form.map((item: any) => { - const type = Object.keys(item)[0] - return { - ...item[type], - type: type || 'text-input', - } + return basicAppConfig.user_input_form.map((item: UserInputFormItem) => { + if ('text-input' in item) + return { label: item['text-input'].label, variable: item['text-input'].variable, type: 'text-input' } + if ('select' in item) + return { label: item.select.label, variable: item.select.variable, type: 'select' } + return { label: item.paragraph.label, variable: item.paragraph.variable, type: 'paragraph' } }) }, [basicAppConfig.user_input_form, isBasicApp]) useEffect(() => { @@ -90,12 +104,22 @@ function MCPServiceCard({ const [activated, setActivated] = useState(serverActivated) - const latestParams = useMemo(() => { + const latestParams = useMemo(() => { if (isAdvancedApp) { if (!currentWorkflow?.graph) return [] - const startNode = currentWorkflow?.graph.nodes.find(node => node.data.type === BlockEnum.Start) as any - return startNode?.data.variables as any[] || [] + const startNode = currentWorkflow?.graph.nodes.find(node => node.data.type === BlockEnum.Start) + const variables = (startNode?.data as { variables?: InputVar[] } | undefined)?.variables || [] + return variables.map((variable) => { + const label = typeof variable.label === 'string' + ? variable.label + : (variable.label.variable || variable.label.nodeName) + return { + label, + variable: variable.variable, + type: variable.type, + } + }) } return basicAppInputForm }, [currentWorkflow, basicAppInputForm, isAdvancedApp]) @@ -103,6 +127,19 @@ function MCPServiceCard({ const onGenCode = async () => { await refreshMCPServerCode(detail?.id || '') invalidateMCPServerDetail(appId) + + // Emit collaboration event to notify other clients of MCP server changes + const socket = webSocketClient.getSocket(appId) + if (socket) { + socket.emit('collaboration_event', { + type: 'mcp_server_update', + data: { + action: 'codeRegenerated', + timestamp: Date.now(), + }, + timestamp: Date.now(), + }) + } } const onChangeStatus = async (state: boolean) => { @@ -132,6 +169,20 @@ function MCPServiceCard({ }) invalidateMCPServerDetail(appId) } + + // Emit collaboration event to notify other clients of MCP server status change + const socket = webSocketClient.getSocket(appId) + if (socket) { + socket.emit('collaboration_event', { + type: 'mcp_server_update', + data: { + action: 'statusChanged', + status: state ? 'active' : 'inactive', + timestamp: Date.now(), + }, + timestamp: Date.now(), + }) + } } const handleServerModalHide = () => { @@ -144,6 +195,23 @@ function MCPServiceCard({ setActivated(serverActivated) }, [serverActivated]) + // Listen for collaborative MCP server updates from other clients + useEffect(() => { + if (!appId) + return + + const unsubscribe = collaborationManager.onMcpServerUpdate(async (_update: CollaborationUpdate) => { + try { + invalidateMCPServerDetail(appId) + } + catch (error) { + console.error('MCP server update failed:', error) + } + }) + + return unsubscribe + }, [appId, invalidateMCPServerDetail]) + if (!currentWorkflow && isAdvancedApp) return null diff --git a/web/app/components/workflow-app/components/workflow-header/features-trigger.spec.tsx b/web/app/components/workflow-app/components/workflow-header/features-trigger.spec.tsx index 724a39837b..8bc47ca89d 100644 --- a/web/app/components/workflow-app/components/workflow-header/features-trigger.spec.tsx +++ b/web/app/components/workflow-app/components/workflow-header/features-trigger.spec.tsx @@ -108,7 +108,7 @@ vi.mock('@/app/components/app/app-publisher', () => ({ -
diff --git a/web/app/components/workflow-app/components/workflow-header/features-trigger.tsx b/web/app/components/workflow-app/components/workflow-header/features-trigger.tsx index 466220b611..fdfb895e2c 100644 --- a/web/app/components/workflow-app/components/workflow-header/features-trigger.tsx +++ b/web/app/components/workflow-app/components/workflow-header/features-trigger.tsx @@ -1,3 +1,4 @@ +import type { ModelAndParameter } from '@/app/components/app/configuration/debug/types' import type { EndNodeType } from '@/app/components/workflow/nodes/end/types' import type { StartNodeType } from '@/app/components/workflow/nodes/start/types' import type { @@ -140,24 +141,38 @@ const FeaturesTrigger = () => { const needWarningNodes = useChecklist(nodes, edges) const updatePublishedWorkflow = useInvalidateAppWorkflow() - const onPublish = useCallback(async (params?: PublishWorkflowParams) => { + const onPublish = useCallback(async (params?: ModelAndParameter | PublishWorkflowParams) => { + const publishParams = params && 'title' in params ? params : undefined // First check if there are any items in the checklist // if (!validateBeforeRun()) // throw new Error('Checklist has unresolved items') if (needWarningNodes.length > 0) { + console.warn('[workflow-header] publish blocked by checklist', { + appId: appID, + warningCount: needWarningNodes.length, + }) notify({ type: 'error', message: t('panel.checklistTip', { ns: 'workflow' }) }) throw new Error('Checklist has unresolved items') } // Then perform the detailed validation if (await handleCheckBeforePublish()) { + console.warn('[workflow-header] publish start', { + appId: appID, + title: publishParams?.title ?? '', + }) const res = await publishWorkflow({ url: `/apps/${appID}/workflows/publish`, - title: params?.title || '', - releaseNotes: params?.releaseNotes || '', + title: publishParams?.title || '', + releaseNotes: publishParams?.releaseNotes || '', }) + console.warn('[workflow-header] publish response', { + appId: appID, + hasResponse: Boolean(res), + createdAt: res?.created_at, + }) if (res) { notify({ type: 'success', message: t('api.actionSuccess', { ns: 'common' }) }) updatePublishedWorkflow(appID!) diff --git a/web/app/components/workflow-app/components/workflow-main.tsx b/web/app/components/workflow-app/components/workflow-main.tsx index f890b2a4f7..828c7fab08 100644 --- a/web/app/components/workflow-app/components/workflow-main.tsx +++ b/web/app/components/workflow-app/components/workflow-main.tsx @@ -1,13 +1,26 @@ import type { ReactNode } from 'react' +import type { Features as FeaturesData } from '@/app/components/base/features/types' import type { WorkflowProps } from '@/app/components/workflow' +import type { CollaborationUpdate } from '@/app/components/workflow/collaboration/types/collaboration' +import type { Shape as HooksStoreShape } from '@/app/components/workflow/hooks-store/store' +import type { Edge, Node } from '@/app/components/workflow/types' +import type { FetchWorkflowDraftResponse } from '@/types/workflow' import { useCallback, + useEffect, useMemo, + useRef, } from 'react' +import { useReactFlow } from 'reactflow' import { useFeatures, useFeaturesStore } from '@/app/components/base/features/hooks' +import { FILE_EXTS } from '@/app/components/base/prompt-editor/constants' import { WorkflowWithInnerContext } from '@/app/components/workflow' +import { collaborationManager, useCollaboration } from '@/app/components/workflow/collaboration' +import { useWorkflowUpdate } from '@/app/components/workflow/hooks/use-workflow-interactions' import { MCPToolAvailabilityProvider } from '@/app/components/workflow/nodes/_base/components/mcp-tool-availability' -import { useWorkflowStore } from '@/app/components/workflow/store' +import { useStore, useWorkflowStore } from '@/app/components/workflow/store' +import { SupportUploadFileTypes } from '@/app/components/workflow/types' +import { fetchWorkflowDraft } from '@/service/workflow' import { useAvailableNodesMetaData, useConfigsMap, @@ -25,6 +38,7 @@ import WorkflowChildren from './workflow-children' type WorkflowMainProps = Pick & { headerLeftSlot?: ReactNode } +type WorkflowDataUpdatePayload = Pick const WorkflowMain = ({ nodes, edges, @@ -34,8 +48,48 @@ const WorkflowMain = ({ const sandboxEnabled = useFeatures(state => state.features.sandbox?.enabled) ?? false const featuresStore = useFeaturesStore() const workflowStore = useWorkflowStore() + const appId = useStore(s => s.appId) + const containerRef = useRef(null) + const reactFlow = useReactFlow() - const handleWorkflowDataUpdate = useCallback((payload: any) => { + const reactFlowStore = useMemo(() => ({ + getState: () => ({ + getNodes: () => reactFlow.getNodes(), + setNodes: (nodesToSet: Node[]) => reactFlow.setNodes(nodesToSet), + getEdges: () => reactFlow.getEdges(), + setEdges: (edgesToSet: Edge[]) => reactFlow.setEdges(edgesToSet), + }), + }), [reactFlow]) + const { + startCursorTracking, + stopCursorTracking, + onlineUsers, + cursors, + isConnected, + isEnabled: isCollaborationEnabled, + } = useCollaboration(appId || '', reactFlowStore) + const myUserId = useMemo( + () => (isCollaborationEnabled && isConnected ? 'current-user' : null), + [isCollaborationEnabled, isConnected], + ) + + const filteredCursors = Object.fromEntries( + Object.entries(cursors).filter(([userId]) => userId !== myUserId), + ) + + useEffect(() => { + if (!isCollaborationEnabled) + return + + if (containerRef.current) + startCursorTracking(containerRef as React.RefObject, reactFlow) + + return () => { + stopCursorTracking() + } + }, [startCursorTracking, stopCursorTracking, reactFlow, isCollaborationEnabled]) + + const handleWorkflowDataUpdate = useCallback((payload: WorkflowDataUpdatePayload) => { const { features, conversation_variables, @@ -44,7 +98,33 @@ const WorkflowMain = ({ if (features && featuresStore) { const { setFeatures } = featuresStore.getState() - setFeatures(features) + const transformedFeatures: FeaturesData = { + file: { + image: { + enabled: !!features.file_upload?.image?.enabled, + number_limits: features.file_upload?.image?.number_limits || 3, + transfer_methods: features.file_upload?.image?.transfer_methods || ['local_file', 'remote_url'], + }, + enabled: !!(features.file_upload?.enabled || features.file_upload?.image?.enabled), + allowed_file_types: features.file_upload?.allowed_file_types || [SupportUploadFileTypes.image], + allowed_file_extensions: features.file_upload?.allowed_file_extensions || FILE_EXTS[SupportUploadFileTypes.image].map(ext => `.${ext}`), + allowed_file_upload_methods: features.file_upload?.allowed_file_upload_methods || features.file_upload?.image?.transfer_methods || ['local_file', 'remote_url'], + number_limits: features.file_upload?.number_limits || features.file_upload?.image?.number_limits || 3, + }, + opening: { + enabled: !!features.opening_statement, + opening_statement: features.opening_statement, + suggested_questions: features.suggested_questions, + }, + suggested: features.suggested_questions_after_answer || { enabled: false }, + speech2text: features.speech_to_text || { enabled: false }, + text2speech: features.text_to_speech || { enabled: false }, + citation: features.retriever_resource || { enabled: false }, + moderation: features.sensitive_word_avoidance || { enabled: false }, + annotationReply: features.annotation_reply || { enabled: false }, + } + + setFeatures(transformedFeatures) } if (conversation_variables) { const { setConversationVariables } = workflowStore.getState() @@ -61,6 +141,7 @@ const WorkflowMain = ({ syncWorkflowDraftWhenPageClose, } = useNodesSyncDraft() const { handleRefreshWorkflowDraft } = useWorkflowRefreshDraft() + const { handleUpdateWorkflowCanvas } = useWorkflowUpdate() const { handleBackupDraft, handleLoadBackupDraft, @@ -68,6 +149,64 @@ const WorkflowMain = ({ handleRun, handleStopRun, } = useWorkflowRun() + + useEffect(() => { + if (!appId || !isCollaborationEnabled) + return + + const unsubscribe = collaborationManager.onVarsAndFeaturesUpdate(async (_update: CollaborationUpdate) => { + try { + const response = await fetchWorkflowDraft(`/apps/${appId}/workflows/draft`) + handleWorkflowDataUpdate(response) + } + catch (error) { + console.error('workflow vars and features update failed:', error) + } + }) + + return unsubscribe + }, [appId, handleWorkflowDataUpdate, isCollaborationEnabled]) + + // Listen for workflow updates from other users + useEffect(() => { + if (!appId || !isCollaborationEnabled) + return + + const unsubscribe = collaborationManager.onWorkflowUpdate(async () => { + try { + const response = await fetchWorkflowDraft(`/apps/${appId}/workflows/draft`) + + // Handle features, variables etc. + handleWorkflowDataUpdate(response) + + // Update workflow canvas (nodes, edges, viewport) + if (response.graph) { + handleUpdateWorkflowCanvas({ + nodes: response.graph.nodes || [], + edges: response.graph.edges || [], + viewport: response.graph.viewport || { x: 0, y: 0, zoom: 1 }, + }) + } + } + catch (error) { + console.error('Failed to fetch updated workflow:', error) + } + }) + + return unsubscribe + }, [appId, handleWorkflowDataUpdate, handleUpdateWorkflowCanvas, isCollaborationEnabled]) + + // Listen for sync requests from other users (only processed by leader) + useEffect(() => { + if (!appId || !isCollaborationEnabled) + return + + const unsubscribe = collaborationManager.onSyncRequest(() => { + doSyncWorkflowDraft() + }) + + return unsubscribe + }, [appId, doSyncWorkflowDraft, isCollaborationEnabled]) const { handleStartWorkflowRun, handleWorkflowStartRunInChatflow, @@ -85,6 +224,7 @@ const WorkflowMain = ({ } = useDSL() const configsMap = useConfigsMap() + const { fetchInspectVars } = useSetWorkflowVarsWithValue({ ...configsMap, }) @@ -105,7 +245,7 @@ const WorkflowMain = ({ invalidateConversationVarValues, } = useInspectVarsCrud() - const hooksStore = useMemo(() => { + const hooksStore = useMemo>(() => { return { syncWorkflowDraftWhenPageClose, doSyncWorkflowDraft, @@ -182,17 +322,25 @@ const WorkflowMain = ({ ]) return ( - - - - - + + + + + +
) } diff --git a/web/app/components/workflow-app/components/workflow-panel.tsx b/web/app/components/workflow-app/components/workflow-panel.tsx index a1ed289f94..20099bb81d 100644 --- a/web/app/components/workflow-app/components/workflow-panel.tsx +++ b/web/app/components/workflow-app/components/workflow-panel.tsx @@ -7,6 +7,7 @@ import { import { useShallow } from 'zustand/react/shallow' import { useStore as useAppStore } from '@/app/components/app/store' import Panel from '@/app/components/workflow/panel' +import CommentsPanel from '@/app/components/workflow/panel/comments-panel' import { useStore } from '@/app/components/workflow/store' import { useIsChatMode, @@ -67,6 +68,7 @@ const WorkflowPanelOnRight = () => { const showDebugAndPreviewPanel = useStore(s => s.showDebugAndPreviewPanel) const showChatVariablePanel = useStore(s => s.showChatVariablePanel) const showGlobalVariablePanel = useStore(s => s.showGlobalVariablePanel) + const controlMode = useStore(s => s.controlMode) return ( <> @@ -100,6 +102,7 @@ const WorkflowPanelOnRight = () => { ) } + {controlMode === 'comment' && } ) } diff --git a/web/app/components/workflow-app/hooks/use-available-nodes-meta-data.ts b/web/app/components/workflow-app/hooks/use-available-nodes-meta-data.ts index e43a58fd14..0a278be688 100644 --- a/web/app/components/workflow-app/hooks/use-available-nodes-meta-data.ts +++ b/web/app/components/workflow-app/hooks/use-available-nodes-meta-data.ts @@ -40,7 +40,7 @@ export const useAvailableNodesMetaData = () => { TriggerPluginDefault, ] ), - ], [isChatMode, startNodeMetaData]) + ] as AvailableNodesMetaData['nodes'], [isChatMode, startNodeMetaData]) const availableNodesMetaData = useMemo(() => { const toNodeDefaultBase = ( diff --git a/web/app/components/workflow-app/hooks/use-get-run-and-trace-url.ts b/web/app/components/workflow-app/hooks/use-get-run-and-trace-url.ts index 28bcd017f8..043ea25ed7 100644 --- a/web/app/components/workflow-app/hooks/use-get-run-and-trace-url.ts +++ b/web/app/components/workflow-app/hooks/use-get-run-and-trace-url.ts @@ -3,8 +3,14 @@ import { useWorkflowStore } from '@/app/components/workflow/store' export const useGetRunAndTraceUrl = () => { const workflowStore = useWorkflowStore() - const getWorkflowRunAndTraceUrl = useCallback((runId: string) => { + const getWorkflowRunAndTraceUrl = useCallback((runId?: string) => { const { appId } = workflowStore.getState() + if (!appId || !runId) { + return { + runUrl: '', + traceUrl: '', + } + } return { runUrl: `/apps/${appId}/workflow-runs/${runId}`, diff --git a/web/app/components/workflow-app/hooks/use-nodes-sync-draft.ts b/web/app/components/workflow-app/hooks/use-nodes-sync-draft.ts index fc9c6c0ef3..bba5cb8f27 100644 --- a/web/app/components/workflow-app/hooks/use-nodes-sync-draft.ts +++ b/web/app/components/workflow-app/hooks/use-nodes-sync-draft.ts @@ -1,11 +1,15 @@ +import type { WorkflowDraftFeaturesPayload } from '@/service/workflow' import { produce } from 'immer' +import { useParams } from 'next/navigation' import { useCallback } from 'react' import { useStoreApi } from 'reactflow' import { useFeaturesStore } from '@/app/components/base/features/hooks' +import { collaborationManager } from '@/app/components/workflow/collaboration/core/collaboration-manager' import { useSerialAsyncCallback } from '@/app/components/workflow/hooks/use-serial-async-callback' import { useNodesReadOnly } from '@/app/components/workflow/hooks/use-workflow' import { useWorkflowStore } from '@/app/components/workflow/store' import { API_PREFIX } from '@/config' +import { useGlobalPublicStore } from '@/context/global-public-context' import { syncWorkflowDraft } from '@/service/workflow' import { useWorkflowRefreshDraft } from '.' @@ -15,6 +19,8 @@ export const useNodesSyncDraft = () => { const featuresStore = useFeaturesStore() const { getNodesReadOnly } = useNodesReadOnly() const { handleRefreshWorkflowDraft } = useWorkflowRefreshDraft() + const params = useParams() + const isCollaborationEnabled = useGlobalPublicStore(s => s.systemFeatures.enable_collaboration_mode) const getPostParams = useCallback(() => { const { @@ -52,7 +58,16 @@ export const useNodesSyncDraft = () => { }) }) }) - const viewport = { x, y, zoom } + const featuresPayload: WorkflowDraftFeaturesPayload = { + opening_statement: features.opening?.enabled ? (features.opening?.opening_statement || '') : '', + suggested_questions: features.opening?.enabled ? (features.opening?.suggested_questions || []) : [], + suggested_questions_after_answer: features.suggested, + text_to_speech: features.text2speech, + speech_to_text: features.speech2text, + retriever_resource: features.citation, + sensitive_word_avoidance: features.moderation, + file_upload: features.file, + } return { url: `/apps/${appId}/workflows/draft`, @@ -60,34 +75,41 @@ export const useNodesSyncDraft = () => { graph: { nodes: producedNodes, edges: producedEdges, - viewport, - }, - features: { - opening_statement: features.opening?.enabled ? (features.opening?.opening_statement || '') : '', - suggested_questions: features.opening?.enabled ? (features.opening?.suggested_questions || []) : [], - suggested_questions_after_answer: features.suggested, - text_to_speech: features.text2speech, - speech_to_text: features.speech2text, - retriever_resource: features.citation, - sensitive_word_avoidance: features.moderation, - file_upload: features.file, - sandbox: features.sandbox, + viewport: { + x, + y, + zoom, + }, }, + features: featuresPayload, environment_variables: environmentVariables, conversation_variables: conversationVariables, hash: syncWorkflowDraftHash, + _is_collaborative: isCollaborationEnabled, }, } - }, [store, featuresStore, workflowStore]) + }, [store, featuresStore, workflowStore, isCollaborationEnabled]) const syncWorkflowDraftWhenPageClose = useCallback(() => { if (getNodesReadOnly()) return + + // Check leader status at sync time + const currentIsLeader = isCollaborationEnabled ? collaborationManager.getIsLeader() : true + + // Only allow leader to sync data + if (isCollaborationEnabled && !currentIsLeader) + return + const postParams = getPostParams() - if (postParams) - navigator.sendBeacon(`${API_PREFIX}${postParams.url}`, JSON.stringify(postParams.params)) - }, [getPostParams, getNodesReadOnly]) + if (postParams) { + navigator.sendBeacon( + `${API_PREFIX}/apps/${params.appId}/workflows/draft`, + JSON.stringify(postParams.params), + ) + } + }, [getPostParams, params.appId, getNodesReadOnly, isCollaborationEnabled]) const performSync = useCallback(async ( notRefreshWhenSyncError?: boolean, @@ -99,6 +121,17 @@ export const useNodesSyncDraft = () => { ) => { if (getNodesReadOnly()) return + + // Check leader status at sync time + const currentIsLeader = isCollaborationEnabled ? collaborationManager.getIsLeader() : true + + // If not leader, request the leader to sync + if (isCollaborationEnabled && !currentIsLeader) { + if (isCollaborationEnabled) + collaborationManager.emitSyncRequest() + callback?.onSettled?.() + return + } const postParams = getPostParams() if (postParams) { @@ -106,6 +139,7 @@ export const useNodesSyncDraft = () => { setSyncWorkflowDraftHash, setDraftUpdatedAt, } = workflowStore.getState() + try { const res = await syncWorkflowDraft({ ...postParams, @@ -128,7 +162,7 @@ export const useNodesSyncDraft = () => { callback?.onSettled?.() } } - }, [workflowStore, getPostParams, getNodesReadOnly, handleRefreshWorkflowDraft]) + }, [workflowStore, getPostParams, getNodesReadOnly, handleRefreshWorkflowDraft, isCollaborationEnabled]) const doSyncWorkflowDraft = useSerialAsyncCallback(performSync, getNodesReadOnly) const syncWorkflowDraftImmediately = useCallback(( diff --git a/web/app/components/workflow-app/index.tsx b/web/app/components/workflow-app/index.tsx index 0b721109d0..a5a3fbcac1 100644 --- a/web/app/components/workflow-app/index.tsx +++ b/web/app/components/workflow-app/index.tsx @@ -18,6 +18,7 @@ import { FeaturesProvider } from '@/app/components/base/features' import Loading from '@/app/components/base/loading' import { FILE_EXTS } from '@/app/components/base/prompt-editor/constants' import WorkflowWithDefaultContext from '@/app/components/workflow' +import { collaborationManager } from '@/app/components/workflow/collaboration/core/collaboration-manager' import { WorkflowContextProvider, } from '@/app/components/workflow/context' @@ -185,15 +186,20 @@ const WorkflowAppWithAdditionalContext = () => { }, [workflowStore]) const nodesData = useMemo(() => { - if (data) - return initialNodes(data.graph.nodes, data.graph.edges) - + if (data) { + const processedNodes = initialNodes(data.graph.nodes, data.graph.edges) + collaborationManager.setNodes([], processedNodes) + return processedNodes + } return [] }, [data]) - const edgesData = useMemo(() => { - if (data) - return initialEdges(data.graph.edges, data.graph.nodes) + const edgesData = useMemo(() => { + if (data) { + const processedEdges = initialEdges(data.graph.edges, data.graph.nodes) + collaborationManager.setEdges([], processedEdges) + return processedEdges + } return [] }, [data]) diff --git a/web/app/components/workflow/block-selector/index.tsx b/web/app/components/workflow/block-selector/index.tsx index 5b9d86d6d4..0475b6bfcf 100644 --- a/web/app/components/workflow/block-selector/index.tsx +++ b/web/app/components/workflow/block-selector/index.tsx @@ -35,7 +35,7 @@ const NodeSelectorWrapper = (props: NodeSelectorProps) => { return true }) - }, [availableNodesMetaData?.nodes]) + }, [availableNodesMetaData?.nodes]) as NodeSelectorProps['blocks'] return ( = ({ candidateNode, }) => { - const store = useStoreApi() const reactflow = useReactFlow() const workflowStore = useWorkflowStore() const mousePosition = useStore(s => s.mousePosition) @@ -41,15 +40,12 @@ const CandidateNodeMain: FC = ({ const { saveStateToHistory } = useWorkflowHistory() const { handleSyncWorkflowDraft } = useNodesSyncDraft() const autoGenerateWebhookUrl = useAutoGenerateWebhookUrl() + const collaborativeWorkflow = useCollaborativeWorkflow() useEventListener('click', (e) => { e.preventDefault() - const { - getNodes, - setNodes, - } = store.getState() const { screenToFlowPosition } = reactflow - const nodes = getNodes() + const { nodes, setNodes } = collaborativeWorkflow.getState() const { x, y } = screenToFlowPosition({ x: mousePosition.pageX, y: mousePosition.pageY }) const newNodes = produce(nodes, (draft) => { draft.push({ diff --git a/web/app/components/workflow/collaboration/components/user-cursors.tsx b/web/app/components/workflow/collaboration/components/user-cursors.tsx new file mode 100644 index 0000000000..488de1530d --- /dev/null +++ b/web/app/components/workflow/collaboration/components/user-cursors.tsx @@ -0,0 +1,78 @@ +import type { FC } from 'react' +import type { CursorPosition, OnlineUser } from '@/app/components/workflow/collaboration/types' +import { useViewport } from 'reactflow' +import { getUserColor } from '../utils/user-color' + +type UserCursorsProps = { + cursors: Record + myUserId: string | null + onlineUsers: OnlineUser[] +} + +const UserCursors: FC = ({ + cursors, + myUserId, + onlineUsers, +}) => { + const viewport = useViewport() + + const convertToScreenCoordinates = (cursor: CursorPosition) => { + // Convert world coordinates to screen coordinates using current viewport + const screenX = cursor.x * viewport.zoom + viewport.x + const screenY = cursor.y * viewport.zoom + viewport.y + + return { x: screenX, y: screenY } + } + return ( + <> + {Object.entries(cursors || {}).map(([userId, cursor]) => { + if (userId === myUserId) + return null + + const userInfo = onlineUsers.find(user => user.user_id === userId) + const userName = userInfo?.username || `User ${userId.slice(-4)}` + const userColor = getUserColor(userId) + const screenPos = convertToScreenCoordinates(cursor) + + return ( +
+ + + + +
+ {userName} +
+
+ ) + })} + + ) +} + +export default UserCursors diff --git a/web/app/components/workflow/collaboration/core/__tests__/collaboration-manager.merge-behavior.test.ts b/web/app/components/workflow/collaboration/core/__tests__/collaboration-manager.merge-behavior.test.ts new file mode 100644 index 0000000000..93d1c6b746 --- /dev/null +++ b/web/app/components/workflow/collaboration/core/__tests__/collaboration-manager.merge-behavior.test.ts @@ -0,0 +1,331 @@ +import type { LoroMap } from 'loro-crdt' +import type { Node } from '@/app/components/workflow/types' +import { LoroDoc } from 'loro-crdt' +import { BlockEnum } from '@/app/components/workflow/types' +import { CollaborationManager } from '../collaboration-manager' + +const NODE_ID = 'node-1' +const LLM_NODE_ID = 'llm-node' +const PARAM_NODE_ID = 'parameter-node' + +type WorkflowVariable = { + variable: string + label: string + type: string + required: boolean + default: string + max_length: number + placeholder: string + options: string[] + hint: string +} + +type PromptTemplateItem = { + id: string + role: string + text: string +} + +type ParameterItem = { + description: string + name: string + required: boolean + type: string +} + +type StartNodeData = { + variables: WorkflowVariable[] +} + +type LLMNodeData = { + model: { + mode: string + name: string + provider: string + completion_params: { + temperature: number + } + } + context: { + enabled: boolean + variable_selector: string[] + } + vision: { + enabled: boolean + } + prompt_template: PromptTemplateItem[] +} + +type ParameterExtractorNodeData = { + model: { + mode: string + name: string + provider: string + completion_params: { + temperature: number + } + } + parameters: ParameterItem[] + query: unknown[] + reasoning_mode: string + vision: { + enabled: boolean + } +} + +type CollaborationManagerInternals = { + doc: LoroDoc + nodesMap: LoroMap + edgesMap: LoroMap + syncNodes: (oldNodes: Node[], newNodes: Node[]) => void +} + +const createNode = (variables: string[]): Node => ({ + id: NODE_ID, + type: 'custom', + position: { x: 0, y: 0 }, + data: { + type: BlockEnum.Start, + title: 'Start', + desc: '', + variables: variables.map(name => ({ + variable: name, + label: name, + type: 'text-input', + required: true, + default: '', + max_length: 48, + placeholder: '', + options: [], + hint: '', + })), + }, +}) + +const createLLMNode = (templates: PromptTemplateItem[]): Node => ({ + id: LLM_NODE_ID, + type: 'custom', + position: { x: 200, y: 200 }, + data: { + type: BlockEnum.LLM, + title: 'LLM', + desc: '', + selected: false, + model: { + mode: 'chat', + name: 'gemini-2.5-pro', + provider: 'langgenius/gemini/google', + completion_params: { + temperature: 0.7, + }, + }, + context: { + enabled: false, + variable_selector: [], + }, + vision: { + enabled: false, + }, + prompt_template: templates, + }, +}) + +const createParameterExtractorNode = (parameters: ParameterItem[]): Node => ({ + id: PARAM_NODE_ID, + type: 'custom', + position: { x: 400, y: 120 }, + data: { + type: BlockEnum.ParameterExtractor, + title: 'ParameterExtractor', + desc: '', + selected: true, + model: { + mode: 'chat', + name: '', + provider: '', + completion_params: { + temperature: 0.7, + }, + }, + query: [], + reasoning_mode: 'prompt', + parameters, + vision: { + enabled: false, + }, + }, +}) + +const getManagerInternals = (manager: CollaborationManager): CollaborationManagerInternals => + manager as unknown as CollaborationManagerInternals + +const getManager = (doc: LoroDoc) => { + const manager = new CollaborationManager() + const internals = getManagerInternals(manager) + internals.doc = doc + internals.nodesMap = doc.getMap('nodes') + internals.edgesMap = doc.getMap('edges') + return manager +} + +const deepClone = (value: T): T => JSON.parse(JSON.stringify(value)) + +const syncNodes = (manager: CollaborationManager, previous: Node[], next: Node[]) => { + const internals = getManagerInternals(manager) + internals.syncNodes(previous, next) +} + +const exportNodes = (manager: CollaborationManager) => manager.getNodes() + +describe('Loro merge behavior smoke test', () => { + it('inspects concurrent edits after merge', () => { + const docA = new LoroDoc() + const managerA = getManager(docA) + syncNodes(managerA, [], [createNode(['a'])]) + + const snapshot = docA.export({ mode: 'snapshot' }) + + const docB = LoroDoc.fromSnapshot(snapshot) + const managerB = getManager(docB) + + syncNodes(managerA, [createNode(['a'])], [createNode(['a', 'b'])]) + syncNodes(managerB, [createNode(['a'])], [createNode(['a', 'c'])]) + + const updateForA = docB.export({ mode: 'update', from: docA.version() }) + docA.import(updateForA) + + const updateForB = docA.export({ mode: 'update', from: docB.version() }) + docB.import(updateForB) + + const finalA = exportNodes(managerA) + const finalB = exportNodes(managerB) + expect(finalA.length).toBe(1) + expect(finalB.length).toBe(1) + }) + + it('merges prompt template insertions and edits across replicas', () => { + const baseTemplate = [ + { + id: 'system-1', + role: 'system', + text: 'base instruction', + }, + ] + + const docA = new LoroDoc() + const managerA = getManager(docA) + syncNodes(managerA, [], [createLLMNode(deepClone(baseTemplate))]) + + const snapshot = docA.export({ mode: 'snapshot' }) + const docB = LoroDoc.fromSnapshot(snapshot) + const managerB = getManager(docB) + + const additionTemplate = [ + ...baseTemplate, + { + id: 'user-1', + role: 'user', + text: 'hello from docA', + }, + ] + syncNodes(managerA, [createLLMNode(deepClone(baseTemplate))], [createLLMNode(deepClone(additionTemplate))]) + + const editedTemplate = [ + { + id: 'system-1', + role: 'system', + text: 'updated by docB', + }, + ] + syncNodes(managerB, [createLLMNode(deepClone(baseTemplate))], [createLLMNode(deepClone(editedTemplate))]) + + const updateForA = docB.export({ mode: 'update', from: docA.version() }) + docA.import(updateForA) + + const updateForB = docA.export({ mode: 'update', from: docB.version() }) + docB.import(updateForB) + + const finalA = exportNodes(managerA).find(node => node.id === LLM_NODE_ID) as Node | undefined + const finalB = exportNodes(managerB).find(node => node.id === LLM_NODE_ID) as Node | undefined + + expect(finalA).toBeDefined() + expect(finalB).toBeDefined() + + const expectedTemplates = [ + { + id: 'system-1', + role: 'system', + text: 'updated by docB', + }, + { + id: 'user-1', + role: 'user', + text: 'hello from docA', + }, + ] + + expect(finalA!.data.prompt_template).toEqual(expectedTemplates) + expect(finalB!.data.prompt_template).toEqual(expectedTemplates) + }) + + it('converges when parameter lists are edited concurrently', () => { + const baseParameters = [ + { description: 'bb', name: 'aa', required: false, type: 'string' }, + { description: 'dd', name: 'cc', required: false, type: 'string' }, + ] + + const docA = new LoroDoc() + const managerA = getManager(docA) + syncNodes(managerA, [], [createParameterExtractorNode(deepClone(baseParameters))]) + + const snapshot = docA.export({ mode: 'snapshot' }) + const docB = LoroDoc.fromSnapshot(snapshot) + const managerB = getManager(docB) + + const docAUpdate = [ + { description: 'bb updated by A', name: 'aa', required: true, type: 'string' }, + { description: 'dd', name: 'cc', required: false, type: 'string' }, + { description: 'new from A', name: 'ee', required: false, type: 'number' }, + ] + syncNodes( + managerA, + [createParameterExtractorNode(deepClone(baseParameters))], + [createParameterExtractorNode(deepClone(docAUpdate))], + ) + + const docBUpdate = [ + { description: 'bb', name: 'aa', required: false, type: 'string' }, + { description: 'dd updated by B', name: 'cc', required: true, type: 'string' }, + ] + syncNodes( + managerB, + [createParameterExtractorNode(deepClone(baseParameters))], + [createParameterExtractorNode(deepClone(docBUpdate))], + ) + + const updateForA = docB.export({ mode: 'update', from: docA.version() }) + docA.import(updateForA) + + const updateForB = docA.export({ mode: 'update', from: docB.version() }) + docB.import(updateForB) + + const finalA = exportNodes(managerA).find(node => node.id === PARAM_NODE_ID) as + | Node + | undefined + const finalB = exportNodes(managerB).find(node => node.id === PARAM_NODE_ID) as + | Node + | undefined + + expect(finalA).toBeDefined() + expect(finalB).toBeDefined() + + const expectedParameters = [ + { description: 'bb updated by A', name: 'aa', required: true, type: 'string' }, + { description: 'dd updated by B', name: 'cc', required: true, type: 'string' }, + { description: 'new from A', name: 'ee', required: false, type: 'number' }, + ] + + expect(finalA!.data.parameters).toEqual(expectedParameters) + expect(finalB!.data.parameters).toEqual(expectedParameters) + }) +}) diff --git a/web/app/components/workflow/collaboration/core/__tests__/collaboration-manager.test.ts b/web/app/components/workflow/collaboration/core/__tests__/collaboration-manager.test.ts new file mode 100644 index 0000000000..1728bcad55 --- /dev/null +++ b/web/app/components/workflow/collaboration/core/__tests__/collaboration-manager.test.ts @@ -0,0 +1,763 @@ +import type { LoroMap } from 'loro-crdt' +import type { + NodePanelPresenceMap, + NodePanelPresenceUser, +} from '@/app/components/workflow/collaboration/types/collaboration' +import type { CommonNodeType, Edge, Node } from '@/app/components/workflow/types' +import { LoroDoc } from 'loro-crdt' +import { Position } from 'reactflow' +import { CollaborationManager } from '@/app/components/workflow/collaboration/core/collaboration-manager' +import { BlockEnum } from '@/app/components/workflow/types' + +const NODE_ID = '1760342909316' + +type WorkflowVariable = { + default: string + hint: string + label: string + max_length: number + options: string[] + placeholder: string + required: boolean + type: string + variable: string +} + +type PromptTemplateItem = { + id: string + role: string + text: string +} + +type ParameterItem = { + description: string + name: string + required: boolean + type: string +} + +type NodePanelPresenceEventData = { + nodeId: string + action: 'open' | 'close' + user: NodePanelPresenceUser + clientId: string + timestamp?: number +} + +type StartNodeData = { + variables: WorkflowVariable[] +} + +type LLMNodeData = { + context: { + enabled: boolean + variable_selector: string[] + } + model: { + mode: string + name: string + provider: string + completion_params: { + temperature: number + } + } + prompt_template: PromptTemplateItem[] + vision: { + enabled: boolean + } +} + +type ParameterExtractorNodeData = { + model: { + mode: string + name: string + provider: string + completion_params: { + temperature: number + } + } + parameters: ParameterItem[] + query: unknown[] + reasoning_mode: string + vision: { + enabled: boolean + } +} + +type LLMNodeDataWithUnknownTemplate = Omit & { + prompt_template: unknown +} + +type ManagerDoc = LoroDoc | { commit: () => void } + +type CollaborationManagerInternals = { + doc: ManagerDoc + nodesMap: LoroMap + edgesMap: LoroMap + syncNodes: (oldNodes: Node[], newNodes: Node[]) => void + syncEdges: (oldEdges: Edge[], newEdges: Edge[]) => void + applyNodePanelPresenceUpdate: (update: NodePanelPresenceEventData) => void + forceDisconnect: () => void + activeConnections: Set + isUndoRedoInProgress: boolean +} + +const createVariable = (name: string, overrides: Partial = {}): WorkflowVariable => ({ + default: '', + hint: '', + label: name, + max_length: 48, + options: [], + placeholder: '', + required: true, + type: 'text-input', + variable: name, + ...overrides, +}) + +const deepClone = (value: T): T => JSON.parse(JSON.stringify(value)) + +const createNodeSnapshot = (variableNames: string[]): Node => ({ + id: NODE_ID, + type: 'custom', + position: { x: 0, y: 24 }, + positionAbsolute: { x: 0, y: 24 }, + height: 88, + width: 242, + selected: true, + selectable: true, + draggable: true, + sourcePosition: Position.Right, + targetPosition: Position.Left, + data: { + selected: true, + title: '开始', + desc: '', + type: BlockEnum.Start, + variables: variableNames.map(name => createVariable(name)), + }, +}) + +const LLM_NODE_ID = 'llm-node' +const PARAM_NODE_ID = 'param-extractor-node' + +const createLLMNodeSnapshot = (promptTemplates: PromptTemplateItem[]): Node => ({ + id: LLM_NODE_ID, + type: 'custom', + position: { x: 200, y: 120 }, + positionAbsolute: { x: 200, y: 120 }, + height: 320, + width: 460, + selected: false, + selectable: true, + draggable: true, + sourcePosition: Position.Right, + targetPosition: Position.Left, + data: { + type: BlockEnum.LLM, + title: 'LLM', + desc: '', + selected: false, + context: { + enabled: false, + variable_selector: [], + }, + model: { + mode: 'chat', + name: 'gemini-2.5-pro', + provider: 'langgenius/gemini/google', + completion_params: { + temperature: 0.7, + }, + }, + vision: { + enabled: false, + }, + prompt_template: promptTemplates, + }, +}) + +const createParameterExtractorNodeSnapshot = (parameters: ParameterItem[]): Node => ({ + id: PARAM_NODE_ID, + type: 'custom', + position: { x: 420, y: 220 }, + positionAbsolute: { x: 420, y: 220 }, + height: 260, + width: 420, + selected: true, + selectable: true, + draggable: true, + sourcePosition: Position.Right, + targetPosition: Position.Left, + data: { + type: BlockEnum.ParameterExtractor, + title: '参数提取器', + desc: '', + selected: true, + model: { + mode: 'chat', + name: '', + provider: '', + completion_params: { + temperature: 0.7, + }, + }, + reasoning_mode: 'prompt', + parameters, + query: [], + vision: { + enabled: false, + }, + }, +}) + +const getVariables = (node: Node): string[] => { + const data = node.data as CommonNodeType<{ variables?: WorkflowVariable[] }> + const variables = data.variables ?? [] + return variables.map(item => item.variable) +} + +const getVariableObject = (node: Node, name: string): WorkflowVariable | undefined => { + const data = node.data as CommonNodeType<{ variables?: WorkflowVariable[] }> + const variables = data.variables ?? [] + return variables.find(item => item.variable === name) +} + +const getPromptTemplates = (node: Node): PromptTemplateItem[] => { + const data = node.data as CommonNodeType<{ prompt_template?: PromptTemplateItem[] }> + return data.prompt_template ?? [] +} + +const getParameters = (node: Node): ParameterItem[] => { + const data = node.data as CommonNodeType<{ parameters?: ParameterItem[] }> + return data.parameters ?? [] +} + +const getManagerInternals = (manager: CollaborationManager): CollaborationManagerInternals => + manager as unknown as CollaborationManagerInternals + +const setupManager = (): { manager: CollaborationManager, internals: CollaborationManagerInternals } => { + const manager = new CollaborationManager() + const doc = new LoroDoc() + const internals = getManagerInternals(manager) + internals.doc = doc + internals.nodesMap = doc.getMap('nodes') + internals.edgesMap = doc.getMap('edges') + return { manager, internals } +} + +describe('CollaborationManager syncNodes', () => { + let manager: CollaborationManager + let internals: CollaborationManagerInternals + + beforeEach(() => { + const setup = setupManager() + manager = setup.manager + internals = setup.internals + + const initialNode = createNodeSnapshot(['a']) + internals.syncNodes([], [deepClone(initialNode)]) + }) + + it('updates collaborators map when a single client adds a variable', () => { + const base = [createNodeSnapshot(['a'])] + const next = [createNodeSnapshot(['a', 'b'])] + + internals.syncNodes(base, next) + + const stored = (manager.getNodes() as Node[]).find(node => node.id === NODE_ID) + expect(stored).toBeDefined() + expect(getVariables(stored!)).toEqual(['a', 'b']) + }) + + it('applies the latest parallel additions derived from the same base snapshot', () => { + const base = [createNodeSnapshot(['a'])] + const userA = [createNodeSnapshot(['a', 'b'])] + const userB = [createNodeSnapshot(['a', 'c'])] + + internals.syncNodes(base, userA) + + const afterUserA = (manager.getNodes() as Node[]).find(node => node.id === NODE_ID) + expect(getVariables(afterUserA!)).toEqual(['a', 'b']) + + internals.syncNodes(base, userB) + + const finalNode = (manager.getNodes() as Node[]).find(node => node.id === NODE_ID) + const finalVariables = getVariables(finalNode!) + + expect(finalVariables).toEqual(['a', 'c']) + }) + + it('prefers the incoming mutation when the same variable is edited concurrently', () => { + const base = [createNodeSnapshot(['a'])] + const userA = [ + { + ...createNodeSnapshot(['a']), + data: { + ...createNodeSnapshot(['a']).data, + variables: [ + createVariable('a', { label: 'A from userA', hint: 'hintA' }), + ], + }, + }, + ] + const userB = [ + { + ...createNodeSnapshot(['a']), + data: { + ...createNodeSnapshot(['a']).data, + variables: [ + createVariable('a', { label: 'A from userB', hint: 'hintB' }), + ], + }, + }, + ] + + internals.syncNodes(base, userA) + internals.syncNodes(base, userB) + + const finalNode = (manager.getNodes() as Node[]).find(node => node.id === NODE_ID) + const finalVariable = getVariableObject(finalNode!, 'a') + + expect(finalVariable?.label).toBe('A from userB') + expect(finalVariable?.hint).toBe('hintB') + }) + + it('reflects the last writer when concurrent removal and edits happen', () => { + const base = [createNodeSnapshot(['a', 'b'])] + internals.syncNodes([], [deepClone(base[0])]) + const userA = [ + { + ...createNodeSnapshot(['a']), + data: { + ...createNodeSnapshot(['a']).data, + variables: [ + createVariable('a', { label: 'A after deletion' }), + ], + }, + }, + ] + const userB = [ + { + ...createNodeSnapshot(['a', 'b']), + data: { + ...createNodeSnapshot(['a']).data, + variables: [ + createVariable('a'), + createVariable('b', { label: 'B edited but should vanish' }), + ], + }, + }, + ] + + internals.syncNodes(base, userA) + internals.syncNodes(base, userB) + + const finalNode = (manager.getNodes() as Node[]).find(node => node.id === NODE_ID) + const finalVariables = getVariables(finalNode!) + expect(finalVariables).toEqual(['a', 'b']) + expect(getVariableObject(finalNode!, 'b')).toBeDefined() + }) + + it('synchronizes prompt_template list updates across collaborators', () => { + const { manager: promptManager, internals: promptInternals } = setupManager() + + const baseTemplate = [ + { + id: 'abcfa5f9-3c44-4252-aeba-4b6eaf0acfc4', + role: 'system', + text: 'avc', + }, + ] + + const baseNode = createLLMNodeSnapshot(baseTemplate) + promptInternals.syncNodes([], [deepClone(baseNode)]) + + const updatedTemplates = [ + ...baseTemplate, + { + id: 'user-1', + role: 'user', + text: 'hello world', + }, + ] + + const updatedNode = createLLMNodeSnapshot(updatedTemplates) + promptInternals.syncNodes([deepClone(baseNode)], [deepClone(updatedNode)]) + + const stored = (promptManager.getNodes() as Node[]).find(node => node.id === LLM_NODE_ID) + expect(stored).toBeDefined() + + const storedTemplates = getPromptTemplates(stored!) + expect(storedTemplates).toHaveLength(2) + expect(storedTemplates[0]).toEqual(baseTemplate[0]) + expect(storedTemplates[1]).toEqual(updatedTemplates[1]) + + const editedTemplates = [ + { + id: 'abcfa5f9-3c44-4252-aeba-4b6eaf0acfc4', + role: 'system', + text: 'updated system prompt', + }, + ] + const editedNode = createLLMNodeSnapshot(editedTemplates) + + promptInternals.syncNodes([deepClone(updatedNode)], [deepClone(editedNode)]) + + const final = (promptManager.getNodes() as Node[]).find(node => node.id === LLM_NODE_ID) + const finalTemplates = getPromptTemplates(final!) + expect(finalTemplates).toHaveLength(1) + expect(finalTemplates[0].text).toBe('updated system prompt') + }) + + it('keeps parameter list in sync when nodes add, edit, or remove parameters', () => { + const { manager: parameterManager, internals: parameterInternals } = setupManager() + + const baseParameters: ParameterItem[] = [ + { description: 'bb', name: 'aa', required: false, type: 'string' }, + { description: 'dd', name: 'cc', required: false, type: 'string' }, + ] + + const baseNode = createParameterExtractorNodeSnapshot(baseParameters) + parameterInternals.syncNodes([], [deepClone(baseNode)]) + + const updatedParameters: ParameterItem[] = [ + ...baseParameters, + { description: 'ff', name: 'ee', required: true, type: 'number' }, + ] + + const updatedNode = createParameterExtractorNodeSnapshot(updatedParameters) + parameterInternals.syncNodes([deepClone(baseNode)], [deepClone(updatedNode)]) + + const stored = (parameterManager.getNodes() as Node[]).find(node => node.id === PARAM_NODE_ID) + expect(stored).toBeDefined() + expect(getParameters(stored!)).toEqual(updatedParameters) + + const editedParameters: ParameterItem[] = [ + { description: 'bb edited', name: 'aa', required: true, type: 'string' }, + ] + const editedNode = createParameterExtractorNodeSnapshot(editedParameters) + + parameterInternals.syncNodes([deepClone(updatedNode)], [deepClone(editedNode)]) + + const final = (parameterManager.getNodes() as Node[]).find(node => node.id === PARAM_NODE_ID) + expect(getParameters(final!)).toEqual(editedParameters) + }) + + it('handles nodes without data gracefully', () => { + const emptyNode: Node = { + id: 'empty-node', + type: 'custom', + position: { x: 0, y: 0 }, + data: undefined as unknown as CommonNodeType>, + } + + internals.syncNodes([], [deepClone(emptyNode)]) + + const stored = (manager.getNodes() as Node[]).find(node => node.id === 'empty-node') + expect(stored).toBeDefined() + expect(stored?.data).toEqual({}) + }) + + it('preserves CRDT list instances when synchronizing parsed state back into the manager', () => { + const { manager: promptManager, internals: promptInternals } = setupManager() + + const base = createLLMNodeSnapshot([ + { id: 'system', role: 'system', text: 'base' }, + ]) + promptInternals.syncNodes([], [deepClone(base)]) + + const storedBefore = promptManager.getNodes().find(node => node.id === LLM_NODE_ID) as Node | undefined + expect(storedBefore).toBeDefined() + const firstTemplate = storedBefore?.data.prompt_template?.[0] + expect(firstTemplate?.text).toBe('base') + + // simulate consumer mutating the plain JSON array and syncing back + const baseNode = storedBefore! + const mutatedNode = deepClone(baseNode) + mutatedNode.data.prompt_template.push({ + id: 'user', + role: 'user', + text: 'mutated', + }) + + promptInternals.syncNodes([baseNode], [mutatedNode]) + + const storedAfter = promptManager.getNodes().find(node => node.id === LLM_NODE_ID) as Node | undefined + const templatesAfter = storedAfter?.data.prompt_template + expect(Array.isArray(templatesAfter)).toBe(true) + expect(templatesAfter).toHaveLength(2) + }) + + it('reuses CRDT list when syncing parameters repeatedly', () => { + const { manager: parameterManager, internals: parameterInternals } = setupManager() + + const initialParameters: ParameterItem[] = [ + { description: 'desc', name: 'param', required: false, type: 'string' }, + ] + const node = createParameterExtractorNodeSnapshot(initialParameters) + parameterInternals.syncNodes([], [deepClone(node)]) + + const stored = parameterManager.getNodes().find(n => n.id === PARAM_NODE_ID) as Node + const mutatedNode = deepClone(stored) + mutatedNode.data.parameters[0].description = 'updated' + + parameterInternals.syncNodes([stored], [mutatedNode]) + + const storedAfter = parameterManager.getNodes().find(n => n.id === PARAM_NODE_ID) as + | Node + | undefined + const params = storedAfter?.data.parameters ?? [] + expect(params).toHaveLength(1) + expect(params[0].description).toBe('updated') + }) + + it('filters out transient/private data keys while keeping allowlisted ones', () => { + const nodeWithPrivate: Node<{ _foo: string, variables: WorkflowVariable[] }> = { + id: 'private-node', + type: 'custom', + position: { x: 0, y: 0 }, + data: { + type: BlockEnum.Start, + title: 'private', + desc: '', + _foo: 'should disappear', + _children: [{ nodeId: 'child-a', nodeType: BlockEnum.Start }], + selected: true, + variables: [], + }, + } + + internals.syncNodes([], [deepClone(nodeWithPrivate)]) + + const stored = (manager.getNodes() as Node[]).find(node => node.id === 'private-node')! + const storedData = stored.data as CommonNodeType<{ _foo?: string }> + expect(storedData._foo).toBeUndefined() + expect(storedData._children).toEqual([{ nodeId: 'child-a', nodeType: BlockEnum.Start }]) + expect(storedData.selected).toBeUndefined() + }) + + it('removes list fields when they are omitted in the update snapshot', () => { + const baseNode = createNodeSnapshot(['alpha']) + internals.syncNodes([], [deepClone(baseNode)]) + + const withoutVariables: Node = { + ...deepClone(baseNode), + data: { + ...deepClone(baseNode).data, + }, + } + delete (withoutVariables.data as CommonNodeType<{ variables?: WorkflowVariable[] }>).variables + + internals.syncNodes([deepClone(baseNode)], [withoutVariables]) + + const stored = (manager.getNodes() as Node[]).find(node => node.id === NODE_ID)! + const storedData = stored.data as CommonNodeType<{ variables?: WorkflowVariable[] }> + expect(storedData.variables).toBeUndefined() + }) + + it('treats non-array list inputs as empty lists during synchronization', () => { + const { manager: promptManager, internals: promptInternals } = setupManager() + + const nodeWithInvalidTemplate = createLLMNodeSnapshot([]) + promptInternals.syncNodes([], [deepClone(nodeWithInvalidTemplate)]) + + const mutated = deepClone(nodeWithInvalidTemplate) as Node + mutated.data.prompt_template = 'not-an-array' + + promptInternals.syncNodes([deepClone(nodeWithInvalidTemplate)], [mutated]) + + const stored = promptManager.getNodes().find(node => node.id === LLM_NODE_ID) as Node + expect(Array.isArray(stored.data.prompt_template)).toBe(true) + expect(stored.data.prompt_template).toHaveLength(0) + }) + + it('updates edges map when edges are added, modified, and removed', () => { + const { manager: edgeManager } = setupManager() + + const edge: Edge = { + id: 'edge-1', + source: 'node-a', + target: 'node-b', + type: 'default', + data: { + sourceType: BlockEnum.Start, + targetType: BlockEnum.LLM, + _waitingRun: false, + }, + } + + edgeManager.setEdges([], [edge]) + expect(edgeManager.getEdges()).toHaveLength(1) + const storedEdge = edgeManager.getEdges()[0]! + expect(storedEdge.data).toBeDefined() + expect(storedEdge.data!._waitingRun).toBe(false) + + const updatedEdge: Edge = { + ...edge, + data: { + sourceType: BlockEnum.Start, + targetType: BlockEnum.LLM, + _waitingRun: true, + }, + } + edgeManager.setEdges([edge], [updatedEdge]) + expect(edgeManager.getEdges()).toHaveLength(1) + const updatedStoredEdge = edgeManager.getEdges()[0]! + expect(updatedStoredEdge.data).toBeDefined() + expect(updatedStoredEdge.data!._waitingRun).toBe(true) + + edgeManager.setEdges([updatedEdge], []) + expect(edgeManager.getEdges()).toHaveLength(0) + }) +}) + +describe('CollaborationManager public API wrappers', () => { + let manager: CollaborationManager + let internals: CollaborationManagerInternals + const baseNodes: Node[] = [] + const updatedNodes: Node[] = [ + { + id: 'new-node', + type: 'custom', + position: { x: 0, y: 0 }, + data: { + type: BlockEnum.Start, + title: 'New node', + desc: '', + }, + }, + ] + const baseEdges: Edge[] = [] + const updatedEdges: Edge[] = [ + { + id: 'edge-1', + source: 'source', + target: 'target', + type: 'default', + data: { + sourceType: BlockEnum.Start, + targetType: BlockEnum.End, + }, + }, + ] + + beforeEach(() => { + manager = new CollaborationManager() + internals = getManagerInternals(manager) + }) + + it('setNodes delegates to syncNodes and commits the CRDT document', () => { + const commit = vi.fn() + internals.doc = { commit } + const syncSpy = vi.spyOn(internals, 'syncNodes').mockImplementation(() => undefined) + + manager.setNodes(baseNodes, updatedNodes) + + expect(syncSpy).toHaveBeenCalledWith(baseNodes, updatedNodes) + expect(commit).toHaveBeenCalled() + syncSpy.mockRestore() + }) + + it('setNodes skips syncing when undo/redo replay is running', () => { + const commit = vi.fn() + internals.doc = { commit } + internals.isUndoRedoInProgress = true + const syncSpy = vi.spyOn(internals, 'syncNodes').mockImplementation(() => undefined) + + manager.setNodes(baseNodes, updatedNodes) + + expect(syncSpy).not.toHaveBeenCalled() + expect(commit).not.toHaveBeenCalled() + syncSpy.mockRestore() + }) + + it('setEdges delegates to syncEdges and commits the CRDT document', () => { + const commit = vi.fn() + internals.doc = { commit } + const syncSpy = vi.spyOn(internals, 'syncEdges').mockImplementation(() => undefined) + + manager.setEdges(baseEdges, updatedEdges) + + expect(syncSpy).toHaveBeenCalledWith(baseEdges, updatedEdges) + expect(commit).toHaveBeenCalled() + syncSpy.mockRestore() + }) + + it('disconnect tears down the collaboration state only when last connection closes', () => { + const forceSpy = vi.spyOn(internals, 'forceDisconnect').mockImplementation(() => undefined) + internals.activeConnections.add('conn-a') + internals.activeConnections.add('conn-b') + + manager.disconnect('conn-a') + expect(forceSpy).not.toHaveBeenCalled() + + manager.disconnect('conn-b') + expect(forceSpy).toHaveBeenCalledTimes(1) + forceSpy.mockRestore() + }) + + it('applyNodePanelPresenceUpdate keeps a client visible on a single node at a time', () => { + const updates: NodePanelPresenceMap[] = [] + manager.onNodePanelPresenceUpdate((presence) => { + updates.push(presence) + }) + + const user: NodePanelPresenceUser = { userId: 'user-1', username: 'Dana' } + + internals.applyNodePanelPresenceUpdate({ + nodeId: 'node-a', + action: 'open', + user, + clientId: 'client-1', + timestamp: 100, + }) + + internals.applyNodePanelPresenceUpdate({ + nodeId: 'node-b', + action: 'open', + user, + clientId: 'client-1', + timestamp: 200, + }) + + const finalSnapshot = updates[updates.length - 1]! + expect(finalSnapshot).toEqual({ + 'node-b': { + 'client-1': { + userId: 'user-1', + username: 'Dana', + clientId: 'client-1', + timestamp: 200, + }, + }, + }) + }) + + it('applyNodePanelPresenceUpdate clears node entries when last viewer closes the panel', () => { + const updates: NodePanelPresenceMap[] = [] + manager.onNodePanelPresenceUpdate((presence) => { + updates.push(presence) + }) + + const user: NodePanelPresenceUser = { userId: 'user-2', username: 'Kai' } + + internals.applyNodePanelPresenceUpdate({ + nodeId: 'node-a', + action: 'open', + user, + clientId: 'client-9', + timestamp: 300, + }) + + internals.applyNodePanelPresenceUpdate({ + nodeId: 'node-a', + action: 'close', + user, + clientId: 'client-9', + timestamp: 301, + }) + + expect(updates[updates.length - 1]).toEqual({}) + }) +}) diff --git a/web/app/components/workflow/collaboration/core/__tests__/crdt-provider.test.ts b/web/app/components/workflow/collaboration/core/__tests__/crdt-provider.test.ts new file mode 100644 index 0000000000..6c4fe91d2a --- /dev/null +++ b/web/app/components/workflow/collaboration/core/__tests__/crdt-provider.test.ts @@ -0,0 +1,138 @@ +import type { LoroDoc } from 'loro-crdt' +import type { Socket } from 'socket.io-client' +import { CRDTProvider } from '../crdt-provider' + +type FakeDocEvent = { + by: string +} + +type FakeDoc = { + export: ReturnType + import: ReturnType + subscribe: ReturnType + trigger: (event: FakeDocEvent) => void +} + +const createFakeDoc = (): FakeDoc => { + let handler: ((payload: FakeDocEvent) => void) | null = null + + const exportFn = vi.fn(() => new Uint8Array([1, 2, 3])) + const importFn = vi.fn() + const subscribeFn = vi.fn((cb: (payload: FakeDocEvent) => void) => { + handler = cb + }) + + return { + export: exportFn, + import: importFn, + subscribe: subscribeFn, + trigger: (event: FakeDocEvent) => { + handler?.(event) + }, + } +} + +type MockSocket = { + trigger: (event: string, ...args: unknown[]) => void + emit: ReturnType + on: ReturnType + off: ReturnType +} + +const createMockSocket = (): MockSocket => { + const handlers = new Map void>() + + const socket: MockSocket = { + emit: vi.fn(), + on: vi.fn((event: string, handler: (...args: unknown[]) => void) => { + handlers.set(event, handler) + }), + off: vi.fn((event: string) => { + handlers.delete(event) + }), + trigger: (event: string, ...args: unknown[]) => { + const handler = handlers.get(event) + if (handler) + handler(...args) + }, + } + + return socket +} + +describe('CRDTProvider', () => { + it('emits graph_event when local changes happen', () => { + const doc = createFakeDoc() + const socket = createMockSocket() + + const provider = new CRDTProvider(socket as unknown as Socket, doc as unknown as LoroDoc) + expect(provider).toBeInstanceOf(CRDTProvider) + + doc.trigger({ by: 'local' }) + + expect(socket.emit).toHaveBeenCalledWith( + 'graph_event', + expect.any(Uint8Array), + expect.any(Function), + ) + expect(doc.export).toHaveBeenCalledWith({ mode: 'update' }) + }) + + it('ignores non-local events', () => { + const doc = createFakeDoc() + const socket = createMockSocket() + + const provider = new CRDTProvider(socket as unknown as Socket, doc as unknown as LoroDoc) + + doc.trigger({ by: 'remote' }) + + expect(socket.emit).not.toHaveBeenCalled() + provider.destroy() + }) + + it('imports remote updates on graph_update', () => { + const doc = createFakeDoc() + const socket = createMockSocket() + + const provider = new CRDTProvider(socket as unknown as Socket, doc as unknown as LoroDoc) + + const payload = new Uint8Array([9, 9, 9]) + socket.trigger('graph_update', payload) + + expect(doc.import).toHaveBeenCalledWith(expect.any(Uint8Array)) + expect(Array.from(doc.import.mock.calls[0][0])).toEqual([9, 9, 9]) + provider.destroy() + }) + + it('removes graph_update listener on destroy', () => { + const doc = createFakeDoc() + const socket = createMockSocket() + + const provider = new CRDTProvider(socket as unknown as Socket, doc as unknown as LoroDoc) + provider.destroy() + + expect(socket.off).toHaveBeenCalledWith('graph_update') + }) + + it('logs an error when graph_update import fails but continues operating', () => { + const doc = createFakeDoc() + const socket = createMockSocket() + doc.import.mockImplementation(() => { + throw new Error('boom') + }) + + const provider = new CRDTProvider(socket as unknown as Socket, doc as unknown as LoroDoc) + + const errorSpy = vi.spyOn(console, 'error').mockImplementation(() => undefined) + + socket.trigger('graph_update', new Uint8Array([1])) + expect(errorSpy).toHaveBeenCalledWith('Error importing graph update:', expect.any(Error)) + + doc.import.mockReset() + socket.trigger('graph_update', new Uint8Array([2, 3])) + expect(doc.import).toHaveBeenCalled() + + provider.destroy() + errorSpy.mockRestore() + }) +}) diff --git a/web/app/components/workflow/collaboration/core/__tests__/event-emitter.test.ts b/web/app/components/workflow/collaboration/core/__tests__/event-emitter.test.ts new file mode 100644 index 0000000000..19c4990856 --- /dev/null +++ b/web/app/components/workflow/collaboration/core/__tests__/event-emitter.test.ts @@ -0,0 +1,93 @@ +import { EventEmitter } from '../event-emitter' + +describe('EventEmitter', () => { + it('registers and invokes handlers via on/emit', () => { + const emitter = new EventEmitter() + const handler = vi.fn() + + emitter.on('test', handler) + emitter.emit('test', { value: 42 }) + + expect(handler).toHaveBeenCalledWith({ value: 42 }) + }) + + it('removes specific handler with off', () => { + const emitter = new EventEmitter() + const handlerA = vi.fn() + const handlerB = vi.fn() + + emitter.on('test', handlerA) + emitter.on('test', handlerB) + + emitter.off('test', handlerA) + emitter.emit('test', 'payload') + + expect(handlerA).not.toHaveBeenCalled() + expect(handlerB).toHaveBeenCalledWith('payload') + }) + + it('clears all listeners when off is called without handler', () => { + const emitter = new EventEmitter() + const handlerA = vi.fn() + const handlerB = vi.fn() + + emitter.on('trigger', handlerA) + emitter.on('trigger', handlerB) + + emitter.off('trigger') + emitter.emit('trigger', 'payload') + + expect(handlerA).not.toHaveBeenCalled() + expect(handlerB).not.toHaveBeenCalled() + expect(emitter.getListenerCount('trigger')).toBe(0) + }) + + it('removeAllListeners clears every registered event', () => { + const emitter = new EventEmitter() + emitter.on('one', vi.fn()) + emitter.on('two', vi.fn()) + + emitter.removeAllListeners() + + expect(emitter.getListenerCount('one')).toBe(0) + expect(emitter.getListenerCount('two')).toBe(0) + }) + + it('returns an unsubscribe function from on', () => { + const emitter = new EventEmitter() + const handler = vi.fn() + + const unsubscribe = emitter.on('detach', handler) + unsubscribe() + + emitter.emit('detach', 'value') + + expect(handler).not.toHaveBeenCalled() + }) + + it('continues emitting when a handler throws', () => { + const emitter = new EventEmitter() + const errorHandler = vi + .spyOn(console, 'error') + .mockImplementation(() => undefined) + + const failingHandler = vi.fn(() => { + throw new Error('boom') + }) + const succeedingHandler = vi.fn() + + emitter.on('safe', failingHandler) + emitter.on('safe', succeedingHandler) + + emitter.emit('safe', 7) + + expect(failingHandler).toHaveBeenCalledWith(7) + expect(succeedingHandler).toHaveBeenCalledWith(7) + expect(errorHandler).toHaveBeenCalledWith( + expect.stringContaining('Error in event handler for safe:'), + expect.any(Error), + ) + + errorHandler.mockRestore() + }) +}) diff --git a/web/app/components/workflow/collaboration/core/__tests__/websocket-manager.test.ts b/web/app/components/workflow/collaboration/core/__tests__/websocket-manager.test.ts new file mode 100644 index 0000000000..b9982164c5 --- /dev/null +++ b/web/app/components/workflow/collaboration/core/__tests__/websocket-manager.test.ts @@ -0,0 +1,161 @@ +type MockSocket = { + trigger: (event: string, ...args: unknown[]) => void + emit: ReturnType + on: ReturnType + disconnect: ReturnType + connected: boolean +} + +type IoOptions = { + auth?: unknown + path?: string + transports?: string[] + withCredentials?: boolean +} + +const ioMock = vi.hoisted(() => vi.fn()) + +vi.mock('socket.io-client', () => ({ + io: (...args: Parameters) => ioMock(...args), +})) + +const createMockSocket = (id: string): MockSocket => { + const handlers = new Map void>() + + const socket: MockSocket & { id: string } = { + id, + connected: true, + emit: vi.fn(), + disconnect: vi.fn(() => { + socket.connected = false + }), + on: vi.fn((event: string, handler: (...args: unknown[]) => void) => { + handlers.set(event, handler) + }), + trigger: (event: string, ...args: unknown[]) => { + const handler = handlers.get(event) + if (handler) + handler(...args) + }, + } + + return socket +} + +describe('WebSocketClient', () => { + beforeEach(() => { + vi.resetModules() + ioMock.mockReset() + }) + + it('connects with default url and registers base listeners', async () => { + const mockSocket = createMockSocket('socket-fallback') + ioMock.mockImplementation(() => mockSocket) + + const { WebSocketClient } = await import('../websocket-manager') + const client = new WebSocketClient() + const socket = client.connect('app-1') + + expect(ioMock).toHaveBeenCalledWith( + 'ws://localhost:5001', + expect.objectContaining({ + path: '/socket.io', + transports: ['websocket'], + withCredentials: true, + }), + ) + expect(socket).toBe(mockSocket) + expect(mockSocket.on).toHaveBeenCalledWith('connect', expect.any(Function)) + expect(mockSocket.on).toHaveBeenCalledWith('disconnect', expect.any(Function)) + expect(mockSocket.on).toHaveBeenCalledWith('connect_error', expect.any(Function)) + }) + + it('reuses existing connected socket and avoids duplicate connections', async () => { + const mockSocket = createMockSocket('socket-reuse') + ioMock.mockImplementation(() => mockSocket) + + const { WebSocketClient } = await import('../websocket-manager') + const client = new WebSocketClient() + + const first = client.connect('app-reuse') + const second = client.connect('app-reuse') + + expect(ioMock).toHaveBeenCalledTimes(1) + expect(second).toBe(first) + }) + + it('emits user_connect on connect without auth payload', async () => { + const mockSocket = createMockSocket('socket-auth') + ioMock.mockImplementation((url: string, options: IoOptions) => { + expect(options.auth).toBeUndefined() + return mockSocket + }) + + const { WebSocketClient } = await import('../websocket-manager') + const client = new WebSocketClient() + client.connect('app-auth') + + const connectHandler = mockSocket.on.mock.calls.find(call => call[0] === 'connect')?.[1] as () => void + expect(connectHandler).toBeDefined() + connectHandler() + + expect(mockSocket.emit).toHaveBeenCalledWith( + 'user_connect', + { workflow_id: 'app-auth' }, + expect.any(Function), + ) + }) + + it('disconnects a specific app and clears internal maps', async () => { + const mockSocket = createMockSocket('socket-disconnect-one') + ioMock.mockImplementation(() => mockSocket) + + const { WebSocketClient } = await import('../websocket-manager') + const client = new WebSocketClient() + client.connect('app-disconnect') + + expect(client.isConnected('app-disconnect')).toBe(true) + client.disconnect('app-disconnect') + + expect(mockSocket.disconnect).toHaveBeenCalled() + expect(client.getSocket('app-disconnect')).toBeNull() + expect(client.isConnected('app-disconnect')).toBe(false) + }) + + it('disconnects all apps when no id is provided', async () => { + const socketA = createMockSocket('socket-a') + const socketB = createMockSocket('socket-b') + ioMock.mockImplementationOnce(() => socketA).mockImplementationOnce(() => socketB) + + const { WebSocketClient } = await import('../websocket-manager') + const client = new WebSocketClient() + client.connect('app-a') + client.connect('app-b') + + client.disconnect() + + expect(socketA.disconnect).toHaveBeenCalled() + expect(socketB.disconnect).toHaveBeenCalled() + expect(client.getConnectedApps()).toEqual([]) + }) + + it('reports connected apps, sockets, and debug info correctly', async () => { + const socketA = createMockSocket('socket-debug-a') + const socketB = createMockSocket('socket-debug-b') + socketB.connected = false + ioMock.mockImplementationOnce(() => socketA).mockImplementationOnce(() => socketB) + + const { WebSocketClient } = await import('../websocket-manager') + const client = new WebSocketClient() + client.connect('app-a') + client.connect('app-b') + + expect(client.getConnectedApps()).toEqual(['app-a']) + + const debugInfo = client.getDebugInfo() + expect(debugInfo).toMatchObject({ + 'app-a': { connected: true, socketId: 'socket-debug-a' }, + 'app-b': { connected: false, socketId: 'socket-debug-b' }, + }) + }) +}) diff --git a/web/app/components/workflow/collaboration/core/collaboration-manager.ts b/web/app/components/workflow/collaboration/core/collaboration-manager.ts new file mode 100644 index 0000000000..0f0546ae8c --- /dev/null +++ b/web/app/components/workflow/collaboration/core/collaboration-manager.ts @@ -0,0 +1,1185 @@ +import type { Value } from 'loro-crdt' +import type { Socket } from 'socket.io-client' +import type { + CommonNodeType, + Edge, + Node, +} from '../../types' +import type { + CollaborationState, + CollaborationUpdate, + CursorPosition, + NodePanelPresenceMap, + NodePanelPresenceUser, + OnlineUser, + RestoreCompleteData, + RestoreIntentData, + RestoreRequestData, +} from '../types/collaboration' +import { cloneDeep } from 'es-toolkit/object' +import { isEqual } from 'es-toolkit/predicate' +import { LoroDoc, LoroList, LoroMap, UndoManager } from 'loro-crdt' +import { CRDTProvider } from './crdt-provider' +import { EventEmitter } from './event-emitter' +import { emitWithAuthGuard, webSocketClient } from './websocket-manager' + +type NodePanelPresenceEventData = { + nodeId: string + action: 'open' | 'close' + user: NodePanelPresenceUser + clientId: string + timestamp: number +} + +type ReactFlowStore = { + getState: () => { + getNodes: () => Node[] + setNodes: (nodes: Node[]) => void + getEdges: () => Edge[] + setEdges: (edges: Edge[]) => void + } +} + +type CollaborationEventPayload = { + type: CollaborationUpdate['type'] + data: Record + timestamp: number + userId?: string +} + +type LoroSubscribeEvent = { + by?: string +} + +type LoroContainer = { + kind?: () => string + getAttached?: () => unknown +} + +const toLoroValue = (value: unknown): Value => cloneDeep(value) as Value +const toLoroRecord = (value: unknown): Record => cloneDeep(value) as Record +export class CollaborationManager { + private doc: LoroDoc | null = null + private undoManager: UndoManager | null = null + private provider: CRDTProvider | null = null + private nodesMap: LoroMap> | null = null + private edgesMap: LoroMap> | null = null + private eventEmitter = new EventEmitter() + private currentAppId: string | null = null + private reactFlowStore: ReactFlowStore | null = null + private isLeader = false + private leaderId: string | null = null + private cursors: Record = {} + private nodePanelPresence: NodePanelPresenceMap = {} + private activeConnections = new Set() + private isUndoRedoInProgress = false + private pendingInitialSync = false + private rejoinInProgress = false + private pendingGraphImportEmit = false + + private getActiveSocket(): Socket | null { + if (!this.currentAppId) + return null + return webSocketClient.getSocket(this.currentAppId) + } + + private handleSessionUnauthorized = (): void => { + if (this.rejoinInProgress) + return + if (!this.currentAppId) + return + + const socket = this.getActiveSocket() + if (!socket) + return + + this.rejoinInProgress = true + console.warn('Collaboration session expired, attempting to rejoin workflow.') + emitWithAuthGuard( + socket, + 'user_connect', + { workflow_id: this.currentAppId }, + { + onAck: () => { + this.rejoinInProgress = false + }, + onUnauthorized: () => { + this.rejoinInProgress = false + console.error('Rejoin failed due to authorization error, forcing disconnect.') + this.forceDisconnect() + }, + }, + ) + } + + private sendCollaborationEvent(payload: CollaborationEventPayload): void { + const socket = this.getActiveSocket() + if (!socket) + return + + emitWithAuthGuard(socket, 'collaboration_event', payload, { onUnauthorized: this.handleSessionUnauthorized }) + } + + private sendGraphEvent(payload: Uint8Array): void { + const socket = this.getActiveSocket() + if (!socket) + return + + emitWithAuthGuard(socket, 'graph_event', payload, { onUnauthorized: this.handleSessionUnauthorized }) + } + + private getNodeContainer(nodeId: string): LoroMap> { + if (!this.nodesMap) + throw new Error('Nodes map not initialized') + + let container = this.nodesMap.get(nodeId) as unknown + + const isMapContainer = (value: unknown): value is LoroMap> & LoroContainer => { + return !!value && typeof (value as LoroContainer).kind === 'function' && (value as LoroContainer).kind?.() === 'Map' + } + + if (!container || !isMapContainer(container)) { + const previousValue = container + const newContainer = this.nodesMap.setContainer(nodeId, new LoroMap()) + const attached = (newContainer as LoroContainer).getAttached?.() ?? newContainer + container = attached + if (previousValue && typeof previousValue === 'object') + this.populateNodeContainer(container as LoroMap>, previousValue as Node) + } + else { + const attached = (container as LoroContainer).getAttached?.() ?? container + container = attached + } + + return container as LoroMap> + } + + private ensureDataContainer(nodeContainer: LoroMap>): LoroMap> { + let dataContainer = nodeContainer.get('data') as unknown + + if (!dataContainer || typeof (dataContainer as LoroContainer).kind !== 'function' || (dataContainer as LoroContainer).kind?.() !== 'Map') + dataContainer = nodeContainer.setContainer('data', new LoroMap()) + + const attached = (dataContainer as LoroContainer).getAttached?.() ?? dataContainer + return attached as LoroMap> + } + + private ensureList(nodeContainer: LoroMap>, key: string): LoroList { + const dataContainer = this.ensureDataContainer(nodeContainer) + let list = dataContainer.get(key) as unknown + + if (!list || typeof (list as LoroContainer).kind !== 'function' || (list as LoroContainer).kind?.() !== 'List') + list = dataContainer.setContainer(key, new LoroList()) + + const attached = (list as LoroContainer).getAttached?.() ?? list + return attached as LoroList + } + + private exportNode(nodeId: string): Node { + const container = this.getNodeContainer(nodeId) + const json = container.toJSON() as Node + return { + ...json, + data: json.data || {}, + } + } + + private populateNodeContainer(container: LoroMap>, node: Node): void { + const listFields = new Set(['variables', 'prompt_template', 'parameters']) + container.set('id', node.id) + container.set('type', node.type) + container.set('position', toLoroValue(node.position)) + container.set('sourcePosition', node.sourcePosition) + container.set('targetPosition', node.targetPosition) + + if (node.width === undefined) + container.delete('width') + else container.set('width', node.width) + + if (node.height === undefined) + container.delete('height') + else container.set('height', node.height) + + if (node.selected === undefined) + container.delete('selected') + else container.set('selected', node.selected) + + const optionalProps: Array = [ + 'parentId', + 'positionAbsolute', + 'extent', + 'zIndex', + 'draggable', + 'selectable', + 'dragHandle', + 'dragging', + 'connectable', + 'expandParent', + 'focusable', + 'hidden', + 'style', + 'className', + 'ariaLabel', + 'resizing', + 'deletable', + ] + + optionalProps.forEach((prop) => { + const value = node[prop] + if (value === undefined) + container.delete(prop as string) + else + container.set(prop as string, toLoroValue(value)) + }) + + const dataContainer = this.ensureDataContainer(container) + const handledKeys = new Set() + + Object.entries(node.data || {}).forEach(([key, value]) => { + if (!this.shouldSyncDataKey(key)) + return + handledKeys.add(key) + + if (listFields.has(key)) + this.syncList(container, key, Array.isArray(value) ? value : []) + else + dataContainer.set(key, toLoroValue(value)) + }) + + const existingData = dataContainer.toJSON() || {} + Object.keys(existingData).forEach((key) => { + if (!this.shouldSyncDataKey(key)) + return + if (handledKeys.has(key)) + return + + dataContainer.delete(key) + }) + } + + private shouldSyncDataKey(key: string): boolean { + const syncDataAllowList = new Set(['_children', '_connectedSourceHandleIds', '_connectedTargetHandleIds', '_targetBranches']) + return (syncDataAllowList.has(key) || !key.startsWith('_')) && key !== 'selected' + } + + private syncList(nodeContainer: LoroMap>, key: string, desired: Array): void { + const list = this.ensureList(nodeContainer, key) + const current = list.toJSON() as Array + const target = Array.isArray(desired) ? desired : [] + const minLength = Math.min(current.length, target.length) + + for (let i = 0; i < minLength; i += 1) { + if (!isEqual(current[i], target[i])) { + list.delete(i, 1) + list.insert(i, cloneDeep(target[i])) + } + } + + if (current.length > target.length) { + list.delete(target.length, current.length - target.length) + } + else if (target.length > current.length) { + for (let i = current.length; i < target.length; i += 1) + list.insert(i, cloneDeep(target[i])) + } + } + + private getNodePanelPresenceSnapshot(): NodePanelPresenceMap { + const snapshot: NodePanelPresenceMap = {} + Object.entries(this.nodePanelPresence).forEach(([nodeId, viewers]) => { + snapshot[nodeId] = { ...viewers } + }) + return snapshot + } + + private applyNodePanelPresenceUpdate(update: NodePanelPresenceEventData): void { + const { nodeId, action, clientId, user, timestamp } = update + + if (action === 'open') { + // ensure a client only appears on a single node at a time + Object.entries(this.nodePanelPresence).forEach(([id, viewers]) => { + if (viewers[clientId]) { + delete viewers[clientId] + if (Object.keys(viewers).length === 0) + delete this.nodePanelPresence[id] + } + }) + + if (!this.nodePanelPresence[nodeId]) + this.nodePanelPresence[nodeId] = {} + + this.nodePanelPresence[nodeId][clientId] = { + ...user, + clientId, + timestamp: timestamp || Date.now(), + } + } + else { + const viewers = this.nodePanelPresence[nodeId] + if (viewers) { + delete viewers[clientId] + if (Object.keys(viewers).length === 0) + delete this.nodePanelPresence[nodeId] + } + } + + this.eventEmitter.emit('nodePanelPresence', this.getNodePanelPresenceSnapshot()) + } + + private cleanupNodePanelPresence(activeClientIds: Set, activeUserIds: Set): void { + let hasChanges = false + + Object.entries(this.nodePanelPresence).forEach(([nodeId, viewers]) => { + Object.keys(viewers).forEach((clientId) => { + const viewer = viewers[clientId] + const clientActive = activeClientIds.has(clientId) + const userActive = viewer?.userId ? activeUserIds.has(viewer.userId) : false + + if (!clientActive && !userActive) { + delete viewers[clientId] + hasChanges = true + } + }) + + if (Object.keys(viewers).length === 0) + delete this.nodePanelPresence[nodeId] + }) + + if (hasChanges) + this.eventEmitter.emit('nodePanelPresence', this.getNodePanelPresenceSnapshot()) + } + + init = (appId: string, reactFlowStore: ReactFlowStore): void => { + if (!reactFlowStore) { + console.warn('CollaborationManager.init called without reactFlowStore, deferring to connect()') + return + } + this.connect(appId, reactFlowStore) + } + + setNodes = (oldNodes: Node[], newNodes: Node[]): void => { + if (!this.doc) + return + + // Don't track operations during undo/redo to prevent loops + if (this.isUndoRedoInProgress) + return + + this.syncNodes(oldNodes, newNodes) + this.doc.commit() + } + + setEdges = (oldEdges: Edge[], newEdges: Edge[]): void => { + if (!this.doc) + return + + // Don't track operations during undo/redo to prevent loops + if (this.isUndoRedoInProgress) + return + + this.syncEdges(oldEdges, newEdges) + this.doc.commit() + } + + destroy = (): void => { + this.disconnect() + } + + async connect(appId: string, reactFlowStore?: ReactFlowStore): Promise { + const connectionId = Math.random().toString(36).substring(2, 11) + + this.activeConnections.add(connectionId) + + if (this.currentAppId === appId && this.doc) { + // Already connected to the same app, only update store if provided and we don't have one + if (reactFlowStore && !this.reactFlowStore) + this.reactFlowStore = reactFlowStore + + return connectionId + } + + // Only disconnect if switching to a different app + if (this.currentAppId && this.currentAppId !== appId) + this.forceDisconnect() + + this.currentAppId = appId + // Only set store if provided + if (reactFlowStore) + this.reactFlowStore = reactFlowStore + + const socket = webSocketClient.connect(appId) + + // Setup event listeners BEFORE any other operations + this.setupSocketEventListeners(socket) + + this.doc = new LoroDoc() + this.nodesMap = this.doc.getMap('nodes') as LoroMap> + this.edgesMap = this.doc.getMap('edges') as LoroMap> + + // Initialize UndoManager for collaborative undo/redo + this.undoManager = new UndoManager(this.doc, { + maxUndoSteps: 100, + mergeInterval: 500, // Merge operations within 500ms + excludeOriginPrefixes: [], // Don't exclude anything - let UndoManager track all local operations + onPush: (_isUndo, _range, _event) => { + // Store current selection state when an operation is pushed + const selectedNode = this.reactFlowStore?.getState().getNodes().find((n: Node) => n.data?.selected) + + // Emit event to update UI button states when new operation is pushed + setTimeout(() => { + this.eventEmitter.emit('undoRedoStateChange', { + canUndo: this.undoManager?.canUndo() || false, + canRedo: this.undoManager?.canRedo() || false, + }) + }, 0) + + return { + value: { + selectedNodeId: selectedNode?.id || null, + timestamp: Date.now(), + }, + cursors: [], + } + }, + onPop: (_isUndo, value, _counterRange) => { + // Restore selection state when undoing/redoing + if (value?.value && typeof value.value === 'object' && 'selectedNodeId' in value.value && this.reactFlowStore) { + const selectedNodeId = (value.value as { selectedNodeId?: string | null }).selectedNodeId + if (selectedNodeId) { + const { setNodes } = this.reactFlowStore.getState() + const nodes = this.reactFlowStore.getState().getNodes() + const newNodes = nodes.map((n: Node) => ({ + ...n, + data: { + ...n.data, + selected: n.id === selectedNodeId, + }, + })) + setNodes(newNodes) + } + } + }, + }) + + this.provider = new CRDTProvider(socket, this.doc, this.handleSessionUnauthorized) + + this.setupSubscriptions() + + // Force user_connect if already connected + if (socket.connected) + emitWithAuthGuard(socket, 'user_connect', { workflow_id: appId }, { onUnauthorized: this.handleSessionUnauthorized }) + + return connectionId + } + + disconnect = (connectionId?: string): void => { + if (connectionId) + this.activeConnections.delete(connectionId) + + // Only disconnect when no more connections + if (this.activeConnections.size === 0) + this.forceDisconnect() + } + + private forceDisconnect = (): void => { + if (this.currentAppId) + webSocketClient.disconnect(this.currentAppId) + + this.provider?.destroy() + this.undoManager = null + this.doc = null + this.provider = null + this.nodesMap = null + this.edgesMap = null + this.currentAppId = null + this.reactFlowStore = null + this.cursors = {} + this.nodePanelPresence = {} + this.isUndoRedoInProgress = false + this.rejoinInProgress = false + + // Only reset leader status when actually disconnecting + const wasLeader = this.isLeader + this.isLeader = false + this.leaderId = null + + if (wasLeader) + this.eventEmitter.emit('leaderChange', false) + + this.activeConnections.clear() + this.eventEmitter.removeAllListeners() + } + + isConnected(): boolean { + return this.currentAppId ? webSocketClient.isConnected(this.currentAppId) : false + } + + getNodes(): Node[] { + if (!this.nodesMap) + return [] + return Array.from(this.nodesMap.keys()).map(id => this.exportNode(id as string)) + } + + getEdges(): Edge[] { + return this.edgesMap ? Array.from(this.edgesMap.values()) as Edge[] : [] + } + + emitCursorMove(position: CursorPosition): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + const socket = this.getActiveSocket() + if (!socket) + return + + this.sendCollaborationEvent({ + type: 'mouse_move', + userId: socket.id, + data: { x: position.x, y: position.y }, + timestamp: Date.now(), + }) + } + + emitSyncRequest(): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + this.sendCollaborationEvent({ + type: 'sync_request', + data: { timestamp: Date.now() }, + timestamp: Date.now(), + }) + } + + emitWorkflowUpdate(appId: string): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + this.sendCollaborationEvent({ + type: 'workflow_update', + data: { appId, timestamp: Date.now() }, + timestamp: Date.now(), + }) + } + + emitNodePanelPresence(nodeId: string, isOpen: boolean, user: NodePanelPresenceUser): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + const socket = this.getActiveSocket() + if (!socket || !nodeId || !user?.userId) + return + + const payload: NodePanelPresenceEventData = { + nodeId, + action: isOpen ? 'open' : 'close', + user, + clientId: socket.id as string, + timestamp: Date.now(), + } + + this.sendCollaborationEvent({ + type: 'node_panel_presence', + data: payload, + timestamp: payload.timestamp, + }) + + this.applyNodePanelPresenceUpdate(payload) + } + + onSyncRequest(callback: () => void): () => void { + return this.eventEmitter.on('syncRequest', callback) + } + + onGraphImport(callback: (payload: { nodes: Node[], edges: Edge[] }) => void): () => void { + return this.eventEmitter.on('graphImport', callback) + } + + onStateChange(callback: (state: Partial) => void): () => void { + return this.eventEmitter.on('stateChange', callback) + } + + onCursorUpdate(callback: (cursors: Record) => void): () => void { + return this.eventEmitter.on('cursors', callback) + } + + onOnlineUsersUpdate(callback: (users: OnlineUser[]) => void): () => void { + return this.eventEmitter.on('onlineUsers', callback) + } + + onWorkflowUpdate(callback: (update: { appId: string, timestamp: number }) => void): () => void { + return this.eventEmitter.on('workflowUpdate', callback) + } + + onVarsAndFeaturesUpdate(callback: (update: CollaborationUpdate) => void): () => void { + return this.eventEmitter.on('varsAndFeaturesUpdate', callback) + } + + onAppStateUpdate(callback: (update: CollaborationUpdate) => void): () => void { + return this.eventEmitter.on('appStateUpdate', callback) + } + + onAppPublishUpdate(callback: (update: CollaborationUpdate) => void): () => void { + return this.eventEmitter.on('appPublishUpdate', callback) + } + + onAppMetaUpdate(callback: (update: CollaborationUpdate) => void): () => void { + return this.eventEmitter.on('appMetaUpdate', callback) + } + + onMcpServerUpdate(callback: (update: CollaborationUpdate) => void): () => void { + return this.eventEmitter.on('mcpServerUpdate', callback) + } + + onNodePanelPresenceUpdate(callback: (presence: NodePanelPresenceMap) => void): () => void { + const off = this.eventEmitter.on('nodePanelPresence', callback) + callback(this.getNodePanelPresenceSnapshot()) + return off + } + + onLeaderChange(callback: (isLeader: boolean) => void): () => void { + return this.eventEmitter.on('leaderChange', callback) + } + + onCommentsUpdate(callback: (update: { appId: string, timestamp: number }) => void): () => void { + return this.eventEmitter.on('commentsUpdate', callback) + } + + emitCommentsUpdate(appId: string): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + this.sendCollaborationEvent({ + type: 'comments_update', + data: { appId, timestamp: Date.now() }, + timestamp: Date.now(), + }) + } + + onUndoRedoStateChange(callback: (state: { canUndo: boolean, canRedo: boolean }) => void): () => void { + return this.eventEmitter.on('undoRedoStateChange', callback) + } + + emitRestoreRequest(data: RestoreRequestData): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + this.sendCollaborationEvent({ + type: 'workflow_restore_request', + data: data as unknown as Record, + timestamp: Date.now(), + }) + } + + emitRestoreIntent(data: RestoreIntentData): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + this.sendCollaborationEvent({ + type: 'workflow_restore_intent', + data: data as unknown as Record, + timestamp: Date.now(), + }) + } + + emitRestoreComplete(data: RestoreCompleteData): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + this.sendCollaborationEvent({ + type: 'workflow_restore_complete', + data: data as unknown as Record, + timestamp: Date.now(), + }) + } + + onRestoreRequest(callback: (data: RestoreRequestData) => void): () => void { + return this.eventEmitter.on('restoreRequest', callback) + } + + onRestoreIntent(callback: (data: RestoreIntentData) => void): () => void { + return this.eventEmitter.on('restoreIntent', callback) + } + + onRestoreComplete(callback: (data: RestoreCompleteData) => void): () => void { + return this.eventEmitter.on('restoreComplete', callback) + } + + getLeaderId(): string | null { + return this.leaderId + } + + getIsLeader(): boolean { + return this.isLeader + } + + // Collaborative undo/redo methods + undo(): boolean { + if (!this.undoManager) + return false + + const canUndo = this.undoManager.canUndo() + if (canUndo) { + this.isUndoRedoInProgress = true + const result = this.undoManager.undo() + + // After undo, manually update React state from CRDT without triggering collaboration + const reactFlowStore = this.reactFlowStore + if (result && reactFlowStore) { + requestAnimationFrame(() => { + // Get ReactFlow's native setters, not the collaborative ones + const state = reactFlowStore.getState() + const updatedNodes = Array.from(this.nodesMap?.values() || []) as Node[] + const updatedEdges = Array.from(this.edgesMap?.values() || []) as Edge[] + // Call ReactFlow's native setters directly to avoid triggering collaboration + state.setNodes(updatedNodes) + state.setEdges(updatedEdges) + + this.isUndoRedoInProgress = false + + // Emit event to update UI button states + this.eventEmitter.emit('undoRedoStateChange', { + canUndo: this.undoManager?.canUndo() || false, + canRedo: this.undoManager?.canRedo() || false, + }) + }) + } + else { + this.isUndoRedoInProgress = false + } + + return result + } + + return false + } + + redo(): boolean { + if (!this.undoManager) + return false + + const canRedo = this.undoManager.canRedo() + if (canRedo) { + this.isUndoRedoInProgress = true + const result = this.undoManager.redo() + + // After redo, manually update React state from CRDT without triggering collaboration + const reactFlowStore = this.reactFlowStore + if (result && reactFlowStore) { + requestAnimationFrame(() => { + // Get ReactFlow's native setters, not the collaborative ones + const state = reactFlowStore.getState() + const updatedNodes = Array.from(this.nodesMap?.values() || []) as Node[] + const updatedEdges = Array.from(this.edgesMap?.values() || []) as Edge[] + // Call ReactFlow's native setters directly to avoid triggering collaboration + state.setNodes(updatedNodes) + state.setEdges(updatedEdges) + + this.isUndoRedoInProgress = false + + // Emit event to update UI button states + this.eventEmitter.emit('undoRedoStateChange', { + canUndo: this.undoManager?.canUndo() || false, + canRedo: this.undoManager?.canRedo() || false, + }) + }) + } + else { + this.isUndoRedoInProgress = false + } + + return result + } + + return false + } + + canUndo(): boolean { + if (!this.undoManager) + return false + return this.undoManager.canUndo() + } + + canRedo(): boolean { + if (!this.undoManager) + return false + return this.undoManager.canRedo() + } + + clearUndoStack(): void { + if (!this.undoManager) + return + this.undoManager.clear() + } + + private syncNodes(oldNodes: Node[], newNodes: Node[]): void { + if (!this.nodesMap || !this.doc) + return + + const oldNodesMap = new Map(oldNodes.map(node => [node.id, node])) + const newNodesMap = new Map(newNodes.map(node => [node.id, node])) + + oldNodes.forEach((oldNode) => { + if (!newNodesMap.has(oldNode.id)) { + this.nodesMap?.delete(oldNode.id) + } + }) + + newNodes.forEach((newNode) => { + const oldNode = oldNodesMap.get(newNode.id) + if (oldNode && oldNode === newNode) + return + if (oldNode && isEqual(oldNode, newNode)) + return + + const nodeContainer = this.getNodeContainer(newNode.id) + this.populateNodeContainer(nodeContainer, newNode) + }) + } + + private syncEdges(oldEdges: Edge[], newEdges: Edge[]): void { + if (!this.edgesMap) + return + + const oldEdgesMap = new Map(oldEdges.map(edge => [edge.id, edge])) + const newEdgesMap = new Map(newEdges.map(edge => [edge.id, edge])) + + oldEdges.forEach((oldEdge) => { + if (!newEdgesMap.has(oldEdge.id)) { + this.edgesMap?.delete(oldEdge.id) + } + }) + + newEdges.forEach((newEdge) => { + const oldEdge = oldEdgesMap.get(newEdge.id) + if (!oldEdge || !isEqual(oldEdge, newEdge)) { + const clonedEdge = toLoroRecord(newEdge) + this.edgesMap?.set(newEdge.id, clonedEdge) + } + }) + } + + private setupSubscriptions(): void { + this.nodesMap?.subscribe((event: LoroSubscribeEvent) => { + const reactFlowStore = this.reactFlowStore + if (event.by === 'import' && reactFlowStore) { + // Don't update React nodes during undo/redo to prevent loops + if (this.isUndoRedoInProgress) + return + + requestAnimationFrame(() => { + const state = reactFlowStore.getState() + const previousNodes: Node[] = state.getNodes() + const previousNodeMap = new Map(previousNodes.map(node => [node.id, node])) + const selectedIds = new Set( + previousNodes + .filter(node => node.data?.selected) + .map(node => node.id), + ) + + this.pendingInitialSync = false + + const updatedNodes = Array + .from(this.nodesMap?.keys() || []) + .map((nodeId) => { + const node = this.exportNode(nodeId as string) + const clonedNode: Node = { + ...node, + data: { + ...(node.data || {}), + }, + } + const clonedNodeData = clonedNode.data as (CommonNodeType & Record) + // Keep the previous node's private data properties (starting with _) + const previousNode = previousNodeMap.get(clonedNode.id) + if (previousNode?.data) { + const previousData = previousNode.data as Record + Object.entries(previousData) + .filter(([key]) => key.startsWith('_')) + .forEach(([key, value]) => { + if (!(key in clonedNodeData)) + clonedNodeData[key] = value + }) + } + + if (selectedIds.has(clonedNode.id)) + clonedNode.data.selected = true + + return clonedNode + }) + + // Call ReactFlow's native setter directly to avoid triggering collaboration + state.setNodes(updatedNodes) + + this.scheduleGraphImportEmit() + }) + } + }) + + this.edgesMap?.subscribe((event: LoroSubscribeEvent) => { + const reactFlowStore = this.reactFlowStore + if (event.by === 'import' && reactFlowStore) { + // Don't update React edges during undo/redo to prevent loops + if (this.isUndoRedoInProgress) + return + + requestAnimationFrame(() => { + // Get ReactFlow's native setters, not the collaborative ones + const state = reactFlowStore.getState() + const updatedEdges = Array.from(this.edgesMap?.values() || []) as Edge[] + + this.pendingInitialSync = false + + // Call ReactFlow's native setter directly to avoid triggering collaboration + state.setEdges(updatedEdges) + + this.scheduleGraphImportEmit() + }) + } + }) + } + + private scheduleGraphImportEmit(): void { + if (this.pendingGraphImportEmit) + return + + this.pendingGraphImportEmit = true + requestAnimationFrame(() => { + this.pendingGraphImportEmit = false + const mergedNodes = this.mergeLocalNodeState(this.getNodes()) + this.eventEmitter.emit('graphImport', { + nodes: mergedNodes, + edges: this.getEdges(), + }) + }) + } + + refreshGraphSynchronously(): void { + const mergedNodes = this.mergeLocalNodeState(this.getNodes()) + this.eventEmitter.emit('graphImport', { + nodes: mergedNodes, + edges: this.getEdges(), + }) + } + + private mergeLocalNodeState(nodes: Node[]): Node[] { + const reactFlowStore = this.reactFlowStore + const state = reactFlowStore?.getState() + const localNodes = state?.getNodes() || [] + + if (localNodes.length === 0) + return nodes + + const localNodesMap = new Map(localNodes.map(node => [node.id, node])) + return nodes.map((node) => { + const localNode = localNodesMap.get(node.id) + if (!localNode) + return node + + const nextNode = cloneDeep(node) + const nextData = { ...(nextNode.data || {}) } as Node['data'] + const nextDataRecord = nextData as Record + const localData = localNode.data as Record | undefined + + if (localData) { + Object.entries(localData).forEach(([key, value]) => { + if (key === 'selected' || key.startsWith('_')) + nextDataRecord[key] = value + }) + } + + if (!Object.prototype.hasOwnProperty.call(nextDataRecord, 'selected') && localNode.selected !== undefined) + nextDataRecord.selected = localNode.selected + + nextNode.data = nextData + return nextNode + }) + } + + private setupSocketEventListeners(socket: Socket): void { + socket.on('collaboration_update', (update: CollaborationUpdate) => { + if (update.type === 'mouse_move') { + // Update cursor state for this user + const data = update.data as { x: number, y: number } + this.cursors[update.userId] = { + x: data.x, + y: data.y, + userId: update.userId, + timestamp: update.timestamp, + } + + this.eventEmitter.emit('cursors', { ...this.cursors }) + } + else if (update.type === 'vars_and_features_update') { + this.eventEmitter.emit('varsAndFeaturesUpdate', update) + } + else if (update.type === 'app_state_update') { + this.eventEmitter.emit('appStateUpdate', update) + } + else if (update.type === 'app_meta_update') { + this.eventEmitter.emit('appMetaUpdate', update) + } + else if (update.type === 'app_publish_update') { + this.eventEmitter.emit('appPublishUpdate', update) + } + else if (update.type === 'mcp_server_update') { + this.eventEmitter.emit('mcpServerUpdate', update) + } + else if (update.type === 'workflow_update') { + this.eventEmitter.emit('workflowUpdate', update.data) + } + else if (update.type === 'comments_update') { + this.eventEmitter.emit('commentsUpdate', update.data) + } + else if (update.type === 'node_panel_presence') { + this.applyNodePanelPresenceUpdate(update.data as NodePanelPresenceEventData) + } + else if (update.type === 'sync_request') { + // Only process if we are the leader + if (this.isLeader) + this.eventEmitter.emit('syncRequest', {}) + } + else if (update.type === 'graph_resync_request') { + if (this.isLeader) + this.broadcastCurrentGraph() + } + else if (update.type === 'workflow_restore_request') { + if (this.isLeader) + this.eventEmitter.emit('restoreRequest', update.data as RestoreRequestData) + } + else if (update.type === 'workflow_restore_intent') { + this.eventEmitter.emit('restoreIntent', update.data as RestoreIntentData) + } + else if (update.type === 'workflow_restore_complete') { + this.eventEmitter.emit('restoreComplete', update.data as RestoreCompleteData) + } + }) + + socket.on('online_users', (data: { users: OnlineUser[], leader?: string }) => { + try { + if (!data || !Array.isArray(data.users)) { + console.warn('Invalid online_users data structure:', data) + return + } + + const onlineUserIds = new Set(data.users.map((user: OnlineUser) => user.user_id)) + const onlineClientIds = new Set( + data.users + .map((user: OnlineUser) => user.sid) + .filter((sid): sid is string => typeof sid === 'string' && sid.length > 0), + ) + + // Remove cursors for offline users + Object.keys(this.cursors).forEach((userId) => { + if (!onlineUserIds.has(userId)) + delete this.cursors[userId] + }) + + this.cleanupNodePanelPresence(onlineClientIds, onlineUserIds) + + // Update leader information + if (data.leader && typeof data.leader === 'string') + this.leaderId = data.leader + + this.eventEmitter.emit('onlineUsers', data.users) + this.eventEmitter.emit('cursors', { ...this.cursors }) + } + catch (error) { + console.error('Error processing online_users update:', error) + } + }) + + socket.on('status', (data: { isLeader: boolean }) => { + try { + if (!data || typeof data.isLeader !== 'boolean') { + console.warn('Invalid status data:', data) + return + } + + const wasLeader = this.isLeader + this.isLeader = data.isLeader + + if (this.isLeader) + this.pendingInitialSync = false + else + this.requestInitialSyncIfNeeded() + + if (wasLeader !== this.isLeader) + this.eventEmitter.emit('leaderChange', this.isLeader) + } + catch (error) { + console.error('Error processing status update:', error) + } + }) + + socket.on('connect', () => { + this.eventEmitter.emit('stateChange', { isConnected: true }) + this.pendingInitialSync = true + }) + + socket.on('disconnect', () => { + this.cursors = {} + this.isLeader = false + this.leaderId = null + this.pendingInitialSync = false + this.eventEmitter.emit('stateChange', { isConnected: false }) + this.eventEmitter.emit('cursors', {}) + }) + + socket.on('connect_error', (error: Error) => { + console.error('WebSocket connection error:', error) + this.eventEmitter.emit('stateChange', { isConnected: false, error: error.message }) + }) + + socket.on('error', (error: Error) => { + console.error('WebSocket error:', error) + }) + } + + // We currently only relay CRDT updates; the server doesn't persist them. + // When a follower joins mid-session, it might miss earlier broadcasts and render stale data. + // This lightweight checkpoint asks the leader to rebroadcast the latest graph snapshot once. + private requestInitialSyncIfNeeded(): void { + if (!this.pendingInitialSync) + return + if (this.isLeader) { + this.pendingInitialSync = false + return + } + + this.emitGraphResyncRequest() + this.pendingInitialSync = false + } + + private emitGraphResyncRequest(): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + + this.sendCollaborationEvent({ + type: 'graph_resync_request', + data: { timestamp: Date.now() }, + timestamp: Date.now(), + }) + } + + private broadcastCurrentGraph(): void { + if (!this.currentAppId || !webSocketClient.isConnected(this.currentAppId)) + return + if (!this.doc) + return + + const socket = webSocketClient.getSocket(this.currentAppId) + if (!socket) + return + + try { + const snapshot = this.doc.export({ mode: 'snapshot' }) + this.sendGraphEvent(snapshot) + } + catch (error) { + console.error('Failed to broadcast graph snapshot:', error) + } + } +} + +export const collaborationManager = new CollaborationManager() diff --git a/web/app/components/workflow/collaboration/core/crdt-provider.ts b/web/app/components/workflow/collaboration/core/crdt-provider.ts new file mode 100644 index 0000000000..ce3fff4b32 --- /dev/null +++ b/web/app/components/workflow/collaboration/core/crdt-provider.ts @@ -0,0 +1,39 @@ +import type { LoroDoc } from 'loro-crdt' +import type { Socket } from 'socket.io-client' +import { emitWithAuthGuard } from './websocket-manager' + +export class CRDTProvider { + private doc: LoroDoc + private socket: Socket + private onUnauthorized?: () => void + + constructor(socket: Socket, doc: LoroDoc, onUnauthorized?: () => void) { + this.socket = socket + this.doc = doc + this.onUnauthorized = onUnauthorized + this.setupEventListeners() + } + + private setupEventListeners(): void { + this.doc.subscribe((event: { by?: string }) => { + if (event.by === 'local') { + const update = this.doc.export({ mode: 'update' }) + emitWithAuthGuard(this.socket, 'graph_event', update, { onUnauthorized: this.onUnauthorized }) + } + }) + + this.socket.on('graph_update', (updateData: Uint8Array) => { + try { + const data = new Uint8Array(updateData) + this.doc.import(data) + } + catch (error) { + console.error('Error importing graph update:', error) + } + }) + } + + destroy(): void { + this.socket.off('graph_update') + } +} diff --git a/web/app/components/workflow/collaboration/core/event-emitter.ts b/web/app/components/workflow/collaboration/core/event-emitter.ts new file mode 100644 index 0000000000..b4f79b7922 --- /dev/null +++ b/web/app/components/workflow/collaboration/core/event-emitter.ts @@ -0,0 +1,51 @@ +export type EventHandler = (data: T) => void + +export class EventEmitter { + private events: Map>> = new Map() + + on(event: string, handler: EventHandler): () => void { + if (!this.events.has(event)) + this.events.set(event, new Set()) + + this.events.get(event)!.add(handler as EventHandler) + + return () => this.off(event, handler) + } + + off(event: string, handler?: EventHandler): void { + if (!this.events.has(event)) + return + + const handlers = this.events.get(event)! + if (handler) + handlers.delete(handler as EventHandler) + else + handlers.clear() + + if (handlers.size === 0) + this.events.delete(event) + } + + emit(event: string, data: T): void { + if (!this.events.has(event)) + return + + const handlers = this.events.get(event)! + handlers.forEach((handler) => { + try { + handler(data) + } + catch (error) { + console.error(`Error in event handler for ${event}:`, error) + } + }) + } + + removeAllListeners(): void { + this.events.clear() + } + + getListenerCount(event: string): number { + return this.events.get(event)?.size || 0 + } +} diff --git a/web/app/components/workflow/collaboration/core/websocket-manager.ts b/web/app/components/workflow/collaboration/core/websocket-manager.ts new file mode 100644 index 0000000000..62ffe2cf0c --- /dev/null +++ b/web/app/components/workflow/collaboration/core/websocket-manager.ts @@ -0,0 +1,157 @@ +import type { Socket } from 'socket.io-client' +import type { DebugInfo, WebSocketConfig } from '../types/websocket' +import { io } from 'socket.io-client' +import { SOCKET_URL } from '@/config' + +type AckArgs = unknown[] + +const isUnauthorizedAck = (...ackArgs: AckArgs): boolean => { + const [first, second] = ackArgs + + if (second === 401 || first === 401) + return true + + if (first && typeof first === 'object' && 'msg' in first) { + const message = (first as { msg?: unknown }).msg + return message === 'unauthorized' + } + + return false +} + +export type EmitAckOptions = { + onAck?: (...ackArgs: AckArgs) => void + onUnauthorized?: (...ackArgs: AckArgs) => void +} + +export const emitWithAuthGuard = ( + socket: Socket | null | undefined, + event: string, + payload: unknown, + options?: EmitAckOptions, +): void => { + if (!socket) + return + + socket.emit( + event, + payload, + (...ackArgs: AckArgs) => { + options?.onAck?.(...ackArgs) + if (isUnauthorizedAck(...ackArgs)) + options?.onUnauthorized?.(...ackArgs) + }, + ) +} + +export class WebSocketClient { + private connections: Map = new Map() + private connecting: Set = new Set() + private readonly url: string + private readonly transports: WebSocketConfig['transports'] + private readonly withCredentials?: boolean + + constructor(config: WebSocketConfig = {}) { + this.url = SOCKET_URL + this.transports = config.transports || ['websocket'] + this.withCredentials = config.withCredentials !== false + } + + connect(appId: string): Socket { + const existingSocket = this.connections.get(appId) + if (existingSocket?.connected) + return existingSocket + + if (this.connecting.has(appId)) { + const pendingSocket = this.connections.get(appId) + if (pendingSocket) + return pendingSocket + } + + if (existingSocket && !existingSocket.connected) { + existingSocket.disconnect() + this.connections.delete(appId) + } + + this.connecting.add(appId) + + const socketOptions: { + path: string + transports: WebSocketConfig['transports'] + withCredentials?: boolean + } = { + path: '/socket.io', + transports: this.transports, + withCredentials: this.withCredentials, + } + + const socket = io(this.url, socketOptions) + + this.connections.set(appId, socket) + this.setupBaseEventListeners(socket, appId) + + return socket + } + + disconnect(appId?: string): void { + if (appId) { + const socket = this.connections.get(appId) + if (socket) { + socket.disconnect() + this.connections.delete(appId) + this.connecting.delete(appId) + } + } + else { + this.connections.forEach(socket => socket.disconnect()) + this.connections.clear() + this.connecting.clear() + } + } + + getSocket(appId: string): Socket | null { + return this.connections.get(appId) || null + } + + isConnected(appId: string): boolean { + return this.connections.get(appId)?.connected || false + } + + getConnectedApps(): string[] { + const connectedApps: string[] = [] + this.connections.forEach((socket, appId) => { + if (socket.connected) + connectedApps.push(appId) + }) + return connectedApps + } + + getDebugInfo(): DebugInfo { + const info: DebugInfo = {} + this.connections.forEach((socket, appId) => { + info[appId] = { + connected: socket.connected, + connecting: this.connecting.has(appId), + socketId: socket.id, + } + }) + return info + } + + private setupBaseEventListeners(socket: Socket, appId: string): void { + socket.on('connect', () => { + this.connecting.delete(appId) + emitWithAuthGuard(socket, 'user_connect', { workflow_id: appId }) + }) + + socket.on('disconnect', () => { + this.connecting.delete(appId) + }) + + socket.on('connect_error', () => { + this.connecting.delete(appId) + }) + } +} + +export const webSocketClient = new WebSocketClient() diff --git a/web/app/components/workflow/collaboration/hooks/use-collaboration.ts b/web/app/components/workflow/collaboration/hooks/use-collaboration.ts new file mode 100644 index 0000000000..a8715d7571 --- /dev/null +++ b/web/app/components/workflow/collaboration/hooks/use-collaboration.ts @@ -0,0 +1,144 @@ +import type { ReactFlowInstance } from 'reactflow' +import type { + CollaborationState, + CursorPosition, + NodePanelPresenceMap, + OnlineUser, +} from '../types/collaboration' +import { useEffect, useRef, useState } from 'react' +import Toast from '@/app/components/base/toast' +import { useGlobalPublicStore } from '@/context/global-public-context' +import { collaborationManager } from '../core/collaboration-manager' +import { CursorService } from '../services/cursor-service' + +type CollaborationViewState = { + isConnected: boolean + onlineUsers: OnlineUser[] + cursors: Record + nodePanelPresence: NodePanelPresenceMap + isLeader: boolean +} + +type ReactFlowStore = NonNullable[1]> + +const initialState: CollaborationViewState = { + isConnected: false, + onlineUsers: [], + cursors: {}, + nodePanelPresence: {}, + isLeader: false, +} + +export function useCollaboration(appId: string, reactFlowStore?: ReactFlowStore) { + const [state, setState] = useState(initialState) + + const cursorServiceRef = useRef(null) + const isCollaborationEnabled = useGlobalPublicStore(s => s.systemFeatures.enable_collaboration_mode) + + useEffect(() => { + if (!appId || !isCollaborationEnabled) { + Promise.resolve().then(() => { + setState(initialState) + }) + return + } + + let connectionId: string | null = null + let isUnmounted = false + + if (!cursorServiceRef.current) + cursorServiceRef.current = new CursorService() + + const initCollaboration = async () => { + try { + const id = await collaborationManager.connect(appId, reactFlowStore) + if (isUnmounted) { + collaborationManager.disconnect(id) + return + } + connectionId = id + setState(prev => ({ ...prev, isConnected: collaborationManager.isConnected() })) + } + catch (error) { + console.error('Failed to initialize collaboration:', error) + } + } + + initCollaboration() + + const unsubscribeStateChange = collaborationManager.onStateChange((newState: Partial) => { + if (newState.isConnected === undefined) + return + + setState(prev => ({ ...prev, isConnected: newState.isConnected ?? prev.isConnected })) + }) + + const unsubscribeCursors = collaborationManager.onCursorUpdate((cursors: Record) => { + setState(prev => ({ ...prev, cursors })) + }) + + const unsubscribeUsers = collaborationManager.onOnlineUsersUpdate((users: OnlineUser[]) => { + setState(prev => ({ ...prev, onlineUsers: users })) + }) + + const unsubscribeNodePanelPresence = collaborationManager.onNodePanelPresenceUpdate((presence: NodePanelPresenceMap) => { + setState(prev => ({ ...prev, nodePanelPresence: presence })) + }) + + const unsubscribeLeaderChange = collaborationManager.onLeaderChange((isLeader: boolean) => { + setState(prev => ({ ...prev, isLeader })) + }) + + return () => { + isUnmounted = true + unsubscribeStateChange() + unsubscribeCursors() + unsubscribeUsers() + unsubscribeNodePanelPresence() + unsubscribeLeaderChange() + cursorServiceRef.current?.stopTracking() + if (connectionId) + collaborationManager.disconnect(connectionId) + } + }, [appId, reactFlowStore, isCollaborationEnabled]) + + const prevIsConnected = useRef(false) + useEffect(() => { + if (prevIsConnected.current && !state.isConnected) { + Toast.notify({ + type: 'error', + message: 'Network connection lost. Please check your network.', + }) + } + prevIsConnected.current = state.isConnected || false + }, [state.isConnected]) + + const startCursorTracking = (containerRef: React.RefObject, reactFlowInstance?: ReactFlowInstance) => { + if (!isCollaborationEnabled || !cursorServiceRef.current) + return + + if (cursorServiceRef.current) { + cursorServiceRef.current.startTracking(containerRef, (position) => { + collaborationManager.emitCursorMove(position) + }, reactFlowInstance) + } + } + + const stopCursorTracking = () => { + cursorServiceRef.current?.stopTracking() + } + + const result = { + isConnected: state.isConnected || false, + onlineUsers: state.onlineUsers || [], + cursors: state.cursors || {}, + nodePanelPresence: state.nodePanelPresence || {}, + isLeader: state.isLeader || false, + leaderId: collaborationManager.getLeaderId(), + isEnabled: isCollaborationEnabled, + startCursorTracking, + stopCursorTracking, + } + + return result +} diff --git a/web/app/components/workflow/collaboration/index.ts b/web/app/components/workflow/collaboration/index.ts new file mode 100644 index 0000000000..dd283f9d2e --- /dev/null +++ b/web/app/components/workflow/collaboration/index.ts @@ -0,0 +1,5 @@ +export { collaborationManager } from './core/collaboration-manager' +export { webSocketClient } from './core/websocket-manager' +export { useCollaboration } from './hooks/use-collaboration' +export { CursorService } from './services/cursor-service' +export * from './types' diff --git a/web/app/components/workflow/collaboration/services/cursor-service.ts b/web/app/components/workflow/collaboration/services/cursor-service.ts new file mode 100644 index 0000000000..7af6f2f27f --- /dev/null +++ b/web/app/components/workflow/collaboration/services/cursor-service.ts @@ -0,0 +1,90 @@ +import type { RefObject } from 'react' +import type { ReactFlowInstance } from 'reactflow' +import type { CursorPosition } from '../types/collaboration' + +const CURSOR_MIN_MOVE_DISTANCE = 10 +const CURSOR_THROTTLE_MS = 300 + +export class CursorService { + private containerRef: RefObject | null = null + private reactFlowInstance: ReactFlowInstance | null = null + private isTracking = false + private onCursorUpdate: ((cursors: Record) => void) | null = null + private onEmitPosition: ((position: CursorPosition) => void) | null = null + private lastEmitTime = 0 + private lastPosition: { x: number, y: number } | null = null + + startTracking( + containerRef: RefObject, + onEmitPosition: (position: CursorPosition) => void, + reactFlowInstance?: ReactFlowInstance, + ): void { + if (this.isTracking) + this.stopTracking() + + this.containerRef = containerRef + this.onEmitPosition = onEmitPosition + this.reactFlowInstance = reactFlowInstance || null + this.isTracking = true + + if (containerRef.current) + containerRef.current.addEventListener('mousemove', this.handleMouseMove) + } + + stopTracking(): void { + if (this.containerRef?.current) + this.containerRef.current.removeEventListener('mousemove', this.handleMouseMove) + + this.containerRef = null + this.reactFlowInstance = null + this.onEmitPosition = null + this.isTracking = false + this.lastPosition = null + } + + setCursorUpdateHandler(handler: (cursors: Record) => void): void { + this.onCursorUpdate = handler + } + + updateCursors(cursors: Record): void { + if (this.onCursorUpdate) + this.onCursorUpdate(cursors) + } + + private handleMouseMove = (event: MouseEvent): void => { + if (!this.containerRef?.current || !this.onEmitPosition) + return + + const rect = this.containerRef.current.getBoundingClientRect() + let x = event.clientX - rect.left + let y = event.clientY - rect.top + + // Transform coordinates to ReactFlow world coordinates if ReactFlow instance is available + if (this.reactFlowInstance) { + const viewport = this.reactFlowInstance.getViewport() + // Convert screen coordinates to world coordinates + // World coordinates = (screen coordinates - viewport translation) / zoom + x = (x - viewport.x) / viewport.zoom + y = (y - viewport.y) / viewport.zoom + } + + // Always emit cursor position (remove boundary check since world coordinates can be negative) + const now = Date.now() + const timeThrottled = now - this.lastEmitTime > CURSOR_THROTTLE_MS + const minDistance = CURSOR_MIN_MOVE_DISTANCE / (this.reactFlowInstance?.getZoom() || 1) + const distanceThrottled = !this.lastPosition + || (Math.abs(x - this.lastPosition.x) > minDistance) + || (Math.abs(y - this.lastPosition.y) > minDistance) + + if (timeThrottled && distanceThrottled) { + this.lastPosition = { x, y } + this.lastEmitTime = now + this.onEmitPosition({ + x, + y, + userId: '', + timestamp: now, + }) + } + } +} diff --git a/web/app/components/workflow/collaboration/types/collaboration.ts b/web/app/components/workflow/collaboration/types/collaboration.ts new file mode 100644 index 0000000000..d94ad25759 --- /dev/null +++ b/web/app/components/workflow/collaboration/types/collaboration.ts @@ -0,0 +1,103 @@ +import type { Viewport } from 'reactflow' +import type { ConversationVariable, Edge, EnvironmentVariable, Node } from '../../types' +import type { Features } from '@/app/components/base/features/types' + +export type OnlineUser = { + user_id: string + username: string + avatar: string + sid: string +} + +export type WorkflowOnlineUsers = { + workflow_id: string + users: OnlineUser[] +} + +export type OnlineUserListResponse = { + data: WorkflowOnlineUsers[] +} + +export type CursorPosition = { + x: number + y: number + userId: string + timestamp: number +} + +export type NodePanelPresenceUser = { + userId: string + username: string + avatar?: string | null +} + +export type NodePanelPresenceInfo = NodePanelPresenceUser & { + clientId: string + timestamp: number +} + +export type NodePanelPresenceMap = Record> + +export type CollaborationState = { + appId: string + isConnected: boolean + onlineUsers: OnlineUser[] + cursors: Record + nodePanelPresence: NodePanelPresenceMap +} + +export type GraphSyncData = { + nodes: Node[] + edges: Edge[] +} + +export type CollaborationEventType + = | 'mouse_move' + | 'vars_and_features_update' + | 'sync_request' + | 'app_state_update' + | 'app_meta_update' + | 'mcp_server_update' + | 'workflow_update' + | 'comments_update' + | 'node_panel_presence' + | 'app_publish_update' + | 'graph_resync_request' + | 'workflow_restore_request' + | 'workflow_restore_intent' + | 'workflow_restore_complete' + +export type CollaborationUpdate = { + type: CollaborationEventType + userId: string + data: Record + timestamp: number +} + +export type RestoreRequestData = { + versionId: string + versionName?: string + initiatorUserId: string + initiatorName: string + graphData: { + nodes: Node[] + edges: Edge[] + viewport?: Viewport + } + features?: Features + environmentVariables?: EnvironmentVariable[] + conversationVariables?: ConversationVariable[] +} + +export type RestoreIntentData = { + versionId: string + versionName?: string + initiatorUserId: string + initiatorName: string +} + +export type RestoreCompleteData = { + versionId: string + success: boolean + error?: string +} diff --git a/web/app/components/workflow/collaboration/types/events.ts b/web/app/components/workflow/collaboration/types/events.ts new file mode 100644 index 0000000000..69d228357e --- /dev/null +++ b/web/app/components/workflow/collaboration/types/events.ts @@ -0,0 +1,34 @@ +export type CollaborationEvent = { + type: string + data: TData + timestamp: number +} + +export type GraphUpdateEvent = { + type: 'graph_update' +} & CollaborationEvent + +export type CursorMoveEvent = { + type: 'cursor_move' +} & CollaborationEvent<{ + x: number + y: number + userId: string +}> + +export type UserConnectEvent = { + type: 'user_connect' +} & CollaborationEvent<{ + workflow_id: string +}> + +export type OnlineUsersEvent = { + type: 'online_users' +} & CollaborationEvent<{ + users: Array<{ + user_id: string + username: string + avatar: string + sid: string + }> +}> diff --git a/web/app/components/workflow/collaboration/types/index.ts b/web/app/components/workflow/collaboration/types/index.ts new file mode 100644 index 0000000000..bcbe7353e1 --- /dev/null +++ b/web/app/components/workflow/collaboration/types/index.ts @@ -0,0 +1,3 @@ +export * from './collaboration' +export * from './events' +export * from './websocket' diff --git a/web/app/components/workflow/collaboration/types/websocket.ts b/web/app/components/workflow/collaboration/types/websocket.ts new file mode 100644 index 0000000000..dd89df323f --- /dev/null +++ b/web/app/components/workflow/collaboration/types/websocket.ts @@ -0,0 +1,15 @@ +export type WebSocketConfig = { + token?: string + transports?: string[] + withCredentials?: boolean +} + +export type ConnectionInfo = { + connected: boolean + connecting: boolean + socketId?: string +} + +export type DebugInfo = { + [appId: string]: ConnectionInfo +} diff --git a/web/app/components/workflow/collaboration/utils/user-color.ts b/web/app/components/workflow/collaboration/utils/user-color.ts new file mode 100644 index 0000000000..51aee6a038 --- /dev/null +++ b/web/app/components/workflow/collaboration/utils/user-color.ts @@ -0,0 +1,12 @@ +/** + * Generate a consistent color for a user based on their ID + * Used for cursor colors and avatar backgrounds + */ +export const getUserColor = (id: string): string => { + const colors = ['#155AEF', '#0BA5EC', '#444CE7', '#7839EE', '#4CA30D', '#0E9384', '#DD2590', '#FF4405', '#D92D20', '#F79009', '#828DAD'] + const hash = id.split('').reduce((a, b) => { + a = ((a << 5) - a) + b.charCodeAt(0) + return a & a + }, 0) + return colors[Math.abs(hash) % colors.length] +} diff --git a/web/app/components/workflow/comment-manager.tsx b/web/app/components/workflow/comment-manager.tsx new file mode 100644 index 0000000000..7d5d5f1653 --- /dev/null +++ b/web/app/components/workflow/comment-manager.tsx @@ -0,0 +1,34 @@ +import { useEventListener } from 'ahooks' +import { useWorkflowComment } from './hooks/use-workflow-comment' +import { useWorkflowStore } from './store' + +const CommentManager = () => { + const workflowStore = useWorkflowStore() + const { handleCreateComment, handleCommentCancel } = useWorkflowComment() + + useEventListener('click', (e) => { + const { controlMode, mousePosition, pendingComment } = workflowStore.getState() + + if (controlMode === 'comment') { + const target = e.target as HTMLElement + const isInDropdown = target.closest('[data-mention-dropdown]') + const isInCommentInput = target.closest('[data-comment-input]') + const isOnCanvasPane = target.closest('.react-flow__pane') + + // Only when clicking on the React Flow canvas pane (background), + // and not inside comment input or its dropdown + if (!isInDropdown && !isInCommentInput && isOnCanvasPane) { + e.preventDefault() + e.stopPropagation() + if (pendingComment) + handleCommentCancel() + else + handleCreateComment(mousePosition) + } + } + }) + + return null +} + +export default CommentManager diff --git a/web/app/components/workflow/comment/comment-icon.spec.tsx b/web/app/components/workflow/comment/comment-icon.spec.tsx new file mode 100644 index 0000000000..aee8c64fa3 --- /dev/null +++ b/web/app/components/workflow/comment/comment-icon.spec.tsx @@ -0,0 +1,148 @@ +import type { WorkflowCommentList } from '@/service/workflow-comment' +import { fireEvent, render, screen } from '@testing-library/react' +import { beforeEach, describe, expect, it, vi } from 'vitest' +import { CommentIcon } from './comment-icon' + +type Position = { x: number, y: number } + +let mockUserId = 'user-1' + +const mockFlowToScreenPosition = vi.fn((position: Position) => position) +const mockScreenToFlowPosition = vi.fn((position: Position) => position) + +vi.mock('reactflow', () => ({ + useReactFlow: () => ({ + flowToScreenPosition: mockFlowToScreenPosition, + screenToFlowPosition: mockScreenToFlowPosition, + }), + useViewport: () => ({ + x: 0, + y: 0, + zoom: 1, + }), +})) + +vi.mock('@/context/app-context', () => ({ + useAppContext: () => ({ + userProfile: { + id: mockUserId, + name: 'User', + avatar_url: 'avatar', + }, + }), +})) + +vi.mock('@/app/components/base/user-avatar-list', () => ({ + UserAvatarList: ({ users }: { users: Array<{ id: string }> }) => ( +
{users.map(user => user.id).join(',')}
+ ), +})) + +vi.mock('./comment-preview', () => ({ + default: ({ onClick }: { onClick?: () => void }) => ( + + ), +})) + +const createComment = (overrides: Partial = {}): WorkflowCommentList => ({ + id: 'comment-1', + position_x: 0, + position_y: 0, + content: 'Hello', + created_by: 'user-1', + created_by_account: { + id: 'user-1', + name: 'Alice', + email: 'alice@example.com', + }, + created_at: 1, + updated_at: 2, + resolved: false, + mention_count: 0, + reply_count: 0, + participants: [], + ...overrides, +}) + +describe('CommentIcon', () => { + beforeEach(() => { + vi.clearAllMocks() + mockUserId = 'user-1' + }) + + it('toggles preview on hover when inactive', () => { + const comment = createComment() + const { container } = render( + , + ) + const marker = container.querySelector('[data-role="comment-marker"]') as HTMLElement + const hoverTarget = marker.firstElementChild as HTMLElement + + fireEvent.mouseEnter(hoverTarget) + expect(screen.getByTestId('comment-preview')).toBeInTheDocument() + + fireEvent.mouseLeave(hoverTarget) + expect(screen.queryByTestId('comment-preview')).not.toBeInTheDocument() + }) + + it('calls onPositionUpdate after dragging by author', () => { + const comment = createComment({ position_x: 0, position_y: 0 }) + const onClick = vi.fn() + const onPositionUpdate = vi.fn() + const { container } = render( + , + ) + const marker = container.querySelector('[data-role="comment-marker"]') as HTMLElement + + fireEvent.pointerDown(marker, { + pointerId: 1, + button: 0, + clientX: 100, + clientY: 100, + }) + fireEvent.pointerMove(marker, { + pointerId: 1, + clientX: 110, + clientY: 110, + }) + fireEvent.pointerUp(marker, { + pointerId: 1, + clientX: 110, + clientY: 110, + }) + + expect(mockScreenToFlowPosition).toHaveBeenCalledWith({ x: 10, y: 10 }) + expect(onPositionUpdate).toHaveBeenCalledWith({ x: 10, y: 10 }) + expect(onClick).not.toHaveBeenCalled() + }) + + it('calls onClick for non-author clicks', () => { + mockUserId = 'user-2' + const comment = createComment() + const onClick = vi.fn() + const { container } = render( + , + ) + const marker = container.querySelector('[data-role="comment-marker"]') as HTMLElement + + fireEvent.pointerDown(marker, { + pointerId: 1, + button: 0, + clientX: 50, + clientY: 60, + }) + fireEvent.pointerUp(marker, { + pointerId: 1, + clientX: 50, + clientY: 60, + }) + + expect(onClick).toHaveBeenCalledTimes(1) + }) +}) diff --git a/web/app/components/workflow/comment/comment-icon.tsx b/web/app/components/workflow/comment/comment-icon.tsx new file mode 100644 index 0000000000..301d7decfa --- /dev/null +++ b/web/app/components/workflow/comment/comment-icon.tsx @@ -0,0 +1,268 @@ +'use client' + +import type { FC, PointerEvent as ReactPointerEvent } from 'react' +import type { WorkflowCommentList } from '@/service/workflow-comment' +import { memo, useCallback, useMemo, useRef, useState } from 'react' +import { useReactFlow, useViewport } from 'reactflow' +import { UserAvatarList } from '@/app/components/base/user-avatar-list' +import { useAppContext } from '@/context/app-context' +import CommentPreview from './comment-preview' + +type CommentIconProps = { + comment: WorkflowCommentList + onClick: () => void + isActive?: boolean + onPositionUpdate?: (position: { x: number, y: number }) => void +} + +export const CommentIcon: FC = memo(({ comment, onClick, isActive = false, onPositionUpdate }) => { + const { flowToScreenPosition, screenToFlowPosition } = useReactFlow() + const viewport = useViewport() + const { userProfile } = useAppContext() + const isAuthor = comment.created_by_account?.id === userProfile?.id + const [showPreview, setShowPreview] = useState(false) + const [dragPosition, setDragPosition] = useState<{ x: number, y: number } | null>(null) + const [isDragging, setIsDragging] = useState(false) + const dragStateRef = useRef<{ + offsetX: number + offsetY: number + startX: number + startY: number + hasMoved: boolean + } | null>(null) + + const workflowContainerRect = typeof document !== 'undefined' + ? document.getElementById('workflow-container')?.getBoundingClientRect() + : null + const containerLeft = workflowContainerRect?.left ?? 0 + const containerTop = workflowContainerRect?.top ?? 0 + + const screenPosition = useMemo(() => { + return flowToScreenPosition({ + x: comment.position_x, + y: comment.position_y, + }) + }, [comment.position_x, comment.position_y, viewport.x, viewport.y, viewport.zoom, flowToScreenPosition]) + + const effectiveScreenPosition = dragPosition ?? screenPosition + const canvasPosition = useMemo(() => ({ + x: effectiveScreenPosition.x - containerLeft, + y: effectiveScreenPosition.y - containerTop, + }), [effectiveScreenPosition.x, effectiveScreenPosition.y, containerLeft, containerTop]) + const cursorClass = useMemo(() => { + if (!isAuthor) + return 'cursor-pointer' + if (isActive) + return isDragging ? 'cursor-grabbing' : '' + return isDragging ? 'cursor-grabbing' : 'cursor-pointer' + }, [isActive, isAuthor, isDragging]) + + const handlePointerDown = useCallback((event: ReactPointerEvent) => { + if (event.button !== 0) + return + + event.stopPropagation() + event.preventDefault() + + if (!isAuthor) { + if (event.currentTarget.dataset.role !== 'comment-preview') + setShowPreview(false) + return + } + + dragStateRef.current = { + offsetX: event.clientX - screenPosition.x, + offsetY: event.clientY - screenPosition.y, + startX: event.clientX, + startY: event.clientY, + hasMoved: false, + } + + setDragPosition(screenPosition) + setIsDragging(false) + + if (event.currentTarget.dataset.role !== 'comment-preview') + setShowPreview(false) + + if (event.currentTarget.setPointerCapture) + event.currentTarget.setPointerCapture(event.pointerId) + }, [isAuthor, screenPosition]) + + const handlePointerMove = useCallback((event: ReactPointerEvent) => { + const dragState = dragStateRef.current + if (!dragState) + return + + event.stopPropagation() + event.preventDefault() + + const nextX = event.clientX - dragState.offsetX + const nextY = event.clientY - dragState.offsetY + + if (!dragState.hasMoved) { + const distance = Math.hypot(event.clientX - dragState.startX, event.clientY - dragState.startY) + if (distance > 4) { + dragState.hasMoved = true + setIsDragging(true) + } + } + + setDragPosition({ x: nextX, y: nextY }) + }, []) + + const finishDrag = useCallback((event: ReactPointerEvent) => { + const dragState = dragStateRef.current + if (!dragState) + return false + + if (event.currentTarget.hasPointerCapture?.(event.pointerId)) + event.currentTarget.releasePointerCapture(event.pointerId) + + dragStateRef.current = null + setDragPosition(null) + setIsDragging(false) + return dragState.hasMoved + }, []) + + const handlePointerUp = useCallback((event: ReactPointerEvent) => { + event.stopPropagation() + event.preventDefault() + + const finalScreenPosition = dragPosition ?? screenPosition + const didDrag = finishDrag(event) + + setShowPreview(false) + + if (didDrag) { + if (onPositionUpdate) { + const flowPosition = screenToFlowPosition({ + x: finalScreenPosition.x, + y: finalScreenPosition.y, + }) + onPositionUpdate(flowPosition) + } + } + else if (!isActive) { + onClick() + } + }, [dragPosition, finishDrag, isActive, onClick, onPositionUpdate, screenPosition, screenToFlowPosition]) + + const handlePointerCancel = useCallback((event: ReactPointerEvent) => { + event.stopPropagation() + event.preventDefault() + finishDrag(event) + }, [finishDrag]) + + const handleMouseEnter = useCallback(() => { + if (isActive || isDragging) + return + setShowPreview(true) + }, [isActive, isDragging]) + + const handleMouseLeave = useCallback(() => { + setShowPreview(false) + }, []) + + const participants = useMemo(() => { + const list = comment.participants ?? [] + const author = comment.created_by_account + if (!author) + return [...list] + const rest = list.filter(user => user.id !== author.id) + return [author, ...rest] + }, [comment.created_by_account, comment.participants]) + + // Calculate dynamic width based on number of participants + const participantCount = participants.length + const maxVisible = Math.min(3, participantCount) + const showCount = participantCount > 3 + const avatarSize = 24 + const avatarSpacing = 4 // -space-x-1 is about 4px overlap + + // Width calculation: first avatar + (additional avatars * (size - spacing)) + padding + const dynamicWidth = Math.max(40, // minimum width + 8 + avatarSize + Math.max(0, (showCount ? 2 : maxVisible - 1)) * (avatarSize - avatarSpacing) + 8) + + const pointerEventHandlers = useMemo(() => ({ + onPointerDown: handlePointerDown, + onPointerMove: handlePointerMove, + onPointerUp: handlePointerUp, + onPointerCancel: handlePointerCancel, + }), [handlePointerCancel, handlePointerDown, handlePointerMove, handlePointerUp]) + + return ( + <> +
+
+
+
+
+ +
+
+
+
+
+ + {/* Preview panel */} + {showPreview && !isActive && ( +
setShowPreview(true)} + onMouseLeave={() => setShowPreview(false)} + > + { + setShowPreview(false) + onClick() + }} + /> +
+ )} + + ) +}, (prevProps, nextProps) => { + return ( + prevProps.comment.id === nextProps.comment.id + && prevProps.comment.position_x === nextProps.comment.position_x + && prevProps.comment.position_y === nextProps.comment.position_y + && prevProps.onClick === nextProps.onClick + && prevProps.isActive === nextProps.isActive + && prevProps.onPositionUpdate === nextProps.onPositionUpdate + ) +}) + +CommentIcon.displayName = 'CommentIcon' diff --git a/web/app/components/workflow/comment/comment-input.spec.tsx b/web/app/components/workflow/comment/comment-input.spec.tsx new file mode 100644 index 0000000000..aa60b0c8bc --- /dev/null +++ b/web/app/components/workflow/comment/comment-input.spec.tsx @@ -0,0 +1,106 @@ +import type { FC } from 'react' +import { fireEvent, render, screen } from '@testing-library/react' +import { beforeEach, describe, expect, it, vi } from 'vitest' +import { CommentInput } from './comment-input' + +type MentionInputProps = { + value: string + onChange: (value: string) => void + onSubmit: (content: string, mentionedUserIds: string[]) => void + placeholder?: string + autoFocus?: boolean + className?: string +} + +const stableT = (key: string, options?: { ns?: string }) => ( + options?.ns ? `${options.ns}.${key}` : key +) + +let mentionInputProps: MentionInputProps | null = null + +vi.mock('react-i18next', () => ({ + useTranslation: () => ({ + t: stableT, + }), +})) + +vi.mock('@/context/app-context', () => ({ + useAppContext: () => ({ + userProfile: { + id: 'user-1', + name: 'Alice', + avatar_url: 'avatar', + }, + }), +})) + +vi.mock('@/app/components/base/avatar', () => ({ + default: ({ name }: { name: string }) =>
{name}
, +})) + +vi.mock('./mention-input', () => ({ + MentionInput: ((props: MentionInputProps) => { + mentionInputProps = props + return ( + + ) + }) as FC, +})) + +describe('CommentInput', () => { + beforeEach(() => { + vi.clearAllMocks() + mentionInputProps = null + }) + + it('passes translated placeholder to mention input', () => { + render( + , + ) + + expect(mentionInputProps?.placeholder).toBe('workflow.comments.placeholder.add') + expect(mentionInputProps?.autoFocus).toBe(true) + }) + + it('calls onCancel when Escape is pressed', () => { + const onCancel = vi.fn() + + render( + , + ) + + fireEvent.keyDown(document, { key: 'Escape' }) + + expect(onCancel).toHaveBeenCalledTimes(1) + }) + + it('forwards mention submit to onSubmit', () => { + const onSubmit = vi.fn() + + render( + , + ) + + fireEvent.click(screen.getByTestId('mention-input')) + + expect(onSubmit).toHaveBeenCalledWith('Hello', ['user-2']) + }) +}) diff --git a/web/app/components/workflow/comment/comment-input.tsx b/web/app/components/workflow/comment/comment-input.tsx new file mode 100644 index 0000000000..891471ea4d --- /dev/null +++ b/web/app/components/workflow/comment/comment-input.tsx @@ -0,0 +1,175 @@ +import type { FC, PointerEvent as ReactPointerEvent } from 'react' +import { memo, useCallback, useEffect, useRef, useState } from 'react' +import { useTranslation } from 'react-i18next' +import Avatar from '@/app/components/base/avatar' +import { useAppContext } from '@/context/app-context' +import { cn } from '@/utils/classnames' +import { MentionInput } from './mention-input' + +type CommentInputProps = { + position: { x: number, y: number } + onSubmit: (content: string, mentionedUserIds: string[]) => void + onCancel: () => void + onPositionChange?: (position: { + pageX: number + pageY: number + elementX: number + elementY: number + }) => void +} + +export const CommentInput: FC = memo(({ position, onSubmit, onCancel, onPositionChange }) => { + const [content, setContent] = useState('') + const { t } = useTranslation() + const { userProfile } = useAppContext() + const dragStateRef = useRef<{ + pointerId: number | null + startPointerX: number + startPointerY: number + startX: number + startY: number + active: boolean + } & { + endHandler?: (event: PointerEvent) => void + }>({ + pointerId: null, + startPointerX: 0, + startPointerY: 0, + startX: 0, + startY: 0, + active: false, + endHandler: undefined, + }) + + useEffect(() => { + const handleGlobalKeyDown = (e: KeyboardEvent) => { + if (e.key === 'Escape') { + e.preventDefault() + e.stopPropagation() + onCancel() + } + } + + document.addEventListener('keydown', handleGlobalKeyDown, true) + return () => { + document.removeEventListener('keydown', handleGlobalKeyDown, true) + } + }, [onCancel]) + + const handleMentionSubmit = useCallback((content: string, mentionedUserIds: string[]) => { + onSubmit(content, mentionedUserIds) + setContent('') + }, [onSubmit]) + + const handleDragPointerMove = useCallback((event: PointerEvent) => { + const state = dragStateRef.current + if (!state.active || (state.pointerId !== null && event.pointerId !== state.pointerId)) + return + if (!onPositionChange) + return + event.preventDefault() + const deltaX = event.clientX - state.startPointerX + const deltaY = event.clientY - state.startPointerY + onPositionChange({ + pageX: event.clientX, + pageY: event.clientY, + elementX: state.startX + deltaX, + elementY: state.startY + deltaY, + }) + }, [onPositionChange]) + + const stopDragging = useCallback((event?: PointerEvent) => { + const state = dragStateRef.current + if (!state.active) + return + if (event && state.pointerId !== null && event.pointerId !== state.pointerId) + return + state.active = false + state.pointerId = null + window.removeEventListener('pointermove', handleDragPointerMove) + if (state.endHandler) { + window.removeEventListener('pointerup', state.endHandler) + window.removeEventListener('pointercancel', state.endHandler) + state.endHandler = undefined + } + }, [handleDragPointerMove]) + + const handleDragPointerDown = useCallback((event: ReactPointerEvent) => { + if (event.button !== 0) + return + event.stopPropagation() + event.preventDefault() + if (!onPositionChange) + return + const endHandler = (pointerEvent: PointerEvent) => { + stopDragging(pointerEvent) + } + dragStateRef.current = { + pointerId: event.pointerId, + startPointerX: event.clientX, + startPointerY: event.clientY, + startX: position.x, + startY: position.y, + active: true, + endHandler, + } + window.addEventListener('pointermove', handleDragPointerMove, { passive: false }) + window.addEventListener('pointerup', endHandler) + window.addEventListener('pointercancel', endHandler) + }, [handleDragPointerMove, onPositionChange, position.x, position.y, stopDragging]) + + useEffect(() => () => { + stopDragging() + }, [stopDragging]) + + return ( +
+
+
+
+
+
+
+ +
+
+
+
+
+
+
+ +
+
+
+
+ ) +}) + +CommentInput.displayName = 'CommentInput' diff --git a/web/app/components/workflow/comment/comment-preview.spec.tsx b/web/app/components/workflow/comment/comment-preview.spec.tsx new file mode 100644 index 0000000000..d411c67ecd --- /dev/null +++ b/web/app/components/workflow/comment/comment-preview.spec.tsx @@ -0,0 +1,86 @@ +import type { WorkflowCommentList } from '@/service/workflow-comment' +import { fireEvent, render, screen } from '@testing-library/react' +import { beforeEach, describe, expect, it, vi } from 'vitest' +import CommentPreview from './comment-preview' + +type UserProfile = WorkflowCommentList['created_by_account'] + +const mockSetHovering = vi.fn() +let capturedUsers: UserProfile[] = [] + +vi.mock('@/app/components/base/user-avatar-list', () => ({ + UserAvatarList: ({ users }: { users: UserProfile[] }) => { + capturedUsers = users + return
{users.map(user => user.id).join(',')}
+ }, +})) + +vi.mock('@/hooks/use-format-time-from-now', () => ({ + useFormatTimeFromNow: () => ({ + formatTimeFromNow: (value: number) => `time:${value}`, + }), +})) + +vi.mock('../store', () => ({ + useStore: (selector: (state: { setCommentPreviewHovering: (value: boolean) => void }) => unknown) => + selector({ setCommentPreviewHovering: mockSetHovering }), +})) + +const createComment = (overrides: Partial = {}): WorkflowCommentList => { + const author = { id: 'user-1', name: 'Alice', email: 'alice@example.com' } + const participant = { id: 'user-2', name: 'Bob', email: 'bob@example.com' } + + return { + id: 'comment-1', + position_x: 0, + position_y: 0, + content: 'Hello', + created_by: author.id, + created_by_account: author, + created_at: 1, + updated_at: 10, + resolved: false, + mention_count: 0, + reply_count: 0, + participants: [author, participant], + ...overrides, + } +} + +describe('CommentPreview', () => { + beforeEach(() => { + vi.clearAllMocks() + capturedUsers = [] + }) + + it('orders participants with author first and formats time', () => { + const comment = createComment() + + render() + + expect(capturedUsers.map(user => user.id)).toEqual(['user-1', 'user-2']) + expect(screen.getByText('Hello')).toBeInTheDocument() + expect(screen.getByText('time:10000')).toBeInTheDocument() + }) + + it('updates hover state on enter and leave', () => { + const comment = createComment() + const { container } = render() + const root = container.firstElementChild as HTMLElement + + fireEvent.mouseEnter(root) + fireEvent.mouseLeave(root) + + expect(mockSetHovering).toHaveBeenCalledWith(true) + expect(mockSetHovering).toHaveBeenCalledWith(false) + }) + + it('clears hover state on unmount', () => { + const comment = createComment() + const { unmount } = render() + + unmount() + + expect(mockSetHovering).toHaveBeenCalledWith(false) + }) +}) diff --git a/web/app/components/workflow/comment/comment-preview.tsx b/web/app/components/workflow/comment/comment-preview.tsx new file mode 100644 index 0000000000..30816d9cf3 --- /dev/null +++ b/web/app/components/workflow/comment/comment-preview.tsx @@ -0,0 +1,59 @@ +'use client' + +import type { FC } from 'react' +import type { WorkflowCommentList } from '@/service/workflow-comment' +import { memo, useEffect, useMemo } from 'react' +import { UserAvatarList } from '@/app/components/base/user-avatar-list' +import { useFormatTimeFromNow } from '@/hooks/use-format-time-from-now' +import { useStore } from '../store' + +type CommentPreviewProps = { + comment: WorkflowCommentList + onClick?: () => void +} + +const CommentPreview: FC = ({ comment, onClick }) => { + const { formatTimeFromNow } = useFormatTimeFromNow() + const setCommentPreviewHovering = useStore(s => s.setCommentPreviewHovering) + const participants = useMemo(() => { + const list = comment.participants ?? [] + const author = comment.created_by_account + if (!author) + return [...list] + const rest = list.filter(user => user.id !== author.id) + return [author, ...rest] + }, [comment.created_by_account, comment.participants]) + useEffect(() => () => { + setCommentPreviewHovering(false) + }, [setCommentPreviewHovering]) + + return ( +
setCommentPreviewHovering(true)} + onMouseLeave={() => setCommentPreviewHovering(false)} + > +
+ +
+ +
+
+
{comment.created_by_account.name}
+
+ {formatTimeFromNow(comment.updated_at * 1000)} +
+
+
+ +
{comment.content}
+
+ ) +} + +export default memo(CommentPreview) diff --git a/web/app/components/workflow/comment/cursor.spec.tsx b/web/app/components/workflow/comment/cursor.spec.tsx new file mode 100644 index 0000000000..74cf222c1a --- /dev/null +++ b/web/app/components/workflow/comment/cursor.spec.tsx @@ -0,0 +1,45 @@ +import { render, screen } from '@testing-library/react' +import { beforeEach, describe, expect, it, vi } from 'vitest' +import { ControlMode } from '../types' +import { CommentCursor } from './cursor' + +const mockState = { + controlMode: ControlMode.Pointer, + mousePosition: { + elementX: 10, + elementY: 20, + }, +} + +vi.mock('@/app/components/base/icons/src/public/other', () => ({ + Comment: (props: { className?: string }) => , +})) + +vi.mock('../store', () => ({ + useStore: (selector: (state: typeof mockState) => unknown) => selector(mockState), +})) + +describe('CommentCursor', () => { + beforeEach(() => { + vi.clearAllMocks() + }) + + it('renders nothing when not in comment mode', () => { + mockState.controlMode = ControlMode.Pointer + + render() + + expect(screen.queryByTestId('comment-icon')).not.toBeInTheDocument() + }) + + it('renders at current mouse position when in comment mode', () => { + mockState.controlMode = ControlMode.Comment + + render() + + const icon = screen.getByTestId('comment-icon') + const container = icon.parentElement as HTMLElement + + expect(container).toHaveStyle({ left: '10px', top: '20px' }) + }) +}) diff --git a/web/app/components/workflow/comment/cursor.tsx b/web/app/components/workflow/comment/cursor.tsx new file mode 100644 index 0000000000..9afd4f1d8b --- /dev/null +++ b/web/app/components/workflow/comment/cursor.tsx @@ -0,0 +1,28 @@ +import type { FC } from 'react' +import { memo } from 'react' +import { Comment } from '@/app/components/base/icons/src/public/other' +import { useStore } from '../store' +import { ControlMode } from '../types' + +export const CommentCursor: FC = memo(() => { + const controlMode = useStore(s => s.controlMode) + const mousePosition = useStore(s => s.mousePosition) + + if (controlMode !== ControlMode.Comment) + return null + + return ( +
+ +
+ ) +}) + +CommentCursor.displayName = 'CommentCursor' diff --git a/web/app/components/workflow/comment/index.tsx b/web/app/components/workflow/comment/index.tsx new file mode 100644 index 0000000000..cb8e6ab04d --- /dev/null +++ b/web/app/components/workflow/comment/index.tsx @@ -0,0 +1,5 @@ +export { CommentIcon } from './comment-icon' +export { CommentInput } from './comment-input' +export { CommentCursor } from './cursor' +export { MentionInput } from './mention-input' +export { CommentThread } from './thread' diff --git a/web/app/components/workflow/comment/mention-input.tsx b/web/app/components/workflow/comment/mention-input.tsx new file mode 100644 index 0000000000..ed51feb90d --- /dev/null +++ b/web/app/components/workflow/comment/mention-input.tsx @@ -0,0 +1,661 @@ +'use client' + +import type { ReactNode } from 'react' +import type { UserProfile } from '@/service/workflow-comment' +import { RiArrowUpLine, RiAtLine, RiLoader2Line } from '@remixicon/react' +import { useParams } from 'next/navigation' +import { + forwardRef, + memo, + useCallback, + useEffect, + useImperativeHandle, + useLayoutEffect, + useMemo, + useRef, + useState, +} from 'react' +import { createPortal } from 'react-dom' +import { useTranslation } from 'react-i18next' +import Textarea from 'react-textarea-autosize' +import Avatar from '@/app/components/base/avatar' +import Button from '@/app/components/base/button' +import { EnterKey } from '@/app/components/base/icons/src/public/common' +import { fetchMentionableUsers } from '@/service/workflow-comment' +import { cn } from '@/utils/classnames' +import { useStore, useWorkflowStore } from '../store' + +type MentionInputProps = { + value: string + onChange: (value: string) => void + onSubmit: (content: string, mentionedUserIds: string[]) => void + onCancel?: () => void + placeholder?: string + disabled?: boolean + loading?: boolean + className?: string + isEditing?: boolean + autoFocus?: boolean +} + +const MentionInputInner = forwardRef(({ + value, + onChange, + onSubmit, + onCancel, + placeholder, + disabled = false, + loading = false, + className, + isEditing = false, + autoFocus = false, +}, forwardedRef) => { + const params = useParams() + const { t } = useTranslation() + const appId = params.appId as string + const textareaRef = useRef(null) + const highlightContentRef = useRef(null) + const actionContainerRef = useRef(null) + const actionRightRef = useRef(null) + const baseTextareaHeightRef = useRef(null) + + // Expose textarea ref to parent component + useImperativeHandle(forwardedRef, () => textareaRef.current!, []) + + const workflowStore = useWorkflowStore() + const mentionUsersFromStore = useStore(state => ( + appId ? state.mentionableUsersCache[appId] : undefined + )) + const mentionUsers = mentionUsersFromStore ?? [] + + const [showMentionDropdown, setShowMentionDropdown] = useState(false) + const [mentionQuery, setMentionQuery] = useState('') + const [mentionPosition, setMentionPosition] = useState(0) + const [selectedMentionIndex, setSelectedMentionIndex] = useState(0) + const [mentionedUserIds, setMentionedUserIds] = useState([]) + const resolvedPlaceholder = placeholder ?? t('comments.placeholder.add', { ns: 'workflow' }) + const BASE_PADDING = 4 + const [shouldReserveButtonGap, setShouldReserveButtonGap] = useState(isEditing) + const [shouldReserveHorizontalSpace, setShouldReserveHorizontalSpace] = useState(() => !isEditing) + const [paddingRight, setPaddingRight] = useState(() => BASE_PADDING + (isEditing ? 0 : 48)) + const [paddingBottom, setPaddingBottom] = useState(() => BASE_PADDING + (isEditing ? 32 : 0)) + + const mentionNameList = useMemo(() => { + const names = mentionUsers + .map(user => user.name?.trim()) + .filter((name): name is string => Boolean(name)) + + const uniqueNames = Array.from(new Set(names)) + uniqueNames.sort((a, b) => b.length - a.length) + return uniqueNames + }, [mentionUsers]) + + const highlightedValue = useMemo(() => { + if (!value) + return '' + + if (mentionNameList.length === 0) + return value + + const segments: ReactNode[] = [] + let cursor = 0 + let hasMention = false + + while (cursor < value.length) { + let nextMatchStart = -1 + let matchedName = '' + + for (const name of mentionNameList) { + const searchStart = value.indexOf(`@${name}`, cursor) + if (searchStart === -1) + continue + + const previousChar = searchStart > 0 ? value[searchStart - 1] : '' + if (searchStart > 0 && !/\s/.test(previousChar)) + continue + + if ( + nextMatchStart === -1 + || searchStart < nextMatchStart + || (searchStart === nextMatchStart && name.length > matchedName.length) + ) { + nextMatchStart = searchStart + matchedName = name + } + } + + if (nextMatchStart === -1) + break + + if (nextMatchStart > cursor) + segments.push({value.slice(cursor, nextMatchStart)}) + + const mentionEnd = nextMatchStart + matchedName.length + 1 + segments.push( + + {value.slice(nextMatchStart, mentionEnd)} + , + ) + + hasMention = true + cursor = mentionEnd + } + + if (!hasMention) + return value + + if (cursor < value.length) + segments.push({value.slice(cursor)}) + + return segments + }, [value, mentionNameList]) + + const loadMentionableUsers = useCallback(async () => { + if (!appId) + return + + const state = workflowStore.getState() + if (state.mentionableUsersCache[appId] !== undefined) + return + + if (state.mentionableUsersLoading[appId]) + return + + state.setMentionableUsersLoading(appId, true) + try { + const users = await fetchMentionableUsers(appId) + workflowStore.getState().setMentionableUsersCache(appId, users) + } + catch (error) { + console.error('Failed to load mentionable users:', error) + } + finally { + workflowStore.getState().setMentionableUsersLoading(appId, false) + } + }, [appId, workflowStore]) + + useEffect(() => { + loadMentionableUsers() + }, [loadMentionableUsers]) + const syncHighlightScroll = useCallback(() => { + const textarea = textareaRef.current + const highlightContent = highlightContentRef.current + if (!textarea || !highlightContent) + return + + const { scrollTop, scrollLeft } = textarea + highlightContent.style.transform = `translate(${-scrollLeft}px, ${-scrollTop}px)` + }, []) + + const evaluateContentLayout = useCallback(() => { + const textarea = textareaRef.current + if (!textarea) + return + + const extraBottom = Math.max(0, paddingBottom - BASE_PADDING) + const effectiveClientHeight = textarea.clientHeight - extraBottom + + if (baseTextareaHeightRef.current === null) + baseTextareaHeightRef.current = effectiveClientHeight + + const baseHeight = baseTextareaHeightRef.current ?? effectiveClientHeight + const hasMultiline = effectiveClientHeight > baseHeight + 1 + const shouldReserveVertical = isEditing ? true : hasMultiline + + setShouldReserveButtonGap(shouldReserveVertical) + setShouldReserveHorizontalSpace(!hasMultiline) + }, [isEditing, paddingBottom]) + + const updateLayoutPadding = useCallback(() => { + const actionEl = actionContainerRef.current + const rect = actionEl?.getBoundingClientRect() + const rightRect = actionRightRef.current?.getBoundingClientRect() + let actionWidth = 0 + if (rightRect) + actionWidth = Math.ceil(rightRect.width) + else if (rect) + actionWidth = Math.ceil(rect.width) + + const actionHeight = rect ? Math.ceil(rect.height) : 0 + const fallbackWidth = Math.max(0, paddingRight - BASE_PADDING) + const fallbackHeight = Math.max(0, paddingBottom - BASE_PADDING) + const effectiveWidth = actionWidth > 0 ? actionWidth : fallbackWidth + const effectiveHeight = actionHeight > 0 ? actionHeight : fallbackHeight + + const nextRight = BASE_PADDING + (shouldReserveHorizontalSpace ? effectiveWidth : 0) + const nextBottom = BASE_PADDING + (shouldReserveButtonGap ? effectiveHeight : 0) + + setPaddingRight(prev => (prev === nextRight ? prev : nextRight)) + setPaddingBottom(prev => (prev === nextBottom ? prev : nextBottom)) + }, [shouldReserveButtonGap, shouldReserveHorizontalSpace, paddingRight, paddingBottom]) + + const setActionContainerRef = useCallback((node: HTMLDivElement | null) => { + actionContainerRef.current = node + + if (!isEditing) + actionRightRef.current = node + else if (!node) + actionRightRef.current = null + + if (node && typeof window !== 'undefined') + window.requestAnimationFrame(() => updateLayoutPadding()) + }, [isEditing, updateLayoutPadding]) + + const setActionRightRef = useCallback((node: HTMLDivElement | null) => { + actionRightRef.current = node + + if (node && typeof window !== 'undefined') + window.requestAnimationFrame(() => updateLayoutPadding()) + }, [updateLayoutPadding]) + + useLayoutEffect(() => { + syncHighlightScroll() + }, [value, syncHighlightScroll]) + + useLayoutEffect(() => { + Promise.resolve().then(() => { + evaluateContentLayout() + }) + }, [value, evaluateContentLayout]) + + useLayoutEffect(() => { + Promise.resolve().then(() => { + updateLayoutPadding() + }) + }, [updateLayoutPadding, isEditing, shouldReserveButtonGap]) + + useEffect(() => { + const handleResize = () => { + evaluateContentLayout() + updateLayoutPadding() + } + + window.addEventListener('resize', handleResize) + return () => window.removeEventListener('resize', handleResize) + }, [evaluateContentLayout, updateLayoutPadding]) + + useEffect(() => { + Promise.resolve().then(() => { + baseTextareaHeightRef.current = null + evaluateContentLayout() + setShouldReserveHorizontalSpace(!isEditing) + }) + }, [isEditing, evaluateContentLayout]) + + const filteredMentionUsers = useMemo(() => { + if (!mentionQuery) + return mentionUsers + return mentionUsers.filter(user => + user.name.toLowerCase().includes(mentionQuery.toLowerCase()) + || user.email.toLowerCase().includes(mentionQuery.toLowerCase()), + ) + }, [mentionUsers, mentionQuery]) + + const shouldDisableMentionButton = useMemo(() => { + if (showMentionDropdown) + return true + + const textarea = textareaRef.current + if (!textarea) + return false + + const cursorPosition = textarea.selectionStart || 0 + const textBeforeCursor = value.slice(0, cursorPosition) + return /@\w*$/.test(textBeforeCursor) + }, [showMentionDropdown, value]) + + const dropdownPosition = useMemo(() => { + if (!showMentionDropdown || !textareaRef.current) + return { x: 0, y: 0, placement: 'bottom' as const } + + const textareaRect = textareaRef.current.getBoundingClientRect() + const dropdownHeight = 160 // max-h-40 = 10rem = 160px + const viewportHeight = window.innerHeight + const spaceBelow = viewportHeight - textareaRect.bottom + const spaceAbove = textareaRect.top + + const shouldPlaceAbove = spaceBelow < dropdownHeight && spaceAbove > spaceBelow + + return { + x: textareaRect.left, + y: shouldPlaceAbove ? textareaRect.top - 4 : textareaRect.bottom + 4, + placement: shouldPlaceAbove ? 'top' as const : 'bottom' as const, + } + }, [showMentionDropdown]) + + const handleContentChange = useCallback((newValue: string) => { + onChange(newValue) + + setTimeout(() => { + const cursorPosition = textareaRef.current?.selectionStart || 0 + const textBeforeCursor = newValue.slice(0, cursorPosition) + const mentionMatch = textBeforeCursor.match(/@(\w*)$/) + + if (mentionMatch) { + setMentionQuery(mentionMatch[1]) + setMentionPosition(cursorPosition - mentionMatch[0].length) + setShowMentionDropdown(true) + setSelectedMentionIndex(0) + } + else { + setShowMentionDropdown(false) + } + + if (typeof window !== 'undefined') { + window.requestAnimationFrame(() => { + evaluateContentLayout() + syncHighlightScroll() + }) + } + }, 0) + }, [onChange, evaluateContentLayout, syncHighlightScroll]) + + const handleMentionButtonClick = useCallback((e: React.MouseEvent) => { + e.preventDefault() + e.stopPropagation() + + const textarea = textareaRef.current + if (!textarea) + return + + const cursorPosition = textarea.selectionStart || 0 + const textBeforeCursor = value.slice(0, cursorPosition) + + if (showMentionDropdown) + return + + if (/@\w*$/.test(textBeforeCursor)) + return + + const newContent = `${value.slice(0, cursorPosition)}@${value.slice(cursorPosition)}` + + onChange(newContent) + + setTimeout(() => { + const newCursorPos = cursorPosition + 1 + textarea.setSelectionRange(newCursorPos, newCursorPos) + textarea.focus() + + setMentionQuery('') + setMentionPosition(cursorPosition) + setShowMentionDropdown(true) + setSelectedMentionIndex(0) + + if (typeof window !== 'undefined') { + window.requestAnimationFrame(() => { + evaluateContentLayout() + syncHighlightScroll() + }) + } + }, 0) + }, [value, onChange, evaluateContentLayout, syncHighlightScroll, showMentionDropdown]) + + const insertMention = useCallback((user: UserProfile) => { + const textarea = textareaRef.current + if (!textarea) + return + + const beforeMention = value.slice(0, mentionPosition) + const afterMention = value.slice(textarea.selectionStart || 0) + + const needsSpaceBefore = mentionPosition > 0 && !/\s/.test(value[mentionPosition - 1]) + const prefix = needsSpaceBefore ? ' ' : '' + const newContent = `${beforeMention}${prefix}@${user.name} ${afterMention}` + + onChange(newContent) + setShowMentionDropdown(false) + + const newMentionedUserIds = [...mentionedUserIds, user.id] + setMentionedUserIds(newMentionedUserIds) + + setTimeout(() => { + const extraSpace = needsSpaceBefore ? 1 : 0 + const newCursorPos = mentionPosition + extraSpace + user.name.length + 2 // (space) + @ + name + space + textarea.setSelectionRange(newCursorPos, newCursorPos) + textarea.focus() + if (typeof window !== 'undefined') { + window.requestAnimationFrame(() => { + evaluateContentLayout() + syncHighlightScroll() + }) + } + }, 0) + }, [value, mentionPosition, onChange, mentionedUserIds, evaluateContentLayout, syncHighlightScroll]) + + const handleSubmit = useCallback(async (e?: React.MouseEvent) => { + if (e) { + e.preventDefault() + e.stopPropagation() + } + + if (value.trim()) { + try { + await onSubmit(value.trim(), mentionedUserIds) + setMentionedUserIds([]) + setShowMentionDropdown(false) + } + catch (error) { + console.error('Failed to submit', error) + } + } + }, [value, mentionedUserIds, onSubmit]) + + const handleKeyDown = useCallback((e: React.KeyboardEvent) => { + // Ignore key events during IME composition (e.g., Chinese, Japanese input) + if (e.nativeEvent.isComposing) + return + + if (showMentionDropdown) { + if (e.key === 'ArrowDown') { + e.preventDefault() + setSelectedMentionIndex(prev => + prev < filteredMentionUsers.length - 1 ? prev + 1 : 0, + ) + } + else if (e.key === 'ArrowUp') { + e.preventDefault() + setSelectedMentionIndex(prev => + prev > 0 ? prev - 1 : filteredMentionUsers.length - 1, + ) + } + else if (e.key === 'Enter') { + e.preventDefault() + if (filteredMentionUsers[selectedMentionIndex]) + insertMention(filteredMentionUsers[selectedMentionIndex]) + + return + } + else if (e.key === 'Escape') { + e.preventDefault() + setShowMentionDropdown(false) + return + } + } + + if (e.key === 'Enter' && !e.shiftKey && !showMentionDropdown) { + e.preventDefault() + handleSubmit() + } + }, [showMentionDropdown, filteredMentionUsers, selectedMentionIndex, insertMention, handleSubmit]) + + const resetMentionState = useCallback(() => { + setMentionedUserIds([]) + setShowMentionDropdown(false) + setMentionQuery('') + setMentionPosition(0) + setSelectedMentionIndex(0) + }, []) + + useEffect(() => { + if (!value) { + Promise.resolve().then(() => { + resetMentionState() + }) + } + }, [value, resetMentionState]) + + useEffect(() => { + if (autoFocus && textareaRef.current) { + const textarea = textareaRef.current + setTimeout(() => { + textarea.focus() + const length = textarea.value.length + textarea.setSelectionRange(length, length) + }, 0) + } + }, [autoFocus]) + + return ( + <> +
+
+
+ {highlightedValue} + +
+
+