mirror of
https://github.com/langgenius/dify.git
synced 2026-01-20 12:09:27 +08:00
Compare commits
131 Commits
fix/fail-b
...
dev/plugin
| Author | SHA1 | Date | |
|---|---|---|---|
| 086aeea181 | |||
| 1d7c4a87d0 | |||
| 9042b368e9 | |||
| f1bcd26c69 | |||
| 3dcd8b6330 | |||
| 10c088029c | |||
| 73b1adf862 | |||
| ae76dbd92c | |||
| 782df0c383 | |||
| 089207240e | |||
| 53d30d537f | |||
| 53512a4650 | |||
| 1fb7dcda24 | |||
| 3c3e0a35f4 | |||
| 202a246e83 | |||
| 08b968eca5 | |||
| b1ac71db3e | |||
| 55405c1a26 | |||
| 779770dae5 | |||
| 002b16e1c6 | |||
| 7710d8e83b | |||
| cf75fcdffc | |||
| 6e8601b52c | |||
| 96cf0ed5af | |||
| ddf9eb1f9a | |||
| 46a798bea8 | |||
| bb4fecf3d1 | |||
| 9e258c495d | |||
| 4fbe52da40 | |||
| 1e3197a1ea | |||
| 5f692dfce2 | |||
| 78a7d7fa21 | |||
| a9dda1554e | |||
| c53786d229 | |||
| 17f23f4798 | |||
| 67f2c766bc | |||
| 9a417bfc5e | |||
| 90bc51ed2e | |||
| 02dc835721 | |||
| a05e8f0e37 | |||
| b10cbb9b20 | |||
| 1aaab741a0 | |||
| bafa46393c | |||
| 45d43c41bc | |||
| e944646541 | |||
| 21e1443ed5 | |||
| 93a5ffb037 | |||
| d5711589cd | |||
| 375a359c97 | |||
| 3228bac56d | |||
| c66b4e32db | |||
| 57b60dd51f | |||
| ff911d0dc5 | |||
| 7a71498a3e | |||
| 76bcdc2581 | |||
| 91a218b29d | |||
| 4a6cbda1b4 | |||
| 8c08153e33 | |||
| b44b3866a1 | |||
| c242bb372b | |||
| 8c9e34133c | |||
| 3403ac361a | |||
| 07d6cb3f4a | |||
| 545aa61cf4 | |||
| 9fb78ce827 | |||
| 490b6d092e | |||
| 42b13bd312 | |||
| 28add22f20 | |||
| ce545274a6 | |||
| aa6c951e8c | |||
| c4f4dfc3fb | |||
| 548f6ef2b6 | |||
| b15ff4eb8c | |||
| 7790214620 | |||
| 3942e45cab | |||
| 2ace9ae4e4 | |||
| 5ac0ef6253 | |||
| f552667312 | |||
| 5669a18bd8 | |||
| a97d73ab05 | |||
| 252d2c425b | |||
| 09fc4bba61 | |||
| 5f995fac32 | |||
| 79d4db8541 | |||
| 9c42626772 | |||
| bbfe83c86b | |||
| 55aa4e424a | |||
| 8015f5c0c5 | |||
| f3fe14863d | |||
| d96c368660 | |||
| 3f34b8b0d1 | |||
| 6a58ea9e56 | |||
| 23888398d1 | |||
| bfbc5eb91e | |||
| 98b0d4169e | |||
| 356cd271b2 | |||
| baf7561cf8 | |||
| b09f22961c | |||
| f3ad3a5dfd | |||
| ee49d321c5 | |||
| f88f9d6970 | |||
| 3467ad3d02 | |||
| 6741604027 | |||
| d2cc502c71 | |||
| b88194d1c6 | |||
| 2b95e54d54 | |||
| 9bff9b5c9e | |||
| 3dd2c170e7 | |||
| 88f41f164f | |||
| cd932519b3 | |||
| 2ff2b08739 | |||
| a4a45421cc | |||
| aafab1b59e | |||
| 7f49f96c3f | |||
| 5673f03db5 | |||
| 278adbc10e | |||
| 5d4e517397 | |||
| c2671c16a8 | |||
| 10991cbc03 | |||
| 3fcf7e88b0 | |||
| ffa5af1356 | |||
| 066516b54d | |||
| 49415e5e7f | |||
| a697bbdfa7 | |||
| d5c31f8728 | |||
| 508005b741 | |||
| 4f0ecdbb6e | |||
| ab2e69faef | |||
| e46a3343b8 | |||
| 47637da734 | |||
| 525bde28f6 |
1
.github/workflows/build-push.yml
vendored
1
.github/workflows/build-push.yml
vendored
@ -5,6 +5,7 @@ on:
|
||||
branches:
|
||||
- "main"
|
||||
- "deploy/dev"
|
||||
- "dev/plugin-deploy"
|
||||
release:
|
||||
types: [published]
|
||||
|
||||
|
||||
@ -73,7 +73,7 @@ Dify requires the following dependencies to build, make sure they're installed o
|
||||
* [Docker](https://www.docker.com/)
|
||||
* [Docker Compose](https://docs.docker.com/compose/install/)
|
||||
* [Node.js v18.x (LTS)](http://nodejs.org)
|
||||
* [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
|
||||
* [pnpm](https://pnpm.io/)
|
||||
* [Python](https://www.python.org/) version 3.11.x or 3.12.x
|
||||
|
||||
### 4. Installations
|
||||
|
||||
@ -70,7 +70,7 @@ Dify 依赖以下工具和库:
|
||||
- [Docker](https://www.docker.com/)
|
||||
- [Docker Compose](https://docs.docker.com/compose/install/)
|
||||
- [Node.js v18.x (LTS)](http://nodejs.org)
|
||||
- [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
|
||||
- [pnpm](https://pnpm.io/)
|
||||
- [Python](https://www.python.org/) version 3.11.x or 3.12.x
|
||||
|
||||
### 4. 安装
|
||||
|
||||
@ -73,7 +73,7 @@ Dify を構築するには次の依存関係が必要です。それらがシス
|
||||
- [Docker](https://www.docker.com/)
|
||||
- [Docker Compose](https://docs.docker.com/compose/install/)
|
||||
- [Node.js v18.x (LTS)](http://nodejs.org)
|
||||
- [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
|
||||
- [pnpm](https://pnpm.io/)
|
||||
- [Python](https://www.python.org/) version 3.11.x or 3.12.x
|
||||
|
||||
### 4. インストール
|
||||
|
||||
@ -72,7 +72,7 @@ Dify yêu cầu các phụ thuộc sau để build, hãy đảm bảo chúng đ
|
||||
- [Docker](https://www.docker.com/)
|
||||
- [Docker Compose](https://docs.docker.com/compose/install/)
|
||||
- [Node.js v18.x (LTS)](http://nodejs.org)
|
||||
- [npm](https://www.npmjs.com/) phiên bản 8.x.x hoặc [Yarn](https://yarnpkg.com/)
|
||||
- [pnpm](https://pnpm.io/)
|
||||
- [Python](https://www.python.org/) phiên bản 3.11.x hoặc 3.12.x
|
||||
|
||||
### 4. Cài đặt
|
||||
|
||||
23
LICENSE
23
LICENSE
@ -1,12 +1,12 @@
|
||||
# Open Source License
|
||||
|
||||
Dify is licensed under the Apache License 2.0, with the following additional conditions:
|
||||
Dify is licensed under a modified version of the Apache License 2.0, with the following additional conditions:
|
||||
|
||||
1. Dify may be utilized commercially, including as a backend service for other applications or as an application development platform for enterprises. Should the conditions below be met, a commercial license must be obtained from the producer:
|
||||
|
||||
a. Multi-tenant service: Unless explicitly authorized by Dify in writing, you may not use the Dify source code to operate a multi-tenant environment.
|
||||
a. Multi-tenant service: Unless explicitly authorized by Dify in writing, you may not use the Dify source code to operate a multi-tenant environment.
|
||||
- Tenant Definition: Within the context of Dify, one tenant corresponds to one workspace. The workspace provides a separated area for each tenant's data and configurations.
|
||||
|
||||
|
||||
b. LOGO and copyright information: In the process of using Dify's frontend, you may not remove or modify the LOGO or copyright information in the Dify console or applications. This restriction is inapplicable to uses of Dify that do not involve its frontend.
|
||||
- Frontend Definition: For the purposes of this license, the "frontend" of Dify includes all components located in the `web/` directory when running Dify from the raw source code, or the "web" image when running Dify with Docker.
|
||||
|
||||
@ -21,19 +21,4 @@ Apart from the specific conditions mentioned above, all other rights and restric
|
||||
|
||||
The interactive design of this product is protected by appearance patent.
|
||||
|
||||
© 2024 LangGenius, Inc.
|
||||
|
||||
|
||||
----------
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
© 2025 LangGenius, Inc.
|
||||
|
||||
16
README_FR.md
16
README_FR.md
@ -55,7 +55,7 @@
|
||||
Dify est une plateforme de développement d'applications LLM open source. Son interface intuitive combine un flux de travail d'IA, un pipeline RAG, des capacités d'agent, une gestion de modèles, des fonctionnalités d'observabilité, et plus encore, vous permettant de passer rapidement du prototype à la production. Voici une liste des fonctionnalités principales:
|
||||
</br> </br>
|
||||
|
||||
**1. Flux de travail**:
|
||||
**1. Flux de travail** :
|
||||
Construisez et testez des flux de travail d'IA puissants sur un canevas visuel, en utilisant toutes les fonctionnalités suivantes et plus encore.
|
||||
|
||||
|
||||
@ -63,27 +63,25 @@ Dify est une plateforme de développement d'applications LLM open source. Son in
|
||||
|
||||
|
||||
|
||||
**2. Prise en charge complète des modèles**:
|
||||
**2. Prise en charge complète des modèles** :
|
||||
Intégration transparente avec des centaines de LLM propriétaires / open source provenant de dizaines de fournisseurs d'inférence et de solutions auto-hébergées, couvrant GPT, Mistral, Llama3, et tous les modèles compatibles avec l'API OpenAI. Une liste complète des fournisseurs de modèles pris en charge se trouve [ici](https://docs.dify.ai/getting-started/readme/model-providers).
|
||||
|
||||

|
||||
|
||||
|
||||
**3. IDE de prompt**:
|
||||
**3. IDE de prompt** :
|
||||
Interface intuitive pour créer des prompts, comparer les performances des modèles et ajouter des fonctionnalités supplémentaires telles que la synthèse vocale à une application basée sur des chats.
|
||||
|
||||
**4. Pipeline RAG**:
|
||||
**4. Pipeline RAG** :
|
||||
Des capacités RAG étendues qui couvrent tout, de l'ingestion de documents à la récupération, avec un support prêt à l'emploi pour l'extraction de texte à partir de PDF, PPT et autres formats de document courants.
|
||||
|
||||
**5. Capac
|
||||
|
||||
ités d'agent**:
|
||||
**5. Capacités d'agent** :
|
||||
Vous pouvez définir des agents basés sur l'appel de fonction LLM ou ReAct, et ajouter des outils pré-construits ou personnalisés pour l'agent. Dify fournit plus de 50 outils intégrés pour les agents d'IA, tels que la recherche Google, DALL·E, Stable Diffusion et WolframAlpha.
|
||||
|
||||
**6. LLMOps**:
|
||||
**6. LLMOps** :
|
||||
Surveillez et analysez les journaux d'application et les performances au fil du temps. Vous pouvez continuellement améliorer les prompts, les ensembles de données et les modèles en fonction des données de production et des annotations.
|
||||
|
||||
**7. Backend-as-a-Service**:
|
||||
**7. Backend-as-a-Service** :
|
||||
Toutes les offres de Dify sont accompagnées d'API correspondantes, vous permettant d'intégrer facilement Dify dans votre propre logique métier.
|
||||
|
||||
|
||||
|
||||
@ -164,7 +164,7 @@ DifyはオープンソースのLLMアプリケーション開発プラットフ
|
||||
|
||||
- **企業/組織向けのDify</br>**
|
||||
企業中心の機能を提供しています。[メールを送信](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry)して企業のニーズについて相談してください。 </br>
|
||||
> AWSを使用しているスタートアップ企業や中小企業の場合は、[AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6)のDify Premiumをチェックして、ワンクリックで自分のAWS VPCにデプロイできます。さらに、手頃な価格のAMIオファリングどして、ロゴやブランディングをカスタマイズしてアプリケーションを作成するオプションがあります。
|
||||
> AWSを使用しているスタートアップ企業や中小企業の場合は、[AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t23mebxzwjhu6)のDify Premiumをチェックして、ワンクリックで自分のAWS VPCにデプロイできます。さらに、手頃な価格のAMIオファリングとして、ロゴやブランディングをカスタマイズしてアプリケーションを作成するオプションがあります。
|
||||
|
||||
|
||||
## 最新の情報を入手
|
||||
|
||||
@ -2,6 +2,7 @@ import logging
|
||||
import time
|
||||
|
||||
from configs import dify_config
|
||||
from contexts.wrapper import RecyclableContextVar
|
||||
from dify_app import DifyApp
|
||||
|
||||
|
||||
@ -16,6 +17,12 @@ def create_flask_app_with_configs() -> DifyApp:
|
||||
dify_app = DifyApp(__name__)
|
||||
dify_app.config.from_mapping(dify_config.model_dump())
|
||||
|
||||
# add before request hook
|
||||
@dify_app.before_request
|
||||
def before_request():
|
||||
# add an unique identifier to each request
|
||||
RecyclableContextVar.increment_thread_recycles()
|
||||
|
||||
return dify_app
|
||||
|
||||
|
||||
|
||||
@ -2,6 +2,8 @@ from contextvars import ContextVar
|
||||
from threading import Lock
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from contexts.wrapper import RecyclableContextVar
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from core.plugin.entities.plugin_daemon import PluginModelProviderEntity
|
||||
from core.tools.plugin_tool.provider import PluginToolProviderController
|
||||
@ -12,8 +14,17 @@ tenant_id: ContextVar[str] = ContextVar("tenant_id")
|
||||
|
||||
workflow_variable_pool: ContextVar["VariablePool"] = ContextVar("workflow_variable_pool")
|
||||
|
||||
plugin_tool_providers: ContextVar[dict[str, "PluginToolProviderController"]] = ContextVar("plugin_tool_providers")
|
||||
plugin_tool_providers_lock: ContextVar[Lock] = ContextVar("plugin_tool_providers_lock")
|
||||
"""
|
||||
To avoid race-conditions caused by gunicorn thread recycling, using RecyclableContextVar to replace with
|
||||
"""
|
||||
plugin_tool_providers: RecyclableContextVar[dict[str, "PluginToolProviderController"]] = RecyclableContextVar(
|
||||
ContextVar("plugin_tool_providers")
|
||||
)
|
||||
plugin_tool_providers_lock: RecyclableContextVar[Lock] = RecyclableContextVar(ContextVar("plugin_tool_providers_lock"))
|
||||
|
||||
plugin_model_providers: ContextVar[list["PluginModelProviderEntity"] | None] = ContextVar("plugin_model_providers")
|
||||
plugin_model_providers_lock: ContextVar[Lock] = ContextVar("plugin_model_providers_lock")
|
||||
plugin_model_providers: RecyclableContextVar[list["PluginModelProviderEntity"] | None] = RecyclableContextVar(
|
||||
ContextVar("plugin_model_providers")
|
||||
)
|
||||
plugin_model_providers_lock: RecyclableContextVar[Lock] = RecyclableContextVar(
|
||||
ContextVar("plugin_model_providers_lock")
|
||||
)
|
||||
|
||||
65
api/contexts/wrapper.py
Normal file
65
api/contexts/wrapper.py
Normal file
@ -0,0 +1,65 @@
|
||||
from contextvars import ContextVar
|
||||
from typing import Generic, TypeVar
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
class HiddenValue:
|
||||
pass
|
||||
|
||||
|
||||
_default = HiddenValue()
|
||||
|
||||
|
||||
class RecyclableContextVar(Generic[T]):
|
||||
"""
|
||||
RecyclableContextVar is a wrapper around ContextVar
|
||||
It's safe to use in gunicorn with thread recycling, but features like `reset` are not available for now
|
||||
|
||||
NOTE: you need to call `increment_thread_recycles` before requests
|
||||
"""
|
||||
|
||||
_thread_recycles: ContextVar[int] = ContextVar("thread_recycles")
|
||||
|
||||
@classmethod
|
||||
def increment_thread_recycles(cls):
|
||||
try:
|
||||
recycles = cls._thread_recycles.get()
|
||||
cls._thread_recycles.set(recycles + 1)
|
||||
except LookupError:
|
||||
cls._thread_recycles.set(0)
|
||||
|
||||
def __init__(self, context_var: ContextVar[T]):
|
||||
self._context_var = context_var
|
||||
self._updates = ContextVar[int](context_var.name + "_updates", default=0)
|
||||
|
||||
def get(self, default: T | HiddenValue = _default) -> T:
|
||||
thread_recycles = self._thread_recycles.get(0)
|
||||
self_updates = self._updates.get()
|
||||
if thread_recycles > self_updates:
|
||||
self._updates.set(thread_recycles)
|
||||
|
||||
# check if thread is recycled and should be updated
|
||||
if thread_recycles < self_updates:
|
||||
return self._context_var.get()
|
||||
else:
|
||||
# thread_recycles >= self_updates, means current context is invalid
|
||||
if isinstance(default, HiddenValue) or default is _default:
|
||||
raise LookupError
|
||||
else:
|
||||
return default
|
||||
|
||||
def set(self, value: T):
|
||||
# it leads to a situation that self.updates is less than cls.thread_recycles if `set` was never called before
|
||||
# increase it manually
|
||||
thread_recycles = self._thread_recycles.get(0)
|
||||
self_updates = self._updates.get()
|
||||
if thread_recycles > self_updates:
|
||||
self._updates.set(thread_recycles)
|
||||
|
||||
if self._updates.get() == self._thread_recycles.get(0):
|
||||
# after increment,
|
||||
self._updates.set(self._updates.get() + 1)
|
||||
|
||||
# set the context
|
||||
self._context_var.set(value)
|
||||
@ -617,7 +617,7 @@ class DocumentDetailApi(DocumentResource):
|
||||
raise InvalidMetadataError(f"Invalid metadata value: {metadata}")
|
||||
|
||||
if metadata == "only":
|
||||
response = {"id": document.id, "doc_type": document.doc_type, "doc_metadata": document.doc_metadata}
|
||||
response = {"id": document.id, "doc_type": document.doc_type, "doc_metadata": document.doc_metadata_details}
|
||||
elif metadata == "without":
|
||||
dataset_process_rules = DatasetService.get_process_rules(dataset_id)
|
||||
document_process_rules = document.dataset_process_rule.to_dict()
|
||||
@ -678,7 +678,7 @@ class DocumentDetailApi(DocumentResource):
|
||||
"disabled_by": document.disabled_by,
|
||||
"archived": document.archived,
|
||||
"doc_type": document.doc_type,
|
||||
"doc_metadata": document.doc_metadata,
|
||||
"doc_metadata": document.doc_metadata_details,
|
||||
"segment_count": document.segment_count,
|
||||
"average_segment_length": document.average_segment_length,
|
||||
"hit_count": document.hit_count,
|
||||
|
||||
143
api/controllers/console/datasets/metadata.py
Normal file
143
api/controllers/console/datasets/metadata.py
Normal file
@ -0,0 +1,143 @@
|
||||
from flask_login import current_user # type: ignore # type: ignore
|
||||
from flask_restful import Resource, marshal_with, reqparse # type: ignore
|
||||
from werkzeug.exceptions import NotFound
|
||||
|
||||
from controllers.console import api
|
||||
from controllers.console.wraps import account_initialization_required, enterprise_license_required, setup_required
|
||||
from fields.dataset_fields import dataset_metadata_fields
|
||||
from libs.login import login_required
|
||||
from services.dataset_service import DatasetService
|
||||
from services.entities.knowledge_entities.knowledge_entities import (
|
||||
MetadataArgs,
|
||||
MetadataOperationData,
|
||||
)
|
||||
from services.metadata_service import MetadataService
|
||||
|
||||
|
||||
def _validate_name(name):
|
||||
if not name or len(name) < 1 or len(name) > 40:
|
||||
raise ValueError("Name must be between 1 to 40 characters.")
|
||||
return name
|
||||
|
||||
|
||||
def _validate_description_length(description):
|
||||
if len(description) > 400:
|
||||
raise ValueError("Description cannot exceed 400 characters.")
|
||||
return description
|
||||
|
||||
|
||||
class DatasetListApi(Resource):
|
||||
@setup_required
|
||||
@login_required
|
||||
@account_initialization_required
|
||||
@enterprise_license_required
|
||||
@marshal_with(dataset_metadata_fields)
|
||||
def post(self, dataset_id):
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("type", type=str, required=True, nullable=True, location="json")
|
||||
parser.add_argument("name", type=str, required=True, nullable=True, location="json")
|
||||
args = parser.parse_args()
|
||||
metadata_args = MetadataArgs(**args)
|
||||
|
||||
dataset_id_str = str(dataset_id)
|
||||
dataset = DatasetService.get_dataset(dataset_id_str)
|
||||
if dataset is None:
|
||||
raise NotFound("Dataset not found.")
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
|
||||
metadata = MetadataService.create_metadata(dataset_id_str, metadata_args)
|
||||
return metadata, 201
|
||||
|
||||
|
||||
class DatasetMetadataApi(Resource):
|
||||
@setup_required
|
||||
@login_required
|
||||
@account_initialization_required
|
||||
@enterprise_license_required
|
||||
def patch(self, dataset_id, metadata_id):
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("name", type=str, required=True, nullable=True, location="json")
|
||||
args = parser.parse_args()
|
||||
|
||||
dataset_id_str = str(dataset_id)
|
||||
metadata_id_str = str(metadata_id)
|
||||
dataset = DatasetService.get_dataset(dataset_id_str)
|
||||
if dataset is None:
|
||||
raise NotFound("Dataset not found.")
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
|
||||
metadata = MetadataService.update_metadata_name(dataset_id_str, metadata_id_str, args.get("name"))
|
||||
return metadata, 200
|
||||
|
||||
@setup_required
|
||||
@login_required
|
||||
@account_initialization_required
|
||||
@enterprise_license_required
|
||||
def delete(self, dataset_id, metadata_id):
|
||||
dataset_id_str = str(dataset_id)
|
||||
metadata_id_str = str(metadata_id)
|
||||
dataset = DatasetService.get_dataset(dataset_id_str)
|
||||
if dataset is None:
|
||||
raise NotFound("Dataset not found.")
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
|
||||
MetadataService.delete_metadata(dataset_id_str, metadata_id_str)
|
||||
return 200
|
||||
|
||||
|
||||
class DatasetMetadataBuiltInFieldApi(Resource):
|
||||
@setup_required
|
||||
@login_required
|
||||
@account_initialization_required
|
||||
@enterprise_license_required
|
||||
def get(self):
|
||||
built_in_fields = MetadataService.get_built_in_fields()
|
||||
return built_in_fields, 200
|
||||
|
||||
|
||||
class DatasetMetadataBuiltInFieldActionApi(Resource):
|
||||
@setup_required
|
||||
@login_required
|
||||
@account_initialization_required
|
||||
@enterprise_license_required
|
||||
def post(self, dataset_id, action):
|
||||
dataset_id_str = str(dataset_id)
|
||||
dataset = DatasetService.get_dataset(dataset_id_str)
|
||||
if dataset is None:
|
||||
raise NotFound("Dataset not found.")
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
|
||||
if action == "enable":
|
||||
MetadataService.enable_built_in_field(dataset)
|
||||
elif action == "disable":
|
||||
MetadataService.disable_built_in_field(dataset)
|
||||
return 200
|
||||
|
||||
|
||||
class DocumentMetadataApi(Resource):
|
||||
@setup_required
|
||||
@login_required
|
||||
@account_initialization_required
|
||||
@enterprise_license_required
|
||||
def post(self, dataset_id):
|
||||
dataset_id_str = str(dataset_id)
|
||||
dataset = DatasetService.get_dataset(dataset_id_str)
|
||||
if dataset is None:
|
||||
raise NotFound("Dataset not found.")
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
|
||||
parser = reqparse.RequestParser()
|
||||
parser.add_argument("operation_data", type=list, required=True, nullable=True, location="json")
|
||||
args = parser.parse_args()
|
||||
metadata_args = MetadataOperationData(**args)
|
||||
|
||||
MetadataService.update_documents_metadata(dataset, metadata_args)
|
||||
|
||||
return 200
|
||||
|
||||
|
||||
api.add_resource(DatasetListApi, "/datasets/<uuid:dataset_id>/metadata")
|
||||
api.add_resource(DatasetMetadataApi, "/datasets/<uuid:dataset_id>/metadata/<uuid:metadata_id>")
|
||||
api.add_resource(DatasetMetadataBuiltInFieldApi, "/datasets/metadata/built-in")
|
||||
api.add_resource(DatasetMetadataBuiltInFieldActionApi, "/datasets/metadata/built-in/<string:action>")
|
||||
api.add_resource(DocumentMetadataApi, "/datasets/<uuid:dataset_id>/documents/metadata")
|
||||
@ -1,3 +1,5 @@
|
||||
from urllib.parse import quote
|
||||
|
||||
from flask import Response, request
|
||||
from flask_restful import Resource, reqparse # type: ignore
|
||||
from werkzeug.exceptions import NotFound
|
||||
@ -71,7 +73,8 @@ class FilePreviewApi(Resource):
|
||||
if upload_file.size > 0:
|
||||
response.headers["Content-Length"] = str(upload_file.size)
|
||||
if args["as_attachment"]:
|
||||
response.headers["Content-Disposition"] = f"attachment; filename={upload_file.name}"
|
||||
encoded_filename = quote(upload_file.name)
|
||||
response.headers["Content-Disposition"] = f"attachment; filename*=UTF-8''{encoded_filename}"
|
||||
|
||||
return response
|
||||
|
||||
|
||||
@ -336,6 +336,10 @@ class DocumentUpdateByFileApi(DatasetApiResource):
|
||||
|
||||
if not dataset:
|
||||
raise ValueError("Dataset is not exist.")
|
||||
|
||||
# indexing_technique is already set in dataset since this is an update
|
||||
args["indexing_technique"] = dataset.indexing_technique
|
||||
|
||||
if "file" in request.files:
|
||||
# save file info
|
||||
file = request.files["file"]
|
||||
|
||||
@ -154,7 +154,7 @@ def validate_dataset_token(view=None):
|
||||
) # TODO: only owner information is required, so only one is returned.
|
||||
if tenant_account_join:
|
||||
tenant, ta = tenant_account_join
|
||||
account = Account.query.filter_by(id=ta.account_id).first()
|
||||
account = db.session.query(Account).filter(Account.id == ta.account_id).first()
|
||||
# Login admin
|
||||
if account:
|
||||
account.current_tenant = tenant
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
from enum import StrEnum
|
||||
from typing import Any, Optional, Union
|
||||
|
||||
from pydantic import BaseModel
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from core.tools.entities.tool_entities import ToolInvokeMessage, ToolProviderType
|
||||
|
||||
@ -14,7 +14,7 @@ class AgentToolEntity(BaseModel):
|
||||
provider_type: ToolProviderType
|
||||
provider_id: str
|
||||
tool_name: str
|
||||
tool_parameters: dict[str, Any] = {}
|
||||
tool_parameters: dict[str, Any] = Field(default_factory=dict)
|
||||
plugin_unique_identifier: str | None = None
|
||||
|
||||
|
||||
|
||||
@ -2,9 +2,9 @@ from collections.abc import Mapping
|
||||
from typing import Any
|
||||
|
||||
from core.app.app_config.entities import ModelConfigEntity
|
||||
from core.entities import DEFAULT_PLUGIN_ID
|
||||
from core.model_runtime.entities.model_entities import ModelPropertyKey, ModelType
|
||||
from core.model_runtime.model_providers.model_provider_factory import ModelProviderFactory
|
||||
from core.plugin.entities.plugin import ModelProviderID
|
||||
from core.provider_manager import ProviderManager
|
||||
|
||||
|
||||
@ -61,9 +61,7 @@ class ModelConfigManager:
|
||||
raise ValueError(f"model.provider is required and must be in {str(model_provider_names)}")
|
||||
|
||||
if "/" not in config["model"]["provider"]:
|
||||
config["model"]["provider"] = (
|
||||
f"{DEFAULT_PLUGIN_ID}/{config['model']['provider']}/{config['model']['provider']}"
|
||||
)
|
||||
config["model"]["provider"] = str(ModelProviderID(config["model"]["provider"]))
|
||||
|
||||
if config["model"]["provider"] not in model_provider_names:
|
||||
raise ValueError(f"model.provider is required and must be in {str(model_provider_names)}")
|
||||
|
||||
@ -17,8 +17,8 @@ class ModelConfigEntity(BaseModel):
|
||||
provider: str
|
||||
model: str
|
||||
mode: Optional[str] = None
|
||||
parameters: dict[str, Any] = {}
|
||||
stop: list[str] = []
|
||||
parameters: dict[str, Any] = Field(default_factory=dict)
|
||||
stop: list[str] = Field(default_factory=list)
|
||||
|
||||
|
||||
class AdvancedChatMessageEntity(BaseModel):
|
||||
@ -132,7 +132,7 @@ class ExternalDataVariableEntity(BaseModel):
|
||||
|
||||
variable: str
|
||||
type: str
|
||||
config: dict[str, Any] = {}
|
||||
config: dict[str, Any] = Field(default_factory=dict)
|
||||
|
||||
|
||||
class DatasetRetrieveConfigEntity(BaseModel):
|
||||
@ -188,7 +188,7 @@ class SensitiveWordAvoidanceEntity(BaseModel):
|
||||
"""
|
||||
|
||||
type: str
|
||||
config: dict[str, Any] = {}
|
||||
config: dict[str, Any] = Field(default_factory=dict)
|
||||
|
||||
|
||||
class TextToSpeechEntity(BaseModel):
|
||||
|
||||
@ -42,7 +42,6 @@ class MessageBasedAppGenerator(BaseAppGenerator):
|
||||
ChatAppGenerateEntity,
|
||||
CompletionAppGenerateEntity,
|
||||
AgentChatAppGenerateEntity,
|
||||
AgentChatAppGenerateEntity,
|
||||
],
|
||||
queue_manager: AppQueueManager,
|
||||
conversation: Conversation,
|
||||
|
||||
@ -63,9 +63,9 @@ class ModelConfigWithCredentialsEntity(BaseModel):
|
||||
model_schema: AIModelEntity
|
||||
mode: str
|
||||
provider_model_bundle: ProviderModelBundle
|
||||
credentials: dict[str, Any] = {}
|
||||
parameters: dict[str, Any] = {}
|
||||
stop: list[str] = []
|
||||
credentials: dict[str, Any] = Field(default_factory=dict)
|
||||
parameters: dict[str, Any] = Field(default_factory=dict)
|
||||
stop: list[str] = Field(default_factory=list)
|
||||
|
||||
# pydantic configs
|
||||
model_config = ConfigDict(protected_namespaces=())
|
||||
@ -94,7 +94,7 @@ class AppGenerateEntity(BaseModel):
|
||||
call_depth: int = 0
|
||||
|
||||
# extra parameters, like: auto_generate_conversation_name
|
||||
extras: dict[str, Any] = {}
|
||||
extras: dict[str, Any] = Field(default_factory=dict)
|
||||
|
||||
# tracing instance
|
||||
trace_manager: Optional[TraceQueueManager] = None
|
||||
|
||||
@ -6,11 +6,10 @@ from collections.abc import Iterator, Sequence
|
||||
from json import JSONDecodeError
|
||||
from typing import Optional
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
from sqlalchemy import or_
|
||||
|
||||
from constants import HIDDEN_VALUE
|
||||
from core.entities import DEFAULT_PLUGIN_ID
|
||||
from core.entities.model_entities import ModelStatus, ModelWithProviderEntity, SimpleModelProviderEntity
|
||||
from core.entities.provider_entities import (
|
||||
CustomConfiguration,
|
||||
@ -1004,7 +1003,7 @@ class ProviderConfigurations(BaseModel):
|
||||
"""
|
||||
|
||||
tenant_id: str
|
||||
configurations: dict[str, ProviderConfiguration] = {}
|
||||
configurations: dict[str, ProviderConfiguration] = Field(default_factory=dict)
|
||||
|
||||
def __init__(self, tenant_id: str):
|
||||
super().__init__(tenant_id=tenant_id)
|
||||
@ -1060,7 +1059,7 @@ class ProviderConfigurations(BaseModel):
|
||||
|
||||
def __getitem__(self, key):
|
||||
if "/" not in key:
|
||||
key = f"{DEFAULT_PLUGIN_ID}/{key}/{key}"
|
||||
key = str(ModelProviderID(key))
|
||||
|
||||
return self.configurations[key]
|
||||
|
||||
@ -1075,7 +1074,7 @@ class ProviderConfigurations(BaseModel):
|
||||
|
||||
def get(self, key, default=None) -> ProviderConfiguration | None:
|
||||
if "/" not in key:
|
||||
key = f"{DEFAULT_PLUGIN_ID}/{key}/{key}"
|
||||
key = str(ModelProviderID(key))
|
||||
|
||||
return self.configurations.get(key, default) # type: ignore
|
||||
|
||||
|
||||
@ -41,9 +41,13 @@ class HostedModerationConfig(BaseModel):
|
||||
|
||||
|
||||
class HostingConfiguration:
|
||||
provider_map: dict[str, HostingProvider] = {}
|
||||
provider_map: dict[str, HostingProvider]
|
||||
moderation_config: Optional[HostedModerationConfig] = None
|
||||
|
||||
def __init__(self) -> None:
|
||||
self.provider_map = {}
|
||||
self.moderation_config = None
|
||||
|
||||
def init_app(self, app: Flask) -> None:
|
||||
if dify_config.EDITION != "CLOUD":
|
||||
return
|
||||
|
||||
@ -7,7 +7,6 @@ from typing import Optional
|
||||
from pydantic import BaseModel
|
||||
|
||||
import contexts
|
||||
from core.entities import DEFAULT_PLUGIN_ID
|
||||
from core.helper.position_helper import get_provider_position_map, sort_to_dict_by_position_map
|
||||
from core.model_runtime.entities.model_entities import AIModelEntity, ModelType
|
||||
from core.model_runtime.entities.provider_entities import ProviderConfig, ProviderEntity, SimpleProviderEntity
|
||||
@ -34,9 +33,11 @@ class ModelProviderExtension(BaseModel):
|
||||
|
||||
|
||||
class ModelProviderFactory:
|
||||
provider_position_map: dict[str, int] = {}
|
||||
provider_position_map: dict[str, int]
|
||||
|
||||
def __init__(self, tenant_id: str) -> None:
|
||||
self.provider_position_map = {}
|
||||
|
||||
self.tenant_id = tenant_id
|
||||
self.plugin_model_manager = PluginModelManager()
|
||||
|
||||
@ -360,11 +361,5 @@ class ModelProviderFactory:
|
||||
:param provider: provider name
|
||||
:return: plugin id and provider name
|
||||
"""
|
||||
plugin_id = DEFAULT_PLUGIN_ID
|
||||
provider_name = provider
|
||||
if "/" in provider:
|
||||
# get the plugin_id before provider
|
||||
plugin_id = "/".join(provider.split("/")[:-1])
|
||||
provider_name = provider.split("/")[-1]
|
||||
|
||||
return str(plugin_id), provider_name
|
||||
provider_id = ModelProviderID(provider)
|
||||
return provider_id.plugin_id, provider_id.provider_name
|
||||
|
||||
@ -101,11 +101,13 @@ class ProviderManager:
|
||||
)
|
||||
|
||||
# append providers with langgenius/openai/openai
|
||||
for provider_name in list(provider_name_to_provider_records_dict.keys()):
|
||||
provider_name_list = list(provider_name_to_provider_records_dict.keys())
|
||||
for provider_name in provider_name_list:
|
||||
provider_id = ModelProviderID(provider_name)
|
||||
provider_name_to_provider_records_dict[str(provider_id)] = provider_name_to_provider_records_dict[
|
||||
provider_name
|
||||
]
|
||||
if str(provider_id) not in provider_name_list:
|
||||
provider_name_to_provider_records_dict[str(provider_id)] = provider_name_to_provider_records_dict[
|
||||
provider_name
|
||||
]
|
||||
|
||||
# Get all provider model records of the workspace
|
||||
provider_name_to_provider_model_records_dict = self._get_all_provider_models(tenant_id)
|
||||
@ -367,7 +369,8 @@ class ProviderManager:
|
||||
|
||||
provider_name_to_provider_records_dict = defaultdict(list)
|
||||
for provider in providers:
|
||||
provider_name_to_provider_records_dict[provider.provider_name].append(provider)
|
||||
# TODO: Use provider name with prefix after the data migration
|
||||
provider_name_to_provider_records_dict[str(ModelProviderID(provider.provider_name))].append(provider)
|
||||
|
||||
return provider_name_to_provider_records_dict
|
||||
|
||||
@ -506,7 +509,8 @@ class ProviderManager:
|
||||
# FIXME ignore the type errork, onyl TrialHostingQuota has limit need to change the logic
|
||||
provider_record = Provider(
|
||||
tenant_id=tenant_id,
|
||||
provider_name=provider_name,
|
||||
# TODO: Use provider name with prefix after the data migration.
|
||||
provider_name=ModelProviderID(provider_name).provider_name,
|
||||
provider_type=ProviderType.SYSTEM.value,
|
||||
quota_type=ProviderQuotaType.TRIAL.value,
|
||||
quota_limit=quota.quota_limit, # type: ignore
|
||||
@ -521,13 +525,12 @@ class ProviderManager:
|
||||
db.session.query(Provider)
|
||||
.filter(
|
||||
Provider.tenant_id == tenant_id,
|
||||
Provider.provider_name == provider_name,
|
||||
Provider.provider_name == ModelProviderID(provider_name).provider_name,
|
||||
Provider.provider_type == ProviderType.SYSTEM.value,
|
||||
Provider.quota_type == ProviderQuotaType.TRIAL.value,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
|
||||
if provider_record and not provider_record.is_valid:
|
||||
provider_record.is_valid = True
|
||||
db.session.commit()
|
||||
|
||||
@ -88,16 +88,17 @@ class Jieba(BaseKeyword):
|
||||
keyword_table = self._get_dataset_keyword_table()
|
||||
|
||||
k = kwargs.get("top_k", 4)
|
||||
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
sorted_chunk_indices = self._retrieve_ids_by_query(keyword_table or {}, query, k)
|
||||
|
||||
documents = []
|
||||
for chunk_index in sorted_chunk_indices:
|
||||
segment = (
|
||||
db.session.query(DocumentSegment)
|
||||
.filter(DocumentSegment.dataset_id == self.dataset.id, DocumentSegment.index_node_id == chunk_index)
|
||||
.first()
|
||||
segment_query = db.session.query(DocumentSegment).filter(
|
||||
DocumentSegment.dataset_id == self.dataset.id, DocumentSegment.index_node_id == chunk_index
|
||||
)
|
||||
if document_ids_filter:
|
||||
segment_query = segment_query.filter(DocumentSegment.document_id.in_(document_ids_filter))
|
||||
segment = segment_query.first()
|
||||
|
||||
if segment:
|
||||
documents.append(
|
||||
|
||||
@ -42,6 +42,7 @@ class RetrievalService:
|
||||
reranking_model: Optional[dict] = None,
|
||||
reranking_mode: str = "reranking_model",
|
||||
weights: Optional[dict] = None,
|
||||
document_ids_filter: Optional[list[str]] = None,
|
||||
):
|
||||
if not query:
|
||||
return []
|
||||
@ -65,6 +66,7 @@ class RetrievalService:
|
||||
top_k=top_k,
|
||||
all_documents=all_documents,
|
||||
exceptions=exceptions,
|
||||
document_ids_filter=document_ids_filter,
|
||||
)
|
||||
)
|
||||
if RetrievalMethod.is_support_semantic_search(retrieval_method):
|
||||
@ -80,6 +82,7 @@ class RetrievalService:
|
||||
all_documents=all_documents,
|
||||
retrieval_method=retrieval_method,
|
||||
exceptions=exceptions,
|
||||
document_ids_filter=document_ids_filter,
|
||||
)
|
||||
)
|
||||
if RetrievalMethod.is_support_fulltext_search(retrieval_method):
|
||||
@ -131,7 +134,14 @@ class RetrievalService:
|
||||
|
||||
@classmethod
|
||||
def keyword_search(
|
||||
cls, flask_app: Flask, dataset_id: str, query: str, top_k: int, all_documents: list, exceptions: list
|
||||
cls,
|
||||
flask_app: Flask,
|
||||
dataset_id: str,
|
||||
query: str,
|
||||
top_k: int,
|
||||
all_documents: list,
|
||||
exceptions: list,
|
||||
document_ids_filter: Optional[list[str]] = None,
|
||||
):
|
||||
with flask_app.app_context():
|
||||
try:
|
||||
@ -140,7 +150,10 @@ class RetrievalService:
|
||||
raise ValueError("dataset not found")
|
||||
|
||||
keyword = Keyword(dataset=dataset)
|
||||
documents = keyword.search(cls.escape_query_for_search(query), top_k=top_k)
|
||||
|
||||
documents = keyword.search(
|
||||
cls.escape_query_for_search(query), top_k=top_k, document_ids_filter=document_ids_filter
|
||||
)
|
||||
all_documents.extend(documents)
|
||||
except Exception as e:
|
||||
exceptions.append(str(e))
|
||||
@ -157,6 +170,7 @@ class RetrievalService:
|
||||
all_documents: list,
|
||||
retrieval_method: str,
|
||||
exceptions: list,
|
||||
document_ids_filter: Optional[list[str]] = None,
|
||||
):
|
||||
with flask_app.app_context():
|
||||
try:
|
||||
@ -171,6 +185,7 @@ class RetrievalService:
|
||||
top_k=top_k,
|
||||
score_threshold=score_threshold,
|
||||
filter={"group_id": [dataset.id]},
|
||||
document_ids_filter=document_ids_filter,
|
||||
)
|
||||
|
||||
if documents:
|
||||
|
||||
@ -53,7 +53,7 @@ class AnalyticdbVector(BaseVector):
|
||||
self.analyticdb_vector.delete_by_metadata_field(key, value)
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
return self.analyticdb_vector.search_by_vector(query_vector)
|
||||
return self.analyticdb_vector.search_by_vector(query_vector, **kwargs)
|
||||
|
||||
def search_by_full_text(self, query: str, **kwargs: Any) -> list[Document]:
|
||||
return self.analyticdb_vector.search_by_full_text(query, **kwargs)
|
||||
|
||||
@ -194,6 +194,11 @@ class AnalyticdbVectorBySql:
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
top_k = kwargs.get("top_k", 4)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = "WHERE 1=1"
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause += f"AND metadata_->>'document_id' IN ({document_ids})"
|
||||
score_threshold = float(kwargs.get("score_threshold") or 0.0)
|
||||
with self._get_cursor() as cur:
|
||||
query_vector_str = json.dumps(query_vector)
|
||||
@ -202,7 +207,7 @@ class AnalyticdbVectorBySql:
|
||||
f"SELECT t.id AS id, t.vector AS vector, (1.0 - t.score) AS score, "
|
||||
f"t.page_content as page_content, t.metadata_ AS metadata_ "
|
||||
f"FROM (SELECT id, vector, page_content, metadata_, vector <=> %s AS score "
|
||||
f"FROM {self.table_name} ORDER BY score LIMIT {top_k} ) t",
|
||||
f"FROM {self.table_name} {where_clause} ORDER BY score LIMIT {top_k} ) t",
|
||||
(query_vector_str,),
|
||||
)
|
||||
documents = []
|
||||
@ -220,12 +225,17 @@ class AnalyticdbVectorBySql:
|
||||
|
||||
def search_by_full_text(self, query: str, **kwargs: Any) -> list[Document]:
|
||||
top_k = kwargs.get("top_k", 4)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause += f"AND metadata_->>'document_id' IN ({document_ids})"
|
||||
with self._get_cursor() as cur:
|
||||
cur.execute(
|
||||
f"""SELECT id, vector, page_content, metadata_,
|
||||
ts_rank(to_tsvector, to_tsquery_from_text(%s, 'zh_cn'), 32) AS score
|
||||
FROM {self.table_name}
|
||||
WHERE to_tsvector@@to_tsquery_from_text(%s, 'zh_cn')
|
||||
WHERE to_tsvector@@to_tsquery_from_text(%s, 'zh_cn') {where_clause}
|
||||
ORDER BY score DESC
|
||||
LIMIT {top_k}""",
|
||||
(f"'{query}'", f"'{query}'"),
|
||||
|
||||
@ -123,11 +123,21 @@ class BaiduVector(BaseVector):
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
query_vector = [float(val) if isinstance(val, np.float64) else val for val in query_vector]
|
||||
anns = AnnSearch(
|
||||
vector_field=self.field_vector,
|
||||
vector_floats=query_vector,
|
||||
params=HNSWSearchParams(ef=kwargs.get("ef", 10), limit=kwargs.get("top_k", 4)),
|
||||
)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
anns = AnnSearch(
|
||||
vector_field=self.field_vector,
|
||||
vector_floats=query_vector,
|
||||
params=HNSWSearchParams(ef=kwargs.get("ef", 10), limit=kwargs.get("top_k", 4)),
|
||||
filter=f"document_id IN ({document_ids})",
|
||||
)
|
||||
else:
|
||||
anns = AnnSearch(
|
||||
vector_field=self.field_vector,
|
||||
vector_floats=query_vector,
|
||||
params=HNSWSearchParams(ef=kwargs.get("ef", 10), limit=kwargs.get("top_k", 4)),
|
||||
)
|
||||
res = self._db.table(self._collection_name).search(
|
||||
anns=anns,
|
||||
projections=[self.field_id, self.field_text, self.field_metadata],
|
||||
|
||||
@ -95,7 +95,15 @@ class ChromaVector(BaseVector):
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
collection = self._client.get_or_create_collection(self._collection_name)
|
||||
results: QueryResult = collection.query(query_embeddings=query_vector, n_results=kwargs.get("top_k", 4))
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
results: QueryResult = collection.query(
|
||||
query_embeddings=query_vector,
|
||||
n_results=kwargs.get("top_k", 4),
|
||||
where={"document_id": {"$in": document_ids_filter}},
|
||||
)
|
||||
else:
|
||||
results: QueryResult = collection.query(query_embeddings=query_vector, n_results=kwargs.get("top_k", 4))
|
||||
score_threshold = float(kwargs.get("score_threshold") or 0.0)
|
||||
|
||||
# Check if results contain data
|
||||
@ -111,8 +119,9 @@ class ChromaVector(BaseVector):
|
||||
for index in range(len(ids)):
|
||||
distance = distances[index]
|
||||
metadata = dict(metadatas[index])
|
||||
if distance >= score_threshold:
|
||||
metadata["score"] = distance
|
||||
score = 1 - distance
|
||||
if score > score_threshold:
|
||||
metadata["score"] = score
|
||||
doc = Document(
|
||||
page_content=documents[index],
|
||||
metadata=metadata,
|
||||
|
||||
@ -117,6 +117,9 @@ class ElasticSearchVector(BaseVector):
|
||||
top_k = kwargs.get("top_k", 4)
|
||||
num_candidates = math.ceil(top_k * 1.5)
|
||||
knn = {"field": Field.VECTOR.value, "query_vector": query_vector, "k": top_k, "num_candidates": num_candidates}
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
knn["filter"] = {"terms": {"metadata.document_id": document_ids_filter}}
|
||||
|
||||
results = self._client.search(index=self._collection_name, knn=knn, size=top_k)
|
||||
|
||||
@ -145,6 +148,9 @@ class ElasticSearchVector(BaseVector):
|
||||
|
||||
def search_by_full_text(self, query: str, **kwargs: Any) -> list[Document]:
|
||||
query_str = {"match": {Field.CONTENT_KEY.value: query}}
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
query_str["filter"] = {"terms": {"metadata.document_id": document_ids_filter}}
|
||||
results = self._client.search(index=self._collection_name, query=query_str, size=kwargs.get("top_k", 4))
|
||||
docs = []
|
||||
for hit in results["hits"]["hits"]:
|
||||
|
||||
@ -168,7 +168,12 @@ class LindormVectorStore(BaseVector):
|
||||
raise ValueError("All elements in query_vector should be floats")
|
||||
|
||||
top_k = kwargs.get("top_k", 10)
|
||||
query = default_vector_search_query(query_vector=query_vector, k=top_k, **kwargs)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
filters = []
|
||||
if document_ids_filter:
|
||||
filters.append({"terms": {"metadata.document_id": document_ids_filter}})
|
||||
query = default_vector_search_query(query_vector=query_vector, k=top_k, filters=filters, **kwargs)
|
||||
|
||||
try:
|
||||
params = {}
|
||||
if self._using_ugc:
|
||||
@ -206,7 +211,10 @@ class LindormVectorStore(BaseVector):
|
||||
should = kwargs.get("should")
|
||||
minimum_should_match = kwargs.get("minimum_should_match", 0)
|
||||
top_k = kwargs.get("top_k", 10)
|
||||
filters = kwargs.get("filter")
|
||||
filters = kwargs.get("filter", [])
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
filters.append({"terms": {"metadata.document_id": document_ids_filter}})
|
||||
routing = self._routing
|
||||
full_text_query = default_text_search_query(
|
||||
query_text=query,
|
||||
|
||||
@ -218,12 +218,18 @@ class MilvusVector(BaseVector):
|
||||
"""
|
||||
Search for documents by vector similarity.
|
||||
"""
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
filter = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
filter = f'metadata["document_id"] in ({document_ids})'
|
||||
results = self._client.search(
|
||||
collection_name=self._collection_name,
|
||||
data=[query_vector],
|
||||
anns_field=Field.VECTOR.value,
|
||||
limit=kwargs.get("top_k", 4),
|
||||
output_fields=[Field.CONTENT_KEY.value, Field.METADATA_KEY.value],
|
||||
filter=filter,
|
||||
)
|
||||
|
||||
return self._process_search_results(
|
||||
@ -239,6 +245,11 @@ class MilvusVector(BaseVector):
|
||||
if not self._hybrid_search_enabled or not self.field_exists(Field.SPARSE_VECTOR.value):
|
||||
logger.warning("Full-text search is not supported in current Milvus version (requires >= 2.5.0)")
|
||||
return []
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
filter = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
filter = f'metadata["document_id"] in ({document_ids})'
|
||||
|
||||
results = self._client.search(
|
||||
collection_name=self._collection_name,
|
||||
@ -246,6 +257,7 @@ class MilvusVector(BaseVector):
|
||||
anns_field=Field.SPARSE_VECTOR.value,
|
||||
limit=kwargs.get("top_k", 4),
|
||||
output_fields=[Field.CONTENT_KEY.value, Field.METADATA_KEY.value],
|
||||
filter=filter,
|
||||
)
|
||||
|
||||
return self._process_search_results(
|
||||
|
||||
@ -131,6 +131,10 @@ class MyScaleVector(BaseVector):
|
||||
if self._metric.upper() == "COSINE" and order == SortOrder.ASC and score_threshold > 0.0
|
||||
else ""
|
||||
)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_str = f"{where_str} AND metadata['document_id'] in ({document_ids})"
|
||||
sql = f"""
|
||||
SELECT text, vector, metadata, {dist} as dist FROM {self._config.database}.{self._collection_name}
|
||||
{where_str} ORDER BY dist {order.value} LIMIT {top_k}
|
||||
|
||||
@ -154,6 +154,11 @@ class OceanBaseVector(BaseVector):
|
||||
return []
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = None
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause = f"metadata->>'$.document_id' in ({document_ids})"
|
||||
ef_search = kwargs.get("ef_search", self._hnsw_ef_search)
|
||||
if ef_search != self._hnsw_ef_search:
|
||||
self._client.set_ob_hnsw_ef_search(ef_search)
|
||||
@ -167,6 +172,7 @@ class OceanBaseVector(BaseVector):
|
||||
distance_func=func.l2_distance,
|
||||
output_column_names=["text", "metadata"],
|
||||
with_dist=True,
|
||||
where_clause=where_clause,
|
||||
)
|
||||
docs = []
|
||||
for text, metadata, distance in cur:
|
||||
|
||||
@ -154,6 +154,9 @@ class OpenSearchVector(BaseVector):
|
||||
"size": kwargs.get("top_k", 4),
|
||||
"query": {"knn": {Field.VECTOR.value: {Field.VECTOR.value: query_vector, "k": kwargs.get("top_k", 4)}}},
|
||||
}
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
query["query"] = {"terms": {"metadata.document_id": document_ids_filter}}
|
||||
|
||||
try:
|
||||
response = self._client.search(index=self._collection_name.lower(), body=query)
|
||||
@ -179,6 +182,9 @@ class OpenSearchVector(BaseVector):
|
||||
|
||||
def search_by_full_text(self, query: str, **kwargs: Any) -> list[Document]:
|
||||
full_text_query = {"query": {"match": {Field.CONTENT_KEY.value: query}}}
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
full_text_query["query"]["terms"] = {"metadata.document_id": document_ids_filter}
|
||||
|
||||
response = self._client.search(index=self._collection_name.lower(), body=full_text_query)
|
||||
|
||||
|
||||
@ -185,10 +185,15 @@ class OracleVector(BaseVector):
|
||||
:return: List of Documents that are nearest to the query vector.
|
||||
"""
|
||||
top_k = kwargs.get("top_k", 4)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause = f"WHERE metadata->>'document_id' in ({document_ids})"
|
||||
with self._get_cursor() as cur:
|
||||
cur.execute(
|
||||
f"SELECT meta, text, vector_distance(embedding,:1) AS distance FROM {self.table_name}"
|
||||
f" ORDER BY distance fetch first {top_k} rows only",
|
||||
f" {where_clause} ORDER BY distance fetch first {top_k} rows only",
|
||||
[numpy.array(query_vector)],
|
||||
)
|
||||
docs = []
|
||||
@ -241,9 +246,15 @@ class OracleVector(BaseVector):
|
||||
if token not in stop_words:
|
||||
entities.append(token)
|
||||
with self._get_cursor() as cur:
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause = f" AND metadata->>'document_id' in ({document_ids}) "
|
||||
cur.execute(
|
||||
f"select meta, text, embedding FROM {self.table_name}"
|
||||
f" WHERE CONTAINS(text, :1, 1) > 0 order by score(1) desc fetch first {top_k} rows only",
|
||||
f"WHERE CONTAINS(text, :1, 1) > 0 {where_clause} "
|
||||
f"order by score(1) desc fetch first {top_k} rows only",
|
||||
[" ACCUM ".join(entities)],
|
||||
)
|
||||
docs = []
|
||||
|
||||
@ -189,6 +189,9 @@ class PGVectoRS(BaseVector):
|
||||
.limit(kwargs.get("top_k", 4))
|
||||
.order_by("distance")
|
||||
)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
stmt = stmt.where(self._table.meta["document_id"].in_(document_ids_filter))
|
||||
res = session.execute(stmt)
|
||||
results = [(row[0], row[1]) for row in res]
|
||||
|
||||
|
||||
@ -155,10 +155,16 @@ class PGVector(BaseVector):
|
||||
:return: List of Documents that are nearest to the query vector.
|
||||
"""
|
||||
top_k = kwargs.get("top_k", 4)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause = f" WHERE metadata->>'document_id' in ({document_ids}) "
|
||||
|
||||
with self._get_cursor() as cur:
|
||||
cur.execute(
|
||||
f"SELECT meta, text, embedding <=> %s AS distance FROM {self.table_name}"
|
||||
f" {where_clause}"
|
||||
f" ORDER BY distance LIMIT {top_k}",
|
||||
(json.dumps(query_vector),),
|
||||
)
|
||||
@ -176,10 +182,16 @@ class PGVector(BaseVector):
|
||||
top_k = kwargs.get("top_k", 5)
|
||||
|
||||
with self._get_cursor() as cur:
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause = f" AND metadata->>'document_id' in ({document_ids}) "
|
||||
cur.execute(
|
||||
f"""SELECT meta, text, ts_rank(to_tsvector(coalesce(text, '')), plainto_tsquery(%s)) AS score
|
||||
FROM {self.table_name}
|
||||
WHERE to_tsvector(text) @@ plainto_tsquery(%s)
|
||||
{where_clause}
|
||||
ORDER BY score DESC
|
||||
LIMIT {top_k}""",
|
||||
# f"'{query}'" is required in order to account for whitespace in query
|
||||
|
||||
@ -286,27 +286,26 @@ class QdrantVector(BaseVector):
|
||||
from qdrant_client.http import models
|
||||
from qdrant_client.http.exceptions import UnexpectedResponse
|
||||
|
||||
for node_id in ids:
|
||||
try:
|
||||
filter = models.Filter(
|
||||
must=[
|
||||
models.FieldCondition(
|
||||
key="metadata.doc_id",
|
||||
match=models.MatchValue(value=node_id),
|
||||
),
|
||||
],
|
||||
)
|
||||
self._client.delete(
|
||||
collection_name=self._collection_name,
|
||||
points_selector=FilterSelector(filter=filter),
|
||||
)
|
||||
except UnexpectedResponse as e:
|
||||
# Collection does not exist, so return
|
||||
if e.status_code == 404:
|
||||
return
|
||||
# Some other error occurred, so re-raise the exception
|
||||
else:
|
||||
raise e
|
||||
try:
|
||||
filter = models.Filter(
|
||||
must=[
|
||||
models.FieldCondition(
|
||||
key="metadata.doc_id",
|
||||
match=models.MatchAny(any=ids),
|
||||
),
|
||||
],
|
||||
)
|
||||
self._client.delete(
|
||||
collection_name=self._collection_name,
|
||||
points_selector=FilterSelector(filter=filter),
|
||||
)
|
||||
except UnexpectedResponse as e:
|
||||
# Collection does not exist, so return
|
||||
if e.status_code == 404:
|
||||
return
|
||||
# Some other error occurred, so re-raise the exception
|
||||
else:
|
||||
raise e
|
||||
|
||||
def text_exists(self, id: str) -> bool:
|
||||
all_collection_name = []
|
||||
@ -331,6 +330,14 @@ class QdrantVector(BaseVector):
|
||||
),
|
||||
],
|
||||
)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
filter.must.append(
|
||||
models.FieldCondition(
|
||||
key="metadata.document_id",
|
||||
match=models.MatchAny(any=document_ids_filter),
|
||||
)
|
||||
)
|
||||
results = self._client.search(
|
||||
collection_name=self._collection_name,
|
||||
query_vector=query_vector,
|
||||
@ -377,6 +384,14 @@ class QdrantVector(BaseVector):
|
||||
),
|
||||
]
|
||||
)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
scroll_filter.must.append(
|
||||
models.FieldCondition(
|
||||
key="metadata.document_id",
|
||||
match=models.MatchAny(any=document_ids_filter),
|
||||
)
|
||||
)
|
||||
response = self._client.scroll(
|
||||
collection_name=self._collection_name,
|
||||
scroll_filter=scroll_filter,
|
||||
|
||||
@ -223,8 +223,12 @@ class RelytVector(BaseVector):
|
||||
return len(result) > 0
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
filter = kwargs.get("filter", {})
|
||||
if document_ids_filter:
|
||||
filter["document_id"] = document_ids_filter
|
||||
results = self.similarity_search_with_score_by_vector(
|
||||
k=int(kwargs.get("top_k", 4)), embedding=query_vector, filter=kwargs.get("filter")
|
||||
k=int(kwargs.get("top_k", 4)), embedding=query_vector, filter=filter
|
||||
)
|
||||
|
||||
# Organize results.
|
||||
@ -246,9 +250,9 @@ class RelytVector(BaseVector):
|
||||
filter_condition = ""
|
||||
if filter is not None:
|
||||
conditions = [
|
||||
f"metadata->>{key!r} in ({', '.join(map(repr, value))})"
|
||||
f"metadata->>'{key!r}' in ({', '.join(map(repr, value))})"
|
||||
if len(value) > 1
|
||||
else f"metadata->>{key!r} = {value[0]!r}"
|
||||
else f"metadata->>'{key!r}' = {value[0]!r}"
|
||||
for key, value in filter.items()
|
||||
]
|
||||
filter_condition = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
@ -145,11 +145,16 @@ class TencentVector(BaseVector):
|
||||
self._db.collection(self._collection_name).delete(document_ids=ids)
|
||||
|
||||
def delete_by_metadata_field(self, key: str, value: str) -> None:
|
||||
self._db.collection(self._collection_name).delete(filter=Filter(Filter.In(key, [value])))
|
||||
self._db.collection(self._collection_name).delete(filter=Filter(Filter.In(f"metadata.{key}", [value])))
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
filter = None
|
||||
if document_ids_filter:
|
||||
filter = Filter(Filter.In("metadata.document_id", document_ids_filter))
|
||||
res = self._db.collection(self._collection_name).search(
|
||||
vectors=[query_vector],
|
||||
filter=filter,
|
||||
params=document.HNSWSearchParams(ef=kwargs.get("ef", 10)),
|
||||
retrieve_vector=False,
|
||||
limit=kwargs.get("top_k", 4),
|
||||
|
||||
@ -326,6 +326,14 @@ class TidbOnQdrantVector(BaseVector):
|
||||
),
|
||||
],
|
||||
)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
filter.must.append(
|
||||
models.FieldCondition(
|
||||
key="metadata.document_id",
|
||||
match=models.MatchAny(any=document_ids_filter),
|
||||
)
|
||||
)
|
||||
results = self._client.search(
|
||||
collection_name=self._collection_name,
|
||||
query_vector=query_vector,
|
||||
@ -368,6 +376,14 @@ class TidbOnQdrantVector(BaseVector):
|
||||
)
|
||||
]
|
||||
)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
scroll_filter.must.append(
|
||||
models.FieldCondition(
|
||||
key="metadata.document_id",
|
||||
match=models.MatchAny(any=document_ids_filter),
|
||||
)
|
||||
)
|
||||
response = self._client.scroll(
|
||||
collection_name=self._collection_name,
|
||||
scroll_filter=scroll_filter,
|
||||
|
||||
@ -196,6 +196,11 @@ class TiDBVector(BaseVector):
|
||||
|
||||
docs = []
|
||||
tidb_dist_func = self._get_distance_func()
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
where_clause = ""
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
where_clause = f" WHERE meta->>'$.document_id' in ({document_ids}) "
|
||||
|
||||
with Session(self._engine) as session:
|
||||
select_statement = sql_text(f"""
|
||||
@ -206,6 +211,7 @@ class TiDBVector(BaseVector):
|
||||
text,
|
||||
{tidb_dist_func}(vector, :query_vector_str) AS distance
|
||||
FROM {self._collection_name}
|
||||
{where_clause}
|
||||
ORDER BY distance ASC
|
||||
LIMIT :top_k
|
||||
) t
|
||||
|
||||
@ -88,7 +88,20 @@ class UpstashVector(BaseVector):
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
top_k = kwargs.get("top_k", 4)
|
||||
result = self.index.query(vector=query_vector, top_k=top_k, include_metadata=True, include_data=True)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
document_ids = ", ".join(f"'{id}'" for id in document_ids_filter)
|
||||
filter = f"document_id in ({document_ids})"
|
||||
else:
|
||||
filter = ""
|
||||
result = self.index.query(
|
||||
vector=query_vector,
|
||||
top_k=top_k,
|
||||
include_metadata=True,
|
||||
include_data=True,
|
||||
include_vectors=False,
|
||||
filter=filter,
|
||||
)
|
||||
docs = []
|
||||
score_threshold = float(kwargs.get("score_threshold") or 0.0)
|
||||
for record in result:
|
||||
|
||||
@ -49,6 +49,10 @@ class BaseVector(ABC):
|
||||
def delete(self) -> None:
|
||||
raise NotImplementedError
|
||||
|
||||
@abstractmethod
|
||||
def update_metadata(self, document_id: str, metadata: dict) -> None:
|
||||
raise NotImplementedError
|
||||
|
||||
def _filter_duplicate_texts(self, texts: list[Document]) -> list[Document]:
|
||||
for text in texts.copy():
|
||||
if text.metadata and "doc_id" in text.metadata:
|
||||
|
||||
@ -177,7 +177,11 @@ class VikingDBVector(BaseVector):
|
||||
query_vector, limit=kwargs.get("top_k", 4)
|
||||
)
|
||||
score_threshold = float(kwargs.get("score_threshold") or 0.0)
|
||||
return self._get_search_res(results, score_threshold)
|
||||
docs = self._get_search_res(results, score_threshold)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
docs = [doc for doc in docs if doc.metadata.get("document_id") in document_ids_filter]
|
||||
return docs
|
||||
|
||||
def _get_search_res(self, results, score_threshold) -> list[Document]:
|
||||
if len(results) == 0:
|
||||
|
||||
@ -168,16 +168,16 @@ class WeaviateVector(BaseVector):
|
||||
# check whether the index already exists
|
||||
schema = self._default_schema(self._collection_name)
|
||||
if self._client.schema.contains(schema):
|
||||
for uuid in ids:
|
||||
try:
|
||||
self._client.data_object.delete(
|
||||
class_name=self._collection_name,
|
||||
uuid=uuid,
|
||||
)
|
||||
except weaviate.UnexpectedStatusCodeException as e:
|
||||
# tolerate not found error
|
||||
if e.status_code != 404:
|
||||
raise e
|
||||
try:
|
||||
self._client.batch.delete_objects(
|
||||
class_name=self._collection_name,
|
||||
where={"operator": "ContainsAny", "path": ["id"], "valueTextArray": ids},
|
||||
output="minimal",
|
||||
)
|
||||
except weaviate.UnexpectedStatusCodeException as e:
|
||||
# tolerate not found error
|
||||
if e.status_code != 404:
|
||||
raise e
|
||||
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
"""Look up similar documents by embedding vector in Weaviate."""
|
||||
@ -187,8 +187,10 @@ class WeaviateVector(BaseVector):
|
||||
query_obj = self._client.query.get(collection_name, properties)
|
||||
|
||||
vector = {"vector": query_vector}
|
||||
if kwargs.get("where_filter"):
|
||||
query_obj = query_obj.with_where(kwargs.get("where_filter"))
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
where_filter = {"operator": "ContainsAny", "path": ["document_id"], "valueTextArray": document_ids_filter}
|
||||
query_obj = query_obj.with_where(where_filter)
|
||||
result = (
|
||||
query_obj.with_near_vector(vector)
|
||||
.with_limit(kwargs.get("top_k", 4))
|
||||
@ -233,8 +235,10 @@ class WeaviateVector(BaseVector):
|
||||
if kwargs.get("search_distance"):
|
||||
content["certainty"] = kwargs.get("search_distance")
|
||||
query_obj = self._client.query.get(collection_name, properties)
|
||||
if kwargs.get("where_filter"):
|
||||
query_obj = query_obj.with_where(kwargs.get("where_filter"))
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
if document_ids_filter:
|
||||
where_filter = {"operator": "ContainsAny", "path": ["document_id"], "valueTextArray": document_ids_filter}
|
||||
query_obj = query_obj.with_where(where_filter)
|
||||
query_obj = query_obj.with_additional(["vector"])
|
||||
properties = ["text"]
|
||||
result = query_obj.with_bm25(query=query, properties=properties).with_limit(kwargs.get("top_k", 4)).do()
|
||||
|
||||
9
api/core/rag/index_processor/constant/built_in_field.py
Normal file
9
api/core/rag/index_processor/constant/built_in_field.py
Normal file
@ -0,0 +1,9 @@
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class BuiltInField(str, Enum):
|
||||
document_name = "document_name"
|
||||
uploader = "uploader"
|
||||
upload_date = "upload_date"
|
||||
last_update_date = "last_update_date"
|
||||
source = "source"
|
||||
@ -237,6 +237,7 @@ class DatasetRetrieval:
|
||||
model_config: ModelConfigWithCredentialsEntity,
|
||||
planning_strategy: PlanningStrategy,
|
||||
message_id: Optional[str] = None,
|
||||
metadata_filter_document_ids: Optional[dict[str, list[str]]] = None,
|
||||
):
|
||||
tools = []
|
||||
for dataset in available_datasets:
|
||||
@ -291,6 +292,11 @@ class DatasetRetrieval:
|
||||
document.metadata["dataset_name"] = dataset.name
|
||||
results.append(document)
|
||||
else:
|
||||
document_ids_filter = None
|
||||
if metadata_filter_document_ids:
|
||||
document_ids = metadata_filter_document_ids.get(dataset.id, [])
|
||||
if document_ids:
|
||||
document_ids_filter = document_ids
|
||||
retrieval_model_config = dataset.retrieval_model or default_retrieval_model
|
||||
|
||||
# get top k
|
||||
@ -322,6 +328,7 @@ class DatasetRetrieval:
|
||||
reranking_model=reranking_model,
|
||||
reranking_mode=retrieval_model_config.get("reranking_mode", "reranking_model"),
|
||||
weights=retrieval_model_config.get("weights", None),
|
||||
document_ids_filter=document_ids_filter,
|
||||
)
|
||||
self._on_query(query, [dataset_id], app_id, user_from, user_id)
|
||||
|
||||
|
||||
@ -105,10 +105,10 @@ class ApiTool(Tool):
|
||||
needed_parameters = [parameter for parameter in (self.api_bundle.parameters or []) if parameter.required]
|
||||
for parameter in needed_parameters:
|
||||
if parameter.required and parameter.name not in parameters:
|
||||
raise ToolParameterValidationError(f"Missing required parameter {parameter.name}")
|
||||
|
||||
if parameter.default is not None and parameter.name not in parameters:
|
||||
parameters[parameter.name] = parameter.default
|
||||
if parameter.default is not None:
|
||||
parameters[parameter.name] = parameter.default
|
||||
else:
|
||||
raise ToolParameterValidationError(f"Missing required parameter {parameter.name}")
|
||||
|
||||
return headers
|
||||
|
||||
|
||||
@ -246,10 +246,11 @@ class ToolEngine:
|
||||
+ "you do not need to create it, just tell the user to check it now."
|
||||
)
|
||||
elif response.type == ToolInvokeMessage.MessageType.JSON:
|
||||
text = json.dumps(cast(ToolInvokeMessage.JsonMessage, response.message).json_object, ensure_ascii=False)
|
||||
result += f"tool response: {text}."
|
||||
result = json.dumps(
|
||||
cast(ToolInvokeMessage.JsonMessage, response.message).json_object, ensure_ascii=False
|
||||
)
|
||||
else:
|
||||
result += f"tool response: {response.message!r}."
|
||||
result += str(response.message)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@ -9,7 +9,7 @@ from typing import TYPE_CHECKING, Any, Union, cast
|
||||
from yarl import URL
|
||||
|
||||
import contexts
|
||||
from core.plugin.entities.plugin import GenericProviderID
|
||||
from core.plugin.entities.plugin import ToolProviderID
|
||||
from core.plugin.manager.tool import PluginToolManager
|
||||
from core.tools.__base.tool_provider import ToolProviderController
|
||||
from core.tools.__base.tool_runtime import ToolRuntime
|
||||
@ -188,7 +188,7 @@ class ToolManager:
|
||||
)
|
||||
|
||||
if isinstance(provider_controller, PluginToolProviderController):
|
||||
provider_id_entity = GenericProviderID(provider_id)
|
||||
provider_id_entity = ToolProviderID(provider_id)
|
||||
# get credentials
|
||||
builtin_provider: BuiltinToolProvider | None = (
|
||||
db.session.query(BuiltinToolProvider)
|
||||
@ -572,95 +572,96 @@ class ToolManager:
|
||||
else:
|
||||
filters.append(typ)
|
||||
|
||||
if "builtin" in filters:
|
||||
# get builtin providers
|
||||
builtin_providers = cls.list_builtin_providers(tenant_id)
|
||||
with db.session.no_autoflush:
|
||||
if "builtin" in filters:
|
||||
# get builtin providers
|
||||
builtin_providers = cls.list_builtin_providers(tenant_id)
|
||||
|
||||
# get db builtin providers
|
||||
db_builtin_providers: list[BuiltinToolProvider] = (
|
||||
db.session.query(BuiltinToolProvider).filter(BuiltinToolProvider.tenant_id == tenant_id).all()
|
||||
)
|
||||
|
||||
# rewrite db_builtin_providers
|
||||
for db_provider in db_builtin_providers:
|
||||
tool_provider_id = GenericProviderID(db_provider.provider)
|
||||
db_provider.provider = tool_provider_id.to_string()
|
||||
|
||||
def find_db_builtin_provider(provider):
|
||||
return next((x for x in db_builtin_providers if x.provider == provider), None)
|
||||
|
||||
# append builtin providers
|
||||
for provider in builtin_providers:
|
||||
# handle include, exclude
|
||||
if is_filtered(
|
||||
include_set=cast(set[str], dify_config.POSITION_TOOL_INCLUDES_SET),
|
||||
exclude_set=cast(set[str], dify_config.POSITION_TOOL_EXCLUDES_SET),
|
||||
data=provider,
|
||||
name_func=lambda x: x.identity.name,
|
||||
):
|
||||
continue
|
||||
|
||||
user_provider = ToolTransformService.builtin_provider_to_user_provider(
|
||||
provider_controller=provider,
|
||||
db_provider=find_db_builtin_provider(provider.entity.identity.name),
|
||||
decrypt_credentials=False,
|
||||
# get db builtin providers
|
||||
db_builtin_providers: list[BuiltinToolProvider] = (
|
||||
db.session.query(BuiltinToolProvider).filter(BuiltinToolProvider.tenant_id == tenant_id).all()
|
||||
)
|
||||
|
||||
if isinstance(provider, PluginToolProviderController):
|
||||
result_providers[f"plugin_provider.{user_provider.name}"] = user_provider
|
||||
else:
|
||||
result_providers[f"builtin_provider.{user_provider.name}"] = user_provider
|
||||
# rewrite db_builtin_providers
|
||||
for db_provider in db_builtin_providers:
|
||||
tool_provider_id = str(ToolProviderID(db_provider.provider))
|
||||
db_provider.provider = tool_provider_id
|
||||
|
||||
# get db api providers
|
||||
def find_db_builtin_provider(provider):
|
||||
return next((x for x in db_builtin_providers if x.provider == provider), None)
|
||||
|
||||
if "api" in filters:
|
||||
db_api_providers: list[ApiToolProvider] = (
|
||||
db.session.query(ApiToolProvider).filter(ApiToolProvider.tenant_id == tenant_id).all()
|
||||
)
|
||||
# append builtin providers
|
||||
for provider in builtin_providers:
|
||||
# handle include, exclude
|
||||
if is_filtered(
|
||||
include_set=cast(set[str], dify_config.POSITION_TOOL_INCLUDES_SET),
|
||||
exclude_set=cast(set[str], dify_config.POSITION_TOOL_EXCLUDES_SET),
|
||||
data=provider,
|
||||
name_func=lambda x: x.identity.name,
|
||||
):
|
||||
continue
|
||||
|
||||
api_provider_controllers: list[dict[str, Any]] = [
|
||||
{"provider": provider, "controller": ToolTransformService.api_provider_to_controller(provider)}
|
||||
for provider in db_api_providers
|
||||
]
|
||||
|
||||
# get labels
|
||||
labels = ToolLabelManager.get_tools_labels([x["controller"] for x in api_provider_controllers])
|
||||
|
||||
for api_provider_controller in api_provider_controllers:
|
||||
user_provider = ToolTransformService.api_provider_to_user_provider(
|
||||
provider_controller=api_provider_controller["controller"],
|
||||
db_provider=api_provider_controller["provider"],
|
||||
decrypt_credentials=False,
|
||||
labels=labels.get(api_provider_controller["controller"].provider_id, []),
|
||||
)
|
||||
result_providers[f"api_provider.{user_provider.name}"] = user_provider
|
||||
|
||||
if "workflow" in filters:
|
||||
# get workflow providers
|
||||
workflow_providers: list[WorkflowToolProvider] = (
|
||||
db.session.query(WorkflowToolProvider).filter(WorkflowToolProvider.tenant_id == tenant_id).all()
|
||||
)
|
||||
|
||||
workflow_provider_controllers: list[WorkflowToolProviderController] = []
|
||||
for provider in workflow_providers:
|
||||
try:
|
||||
workflow_provider_controllers.append(
|
||||
ToolTransformService.workflow_provider_to_controller(db_provider=provider)
|
||||
user_provider = ToolTransformService.builtin_provider_to_user_provider(
|
||||
provider_controller=provider,
|
||||
db_provider=find_db_builtin_provider(provider.entity.identity.name),
|
||||
decrypt_credentials=False,
|
||||
)
|
||||
except Exception:
|
||||
# app has been deleted
|
||||
pass
|
||||
|
||||
labels = ToolLabelManager.get_tools_labels(
|
||||
[cast(ToolProviderController, controller) for controller in workflow_provider_controllers]
|
||||
)
|
||||
if isinstance(provider, PluginToolProviderController):
|
||||
result_providers[f"plugin_provider.{user_provider.name}"] = user_provider
|
||||
else:
|
||||
result_providers[f"builtin_provider.{user_provider.name}"] = user_provider
|
||||
|
||||
for provider_controller in workflow_provider_controllers:
|
||||
user_provider = ToolTransformService.workflow_provider_to_user_provider(
|
||||
provider_controller=provider_controller,
|
||||
labels=labels.get(provider_controller.provider_id, []),
|
||||
# get db api providers
|
||||
|
||||
if "api" in filters:
|
||||
db_api_providers: list[ApiToolProvider] = (
|
||||
db.session.query(ApiToolProvider).filter(ApiToolProvider.tenant_id == tenant_id).all()
|
||||
)
|
||||
result_providers[f"workflow_provider.{user_provider.name}"] = user_provider
|
||||
|
||||
api_provider_controllers: list[dict[str, Any]] = [
|
||||
{"provider": provider, "controller": ToolTransformService.api_provider_to_controller(provider)}
|
||||
for provider in db_api_providers
|
||||
]
|
||||
|
||||
# get labels
|
||||
labels = ToolLabelManager.get_tools_labels([x["controller"] for x in api_provider_controllers])
|
||||
|
||||
for api_provider_controller in api_provider_controllers:
|
||||
user_provider = ToolTransformService.api_provider_to_user_provider(
|
||||
provider_controller=api_provider_controller["controller"],
|
||||
db_provider=api_provider_controller["provider"],
|
||||
decrypt_credentials=False,
|
||||
labels=labels.get(api_provider_controller["controller"].provider_id, []),
|
||||
)
|
||||
result_providers[f"api_provider.{user_provider.name}"] = user_provider
|
||||
|
||||
if "workflow" in filters:
|
||||
# get workflow providers
|
||||
workflow_providers: list[WorkflowToolProvider] = (
|
||||
db.session.query(WorkflowToolProvider).filter(WorkflowToolProvider.tenant_id == tenant_id).all()
|
||||
)
|
||||
|
||||
workflow_provider_controllers: list[WorkflowToolProviderController] = []
|
||||
for provider in workflow_providers:
|
||||
try:
|
||||
workflow_provider_controllers.append(
|
||||
ToolTransformService.workflow_provider_to_controller(db_provider=provider)
|
||||
)
|
||||
except Exception:
|
||||
# app has been deleted
|
||||
pass
|
||||
|
||||
labels = ToolLabelManager.get_tools_labels(
|
||||
[cast(ToolProviderController, controller) for controller in workflow_provider_controllers]
|
||||
)
|
||||
|
||||
for provider_controller in workflow_provider_controllers:
|
||||
user_provider = ToolTransformService.workflow_provider_to_user_provider(
|
||||
provider_controller=provider_controller,
|
||||
labels=labels.get(provider_controller.provider_id, []),
|
||||
)
|
||||
result_providers[f"workflow_provider.{user_provider.name}"] = user_provider
|
||||
|
||||
return BuiltinToolProviderSort.sort(list(result_providers.values()))
|
||||
|
||||
|
||||
@ -590,6 +590,8 @@ class Graph(BaseModel):
|
||||
start_node_id=node_id,
|
||||
routes_node_ids=routes_node_ids,
|
||||
)
|
||||
# Exclude conditional branch nodes
|
||||
and all(edge.run_condition is None for edge in reverse_edge_mapping.get(node_id, []))
|
||||
):
|
||||
if node_id not in merge_branch_node_ids:
|
||||
merge_branch_node_ids[node_id] = []
|
||||
|
||||
@ -1,6 +1,3 @@
|
||||
from collections.abc import Mapping, Sequence
|
||||
from typing import Any
|
||||
|
||||
from core.workflow.entities.node_entities import NodeRunResult
|
||||
from core.workflow.nodes.base import BaseNode
|
||||
from core.workflow.nodes.end.entities import EndNodeData
|
||||
@ -30,20 +27,3 @@ class EndNode(BaseNode[EndNodeData]):
|
||||
inputs=outputs,
|
||||
outputs=outputs,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def _extract_variable_selector_to_variable_mapping(
|
||||
cls,
|
||||
*,
|
||||
graph_config: Mapping[str, Any],
|
||||
node_id: str,
|
||||
node_data: EndNodeData,
|
||||
) -> Mapping[str, Sequence[str]]:
|
||||
"""
|
||||
Extract variable selector to variable mapping
|
||||
:param graph_config: graph config
|
||||
:param node_id: node id
|
||||
:param node_data: node data
|
||||
:return:
|
||||
"""
|
||||
return {}
|
||||
|
||||
@ -1,5 +1,4 @@
|
||||
from collections.abc import Mapping, Sequence
|
||||
from typing import Any, Literal
|
||||
from typing import Literal
|
||||
|
||||
from typing_extensions import deprecated
|
||||
|
||||
@ -88,23 +87,6 @@ class IfElseNode(BaseNode[IfElseNodeData]):
|
||||
|
||||
return data
|
||||
|
||||
@classmethod
|
||||
def _extract_variable_selector_to_variable_mapping(
|
||||
cls,
|
||||
*,
|
||||
graph_config: Mapping[str, Any],
|
||||
node_id: str,
|
||||
node_data: IfElseNodeData,
|
||||
) -> Mapping[str, Sequence[str]]:
|
||||
"""
|
||||
Extract variable selector to variable mapping
|
||||
:param graph_config: graph config
|
||||
:param node_id: node id
|
||||
:param node_data: node data
|
||||
:return:
|
||||
"""
|
||||
return {}
|
||||
|
||||
|
||||
@deprecated("This function is deprecated. You should use the new cases structure.")
|
||||
def _should_not_use_old_function(
|
||||
|
||||
@ -590,6 +590,7 @@ class IterationNode(BaseNode[IterationNodeData]):
|
||||
with flask_app.app_context():
|
||||
parallel_mode_run_id = uuid.uuid4().hex
|
||||
graph_engine_copy = graph_engine.create_copy()
|
||||
graph_engine_copy.graph_runtime_state.total_tokens = 0
|
||||
variable_pool_copy = graph_engine_copy.graph_runtime_state.variable_pool
|
||||
variable_pool_copy.add([self.node_id, "index"], index)
|
||||
variable_pool_copy.add([self.node_id, "item"], item)
|
||||
|
||||
@ -1,10 +1,7 @@
|
||||
from collections.abc import Mapping, Sequence
|
||||
from typing import Any
|
||||
|
||||
from core.workflow.entities.node_entities import NodeRunResult
|
||||
from core.workflow.nodes.base import BaseNode
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
from core.workflow.nodes.iteration.entities import IterationNodeData, IterationStartNodeData
|
||||
from core.workflow.nodes.iteration.entities import IterationStartNodeData
|
||||
from models.workflow import WorkflowNodeExecutionStatus
|
||||
|
||||
|
||||
@ -21,16 +18,3 @@ class IterationStartNode(BaseNode):
|
||||
Run the node.
|
||||
"""
|
||||
return NodeRunResult(status=WorkflowNodeExecutionStatus.SUCCEEDED)
|
||||
|
||||
@classmethod
|
||||
def _extract_variable_selector_to_variable_mapping(
|
||||
cls, graph_config: Mapping[str, Any], node_id: str, node_data: IterationNodeData
|
||||
) -> Mapping[str, Sequence[str]]:
|
||||
"""
|
||||
Extract variable selector to variable mapping
|
||||
:param graph_config: graph config
|
||||
:param node_id: node id
|
||||
:param node_data: node data
|
||||
:return:
|
||||
"""
|
||||
return {}
|
||||
|
||||
@ -1,8 +1,10 @@
|
||||
from collections.abc import Sequence
|
||||
from typing import Any, Literal, Optional
|
||||
|
||||
from pydantic import BaseModel
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from core.workflow.nodes.base import BaseNodeData
|
||||
from core.workflow.nodes.llm.entities import VisionConfig
|
||||
|
||||
|
||||
class RerankingModelConfig(BaseModel):
|
||||
@ -73,6 +75,48 @@ class SingleRetrievalConfig(BaseModel):
|
||||
model: ModelConfig
|
||||
|
||||
|
||||
SupportedComparisonOperator = Literal[
|
||||
# for string or array
|
||||
"contains",
|
||||
"not contains",
|
||||
"starts with",
|
||||
"ends with",
|
||||
"is",
|
||||
"is not",
|
||||
"empty",
|
||||
"is not empty",
|
||||
# for number
|
||||
"=",
|
||||
"≠",
|
||||
">",
|
||||
"<",
|
||||
"≥",
|
||||
"≤",
|
||||
# for time
|
||||
"before",
|
||||
"after",
|
||||
]
|
||||
|
||||
|
||||
class Condition(BaseModel):
|
||||
"""
|
||||
Conditon detail
|
||||
"""
|
||||
|
||||
metadata_name: str
|
||||
comparison_operator: SupportedComparisonOperator
|
||||
value: str | Sequence[str] | None = None
|
||||
|
||||
|
||||
class MetadataFilteringCondition(BaseModel):
|
||||
"""
|
||||
Metadata Filtering Condition.
|
||||
"""
|
||||
|
||||
logical_operator: Optional[Literal["and", "or"]] = "and"
|
||||
conditions: Optional[list[Condition]] = Field(default=None, deprecated=True)
|
||||
|
||||
|
||||
class KnowledgeRetrievalNodeData(BaseNodeData):
|
||||
"""
|
||||
Knowledge retrieval Node Data.
|
||||
@ -84,3 +128,7 @@ class KnowledgeRetrievalNodeData(BaseNodeData):
|
||||
retrieval_mode: Literal["single", "multiple"]
|
||||
multiple_retrieval_config: Optional[MultipleRetrievalConfig] = None
|
||||
single_retrieval_config: Optional[SingleRetrievalConfig] = None
|
||||
metadata_filtering_mode: Optional[Literal["disabled", "automatic", "manual"]] = "disabled"
|
||||
metadata_model_config: Optional[ModelConfig] = None
|
||||
metadata_filtering_conditions: Optional[MetadataFilteringCondition] = None
|
||||
vision: VisionConfig = Field(default_factory=VisionConfig)
|
||||
|
||||
@ -16,3 +16,7 @@ class ModelNotSupportedError(KnowledgeRetrievalNodeError):
|
||||
|
||||
class ModelQuotaExceededError(KnowledgeRetrievalNodeError):
|
||||
"""Raised when the model provider quota is exceeded."""
|
||||
|
||||
|
||||
class InvalidModelTypeError(KnowledgeRetrievalNodeError):
|
||||
"""Raised when the model is not a Large Language Model."""
|
||||
|
||||
@ -1,6 +1,8 @@
|
||||
import json
|
||||
import logging
|
||||
from collections import defaultdict
|
||||
from collections.abc import Mapping, Sequence
|
||||
from typing import Any, cast
|
||||
from typing import Any, Optional, cast
|
||||
|
||||
from sqlalchemy import func
|
||||
|
||||
@ -9,21 +11,38 @@ from core.app.entities.app_invoke_entities import ModelConfigWithCredentialsEnti
|
||||
from core.entities.agent_entities import PlanningStrategy
|
||||
from core.entities.model_entities import ModelStatus
|
||||
from core.model_manager import ModelInstance, ModelManager
|
||||
from core.model_runtime.entities.model_entities import ModelFeature, ModelType
|
||||
from core.model_runtime.entities.message_entities import PromptMessageRole
|
||||
from core.model_runtime.entities.model_entities import ModelFeature, ModelPropertyKey, ModelType
|
||||
from core.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel
|
||||
from core.prompt.advanced_prompt_transform import AdvancedPromptTransform
|
||||
from core.prompt.simple_prompt_transform import ModelMode
|
||||
from core.rag.datasource.retrieval_service import RetrievalService
|
||||
from core.rag.retrieval.dataset_retrieval import DatasetRetrieval
|
||||
from core.rag.retrieval.retrieval_methods import RetrievalMethod
|
||||
from core.variables import StringSegment
|
||||
from core.workflow.entities.node_entities import NodeRunResult
|
||||
from core.workflow.nodes.base import BaseNode
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
from core.workflow.nodes.event.event import ModelInvokeCompletedEvent
|
||||
from core.workflow.nodes.knowledge_retrieval.template_prompts import (
|
||||
METADATA_FILTER_ASSISTANT_PROMPT_1,
|
||||
METADATA_FILTER_ASSISTANT_PROMPT_2,
|
||||
METADATA_FILTER_COMPLETION_PROMPT,
|
||||
METADATA_FILTER_SYSTEM_PROMPT,
|
||||
METADATA_FILTER_USER_PROMPT_1,
|
||||
METADATA_FILTER_USER_PROMPT_3,
|
||||
)
|
||||
from core.workflow.nodes.list_operator.exc import InvalidConditionError
|
||||
from core.workflow.nodes.llm.entities import LLMNodeChatModelMessage, LLMNodeCompletionModelPromptTemplate
|
||||
from core.workflow.nodes.llm.node import LLMNode
|
||||
from core.workflow.nodes.question_classifier.template_prompts import QUESTION_CLASSIFIER_USER_PROMPT_2
|
||||
from extensions.ext_database import db
|
||||
from models.dataset import Dataset, Document
|
||||
from libs.json_in_md_parser import parse_and_check_json_markdown
|
||||
from models.dataset import Dataset, DatasetMetadata, Document
|
||||
from models.workflow import WorkflowNodeExecutionStatus
|
||||
|
||||
from .entities import KnowledgeRetrievalNodeData
|
||||
from .exc import (
|
||||
InvalidModelTypeError,
|
||||
KnowledgeRetrievalNodeError,
|
||||
ModelCredentialsNotInitializedError,
|
||||
ModelNotExistError,
|
||||
@ -42,13 +61,14 @@ default_retrieval_model = {
|
||||
}
|
||||
|
||||
|
||||
class KnowledgeRetrievalNode(BaseNode[KnowledgeRetrievalNodeData]):
|
||||
class KnowledgeRetrievalNode(LLMNode):
|
||||
_node_data_cls = KnowledgeRetrievalNodeData
|
||||
_node_type = NodeType.KNOWLEDGE_RETRIEVAL
|
||||
|
||||
def _run(self) -> NodeRunResult:
|
||||
node_data = cast(KnowledgeRetrievalNodeData, self.node_data)
|
||||
# extract variables
|
||||
variable = self.graph_runtime_state.variable_pool.get(self.node_data.query_variable_selector)
|
||||
variable = self.graph_runtime_state.variable_pool.get(node_data.query_variable_selector)
|
||||
if not isinstance(variable, StringSegment):
|
||||
return NodeRunResult(
|
||||
status=WorkflowNodeExecutionStatus.FAILED,
|
||||
@ -63,7 +83,7 @@ class KnowledgeRetrievalNode(BaseNode[KnowledgeRetrievalNodeData]):
|
||||
)
|
||||
# retrieve knowledge
|
||||
try:
|
||||
results = self._fetch_dataset_retriever(node_data=self.node_data, query=query)
|
||||
results = self._fetch_dataset_retriever(node_data=node_data, query=query)
|
||||
outputs = {"result": results}
|
||||
return NodeRunResult(
|
||||
status=WorkflowNodeExecutionStatus.SUCCEEDED, inputs=variables, process_data=None, outputs=outputs
|
||||
@ -117,6 +137,9 @@ class KnowledgeRetrievalNode(BaseNode[KnowledgeRetrievalNodeData]):
|
||||
if not dataset:
|
||||
continue
|
||||
available_datasets.append(dataset)
|
||||
metadata_filter_document_ids = self._get_metadata_filter_condition(
|
||||
[dataset.id for dataset in available_datasets], query, node_data
|
||||
)
|
||||
all_documents = []
|
||||
dataset_retrieval = DatasetRetrieval()
|
||||
if node_data.retrieval_mode == DatasetRetrieveConfigEntity.RetrieveStrategy.SINGLE.value:
|
||||
@ -146,6 +169,7 @@ class KnowledgeRetrievalNode(BaseNode[KnowledgeRetrievalNodeData]):
|
||||
model_config=model_config,
|
||||
model_instance=model_instance,
|
||||
planning_strategy=planning_strategy,
|
||||
metadata_filter_document_ids=metadata_filter_document_ids,
|
||||
)
|
||||
elif node_data.retrieval_mode == DatasetRetrieveConfigEntity.RetrieveStrategy.MULTIPLE.value:
|
||||
if node_data.multiple_retrieval_config is None:
|
||||
@ -258,6 +282,134 @@ class KnowledgeRetrievalNode(BaseNode[KnowledgeRetrievalNodeData]):
|
||||
item["metadata"]["position"] = position
|
||||
return retrieval_resource_list
|
||||
|
||||
def _get_metadata_filter_condition(
|
||||
self, dataset_ids: list, query: str, node_data: KnowledgeRetrievalNodeData
|
||||
) -> dict[str, list[str]]:
|
||||
document_query = db.session.query(Document.id).filter(
|
||||
Document.dataset_id.in_(dataset_ids),
|
||||
Document.indexing_status == "completed",
|
||||
Document.enabled == True,
|
||||
Document.archived == False,
|
||||
)
|
||||
if node_data.metadata_filtering_mode == "disabled":
|
||||
return None
|
||||
elif node_data.metadata_filtering_mode == "automatic":
|
||||
automatic_metadata_filters = self._automatic_metadata_filter_func(dataset_ids, query, node_data)
|
||||
if automatic_metadata_filters:
|
||||
for filter in automatic_metadata_filters:
|
||||
self._process_metadata_filter_func(
|
||||
filter.get("condition"), filter.get("metadata_name"), filter.get("value"), document_query
|
||||
)
|
||||
elif node_data.metadata_filtering_mode == "manual":
|
||||
for condition in node_data.metadata_filtering_conditions.conditions:
|
||||
metadata_name = condition.metadata_name
|
||||
expected_value = condition.value
|
||||
if isinstance(expected_value, str):
|
||||
expected_value = self.graph_runtime_state.variable_pool.convert_template(expected_value).text
|
||||
self._process_metadata_filter_func(
|
||||
condition.comparison_operator, metadata_name, expected_value, document_query
|
||||
)
|
||||
else:
|
||||
raise ValueError("Invalid metadata filtering mode")
|
||||
documnents = document_query.all()
|
||||
# group by dataset_id
|
||||
metadata_filter_document_ids = defaultdict(list)
|
||||
for document in documnents:
|
||||
metadata_filter_document_ids[document.dataset_id].append(document.id)
|
||||
return metadata_filter_document_ids
|
||||
|
||||
def _automatic_metadata_filter_func(
|
||||
self, dataset_ids: list, query: str, node_data: KnowledgeRetrievalNodeData
|
||||
) -> list[dict[str, Any]]:
|
||||
# get all metadata field
|
||||
metadata_fields = db.session.query(DatasetMetadata).filter(DatasetMetadata.dataset_id.in_(dataset_ids)).all()
|
||||
all_metadata_fields = [metadata_field.field_name for metadata_field in metadata_fields]
|
||||
# get metadata model config
|
||||
metadata_model_config = node_data.metadata_model_config
|
||||
if metadata_model_config is None:
|
||||
raise ValueError("metadata_model_config is required")
|
||||
# get metadata model instance
|
||||
# fetch model config
|
||||
model_instance, model_config = self._fetch_model_config(node_data.metadata_model_config)
|
||||
# fetch prompt messages
|
||||
prompt_template = self._get_prompt_template(
|
||||
node_data=node_data,
|
||||
query=query or "",
|
||||
metadata_fields=all_metadata_fields,
|
||||
)
|
||||
prompt_messages, stop = self._fetch_prompt_messages(
|
||||
prompt_template=prompt_template,
|
||||
sys_query=query,
|
||||
memory=None,
|
||||
model_config=model_config,
|
||||
sys_files=[],
|
||||
vision_enabled=node_data.vision.enabled,
|
||||
vision_detail=node_data.vision.configs.detail,
|
||||
variable_pool=self.graph_runtime_state.variable_pool,
|
||||
jinja2_variables=[],
|
||||
)
|
||||
|
||||
result_text = ""
|
||||
try:
|
||||
# handle invoke result
|
||||
generator = self._invoke_llm(
|
||||
node_data_model=node_data.model,
|
||||
model_instance=model_instance,
|
||||
prompt_messages=prompt_messages,
|
||||
stop=stop,
|
||||
)
|
||||
|
||||
for event in generator:
|
||||
if isinstance(event, ModelInvokeCompletedEvent):
|
||||
result_text = event.text
|
||||
break
|
||||
|
||||
result_text_json = parse_and_check_json_markdown(result_text, [])
|
||||
automatic_metadata_filters = []
|
||||
if "metadata_map" in result_text_json:
|
||||
metadata_map = result_text_json["metadata_map"]
|
||||
for item in metadata_map:
|
||||
if item.get("metadata_field_name") in all_metadata_fields:
|
||||
automatic_metadata_filters.append(
|
||||
{
|
||||
"metadata_name": item.get("metadata_field_name"),
|
||||
"value": item.get("metadata_field_value"),
|
||||
"condition": item.get("comparison_operator"),
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
return None
|
||||
return automatic_metadata_filters
|
||||
|
||||
def _process_metadata_filter_func(*, condition: str, metadata_name: str, value: str, query):
|
||||
match condition:
|
||||
case "contains":
|
||||
query = query.filter(Document.doc_metadata[metadata_name].like(f"%{value}%"))
|
||||
case "not contains":
|
||||
query = query.filter(Document.doc_metadata[metadata_name].notlike(f"%{value}%"))
|
||||
case "start with":
|
||||
query = query.filter(Document.doc_metadata[metadata_name].like(f"{value}%"))
|
||||
case "end with":
|
||||
query = query.filter(Document.doc_metadata[metadata_name].like(f"%{value}"))
|
||||
case "is", "=":
|
||||
query = query.filter(Document.doc_metadata[metadata_name] == value)
|
||||
case "is not", "≠":
|
||||
query = query.filter(Document.doc_metadata[metadata_name] != value)
|
||||
case "is empty":
|
||||
query = query.filter(Document.doc_metadata[metadata_name].is_(None))
|
||||
case "is not empty":
|
||||
query = query.filter(Document.doc_metadata[metadata_name].isnot(None))
|
||||
case "before", "<":
|
||||
query = query.filter(Document.doc_metadata[metadata_name] < value)
|
||||
case "after", ">":
|
||||
query = query.filter(Document.doc_metadata[metadata_name] > value)
|
||||
case "≤", ">=":
|
||||
query = query.filter(Document.doc_metadata[metadata_name] <= value)
|
||||
case "≥", ">=":
|
||||
query = query.filter(Document.doc_metadata[metadata_name] >= value)
|
||||
case _:
|
||||
raise InvalidConditionError(f"Invalid condition: {condition}")
|
||||
|
||||
@classmethod
|
||||
def _extract_variable_selector_to_variable_mapping(
|
||||
cls,
|
||||
@ -343,3 +495,94 @@ class KnowledgeRetrievalNode(BaseNode[KnowledgeRetrievalNodeData]):
|
||||
parameters=completion_params,
|
||||
stop=stop,
|
||||
)
|
||||
|
||||
def _calculate_rest_token(
|
||||
self,
|
||||
node_data: KnowledgeRetrievalNodeData,
|
||||
query: str,
|
||||
model_config: ModelConfigWithCredentialsEntity,
|
||||
context: Optional[str],
|
||||
) -> int:
|
||||
prompt_transform = AdvancedPromptTransform(with_variable_tmpl=True)
|
||||
prompt_template = self._get_prompt_template(node_data, query, None, 2000)
|
||||
prompt_messages = prompt_transform.get_prompt(
|
||||
prompt_template=prompt_template,
|
||||
inputs={},
|
||||
query="",
|
||||
files=[],
|
||||
context=context,
|
||||
memory_config=node_data.memory,
|
||||
memory=None,
|
||||
model_config=model_config,
|
||||
)
|
||||
rest_tokens = 2000
|
||||
|
||||
model_context_tokens = model_config.model_schema.model_properties.get(ModelPropertyKey.CONTEXT_SIZE)
|
||||
if model_context_tokens:
|
||||
model_instance = ModelInstance(
|
||||
provider_model_bundle=model_config.provider_model_bundle, model=model_config.model
|
||||
)
|
||||
|
||||
curr_message_tokens = model_instance.get_llm_num_tokens(prompt_messages)
|
||||
|
||||
max_tokens = 0
|
||||
for parameter_rule in model_config.model_schema.parameter_rules:
|
||||
if parameter_rule.name == "max_tokens" or (
|
||||
parameter_rule.use_template and parameter_rule.use_template == "max_tokens"
|
||||
):
|
||||
max_tokens = (
|
||||
model_config.parameters.get(parameter_rule.name)
|
||||
or model_config.parameters.get(parameter_rule.use_template or "")
|
||||
) or 0
|
||||
|
||||
rest_tokens = model_context_tokens - max_tokens - curr_message_tokens
|
||||
rest_tokens = max(rest_tokens, 0)
|
||||
|
||||
return rest_tokens
|
||||
|
||||
def _get_prompt_template(self, node_data: KnowledgeRetrievalNodeData, metadata_fields: list, query: str):
|
||||
model_mode = ModelMode.value_of(node_data.metadata_model_config.mode)
|
||||
input_text = query
|
||||
memory_str = ""
|
||||
|
||||
prompt_messages: list[LLMNodeChatModelMessage] = []
|
||||
if model_mode == ModelMode.CHAT:
|
||||
system_prompt_messages = LLMNodeChatModelMessage(
|
||||
role=PromptMessageRole.SYSTEM, text=METADATA_FILTER_SYSTEM_PROMPT
|
||||
)
|
||||
prompt_messages.append(system_prompt_messages)
|
||||
user_prompt_message_1 = LLMNodeChatModelMessage(
|
||||
role=PromptMessageRole.USER, text=METADATA_FILTER_USER_PROMPT_1
|
||||
)
|
||||
prompt_messages.append(user_prompt_message_1)
|
||||
assistant_prompt_message_1 = LLMNodeChatModelMessage(
|
||||
role=PromptMessageRole.ASSISTANT, text=METADATA_FILTER_ASSISTANT_PROMPT_1
|
||||
)
|
||||
prompt_messages.append(assistant_prompt_message_1)
|
||||
user_prompt_message_2 = LLMNodeChatModelMessage(
|
||||
role=PromptMessageRole.USER, text=QUESTION_CLASSIFIER_USER_PROMPT_2
|
||||
)
|
||||
prompt_messages.append(user_prompt_message_2)
|
||||
assistant_prompt_message_2 = LLMNodeChatModelMessage(
|
||||
role=PromptMessageRole.ASSISTANT, text=METADATA_FILTER_ASSISTANT_PROMPT_2
|
||||
)
|
||||
prompt_messages.append(assistant_prompt_message_2)
|
||||
user_prompt_message_3 = LLMNodeChatModelMessage(
|
||||
role=PromptMessageRole.USER,
|
||||
text=METADATA_FILTER_USER_PROMPT_3.format(
|
||||
input_text=input_text,
|
||||
metadata_fields=json.dumps(metadata_fields, ensure_ascii=False),
|
||||
),
|
||||
)
|
||||
prompt_messages.append(user_prompt_message_3)
|
||||
return prompt_messages
|
||||
elif model_mode == ModelMode.COMPLETION:
|
||||
return LLMNodeCompletionModelPromptTemplate(
|
||||
text=METADATA_FILTER_COMPLETION_PROMPT.format(
|
||||
input_text=input_text,
|
||||
metadata_fields=json.dumps(metadata_fields, ensure_ascii=False),
|
||||
)
|
||||
)
|
||||
|
||||
else:
|
||||
raise InvalidModelTypeError(f"Model mode {model_mode} not support.")
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
import json
|
||||
import logging
|
||||
from collections.abc import Generator, Mapping, Sequence
|
||||
from datetime import UTC, datetime
|
||||
from typing import TYPE_CHECKING, Any, Optional, cast
|
||||
|
||||
from configs import dify_config
|
||||
@ -29,6 +30,7 @@ from core.model_runtime.entities.message_entities import (
|
||||
from core.model_runtime.entities.model_entities import ModelFeature, ModelPropertyKey, ModelType
|
||||
from core.model_runtime.model_providers.__base.large_language_model import LargeLanguageModel
|
||||
from core.model_runtime.utils.encoders import jsonable_encoder
|
||||
from core.plugin.entities.plugin import ModelProviderID
|
||||
from core.prompt.entities.advanced_prompt_entities import CompletionModelPromptTemplate, MemoryConfig
|
||||
from core.prompt.utils.prompt_message_util import PromptMessageUtil
|
||||
from core.variables import (
|
||||
@ -758,11 +760,17 @@ class LLMNode(BaseNode[LLMNodeData]):
|
||||
if used_quota is not None and system_configuration.current_quota_type is not None:
|
||||
db.session.query(Provider).filter(
|
||||
Provider.tenant_id == tenant_id,
|
||||
Provider.provider_name == model_instance.provider,
|
||||
# TODO: Use provider name with prefix after the data migration.
|
||||
Provider.provider_name == ModelProviderID(model_instance.provider).provider_name,
|
||||
Provider.provider_type == ProviderType.SYSTEM.value,
|
||||
Provider.quota_type == system_configuration.current_quota_type.value,
|
||||
Provider.quota_limit > Provider.quota_used,
|
||||
).update({"quota_used": Provider.quota_used + used_quota})
|
||||
).update(
|
||||
{
|
||||
"quota_used": Provider.quota_used + used_quota,
|
||||
"last_used": datetime.now(tz=UTC).replace(tzinfo=None),
|
||||
}
|
||||
)
|
||||
db.session.commit()
|
||||
|
||||
@classmethod
|
||||
|
||||
@ -1,6 +1,3 @@
|
||||
from collections.abc import Mapping, Sequence
|
||||
from typing import Any
|
||||
|
||||
from core.workflow.constants import SYSTEM_VARIABLE_NODE_ID
|
||||
from core.workflow.entities.node_entities import NodeRunResult
|
||||
from core.workflow.nodes.base import BaseNode
|
||||
@ -23,13 +20,3 @@ class StartNode(BaseNode[StartNodeData]):
|
||||
node_inputs[SYSTEM_VARIABLE_NODE_ID + "." + var] = system_inputs[var]
|
||||
|
||||
return NodeRunResult(status=WorkflowNodeExecutionStatus.SUCCEEDED, inputs=node_inputs, outputs=node_inputs)
|
||||
|
||||
@classmethod
|
||||
def _extract_variable_selector_to_variable_mapping(
|
||||
cls,
|
||||
*,
|
||||
graph_config: Mapping[str, Any],
|
||||
node_id: str,
|
||||
node_data: StartNodeData,
|
||||
) -> Mapping[str, Sequence[str]]:
|
||||
return {}
|
||||
|
||||
@ -1,6 +1,3 @@
|
||||
from collections.abc import Mapping, Sequence
|
||||
from typing import Any
|
||||
|
||||
from core.workflow.entities.node_entities import NodeRunResult
|
||||
from core.workflow.nodes.base import BaseNode
|
||||
from core.workflow.nodes.enums import NodeType
|
||||
@ -36,16 +33,3 @@ class VariableAggregatorNode(BaseNode[VariableAssignerNodeData]):
|
||||
break
|
||||
|
||||
return NodeRunResult(status=WorkflowNodeExecutionStatus.SUCCEEDED, outputs=outputs, inputs=inputs)
|
||||
|
||||
@classmethod
|
||||
def _extract_variable_selector_to_variable_mapping(
|
||||
cls, *, graph_config: Mapping[str, Any], node_id: str, node_data: VariableAssignerNodeData
|
||||
) -> Mapping[str, Sequence[str]]:
|
||||
"""
|
||||
Extract variable selector to variable mapping
|
||||
:param graph_config: graph config
|
||||
:param node_id: node id
|
||||
:param node_data: node data
|
||||
:return:
|
||||
"""
|
||||
return {}
|
||||
|
||||
@ -1,6 +1,9 @@
|
||||
from datetime import UTC, datetime
|
||||
|
||||
from configs import dify_config
|
||||
from core.app.entities.app_invoke_entities import AgentChatAppGenerateEntity, ChatAppGenerateEntity
|
||||
from core.entities.provider_entities import QuotaUnit
|
||||
from core.plugin.entities.plugin import ModelProviderID
|
||||
from events.message_event import message_was_created
|
||||
from extensions.ext_database import db
|
||||
from models.provider import Provider, ProviderType
|
||||
@ -48,9 +51,15 @@ def handle(sender, **kwargs):
|
||||
if used_quota is not None and system_configuration.current_quota_type is not None:
|
||||
db.session.query(Provider).filter(
|
||||
Provider.tenant_id == application_generate_entity.app_config.tenant_id,
|
||||
Provider.provider_name == model_config.provider,
|
||||
# TODO: Use provider name with prefix after the data migration.
|
||||
Provider.provider_name == ModelProviderID(model_config.provider).provider_name,
|
||||
Provider.provider_type == ProviderType.SYSTEM.value,
|
||||
Provider.quota_type == system_configuration.current_quota_type.value,
|
||||
Provider.quota_limit > Provider.quota_used,
|
||||
).update({"quota_used": Provider.quota_used + used_quota})
|
||||
).update(
|
||||
{
|
||||
"quota_used": Provider.quota_used + used_quota,
|
||||
"last_used": datetime.now(tz=UTC).replace(tzinfo=None),
|
||||
}
|
||||
)
|
||||
db.session.commit()
|
||||
|
||||
@ -53,6 +53,8 @@ external_knowledge_info_fields = {
|
||||
"external_knowledge_api_endpoint": fields.String,
|
||||
}
|
||||
|
||||
doc_metadata_fields = {"id": fields.String, "name": fields.String, "type": fields.String}
|
||||
|
||||
dataset_detail_fields = {
|
||||
"id": fields.String,
|
||||
"name": fields.String,
|
||||
@ -76,6 +78,8 @@ dataset_detail_fields = {
|
||||
"doc_form": fields.String,
|
||||
"external_knowledge_info": fields.Nested(external_knowledge_info_fields),
|
||||
"external_retrieval_model": fields.Nested(external_retrieval_model_fields, allow_null=True),
|
||||
"doc_metadata": fields.List(fields.Nested(doc_metadata_fields)),
|
||||
"built_in_field_enabled": fields.Boolean,
|
||||
}
|
||||
|
||||
dataset_query_detail_fields = {
|
||||
@ -87,3 +91,9 @@ dataset_query_detail_fields = {
|
||||
"created_by": fields.String,
|
||||
"created_at": TimestampField,
|
||||
}
|
||||
|
||||
dataset_metadata_fields = {
|
||||
"id": fields.String,
|
||||
"type": fields.String,
|
||||
"name": fields.String,
|
||||
}
|
||||
|
||||
@ -3,6 +3,13 @@ from flask_restful import fields # type: ignore
|
||||
from fields.dataset_fields import dataset_fields
|
||||
from libs.helper import TimestampField
|
||||
|
||||
document_metadata_fields = {
|
||||
"id": fields.String,
|
||||
"name": fields.String,
|
||||
"type": fields.String,
|
||||
"value": fields.String,
|
||||
}
|
||||
|
||||
document_fields = {
|
||||
"id": fields.String,
|
||||
"position": fields.Integer,
|
||||
@ -25,6 +32,7 @@ document_fields = {
|
||||
"word_count": fields.Integer,
|
||||
"hit_count": fields.Integer,
|
||||
"doc_form": fields.String,
|
||||
"doc_metadata": fields.List(fields.Nested(document_metadata_fields), attribute="doc_metadata_details"),
|
||||
}
|
||||
|
||||
document_with_segments_fields = {
|
||||
@ -51,6 +59,7 @@ document_with_segments_fields = {
|
||||
"hit_count": fields.Integer,
|
||||
"completed_segments": fields.Integer,
|
||||
"total_segments": fields.Integer,
|
||||
"doc_metadata": fields.List(fields.Nested(document_metadata_fields), attribute="doc_metadata_details"),
|
||||
}
|
||||
|
||||
dataset_and_document_fields = {
|
||||
|
||||
@ -0,0 +1,90 @@
|
||||
"""add_metadata_function
|
||||
|
||||
Revision ID: d20049ed0af6
|
||||
Revises: 08ec4f75af5e
|
||||
Create Date: 2025-02-27 09:17:48.903213
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import models as models
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'd20049ed0af6'
|
||||
down_revision = '08ec4f75af5e'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('dataset_metadata_bindings',
|
||||
sa.Column('id', models.types.StringUUID(), server_default=sa.text('uuid_generate_v4()'), nullable=False),
|
||||
sa.Column('tenant_id', models.types.StringUUID(), nullable=False),
|
||||
sa.Column('dataset_id', models.types.StringUUID(), nullable=False),
|
||||
sa.Column('metadata_id', models.types.StringUUID(), nullable=False),
|
||||
sa.Column('document_id', models.types.StringUUID(), nullable=False),
|
||||
sa.Column('created_at', sa.DateTime(), server_default=sa.text('CURRENT_TIMESTAMP'), nullable=False),
|
||||
sa.Column('created_by', models.types.StringUUID(), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id', name='dataset_metadata_binding_pkey')
|
||||
)
|
||||
with op.batch_alter_table('dataset_metadata_bindings', schema=None) as batch_op:
|
||||
batch_op.create_index('dataset_metadata_binding_dataset_idx', ['dataset_id'], unique=False)
|
||||
batch_op.create_index('dataset_metadata_binding_document_idx', ['document_id'], unique=False)
|
||||
batch_op.create_index('dataset_metadata_binding_metadata_idx', ['metadata_id'], unique=False)
|
||||
batch_op.create_index('dataset_metadata_binding_tenant_idx', ['tenant_id'], unique=False)
|
||||
|
||||
op.create_table('dataset_metadatas',
|
||||
sa.Column('id', models.types.StringUUID(), server_default=sa.text('uuid_generate_v4()'), nullable=False),
|
||||
sa.Column('tenant_id', models.types.StringUUID(), nullable=False),
|
||||
sa.Column('dataset_id', models.types.StringUUID(), nullable=False),
|
||||
sa.Column('type', sa.String(length=255), nullable=False),
|
||||
sa.Column('name', sa.String(length=255), nullable=False),
|
||||
sa.Column('created_at', sa.DateTime(), server_default=sa.text('CURRENT_TIMESTAMP(0)'), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(), server_default=sa.text('CURRENT_TIMESTAMP(0)'), nullable=False),
|
||||
sa.Column('created_by', models.types.StringUUID(), nullable=False),
|
||||
sa.Column('updated_by', models.types.StringUUID(), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id', name='dataset_metadata_pkey')
|
||||
)
|
||||
with op.batch_alter_table('dataset_metadatas', schema=None) as batch_op:
|
||||
batch_op.create_index('dataset_metadata_dataset_idx', ['dataset_id'], unique=False)
|
||||
batch_op.create_index('dataset_metadata_tenant_idx', ['tenant_id'], unique=False)
|
||||
|
||||
with op.batch_alter_table('datasets', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('built_in_field_enabled', sa.Boolean(), server_default=sa.text('false'), nullable=False))
|
||||
|
||||
with op.batch_alter_table('documents', schema=None) as batch_op:
|
||||
batch_op.alter_column('doc_metadata',
|
||||
existing_type=postgresql.JSON(astext_type=sa.Text()),
|
||||
type_=postgresql.JSONB(astext_type=sa.Text()),
|
||||
existing_nullable=True)
|
||||
batch_op.create_index('document_metadata_idx', ['doc_metadata'], unique=False, postgresql_using='gin')
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
with op.batch_alter_table('documents', schema=None) as batch_op:
|
||||
batch_op.drop_index('document_metadata_idx', postgresql_using='gin')
|
||||
batch_op.alter_column('doc_metadata',
|
||||
existing_type=postgresql.JSONB(astext_type=sa.Text()),
|
||||
type_=postgresql.JSON(astext_type=sa.Text()),
|
||||
existing_nullable=True)
|
||||
|
||||
with op.batch_alter_table('datasets', schema=None) as batch_op:
|
||||
batch_op.drop_column('built_in_field_enabled')
|
||||
|
||||
with op.batch_alter_table('dataset_metadatas', schema=None) as batch_op:
|
||||
batch_op.drop_index('dataset_metadata_tenant_idx')
|
||||
batch_op.drop_index('dataset_metadata_dataset_idx')
|
||||
|
||||
op.drop_table('dataset_metadatas')
|
||||
with op.batch_alter_table('dataset_metadata_bindings', schema=None) as batch_op:
|
||||
batch_op.drop_index('dataset_metadata_binding_tenant_idx')
|
||||
batch_op.drop_index('dataset_metadata_binding_metadata_idx')
|
||||
batch_op.drop_index('dataset_metadata_binding_document_idx')
|
||||
batch_op.drop_index('dataset_metadata_binding_dataset_idx')
|
||||
|
||||
op.drop_table('dataset_metadata_bindings')
|
||||
# ### end Alembic commands ###
|
||||
@ -16,6 +16,7 @@ from sqlalchemy.dialects.postgresql import JSONB
|
||||
from sqlalchemy.orm import Mapped
|
||||
|
||||
from configs import dify_config
|
||||
from core.rag.index_processor.constant.built_in_field import BuiltInField
|
||||
from core.rag.retrieval.retrieval_methods import RetrievalMethod
|
||||
from extensions.ext_storage import storage
|
||||
from services.entities.knowledge_entities.knowledge_entities import ParentMode, Rule
|
||||
@ -60,6 +61,7 @@ class Dataset(db.Model): # type: ignore[name-defined]
|
||||
embedding_model_provider = db.Column(db.String(255), nullable=True)
|
||||
collection_binding_id = db.Column(StringUUID, nullable=True)
|
||||
retrieval_model = db.Column(JSONB, nullable=True)
|
||||
built_in_field_enabled = db.Column(db.Boolean, nullable=False, server_default=db.text("false"))
|
||||
|
||||
@property
|
||||
def dataset_keyword_table(self):
|
||||
@ -197,6 +199,19 @@ class Dataset(db.Model): # type: ignore[name-defined]
|
||||
"external_knowledge_api_endpoint": json.loads(external_knowledge_api.settings).get("endpoint", ""),
|
||||
}
|
||||
|
||||
@property
|
||||
def doc_metadata(self):
|
||||
dataset_metadatas = db.session.query(DatasetMetadata).filter(DatasetMetadata.dataset_id == self.id).all()
|
||||
|
||||
return [
|
||||
{
|
||||
"id": dataset_metadata.id,
|
||||
"name": dataset_metadata.name,
|
||||
"type": dataset_metadata.type,
|
||||
}
|
||||
for dataset_metadata in dataset_metadatas
|
||||
]
|
||||
|
||||
@staticmethod
|
||||
def gen_collection_name_by_id(dataset_id: str) -> str:
|
||||
normalized_dataset_id = dataset_id.replace("-", "_")
|
||||
@ -250,6 +265,7 @@ class Document(db.Model): # type: ignore[name-defined]
|
||||
db.Index("document_dataset_id_idx", "dataset_id"),
|
||||
db.Index("document_is_paused_idx", "is_paused"),
|
||||
db.Index("document_tenant_idx", "tenant_id"),
|
||||
db.Index("document_metadata_idx", "doc_metadata", postgresql_using="gin"),
|
||||
)
|
||||
|
||||
# initial fields
|
||||
@ -306,7 +322,7 @@ class Document(db.Model): # type: ignore[name-defined]
|
||||
archived_at = db.Column(db.DateTime, nullable=True)
|
||||
updated_at = db.Column(db.DateTime, nullable=False, server_default=func.current_timestamp())
|
||||
doc_type = db.Column(db.String(40), nullable=True)
|
||||
doc_metadata = db.Column(db.JSON, nullable=True)
|
||||
doc_metadata = db.Column(JSONB, nullable=True)
|
||||
doc_form = db.Column(db.String(255), nullable=False, server_default=db.text("'text_model'::character varying"))
|
||||
doc_language = db.Column(db.String(255), nullable=True)
|
||||
|
||||
@ -397,6 +413,78 @@ class Document(db.Model): # type: ignore[name-defined]
|
||||
)
|
||||
|
||||
@property
|
||||
def uploader(self):
|
||||
user = db.session.query(Account).filter(Account.id == self.created_by).first()
|
||||
return user.name if user else None
|
||||
|
||||
@property
|
||||
def upload_date(self):
|
||||
return self.created_at
|
||||
|
||||
@property
|
||||
def last_update_date(self):
|
||||
return self.updated_at
|
||||
|
||||
@property
|
||||
def doc_metadata_details(self):
|
||||
if self.doc_metadata:
|
||||
document_metadatas = (
|
||||
db.session.query(DatasetMetadata)
|
||||
.join(DatasetMetadataBinding, DatasetMetadataBinding.metadata_id == DatasetMetadata.id)
|
||||
.filter(
|
||||
DatasetMetadataBinding.dataset_id == self.dataset_id, DatasetMetadataBinding.document_id == self.id
|
||||
)
|
||||
.all()
|
||||
)
|
||||
metadata_list = []
|
||||
for metadata in document_metadatas:
|
||||
metadata_dict = {
|
||||
"id": metadata.id,
|
||||
"name": metadata.name,
|
||||
"type": metadata.type,
|
||||
"value": self.doc_metadata.get(metadata.type),
|
||||
}
|
||||
metadata_list.append(metadata_dict)
|
||||
# deal built-in fields
|
||||
metadata_list.extend(self.get_built_in_fields())
|
||||
|
||||
return metadata_list
|
||||
return None
|
||||
|
||||
def get_built_in_fields(self):
|
||||
built_in_fields = []
|
||||
built_in_fields.append({
|
||||
"id": "built-in",
|
||||
"name": BuiltInField.document_name,
|
||||
"type": "string",
|
||||
"value": self.name,
|
||||
})
|
||||
built_in_fields.append({
|
||||
"id": "built-in",
|
||||
"name": BuiltInField.uploader,
|
||||
"type": "string",
|
||||
"value": self.uploader,
|
||||
})
|
||||
built_in_fields.append({
|
||||
"id": "built-in",
|
||||
"name": BuiltInField.upload_date,
|
||||
"type": "date",
|
||||
"value": self.created_at,
|
||||
})
|
||||
built_in_fields.append({
|
||||
"id": "built-in",
|
||||
"name": BuiltInField.last_update_date,
|
||||
"type": "date",
|
||||
"value": self.updated_at,
|
||||
})
|
||||
built_in_fields.append({
|
||||
"id": "built-in",
|
||||
"name": BuiltInField.source,
|
||||
"type": "string",
|
||||
"value": self.data_source_info,
|
||||
})
|
||||
return built_in_fields
|
||||
|
||||
def process_rule_dict(self):
|
||||
if self.dataset_process_rule_id:
|
||||
return self.dataset_process_rule.to_dict()
|
||||
@ -930,3 +1018,41 @@ class DatasetAutoDisableLog(db.Model): # type: ignore[name-defined]
|
||||
document_id = db.Column(StringUUID, nullable=False)
|
||||
notified = db.Column(db.Boolean, nullable=False, server_default=db.text("false"))
|
||||
created_at = db.Column(db.DateTime, nullable=False, server_default=db.text("CURRENT_TIMESTAMP(0)"))
|
||||
|
||||
|
||||
class DatasetMetadata(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "dataset_metadatas"
|
||||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="dataset_metadata_pkey"),
|
||||
db.Index("dataset_metadata_tenant_idx", "tenant_id"),
|
||||
db.Index("dataset_metadata_dataset_idx", "dataset_id"),
|
||||
)
|
||||
|
||||
id = db.Column(StringUUID, server_default=db.text("uuid_generate_v4()"))
|
||||
tenant_id = db.Column(StringUUID, nullable=False)
|
||||
dataset_id = db.Column(StringUUID, nullable=False)
|
||||
type = db.Column(db.String(255), nullable=False)
|
||||
name = db.Column(db.String(255), nullable=False)
|
||||
created_at = db.Column(db.DateTime, nullable=False, server_default=db.text("CURRENT_TIMESTAMP(0)"))
|
||||
updated_at = db.Column(db.DateTime, nullable=False, server_default=db.text("CURRENT_TIMESTAMP(0)"))
|
||||
created_by = db.Column(StringUUID, nullable=False)
|
||||
updated_by = db.Column(StringUUID, nullable=True)
|
||||
|
||||
|
||||
class DatasetMetadataBinding(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "dataset_metadata_bindings"
|
||||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="dataset_metadata_binding_pkey"),
|
||||
db.Index("dataset_metadata_binding_tenant_idx", "tenant_id"),
|
||||
db.Index("dataset_metadata_binding_dataset_idx", "dataset_id"),
|
||||
db.Index("dataset_metadata_binding_metadata_idx", "metadata_id"),
|
||||
db.Index("dataset_metadata_binding_document_idx", "document_id"),
|
||||
)
|
||||
|
||||
id = db.Column(StringUUID, server_default=db.text("uuid_generate_v4()"))
|
||||
tenant_id = db.Column(StringUUID, nullable=False)
|
||||
dataset_id = db.Column(StringUUID, nullable=False)
|
||||
metadata_id = db.Column(StringUUID, nullable=False)
|
||||
document_id = db.Column(StringUUID, nullable=False)
|
||||
created_at = db.Column(db.DateTime, nullable=False, server_default=func.current_timestamp())
|
||||
created_by = db.Column(StringUUID, nullable=False)
|
||||
|
||||
@ -604,7 +604,7 @@ class InstalledApp(Base):
|
||||
return tenant
|
||||
|
||||
|
||||
class Conversation(Base):
|
||||
class Conversation(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "conversations"
|
||||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="conversation_pkey"),
|
||||
@ -839,7 +839,7 @@ class Conversation(Base):
|
||||
return self.override_model_configs is not None
|
||||
|
||||
|
||||
class Message(Base):
|
||||
class Message(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "messages"
|
||||
__table_args__ = (
|
||||
PrimaryKeyConstraint("id", name="message_pkey"),
|
||||
@ -1190,7 +1190,7 @@ class Message(Base):
|
||||
)
|
||||
|
||||
|
||||
class MessageFeedback(Base):
|
||||
class MessageFeedback(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "message_feedbacks"
|
||||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="message_feedback_pkey"),
|
||||
@ -1217,7 +1217,7 @@ class MessageFeedback(Base):
|
||||
return account
|
||||
|
||||
|
||||
class MessageFile(Base):
|
||||
class MessageFile(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "message_files"
|
||||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="message_file_pkey"),
|
||||
@ -1258,7 +1258,7 @@ class MessageFile(Base):
|
||||
created_at: Mapped[datetime] = db.Column(db.DateTime, nullable=False, server_default=func.current_timestamp())
|
||||
|
||||
|
||||
class MessageAnnotation(Base):
|
||||
class MessageAnnotation(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "message_annotations"
|
||||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="message_annotation_pkey"),
|
||||
@ -1327,7 +1327,7 @@ class AppAnnotationHitHistory(db.Model): # type: ignore[name-defined]
|
||||
return account
|
||||
|
||||
|
||||
class AppAnnotationSetting(Base):
|
||||
class AppAnnotationSetting(db.Model): # type: ignore[name-defined]
|
||||
__tablename__ = "app_annotation_settings"
|
||||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="app_annotation_settings_pkey"),
|
||||
|
||||
2032
api/poetry.lock
generated
2032
api/poetry.lock
generated
File diff suppressed because it is too large
Load Diff
@ -18,7 +18,7 @@ package-mode = false
|
||||
authlib = "1.3.1"
|
||||
azure-identity = "1.16.1"
|
||||
beautifulsoup4 = "4.12.2"
|
||||
boto3 = "1.36.12"
|
||||
boto3 = "1.37.1"
|
||||
bs4 = "~0.0.1"
|
||||
cachetools = "~5.3.0"
|
||||
celery = "~5.4.0"
|
||||
|
||||
@ -13,10 +13,11 @@ from sqlalchemy.orm import Session
|
||||
from werkzeug.exceptions import NotFound
|
||||
|
||||
from configs import dify_config
|
||||
from core.entities import DEFAULT_PLUGIN_ID
|
||||
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
|
||||
from core.model_manager import ModelManager
|
||||
from core.model_runtime.entities.model_entities import ModelType
|
||||
from core.rag.index_processor.constant.built_in_field import BuiltInField
|
||||
from core.plugin.entities.plugin import ModelProviderID
|
||||
from core.rag.index_processor.constant.index_type import IndexType
|
||||
from core.rag.retrieval.retrieval_methods import RetrievalMethod
|
||||
from events.dataset_event import dataset_was_deleted
|
||||
@ -328,14 +329,10 @@ class DatasetService:
|
||||
else:
|
||||
# add default plugin id to both setting sets, to make sure the plugin model provider is consistent
|
||||
plugin_model_provider = dataset.embedding_model_provider
|
||||
if "/" not in plugin_model_provider:
|
||||
plugin_model_provider = f"{DEFAULT_PLUGIN_ID}/{plugin_model_provider}/{plugin_model_provider}"
|
||||
plugin_model_provider = str(ModelProviderID(plugin_model_provider))
|
||||
|
||||
new_plugin_model_provider = data["embedding_model_provider"]
|
||||
if "/" not in new_plugin_model_provider:
|
||||
new_plugin_model_provider = (
|
||||
f"{DEFAULT_PLUGIN_ID}/{new_plugin_model_provider}/{new_plugin_model_provider}"
|
||||
)
|
||||
new_plugin_model_provider = str(ModelProviderID(new_plugin_model_provider))
|
||||
|
||||
if (
|
||||
new_plugin_model_provider != plugin_model_provider
|
||||
@ -603,9 +600,45 @@ class DocumentService:
|
||||
|
||||
return document
|
||||
|
||||
@staticmethod
|
||||
def get_document_by_ids(document_ids: list[str]) -> list[Document]:
|
||||
documents = (
|
||||
db.session.query(Document)
|
||||
.filter(
|
||||
Document.id.in_(document_ids),
|
||||
Document.enabled == True,
|
||||
Document.indexing_status == "completed",
|
||||
Document.archived == False,
|
||||
)
|
||||
.all()
|
||||
)
|
||||
return documents
|
||||
|
||||
@staticmethod
|
||||
def get_document_by_dataset_id(dataset_id: str) -> list[Document]:
|
||||
documents = db.session.query(Document).filter(Document.dataset_id == dataset_id, Document.enabled == True).all()
|
||||
documents = (
|
||||
db.session.query(Document)
|
||||
.filter(
|
||||
Document.dataset_id == dataset_id,
|
||||
Document.enabled == True,
|
||||
)
|
||||
.all()
|
||||
)
|
||||
|
||||
return documents
|
||||
|
||||
@staticmethod
|
||||
def get_working_documents_by_dataset_id(dataset_id: str) -> list[Document]:
|
||||
documents = (
|
||||
db.session.query(Document)
|
||||
.filter(
|
||||
Document.dataset_id == dataset_id,
|
||||
Document.enabled == True,
|
||||
Document.indexing_status == "completed",
|
||||
Document.archived == False,
|
||||
)
|
||||
.all()
|
||||
)
|
||||
|
||||
return documents
|
||||
|
||||
@ -688,7 +721,11 @@ class DocumentService:
|
||||
if document.tenant_id != current_user.current_tenant_id:
|
||||
raise ValueError("No permission.")
|
||||
|
||||
document.name = name
|
||||
if dataset.built_in_field_enabled:
|
||||
if document.doc_metadata:
|
||||
document.doc_metadata[BuiltInField.document_name] = name
|
||||
else:
|
||||
document.name = name
|
||||
|
||||
db.session.add(document)
|
||||
db.session.commit()
|
||||
@ -979,6 +1016,8 @@ class DocumentService:
|
||||
"notion_page_icon": page.page_icon.model_dump() if page.page_icon else None,
|
||||
"type": page.type,
|
||||
}
|
||||
# Truncate page name to 255 characters to prevent DB field length errors
|
||||
truncated_page_name = page.page_name[:255] if page.page_name else "nopagename"
|
||||
document = DocumentService.build_document(
|
||||
dataset,
|
||||
dataset_process_rule.id, # type: ignore
|
||||
@ -989,7 +1028,7 @@ class DocumentService:
|
||||
created_from,
|
||||
position,
|
||||
account,
|
||||
page.page_name,
|
||||
truncated_page_name,
|
||||
batch,
|
||||
knowledge_config.metadata,
|
||||
)
|
||||
@ -1086,9 +1125,22 @@ class DocumentService:
|
||||
doc_form=document_form,
|
||||
doc_language=document_language,
|
||||
)
|
||||
doc_metadata = {}
|
||||
if dataset.built_in_field_enabled:
|
||||
doc_metadata = {
|
||||
BuiltInField.document_name: name,
|
||||
BuiltInField.uploader: account.name,
|
||||
BuiltInField.upload_date: datetime.datetime.now(datetime.timezone.utc).strftime("%Y-%m-%d %H:%M:%S"),
|
||||
BuiltInField.last_update_date: datetime.datetime.now(datetime.timezone.utc).strftime(
|
||||
"%Y-%m-%d %H:%M:%S"
|
||||
),
|
||||
BuiltInField.source: data_source_type,
|
||||
}
|
||||
if metadata is not None:
|
||||
document.doc_metadata = metadata.doc_metadata
|
||||
doc_metadata.update(metadata.doc_metadata)
|
||||
document.doc_type = metadata.doc_type
|
||||
if doc_metadata:
|
||||
document.doc_metadata = doc_metadata
|
||||
return document
|
||||
|
||||
@staticmethod
|
||||
|
||||
@ -124,3 +124,36 @@ class SegmentUpdateArgs(BaseModel):
|
||||
class ChildChunkUpdateArgs(BaseModel):
|
||||
id: Optional[str] = None
|
||||
content: str
|
||||
|
||||
|
||||
class MetadataArgs(BaseModel):
|
||||
type: Literal["string", "number", "time"]
|
||||
name: str
|
||||
|
||||
|
||||
class MetadataUpdateArgs(BaseModel):
|
||||
name: str
|
||||
value: str
|
||||
|
||||
|
||||
class MetadataValueUpdateArgs(BaseModel):
|
||||
fields: list[MetadataUpdateArgs]
|
||||
|
||||
|
||||
class MetadataDetail(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
value: str
|
||||
|
||||
|
||||
class DocumentMetadataOperation(BaseModel):
|
||||
document_id: str
|
||||
metadata_list: list[MetadataDetail]
|
||||
|
||||
|
||||
class MetadataOperationData(BaseModel):
|
||||
"""
|
||||
Metadata operation data
|
||||
"""
|
||||
|
||||
operation_data: list[DocumentMetadataOperation]
|
||||
|
||||
182
api/services/metadata_service.py
Normal file
182
api/services/metadata_service.py
Normal file
@ -0,0 +1,182 @@
|
||||
import datetime
|
||||
from typing import Optional
|
||||
|
||||
from flask_login import current_user # type: ignore
|
||||
|
||||
from core.rag.index_processor.constant.built_in_field import BuiltInField
|
||||
from extensions.ext_database import db
|
||||
from extensions.ext_redis import redis_client
|
||||
from models.dataset import Dataset, DatasetMetadata, DatasetMetadataBinding
|
||||
from services.dataset_service import DocumentService
|
||||
from services.entities.knowledge_entities.knowledge_entities import (
|
||||
MetadataArgs,
|
||||
MetadataOperationData,
|
||||
)
|
||||
from tasks.update_documents_metadata_task import update_documents_metadata_task
|
||||
|
||||
|
||||
class MetadataService:
|
||||
@staticmethod
|
||||
def create_metadata(dataset_id: str, metadata_args: MetadataArgs) -> DatasetMetadata:
|
||||
metadata = DatasetMetadata(
|
||||
dataset_id=dataset_id,
|
||||
type=metadata_args.type,
|
||||
name=metadata_args.name,
|
||||
created_by=current_user.id,
|
||||
)
|
||||
db.session.add(metadata)
|
||||
db.session.commit()
|
||||
return metadata
|
||||
|
||||
@staticmethod
|
||||
def update_metadata_name(dataset_id: str, metadata_id: str, name: str) -> DatasetMetadata:
|
||||
lock_key = f"dataset_metadata_lock_{dataset_id}"
|
||||
MetadataService.knowledge_base_metadata_lock_check(dataset_id, None)
|
||||
metadata = DatasetMetadata.query.filter_by(id=metadata_id).first()
|
||||
if metadata is None:
|
||||
raise ValueError("Metadata not found.")
|
||||
old_name = metadata.name
|
||||
metadata.name = name
|
||||
metadata.updated_by = current_user.id
|
||||
metadata.updated_at = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
|
||||
|
||||
# update related documents
|
||||
documents = []
|
||||
dataset_metadata_bindings = DatasetMetadataBinding.query.filter_by(metadata_id=metadata_id).all()
|
||||
if dataset_metadata_bindings:
|
||||
document_ids = [binding.document_id for binding in dataset_metadata_bindings]
|
||||
documents = DocumentService.get_document_by_ids(document_ids)
|
||||
for document in documents:
|
||||
document.doc_metadata[name] = document.doc_metadata.pop(old_name)
|
||||
db.session.add(document)
|
||||
db.session.commit()
|
||||
if document_ids:
|
||||
update_documents_metadata_task.delay(dataset_id, document_ids, lock_key)
|
||||
return metadata
|
||||
|
||||
@staticmethod
|
||||
def delete_metadata(dataset_id: str, metadata_id: str):
|
||||
lock_key = f"dataset_metadata_lock_{dataset_id}"
|
||||
MetadataService.knowledge_base_metadata_lock_check(dataset_id, None)
|
||||
metadata = DatasetMetadata.query.filter_by(id=metadata_id).first()
|
||||
if metadata is None:
|
||||
raise ValueError("Metadata not found.")
|
||||
db.session.delete(metadata)
|
||||
|
||||
# delete related documents
|
||||
dataset_metadata_bindings = DatasetMetadataBinding.query.filter_by(metadata_id=metadata_id).all()
|
||||
if dataset_metadata_bindings:
|
||||
document_ids = [binding.document_id for binding in dataset_metadata_bindings]
|
||||
documents = DocumentService.get_document_by_ids(document_ids)
|
||||
for document in documents:
|
||||
document.doc_metadata.pop(metadata.name)
|
||||
db.session.add(document)
|
||||
db.session.commit()
|
||||
if document_ids:
|
||||
update_documents_metadata_task.delay(dataset_id, document_ids, lock_key)
|
||||
|
||||
@staticmethod
|
||||
def get_built_in_fields():
|
||||
return [
|
||||
{"name": BuiltInField.document_name, "type": "string"},
|
||||
{"name": BuiltInField.uploader, "type": "string"},
|
||||
{"name": BuiltInField.upload_date, "type": "date"},
|
||||
{"name": BuiltInField.last_update_date, "type": "date"},
|
||||
{"name": BuiltInField.source, "type": "string"},
|
||||
]
|
||||
|
||||
@staticmethod
|
||||
def enable_built_in_field(dataset: Dataset):
|
||||
if dataset.built_in_fields:
|
||||
return
|
||||
lock_key = f"dataset_metadata_lock_{dataset.id}"
|
||||
MetadataService.knowledge_base_metadata_lock_check(dataset.id, None)
|
||||
dataset.built_in_fields = True
|
||||
db.session.add(dataset)
|
||||
documents = DocumentService.get_working_documents_by_dataset_id(dataset.id)
|
||||
document_ids = []
|
||||
if documents:
|
||||
for document in documents:
|
||||
document.doc_metadata[BuiltInField.document_name] = document.name
|
||||
document.doc_metadata[BuiltInField.uploader] = document.uploader
|
||||
document.doc_metadata[BuiltInField.upload_date] = document.upload_date.strftime("%Y-%m-%d %H:%M:%S")
|
||||
document.doc_metadata[BuiltInField.last_update_date] = document.last_update_date.strftime(
|
||||
"%Y-%m-%d %H:%M:%S"
|
||||
)
|
||||
document.doc_metadata[BuiltInField.source] = document.data_source_type
|
||||
db.session.add(document)
|
||||
document_ids.append(document.id)
|
||||
db.session.commit()
|
||||
if document_ids:
|
||||
update_documents_metadata_task.delay(dataset.id, document_ids, lock_key)
|
||||
|
||||
@staticmethod
|
||||
def disable_built_in_field(dataset: Dataset):
|
||||
if not dataset.built_in_fields:
|
||||
return
|
||||
lock_key = f"dataset_metadata_lock_{dataset.id}"
|
||||
MetadataService.knowledge_base_metadata_lock_check(dataset.id, None)
|
||||
dataset.built_in_fields = False
|
||||
db.session.add(dataset)
|
||||
documents = DocumentService.get_working_documents_by_dataset_id(dataset.id)
|
||||
document_ids = []
|
||||
if documents:
|
||||
for document in documents:
|
||||
document.doc_metadata.pop(BuiltInField.document_name)
|
||||
document.doc_metadata.pop(BuiltInField.uploader)
|
||||
document.doc_metadata.pop(BuiltInField.upload_date)
|
||||
document.doc_metadata.pop(BuiltInField.last_update_date)
|
||||
document.doc_metadata.pop(BuiltInField.source)
|
||||
db.session.add(document)
|
||||
document_ids.append(document.id)
|
||||
db.session.commit()
|
||||
if document_ids:
|
||||
update_documents_metadata_task.delay(dataset.id, document_ids, lock_key)
|
||||
|
||||
@staticmethod
|
||||
def update_documents_metadata(dataset: Dataset, metadata_args: MetadataOperationData):
|
||||
for operation in metadata_args.operation_data:
|
||||
lock_key = f"document_metadata_lock_{operation.document_id}"
|
||||
MetadataService.knowledge_base_metadata_lock_check(None, operation.document_id)
|
||||
document = DocumentService.get_document(operation.document_id)
|
||||
if document is None:
|
||||
raise ValueError("Document not found.")
|
||||
document.doc_metadata = {}
|
||||
for metadata_value in metadata_args.fields:
|
||||
document.doc_metadata[metadata_value.name] = metadata_value.value
|
||||
if dataset.built_in_fields:
|
||||
document.doc_metadata[BuiltInField.document_name] = document.name
|
||||
document.doc_metadata[BuiltInField.uploader] = document.uploader
|
||||
document.doc_metadata[BuiltInField.upload_date] = document.upload_date.strftime("%Y-%m-%d %H:%M:%S")
|
||||
document.doc_metadata[BuiltInField.last_update_date] = document.last_update_date.strftime(
|
||||
"%Y-%m-%d %H:%M:%S"
|
||||
)
|
||||
document.doc_metadata[BuiltInField.source] = document.data_source_type
|
||||
# deal metadata bindding
|
||||
DatasetMetadataBinding.query.filter_by(document_id=operation.document_id).delete()
|
||||
for metadata_value in operation.metadata_list:
|
||||
dataset_metadata_binding = DatasetMetadataBinding(
|
||||
tenant_id=current_user.tenant_id,
|
||||
dataset_id=dataset.id,
|
||||
document_id=operation.document_id,
|
||||
metadata_id=metadata_value.id,
|
||||
created_by=current_user.id,
|
||||
)
|
||||
db.session.add(dataset_metadata_binding)
|
||||
db.session.add(document)
|
||||
db.session.commit()
|
||||
|
||||
update_documents_metadata_task.delay(dataset.id, [document.id], lock_key)
|
||||
|
||||
@staticmethod
|
||||
def knowledge_base_metadata_lock_check(dataset_id: Optional[str], document_id: Optional[str]):
|
||||
if dataset_id:
|
||||
lock_key = f"dataset_metadata_lock_{dataset_id}"
|
||||
if redis_client.get(lock_key):
|
||||
raise ValueError("Another knowledge base metadata operation is running, please wait a moment.")
|
||||
redis_client.set(lock_key, 1, ex=3600)
|
||||
if document_id:
|
||||
lock_key = f"document_metadata_lock_{document_id}"
|
||||
if redis_client.get(lock_key):
|
||||
raise ValueError("Another document metadata operation is running, please wait a moment.")
|
||||
redis_client.set(lock_key, 1, ex=3600)
|
||||
@ -1,5 +1,5 @@
|
||||
from core.helper import marketplace
|
||||
from core.plugin.entities.plugin import GenericProviderID, PluginDependency, PluginInstallationSource
|
||||
from core.plugin.entities.plugin import ModelProviderID, PluginDependency, PluginInstallationSource, ToolProviderID
|
||||
from core.plugin.manager.plugin import PluginInstallationManager
|
||||
|
||||
|
||||
@ -12,10 +12,7 @@ class DependenciesAnalysisService:
|
||||
Convert the tool id to the plugin_id
|
||||
"""
|
||||
try:
|
||||
tool_provider_id = GenericProviderID(tool_id)
|
||||
if tool_id in ["jina", "siliconflow"]:
|
||||
tool_provider_id.plugin_name = tool_provider_id.plugin_name + "_tool"
|
||||
return tool_provider_id.plugin_id
|
||||
return ToolProviderID(tool_id).plugin_id
|
||||
except Exception as e:
|
||||
raise e
|
||||
|
||||
@ -27,11 +24,7 @@ class DependenciesAnalysisService:
|
||||
Convert the model provider id to the plugin_id
|
||||
"""
|
||||
try:
|
||||
generic_provider_id = GenericProviderID(model_provider_id)
|
||||
if model_provider_id == "google":
|
||||
generic_provider_id.plugin_name = "gemini"
|
||||
|
||||
return generic_provider_id.plugin_id
|
||||
return ModelProviderID(model_provider_id).plugin_id
|
||||
except Exception as e:
|
||||
raise e
|
||||
|
||||
|
||||
@ -14,9 +14,8 @@ from flask import Flask, current_app
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from core.agent.entities import AgentToolEntity
|
||||
from core.entities import DEFAULT_PLUGIN_ID
|
||||
from core.helper import marketplace
|
||||
from core.plugin.entities.plugin import PluginInstallationSource
|
||||
from core.plugin.entities.plugin import ModelProviderID, PluginInstallationSource, ToolProviderID
|
||||
from core.plugin.entities.plugin_daemon import PluginInstallTaskStatus
|
||||
from core.plugin.manager.plugin import PluginInstallationManager
|
||||
from core.tools.entities.tool_entities import ToolProviderType
|
||||
@ -203,13 +202,7 @@ class PluginMigration:
|
||||
result = []
|
||||
for row in rs:
|
||||
provider_name = str(row[0])
|
||||
if provider_name and "/" not in provider_name:
|
||||
if provider_name == "google":
|
||||
provider_name = "gemini"
|
||||
|
||||
result.append(DEFAULT_PLUGIN_ID + "/" + provider_name)
|
||||
elif provider_name:
|
||||
result.append(provider_name)
|
||||
result.append(ModelProviderID(provider_name).plugin_id)
|
||||
|
||||
return result
|
||||
|
||||
@ -222,30 +215,10 @@ class PluginMigration:
|
||||
rs = session.query(BuiltinToolProvider).filter(BuiltinToolProvider.tenant_id == tenant_id).all()
|
||||
result = []
|
||||
for row in rs:
|
||||
if "/" not in row.provider:
|
||||
result.append(DEFAULT_PLUGIN_ID + "/" + row.provider)
|
||||
else:
|
||||
result.append(row.provider)
|
||||
result.append(ToolProviderID(row.provider).plugin_id)
|
||||
|
||||
return result
|
||||
|
||||
@classmethod
|
||||
def _handle_builtin_tool_provider(cls, provider_name: str) -> str:
|
||||
"""
|
||||
Handle builtin tool provider.
|
||||
"""
|
||||
if provider_name == "jina":
|
||||
provider_name = "jina_tool"
|
||||
elif provider_name == "siliconflow":
|
||||
provider_name = "siliconflow_tool"
|
||||
elif provider_name == "stepfun":
|
||||
provider_name = "stepfun_tool"
|
||||
|
||||
if "/" not in provider_name:
|
||||
return DEFAULT_PLUGIN_ID + "/" + provider_name
|
||||
else:
|
||||
return provider_name
|
||||
|
||||
@classmethod
|
||||
def extract_workflow_tables(cls, tenant_id: str) -> Sequence[str]:
|
||||
"""
|
||||
@ -266,8 +239,7 @@ class PluginMigration:
|
||||
provider_name = data.get("provider_name")
|
||||
provider_type = data.get("provider_type")
|
||||
if provider_name not in excluded_providers and provider_type == ToolProviderType.BUILT_IN.value:
|
||||
provider_name = cls._handle_builtin_tool_provider(provider_name)
|
||||
result.append(provider_name)
|
||||
result.append(ToolProviderID(provider_name).plugin_id)
|
||||
|
||||
return result
|
||||
|
||||
@ -298,7 +270,7 @@ class PluginMigration:
|
||||
tool_entity.provider_type == ToolProviderType.BUILT_IN.value
|
||||
and tool_entity.provider_id not in excluded_providers
|
||||
):
|
||||
result.append(cls._handle_builtin_tool_provider(tool_entity.provider_id))
|
||||
result.append(ToolProviderID(tool_entity.provider_id).plugin_id)
|
||||
|
||||
except Exception:
|
||||
logger.exception(f"Failed to process tool {tool}")
|
||||
@ -386,7 +358,7 @@ class PluginMigration:
|
||||
batch_plugin_identifiers = [
|
||||
plugins["plugins"][plugin_id]
|
||||
for plugin_id in batch_plugin_ids
|
||||
if plugin_id not in installed_plugins_ids
|
||||
if plugin_id not in installed_plugins_ids and plugin_id in plugins["plugins"]
|
||||
]
|
||||
manager.install_from_identifiers(
|
||||
tenant_id,
|
||||
|
||||
@ -233,56 +233,57 @@ class BuiltinToolManageService:
|
||||
# get all builtin providers
|
||||
provider_controllers = ToolManager.list_builtin_providers(tenant_id)
|
||||
|
||||
# get all user added providers
|
||||
db_providers: list[BuiltinToolProvider] = (
|
||||
db.session.query(BuiltinToolProvider).filter(BuiltinToolProvider.tenant_id == tenant_id).all() or []
|
||||
)
|
||||
with db.session.no_autoflush:
|
||||
# get all user added providers
|
||||
db_providers: list[BuiltinToolProvider] = (
|
||||
db.session.query(BuiltinToolProvider).filter(BuiltinToolProvider.tenant_id == tenant_id).all() or []
|
||||
)
|
||||
|
||||
# rewrite db_providers
|
||||
for db_provider in db_providers:
|
||||
db_provider.provider = str(ToolProviderID(db_provider.provider))
|
||||
# rewrite db_providers
|
||||
for db_provider in db_providers:
|
||||
db_provider.provider = str(ToolProviderID(db_provider.provider))
|
||||
|
||||
# find provider
|
||||
def find_provider(provider):
|
||||
return next(filter(lambda db_provider: db_provider.provider == provider, db_providers), None)
|
||||
# find provider
|
||||
def find_provider(provider):
|
||||
return next(filter(lambda db_provider: db_provider.provider == provider, db_providers), None)
|
||||
|
||||
result: list[ToolProviderApiEntity] = []
|
||||
result: list[ToolProviderApiEntity] = []
|
||||
|
||||
for provider_controller in provider_controllers:
|
||||
try:
|
||||
# handle include, exclude
|
||||
if is_filtered(
|
||||
include_set=dify_config.POSITION_TOOL_INCLUDES_SET, # type: ignore
|
||||
exclude_set=dify_config.POSITION_TOOL_EXCLUDES_SET, # type: ignore
|
||||
data=provider_controller,
|
||||
name_func=lambda x: x.identity.name,
|
||||
):
|
||||
continue
|
||||
for provider_controller in provider_controllers:
|
||||
try:
|
||||
# handle include, exclude
|
||||
if is_filtered(
|
||||
include_set=dify_config.POSITION_TOOL_INCLUDES_SET, # type: ignore
|
||||
exclude_set=dify_config.POSITION_TOOL_EXCLUDES_SET, # type: ignore
|
||||
data=provider_controller,
|
||||
name_func=lambda x: x.identity.name,
|
||||
):
|
||||
continue
|
||||
|
||||
# convert provider controller to user provider
|
||||
user_builtin_provider = ToolTransformService.builtin_provider_to_user_provider(
|
||||
provider_controller=provider_controller,
|
||||
db_provider=find_provider(provider_controller.entity.identity.name),
|
||||
decrypt_credentials=True,
|
||||
)
|
||||
|
||||
# add icon
|
||||
ToolTransformService.repack_provider(tenant_id=tenant_id, provider=user_builtin_provider)
|
||||
|
||||
tools = provider_controller.get_tools()
|
||||
for tool in tools or []:
|
||||
user_builtin_provider.tools.append(
|
||||
ToolTransformService.convert_tool_entity_to_api_entity(
|
||||
tenant_id=tenant_id,
|
||||
tool=tool,
|
||||
credentials=user_builtin_provider.original_credentials,
|
||||
labels=ToolLabelManager.get_tool_labels(provider_controller),
|
||||
)
|
||||
# convert provider controller to user provider
|
||||
user_builtin_provider = ToolTransformService.builtin_provider_to_user_provider(
|
||||
provider_controller=provider_controller,
|
||||
db_provider=find_provider(provider_controller.entity.identity.name),
|
||||
decrypt_credentials=True,
|
||||
)
|
||||
|
||||
result.append(user_builtin_provider)
|
||||
except Exception as e:
|
||||
raise e
|
||||
# add icon
|
||||
ToolTransformService.repack_provider(tenant_id=tenant_id, provider=user_builtin_provider)
|
||||
|
||||
tools = provider_controller.get_tools()
|
||||
for tool in tools or []:
|
||||
user_builtin_provider.tools.append(
|
||||
ToolTransformService.convert_tool_entity_to_api_entity(
|
||||
tenant_id=tenant_id,
|
||||
tool=tool,
|
||||
credentials=user_builtin_provider.original_credentials,
|
||||
labels=ToolLabelManager.get_tool_labels(provider_controller),
|
||||
)
|
||||
)
|
||||
|
||||
result.append(user_builtin_provider)
|
||||
except Exception as e:
|
||||
raise e
|
||||
|
||||
return BuiltinToolProviderSort.sort(result)
|
||||
|
||||
|
||||
121
api/tasks/update_documents_metadata_task.py
Normal file
121
api/tasks/update_documents_metadata_task.py
Normal file
@ -0,0 +1,121 @@
|
||||
import logging
|
||||
import time
|
||||
from typing import Optional
|
||||
|
||||
import click
|
||||
from celery import shared_task # type: ignoreq
|
||||
|
||||
from core.rag.index_processor.constant.built_in_field import BuiltInField
|
||||
from core.rag.index_processor.constant.index_type import IndexType
|
||||
from core.rag.index_processor.index_processor_factory import IndexProcessorFactory
|
||||
from core.rag.models.document import ChildDocument, Document
|
||||
from extensions.ext_database import db
|
||||
from extensions.ext_redis import redis_client
|
||||
from models.dataset import (
|
||||
Document as DatasetDocument,
|
||||
)
|
||||
from models.dataset import (
|
||||
DocumentSegment,
|
||||
)
|
||||
from services.dataset_service import DatasetService
|
||||
|
||||
|
||||
@shared_task(queue="dataset")
|
||||
def update_documents_metadata_task(
|
||||
dataset_id: str,
|
||||
document_ids: list[str],
|
||||
lock_key: Optional[str] = None,
|
||||
):
|
||||
"""
|
||||
Update documents metadata.
|
||||
:param dataset_id: dataset id
|
||||
:param document_ids: document ids
|
||||
|
||||
Usage: update_documents_metadata_task.delay(dataset_id, document_ids)
|
||||
"""
|
||||
logging.info(click.style("Start update documents metadata: {}".format(dataset_id), fg="green"))
|
||||
start_at = time.perf_counter()
|
||||
|
||||
try:
|
||||
dataset = DatasetService.get_dataset(dataset_id)
|
||||
if dataset is None:
|
||||
raise ValueError("Dataset not found.")
|
||||
documents = (
|
||||
db.session.query(DatasetDocument)
|
||||
.filter(
|
||||
DatasetDocument.dataset_id == dataset_id,
|
||||
DatasetDocument.id.in_(document_ids),
|
||||
DatasetDocument.enabled == True,
|
||||
DatasetDocument.indexing_status == "completed",
|
||||
DatasetDocument.archived == False,
|
||||
)
|
||||
.all()
|
||||
)
|
||||
if not documents:
|
||||
raise ValueError("Documents not found.")
|
||||
for dataset_document in documents:
|
||||
index_processor = IndexProcessorFactory(dataset_document.doc_form).init_index_processor()
|
||||
|
||||
segments = (
|
||||
db.session.query(DocumentSegment)
|
||||
.filter(
|
||||
DocumentSegment.dataset_id == dataset_id,
|
||||
DocumentSegment.document_id == dataset_document.id,
|
||||
DocumentSegment.enabled == True,
|
||||
)
|
||||
.all()
|
||||
)
|
||||
if not segments:
|
||||
continue
|
||||
# delete all documents in vector index
|
||||
index_node_ids = [segment.index_node_id for segment in segments]
|
||||
index_processor.clean(dataset, index_node_ids, with_keywords=False, delete_child_chunks=True)
|
||||
# update documents metadata
|
||||
documents = []
|
||||
for segment in segments:
|
||||
document = Document(
|
||||
page_content=segment.content,
|
||||
metadata={
|
||||
"doc_id": segment.index_node_id,
|
||||
"doc_hash": segment.index_node_hash,
|
||||
"document_id": dataset_document.id,
|
||||
"dataset_id": dataset_id,
|
||||
},
|
||||
)
|
||||
|
||||
if dataset_document.doc_form == IndexType.PARENT_CHILD_INDEX:
|
||||
child_chunks = segment.child_chunks
|
||||
if child_chunks:
|
||||
child_documents = []
|
||||
for child_chunk in child_chunks:
|
||||
child_document = ChildDocument(
|
||||
page_content=child_chunk.content,
|
||||
metadata={
|
||||
"doc_id": child_chunk.index_node_id,
|
||||
"doc_hash": child_chunk.index_node_hash,
|
||||
"document_id": dataset_document.id,
|
||||
"dataset_id": dataset_id,
|
||||
},
|
||||
)
|
||||
if dataset.built_in_field_enabled:
|
||||
child_document.metadata[BuiltInField.uploader] = dataset_document.created_by
|
||||
child_document.metadata[BuiltInField.upload_date] = dataset_document.created_at
|
||||
child_document.metadata[BuiltInField.last_update_date] = dataset_document.updated_at
|
||||
child_document.metadata[BuiltInField.source] = dataset_document.data_source_type
|
||||
child_document.metadata[BuiltInField.original_filename] = dataset_document.name
|
||||
if dataset_document.doc_metadata:
|
||||
child_document.metadata.update(dataset_document.doc_metadata)
|
||||
child_documents.append(child_document)
|
||||
document.children = child_documents
|
||||
documents.append(document) # noqa: B909
|
||||
# save vector index
|
||||
index_processor.load(dataset, documents)
|
||||
end_at = time.perf_counter()
|
||||
logging.info(
|
||||
click.style("Updated documents metadata: {} latency: {}".format(dataset_id, end_at - start_at), fg="green")
|
||||
)
|
||||
except Exception:
|
||||
logging.exception("Updated documents metadata failed")
|
||||
finally:
|
||||
if lock_key:
|
||||
redis_client.delete(lock_key)
|
||||
@ -2,7 +2,7 @@ x-shared-env: &shared-api-worker-env
|
||||
services:
|
||||
# API service
|
||||
api:
|
||||
image: langgenius/dify-api:1.0.0
|
||||
image: langgenius/dify-api:0.15.3
|
||||
restart: always
|
||||
environment:
|
||||
# Use the shared environment variables.
|
||||
@ -27,7 +27,7 @@ services:
|
||||
# worker service
|
||||
# The Celery worker for processing the queue.
|
||||
worker:
|
||||
image: langgenius/dify-api:1.0.0
|
||||
image: langgenius/dify-api:0.15.3
|
||||
restart: always
|
||||
environment:
|
||||
# Use the shared environment variables.
|
||||
@ -51,7 +51,7 @@ services:
|
||||
|
||||
# Frontend web application.
|
||||
web:
|
||||
image: langgenius/dify-web:1.0.0
|
||||
image: langgenius/dify-web:0.15.3
|
||||
restart: always
|
||||
environment:
|
||||
CONSOLE_API_URL: ${CONSOLE_API_URL:-}
|
||||
@ -64,6 +64,7 @@ services:
|
||||
MARKETPLACE_URL: ${MARKETPLACE_URL:-https://marketplace.dify.ai}
|
||||
TOP_K_MAX_VALUE: ${TOP_K_MAX_VALUE:-}
|
||||
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH: ${INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH:-}
|
||||
PM2_INSTANCES: ${PM2_INSTANCES:-2}
|
||||
|
||||
# The postgres database.
|
||||
db:
|
||||
@ -121,6 +122,7 @@ services:
|
||||
SANDBOX_PORT: ${SANDBOX_PORT:-8194}
|
||||
volumes:
|
||||
- ./volumes/sandbox/dependencies:/dependencies
|
||||
- ./volumes/sandbox/conf:/conf
|
||||
healthcheck:
|
||||
test: [ 'CMD', 'curl', '-f', 'http://localhost:8194/health' ]
|
||||
networks:
|
||||
@ -128,7 +130,7 @@ services:
|
||||
|
||||
# plugin daemon
|
||||
plugin_daemon:
|
||||
image: langgenius/dify-plugin-daemon:0.0.1-local
|
||||
image: langgenius/dify-plugin-daemon:0.0.2-local
|
||||
restart: always
|
||||
environment:
|
||||
# Use the shared environment variables.
|
||||
|
||||
@ -66,7 +66,7 @@ services:
|
||||
|
||||
# plugin daemon
|
||||
plugin_daemon:
|
||||
image: langgenius/dify-plugin-daemon:0.0.1-local
|
||||
image: langgenius/dify-plugin-daemon:0.0.2-local
|
||||
restart: always
|
||||
environment:
|
||||
# Use the shared environment variables.
|
||||
|
||||
@ -414,7 +414,7 @@ x-shared-env: &shared-api-worker-env
|
||||
services:
|
||||
# API service
|
||||
api:
|
||||
image: langgenius/dify-api:1.0.0
|
||||
image: langgenius/dify-api:0.15.3
|
||||
restart: always
|
||||
environment:
|
||||
# Use the shared environment variables.
|
||||
@ -439,7 +439,7 @@ services:
|
||||
# worker service
|
||||
# The Celery worker for processing the queue.
|
||||
worker:
|
||||
image: langgenius/dify-api:1.0.0
|
||||
image: langgenius/dify-api:0.15.3
|
||||
restart: always
|
||||
environment:
|
||||
# Use the shared environment variables.
|
||||
@ -463,7 +463,7 @@ services:
|
||||
|
||||
# Frontend web application.
|
||||
web:
|
||||
image: langgenius/dify-web:1.0.0
|
||||
image: langgenius/dify-web:0.15.3
|
||||
restart: always
|
||||
environment:
|
||||
CONSOLE_API_URL: ${CONSOLE_API_URL:-}
|
||||
@ -476,6 +476,7 @@ services:
|
||||
MARKETPLACE_URL: ${MARKETPLACE_URL:-https://marketplace.dify.ai}
|
||||
TOP_K_MAX_VALUE: ${TOP_K_MAX_VALUE:-}
|
||||
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH: ${INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH:-}
|
||||
PM2_INSTANCES: ${PM2_INSTANCES:-2}
|
||||
|
||||
# The postgres database.
|
||||
db:
|
||||
@ -533,6 +534,7 @@ services:
|
||||
SANDBOX_PORT: ${SANDBOX_PORT:-8194}
|
||||
volumes:
|
||||
- ./volumes/sandbox/dependencies:/dependencies
|
||||
- ./volumes/sandbox/conf:/conf
|
||||
healthcheck:
|
||||
test: [ 'CMD', 'curl', '-f', 'http://localhost:8194/health' ]
|
||||
networks:
|
||||
@ -540,7 +542,7 @@ services:
|
||||
|
||||
# plugin daemon
|
||||
plugin_daemon:
|
||||
image: langgenius/dify-plugin-daemon:0.0.1-local
|
||||
image: langgenius/dify-plugin-daemon:0.0.2-local
|
||||
restart: always
|
||||
environment:
|
||||
# Use the shared environment variables.
|
||||
|
||||
@ -22,7 +22,7 @@ COPY pnpm-lock.yaml .
|
||||
# if you located in China, you can use taobao registry to speed up
|
||||
# RUN pnpm install --frozen-lockfile --registry https://registry.npmmirror.com/
|
||||
|
||||
RUN pnpm install --frozen-lockfile -P
|
||||
RUN pnpm install --frozen-lockfile
|
||||
|
||||
# build resources
|
||||
FROM base AS builder
|
||||
@ -46,6 +46,7 @@ ENV MARKETPLACE_API_URL=http://127.0.0.1:5001
|
||||
ENV MARKETPLACE_URL=http://127.0.0.1:5001
|
||||
ENV PORT=3000
|
||||
ENV NEXT_TELEMETRY_DISABLED=1
|
||||
ENV PM2_INSTANCES=2
|
||||
|
||||
# set timezone
|
||||
ENV TZ=UTC
|
||||
@ -58,7 +59,6 @@ COPY --from=builder /app/web/public ./public
|
||||
COPY --from=builder /app/web/.next/standalone ./
|
||||
COPY --from=builder /app/web/.next/static ./.next/static
|
||||
|
||||
COPY docker/pm2.json ./pm2.json
|
||||
COPY docker/entrypoint.sh ./entrypoint.sh
|
||||
|
||||
|
||||
|
||||
@ -70,6 +70,8 @@ If you want to customize the host and port:
|
||||
pnpm run start --port=3001 --host=0.0.0.0
|
||||
```
|
||||
|
||||
If you want to customize the number of instances launched by PM2, you can configure `PM2_INSTANCES` in `docker-compose.yaml` or `Dockerfile`.
|
||||
|
||||
## Storybook
|
||||
|
||||
This project uses [Storybook](https://storybook.js.org/) for UI component development.
|
||||
|
||||
@ -53,7 +53,7 @@ export default function ChartView({ appId }: IChartViewProps) {
|
||||
className='mt-0 !w-40'
|
||||
onSelect={(item) => {
|
||||
const id = item.value
|
||||
const value = TIME_PERIOD_MAPPING[id]?.value || '-1'
|
||||
const value = TIME_PERIOD_MAPPING[id]?.value ?? '-1'
|
||||
const name = item.name || t('appLog.filter.period.allTime')
|
||||
onSelect({ value, name })
|
||||
}}
|
||||
|
||||
@ -59,8 +59,8 @@ const Apps = () => {
|
||||
const [activeTab, setActiveTab] = useTabSearchParams({
|
||||
defaultTab: 'all',
|
||||
})
|
||||
const { query: { tagIDs = [], keywords = '' }, setQuery } = useAppsQueryState()
|
||||
const [isCreatedByMe, setIsCreatedByMe] = useState(false)
|
||||
const { query: { tagIDs = [], keywords = '', isCreatedByMe: queryIsCreatedByMe = false }, setQuery } = useAppsQueryState()
|
||||
const [isCreatedByMe, setIsCreatedByMe] = useState(queryIsCreatedByMe)
|
||||
const [tagFilterValue, setTagFilterValue] = useState<string[]>(tagIDs)
|
||||
const [searchKeywords, setSearchKeywords] = useState(keywords)
|
||||
const setKeywords = useCallback((keywords: string) => {
|
||||
@ -126,6 +126,12 @@ const Apps = () => {
|
||||
handleTagsUpdate()
|
||||
}
|
||||
|
||||
const handleCreatedByMeChange = useCallback(() => {
|
||||
const newValue = !isCreatedByMe
|
||||
setIsCreatedByMe(newValue)
|
||||
setQuery(prev => ({ ...prev, isCreatedByMe: newValue }))
|
||||
}, [isCreatedByMe, setQuery])
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className='sticky top-0 flex justify-between items-center pt-4 px-12 pb-2 leading-[56px] bg-background-body z-10 flex-wrap gap-y-2'>
|
||||
@ -139,7 +145,7 @@ const Apps = () => {
|
||||
className='mr-2'
|
||||
label={t('app.showMyCreatedAppsOnly')}
|
||||
isChecked={isCreatedByMe}
|
||||
onChange={() => setIsCreatedByMe(!isCreatedByMe)}
|
||||
onChange={handleCreatedByMeChange}
|
||||
/>
|
||||
<TagFilter type='app' value={tagFilterValue} onChange={handleTagsChange} />
|
||||
<Input
|
||||
|
||||
@ -4,18 +4,20 @@ import { useCallback, useEffect, useMemo, useState } from 'react'
|
||||
type AppsQuery = {
|
||||
tagIDs?: string[]
|
||||
keywords?: string
|
||||
isCreatedByMe?: boolean
|
||||
}
|
||||
|
||||
// Parse the query parameters from the URL search string.
|
||||
function parseParams(params: ReadonlyURLSearchParams): AppsQuery {
|
||||
const tagIDs = params.get('tagIDs')?.split(';')
|
||||
const keywords = params.get('keywords') || undefined
|
||||
return { tagIDs, keywords }
|
||||
const isCreatedByMe = params.get('isCreatedByMe') === 'true'
|
||||
return { tagIDs, keywords, isCreatedByMe }
|
||||
}
|
||||
|
||||
// Update the URL search string with the given query parameters.
|
||||
function updateSearchParams(query: AppsQuery, current: URLSearchParams) {
|
||||
const { tagIDs, keywords } = query || {}
|
||||
const { tagIDs, keywords, isCreatedByMe } = query || {}
|
||||
|
||||
if (tagIDs && tagIDs.length > 0)
|
||||
current.set('tagIDs', tagIDs.join(';'))
|
||||
@ -26,6 +28,11 @@ function updateSearchParams(query: AppsQuery, current: URLSearchParams) {
|
||||
current.set('keywords', keywords)
|
||||
else
|
||||
current.delete('keywords')
|
||||
|
||||
if (isCreatedByMe)
|
||||
current.set('isCreatedByMe', 'true')
|
||||
else
|
||||
current.delete('isCreatedByMe')
|
||||
}
|
||||
|
||||
function useAppsQueryState() {
|
||||
|
||||
@ -26,12 +26,12 @@ import { MAX_TOOLS_NUM } from '@/config'
|
||||
import { AlertTriangle } from '@/app/components/base/icons/src/vender/solid/alertsAndFeedback'
|
||||
import Tooltip from '@/app/components/base/tooltip'
|
||||
import { DefaultToolIcon } from '@/app/components/base/icons/src/public/other'
|
||||
// import AddToolModal from '@/app/components/tools/add-tool-modal'
|
||||
import ConfigCredential from '@/app/components/tools/setting/build-in/config-credentials'
|
||||
import { updateBuiltInToolCredential } from '@/service/tools'
|
||||
import cn from '@/utils/classnames'
|
||||
import ToolPicker from '@/app/components/workflow/block-selector/tool-picker'
|
||||
import type { ToolDefaultValue } from '@/app/components/workflow/block-selector/types'
|
||||
import { canFindTool } from '@/utils'
|
||||
|
||||
type AgentToolWithMoreInfo = AgentTool & { icon: any; collection?: Collection } | null
|
||||
const AgentTools: FC = () => {
|
||||
@ -43,7 +43,7 @@ const AgentTools: FC = () => {
|
||||
const [currentTool, setCurrentTool] = useState<AgentToolWithMoreInfo>(null)
|
||||
const currentCollection = useMemo(() => {
|
||||
if (!currentTool) return null
|
||||
const collection = collectionList.find(collection => collection.id.split('/').pop() === currentTool?.provider_id.split('/').pop() && collection.type === currentTool?.provider_type)
|
||||
const collection = collectionList.find(collection => canFindTool(collection.id, currentTool?.provider_id) && collection.type === currentTool?.provider_type)
|
||||
return collection
|
||||
}, [currentTool, collectionList])
|
||||
const [isShowSettingTool, setIsShowSettingTool] = useState(false)
|
||||
@ -51,7 +51,7 @@ const AgentTools: FC = () => {
|
||||
const tools = (modelConfig?.agentConfig?.tools as AgentTool[] || []).map((item) => {
|
||||
const collection = collectionList.find(
|
||||
collection =>
|
||||
collection.id.split('/').pop() === item.provider_id.split('/').pop()
|
||||
canFindTool(collection.id, item.provider_id)
|
||||
&& collection.type === item.provider_type,
|
||||
)
|
||||
const icon = collection?.icon
|
||||
|
||||
@ -27,6 +27,7 @@ import { BubbleTextMod, ChatBot, ListSparkle, Logic } from '@/app/components/bas
|
||||
import { NEED_REFRESH_APP_LIST_KEY } from '@/config'
|
||||
import { getRedirection } from '@/utils/app-redirection'
|
||||
import FullScreenModal from '@/app/components/base/fullscreen-modal'
|
||||
import useTheme from '@/hooks/use-theme'
|
||||
|
||||
type CreateAppProps = {
|
||||
onSuccess: () => void
|
||||
@ -346,7 +347,7 @@ function AppPreview({ mode }: { mode: AppMode }) {
|
||||
}
|
||||
|
||||
function AppScreenShot({ mode, show }: { mode: AppMode; show: boolean }) {
|
||||
const theme = useContextSelector(AppsContext, state => state.theme)
|
||||
const { theme } = useTheme()
|
||||
const modeToImageMap = {
|
||||
'chat': 'Chatbot',
|
||||
'advanced-chat': 'Chatflow',
|
||||
|
||||
@ -97,7 +97,7 @@ const Uploader: FC<Props> = ({
|
||||
style={{ display: 'none' }}
|
||||
type="file"
|
||||
id="fileUploader"
|
||||
accept='.yml'
|
||||
accept='.yaml,.yml'
|
||||
onChange={fileChangeHandle}
|
||||
/>
|
||||
<div ref={dropRef}>
|
||||
|
||||
@ -635,9 +635,10 @@ const ConversationList: FC<IConversationList> = ({ logs, appDetail, onRefresh })
|
||||
const [currentConversation, setCurrentConversation] = useState<ChatConversationGeneralDetail | CompletionConversationGeneralDetail | undefined>() // Currently selected conversation
|
||||
const isChatMode = appDetail.mode !== 'completion' // Whether the app is a chat app
|
||||
const isChatflow = appDetail.mode === 'advanced-chat' // Whether the app is a chatflow app
|
||||
const { setShowPromptLogModal, setShowAgentLogModal } = useAppStore(useShallow(state => ({
|
||||
const { setShowPromptLogModal, setShowAgentLogModal, setShowMessageLogModal } = useAppStore(useShallow(state => ({
|
||||
setShowPromptLogModal: state.setShowPromptLogModal,
|
||||
setShowAgentLogModal: state.setShowAgentLogModal,
|
||||
setShowMessageLogModal: state.setShowMessageLogModal,
|
||||
})))
|
||||
|
||||
// Annotated data needs to be highlighted
|
||||
@ -664,6 +665,7 @@ const ConversationList: FC<IConversationList> = ({ logs, appDetail, onRefresh })
|
||||
setCurrentConversation(undefined)
|
||||
setShowPromptLogModal(false)
|
||||
setShowAgentLogModal(false)
|
||||
setShowMessageLogModal(false)
|
||||
}
|
||||
|
||||
if (!logs)
|
||||
|
||||
@ -100,7 +100,7 @@ function getThreadMessages(tree: ChatItemInTree[], targetMessageId?: string): Ch
|
||||
let targetNode: ChatItemInTree | undefined
|
||||
|
||||
// find path to the target message
|
||||
const stack = tree.toReversed().map(rootNode => ({
|
||||
const stack = tree.slice().reverse().map(rootNode => ({
|
||||
node: rootNode,
|
||||
path: [rootNode],
|
||||
}))
|
||||
|
||||
@ -0,0 +1,58 @@
|
||||
<svg width="20" height="20" viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<g id="Partner">
|
||||
<mask id="mask0_6296_109592" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="1" y="0" width="18" height="20">
|
||||
<g id="Mask">
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="#932F19"/>
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="url(#paint0_linear_6296_109592)" fill-opacity="0.9"/>
|
||||
<path d="M7.47222 1.78016C8.45993 1.20991 8.90155 0.958665 9.36471 0.860217C9.78356 0.771189 10.2164 0.771189 10.6353 0.860217C11.0984 0.958665 11.5401 1.20991 12.5278 1.78016L15.8547 3.70096C16.8424 4.27121 17.2808 4.52805 17.5976 4.87994C17.8842 5.19815 18.1006 5.57304 18.2329 5.98028C18.3792 6.43061 18.3825 6.9387 18.3825 8.0792V11.9208C18.3825 13.0613 18.3792 13.5694 18.2329 14.0197C18.1006 14.427 17.8842 14.8018 17.5976 15.1201C17.2808 15.4719 16.8424 15.7288 15.8547 16.299L12.5278 18.2198C11.5401 18.7901 11.0984 19.0413 10.6353 19.1398C10.2164 19.2288 9.78356 19.2288 9.36471 19.1398C8.90155 19.0413 8.45993 18.7901 7.47222 18.2198L4.1453 16.299C3.1576 15.7288 2.7192 15.4719 2.40236 15.1201C2.11584 14.8018 1.89939 14.427 1.76707 14.0197C1.62075 13.5694 1.61752 13.0613 1.61752 11.9208V8.0792C1.61752 6.9387 1.62075 6.43061 1.76707 5.98028C1.89939 5.57304 2.11584 5.19815 2.40236 4.87994C2.7192 4.52805 3.1576 4.27121 4.1453 3.70096L7.47222 1.78016Z" stroke="url(#paint1_linear_6296_109592)" stroke-opacity="0.8" stroke-width="0.555556"/>
|
||||
</g>
|
||||
</mask>
|
||||
<g mask="url(#mask0_6296_109592)">
|
||||
<g id="badge-bg">
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="#932F19"/>
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="url(#paint2_linear_6296_109592)" fill-opacity="0.9"/>
|
||||
<path d="M7.58333 1.97261C8.58402 1.39487 8.99036 1.16698 9.41092 1.07758C9.7993 0.99503 10.2007 0.99503 10.5891 1.07758C11.0096 1.16698 11.416 1.39487 12.4167 1.97261L15.7436 3.89341C16.7443 4.47116 17.1448 4.70911 17.4325 5.02863C17.6982 5.3237 17.8989 5.67133 18.0216 6.04895C18.1544 6.45786 18.1603 6.92371 18.1603 8.0792V11.9208C18.1603 13.0763 18.1544 13.5421 18.0216 13.951C17.8989 14.3287 17.6982 14.6763 17.4325 14.9714C17.1448 15.2909 16.7443 15.5288 15.7436 16.1066L12.4167 18.0274C11.416 18.6051 11.0096 18.833 10.5891 18.9224C10.2007 19.005 9.7993 19.005 9.41092 18.9224C8.99036 18.833 8.58402 18.6051 7.58333 18.0274L4.25641 16.1066C3.25572 15.5288 2.8552 15.2909 2.5675 14.9714C2.30182 14.6763 2.10112 14.3287 1.97842 13.951C1.84556 13.5421 1.83975 13.0763 1.83975 11.9208V8.0792C1.83975 6.92371 1.84556 6.45786 1.97842 6.04895C2.10112 5.67133 2.30182 5.3237 2.5675 5.02863C2.8552 4.70911 3.25572 4.47116 4.25641 3.89341L7.58333 1.97261Z" stroke="url(#paint3_linear_6296_109592)" stroke-opacity="0.8"/>
|
||||
</g>
|
||||
<g id="handshake" filter="url(#filter0_d_6296_109592)">
|
||||
<path d="M11.0969 9.64841C10.895 9.44642 10.5675 9.44642 10.3656 9.64841L9.99991 10.0141C9.59596 10.418 8.94109 10.418 8.53717 10.0141C8.13325 9.61015 8.13325 8.95527 8.53717 8.55135L11.4491 5.63868C12.5371 5.39255 13.7238 5.69302 14.5709 6.54011C15.8221 7.79128 15.8807 9.78339 14.7469 11.104L13.6567 12.2081L11.0969 9.64841ZM5.42889 6.54011C6.55286 5.41614 8.27475 5.25452 9.57067 6.05524L7.80581 7.81999C6.99797 8.62783 6.99797 9.9376 7.80581 10.7454C8.58917 11.5288 9.8445 11.5525 10.6564 10.8167L10.7313 10.7454L12.9253 12.9395L10.7313 15.1336C10.3273 15.5375 9.67245 15.5375 9.26855 15.1336L5.42889 11.2939C4.11615 9.9812 4.11615 7.85284 5.42889 6.54011Z" fill="url(#paint4_linear_6296_109592)" shape-rendering="crispEdges"/>
|
||||
</g>
|
||||
<path id="highlight" opacity="0.5" d="M0 0H15.5556L5.26663 20H0V0Z" fill="url(#paint5_linear_6296_109592)"/>
|
||||
</g>
|
||||
</g>
|
||||
<defs>
|
||||
<filter id="filter0_d_6296_109592" x="3.94434" y="5.30556" width="12.1111" height="10.881" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
|
||||
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
|
||||
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
|
||||
<feOffset dy="0.25"/>
|
||||
<feGaussianBlur stdDeviation="0.25"/>
|
||||
<feComposite in2="hardAlpha" operator="out"/>
|
||||
<feColorMatrix type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.2 0"/>
|
||||
<feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_6296_109592"/>
|
||||
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_6296_109592" result="shape"/>
|
||||
</filter>
|
||||
<linearGradient id="paint0_linear_6296_109592" x1="0" y1="0" x2="22.6412" y2="1.78551" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#FF692E"/>
|
||||
<stop offset="1" stop-color="#E04F16"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint1_linear_6296_109592" x1="8.55422" y1="-1.28187e-07" x2="19.7802" y2="12.7346" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.2"/>
|
||||
<stop offset="1" stop-color="#FF4405"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint2_linear_6296_109592" x1="0" y1="0" x2="22.6412" y2="1.78551" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#FF692E"/>
|
||||
<stop offset="1" stop-color="#E04F16"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint3_linear_6296_109592" x1="8.55422" y1="-1.28187e-07" x2="19.7802" y2="12.7346" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.2"/>
|
||||
<stop offset="1" stop-color="#FF4405"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint4_linear_6296_109592" x1="9.99989" y1="5.55556" x2="9.99989" y2="15.4365" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.95"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.8"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint5_linear_6296_109592" x1="-4.78632" y1="4.375" x2="16.2164" y2="10.4" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.12"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.2"/>
|
||||
</linearGradient>
|
||||
</defs>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 9.2 KiB |
@ -0,0 +1,58 @@
|
||||
<svg width="20" height="20" viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<g id="Partner">
|
||||
<mask id="mask0_6291_109635" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="1" y="0" width="18" height="20">
|
||||
<g id="Mask">
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="#F9DBAF"/>
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="url(#paint0_linear_6291_109635)" fill-opacity="0.9"/>
|
||||
<path d="M7.47222 1.78016C8.45993 1.20991 8.90155 0.958665 9.36471 0.860217C9.78356 0.771189 10.2164 0.771189 10.6353 0.860217C11.0984 0.958665 11.5401 1.20991 12.5278 1.78016L15.8547 3.70096C16.8424 4.27121 17.2808 4.52805 17.5976 4.87994C17.8842 5.19815 18.1006 5.57304 18.2329 5.98028C18.3792 6.43061 18.3825 6.9387 18.3825 8.0792V11.9208C18.3825 13.0613 18.3792 13.5694 18.2329 14.0197C18.1006 14.427 17.8842 14.8018 17.5976 15.1201C17.2808 15.4719 16.8424 15.7288 15.8547 16.299L12.5278 18.2198C11.5401 18.7901 11.0984 19.0413 10.6353 19.1398C10.2164 19.2288 9.78356 19.2288 9.36471 19.1398C8.90155 19.0413 8.45993 18.7901 7.47222 18.2198L4.1453 16.299C3.1576 15.7288 2.7192 15.4719 2.40236 15.1201C2.11584 14.8018 1.89939 14.427 1.76707 14.0197C1.62075 13.5694 1.61752 13.0613 1.61752 11.9208V8.0792C1.61752 6.9387 1.62075 6.43061 1.76707 5.98028C1.89939 5.57304 2.11584 5.19815 2.40236 4.87994C2.7192 4.52805 3.1576 4.27121 4.1453 3.70096L7.47222 1.78016Z" stroke="url(#paint1_linear_6291_109635)" stroke-opacity="0.8" stroke-width="0.555556"/>
|
||||
</g>
|
||||
</mask>
|
||||
<g mask="url(#mask0_6291_109635)">
|
||||
<g id="badge-bg">
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="#F9DBAF"/>
|
||||
<path d="M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z" fill="url(#paint2_linear_6291_109635)" fill-opacity="0.9"/>
|
||||
<path d="M7.58333 1.97261C8.58402 1.39487 8.99036 1.16698 9.41092 1.07758C9.7993 0.99503 10.2007 0.99503 10.5891 1.07758C11.0096 1.16698 11.416 1.39487 12.4167 1.97261L15.7436 3.89341C16.7443 4.47116 17.1448 4.70911 17.4325 5.02863C17.6982 5.3237 17.8989 5.67133 18.0216 6.04895C18.1544 6.45786 18.1603 6.92371 18.1603 8.0792V11.9208C18.1603 13.0763 18.1544 13.5421 18.0216 13.951C17.8989 14.3287 17.6982 14.6763 17.4325 14.9714C17.1448 15.2909 16.7443 15.5288 15.7436 16.1066L12.4167 18.0274C11.416 18.6051 11.0096 18.833 10.5891 18.9224C10.2007 19.005 9.7993 19.005 9.41092 18.9224C8.99036 18.833 8.58402 18.6051 7.58333 18.0274L4.25641 16.1066C3.25572 15.5288 2.8552 15.2909 2.5675 14.9714C2.30182 14.6763 2.10112 14.3287 1.97842 13.951C1.84556 13.5421 1.83975 13.0763 1.83975 11.9208V8.0792C1.83975 6.92371 1.84556 6.45786 1.97842 6.04895C2.10112 5.67133 2.30182 5.3237 2.5675 5.02863C2.8552 4.70911 3.25572 4.47116 4.25641 3.89341L7.58333 1.97261Z" stroke="url(#paint3_linear_6291_109635)" stroke-opacity="0.8"/>
|
||||
</g>
|
||||
<g id="handshake" filter="url(#filter0_d_6291_109635)">
|
||||
<path d="M11.0969 9.64852C10.895 9.44652 10.5675 9.44652 10.3656 9.64852L9.99991 10.0142C9.59596 10.4181 8.94109 10.4181 8.53717 10.0142C8.13325 9.61025 8.13325 8.95537 8.53717 8.55146L11.4491 5.63879C12.5371 5.39265 13.7238 5.69313 14.5709 6.54022C15.8221 7.79139 15.8807 9.7835 14.7469 11.1041L13.6567 12.2083L11.0969 9.64852ZM5.42889 6.54022C6.55286 5.41625 8.27475 5.25463 9.57067 6.05534L7.80581 7.8201C6.99797 8.62794 6.99797 9.93771 7.80581 10.7456C8.58917 11.5289 9.8445 11.5526 10.6564 10.8168L10.7313 10.7456L12.9253 12.9396L10.7313 15.1337C10.3273 15.5376 9.67245 15.5376 9.26855 15.1337L5.42889 11.294C4.11615 9.98131 4.11615 7.85295 5.42889 6.54022Z" fill="url(#paint4_linear_6291_109635)" shape-rendering="crispEdges"/>
|
||||
</g>
|
||||
<path id="highlight" opacity="0.5" d="M0 0H15.5556L5.26663 20H0V0Z" fill="url(#paint5_linear_6291_109635)"/>
|
||||
</g>
|
||||
</g>
|
||||
<defs>
|
||||
<filter id="filter0_d_6291_109635" x="3.94434" y="5.30566" width="12.1111" height="10.8809" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
|
||||
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
|
||||
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
|
||||
<feOffset dy="0.25"/>
|
||||
<feGaussianBlur stdDeviation="0.25"/>
|
||||
<feComposite in2="hardAlpha" operator="out"/>
|
||||
<feColorMatrix type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.2 0"/>
|
||||
<feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_6291_109635"/>
|
||||
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_6291_109635" result="shape"/>
|
||||
</filter>
|
||||
<linearGradient id="paint0_linear_6291_109635" x1="0" y1="0" x2="22.6412" y2="1.78551" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#FF692E"/>
|
||||
<stop offset="1" stop-color="#E04F16"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint1_linear_6291_109635" x1="8.55422" y1="-1.28187e-07" x2="19.7802" y2="12.7346" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.95"/>
|
||||
<stop offset="1" stop-color="#E62E05"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint2_linear_6291_109635" x1="0" y1="0" x2="22.6412" y2="1.78551" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#FF692E"/>
|
||||
<stop offset="1" stop-color="#E04F16"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint3_linear_6291_109635" x1="8.55422" y1="-1.28187e-07" x2="19.7802" y2="12.7346" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.95"/>
|
||||
<stop offset="1" stop-color="#E62E05"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint4_linear_6291_109635" x1="9.99989" y1="5.55566" x2="9.99989" y2="15.4366" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.9"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint5_linear_6291_109635" x1="-4.78632" y1="4.375" x2="16.2164" y2="10.4" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.12"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.3"/>
|
||||
</linearGradient>
|
||||
</defs>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 9.2 KiB |
@ -0,0 +1,58 @@
|
||||
<svg width="20" height="20" viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<g id="Verified">
|
||||
<mask id="mask0_6296_109593" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="0" y="0" width="20" height="20">
|
||||
<g id="Mask">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62521C9.14394 0.569383 10.8558 0.569374 11.9116 1.62521L12.8128 2.52641C12.9819 2.69542 13.2111 2.79037 13.4501 2.79037H14.5059C15.9991 2.79037 17.2095 4.00082 17.2095 5.49398V6.54981C17.2095 6.78882 17.3045 7.01805 17.4735 7.18706L18.3747 8.08826C19.4305 9.1441 19.4305 10.8559 18.3747 11.9118L17.4735 12.813C17.3045 12.982 17.2095 13.2112 17.2095 13.4502V14.506C17.2095 15.9992 15.9991 17.2096 14.5059 17.2096H13.4501C13.2111 17.2096 12.9819 17.3046 12.8128 17.4736L11.9116 18.3748C10.8558 19.4306 9.14403 19.4306 8.08817 18.3748L7.18696 17.4736C7.01795 17.3046 6.78873 17.2096 6.54972 17.2096H5.49389C4.00072 17.2096 2.79028 15.9992 2.79028 14.506V13.4502C2.79028 13.2112 2.69533 12.982 2.52632 12.813L1.62513 11.9118C0.569295 10.8559 0.569295 9.1441 1.62512 8.08826L2.52632 7.18706C2.69533 7.01806 2.79028 6.78882 2.79028 6.54981V5.49398C2.79028 4.00082 4.00072 2.79037 5.49389 2.79037H6.54972C6.78873 2.79037 7.01795 2.69542 7.18696 2.52641L8.08817 1.62521Z" fill="#003DC1"/>
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62521C9.14394 0.569383 10.8558 0.569374 11.9116 1.62521L12.8128 2.52641C12.9819 2.69542 13.2111 2.79037 13.4501 2.79037H14.5059C15.9991 2.79037 17.2095 4.00082 17.2095 5.49398V6.54981C17.2095 6.78882 17.3045 7.01805 17.4735 7.18706L18.3747 8.08826C19.4305 9.1441 19.4305 10.8559 18.3747 11.9118L17.4735 12.813C17.3045 12.982 17.2095 13.2112 17.2095 13.4502V14.506C17.2095 15.9992 15.9991 17.2096 14.5059 17.2096H13.4501C13.2111 17.2096 12.9819 17.3046 12.8128 17.4736L11.9116 18.3748C10.8558 19.4306 9.14403 19.4306 8.08817 18.3748L7.18696 17.4736C7.01795 17.3046 6.78873 17.2096 6.54972 17.2096H5.49389C4.00072 17.2096 2.79028 15.9992 2.79028 14.506V13.4502C2.79028 13.2112 2.69533 12.982 2.52632 12.813L1.62513 11.9118C0.569295 10.8559 0.569295 9.1441 1.62512 8.08826L2.52632 7.18706C2.69533 7.01806 2.79028 6.78882 2.79028 6.54981V5.49398C2.79028 4.00082 4.00072 2.79037 5.49389 2.79037H6.54972C6.78873 2.79037 7.01795 2.69542 7.18696 2.52641L8.08817 1.62521Z" fill="url(#paint0_linear_6296_109593)" fill-opacity="0.9"/>
|
||||
<path d="M8.27881 1.81585L8.27881 1.81585C9.2293 0.865317 10.7704 0.865301 11.721 1.81585L12.6222 2.71705L12.6222 2.71709C12.8418 2.9366 13.1395 3.05997 13.4501 3.05997H14.5059C15.8502 3.05997 16.9399 4.14972 16.9399 5.49398V6.54981C16.9399 6.86036 17.0633 7.15813 17.2828 7.37768L17.2829 7.3777L18.1841 8.2789L18.3747 8.08826L18.1841 8.27891C19.1346 9.22945 19.1346 10.7706 18.1841 11.7211L17.2829 12.6224C17.0633 12.8419 16.9399 13.1397 16.9399 13.4502V14.506C16.9399 15.8503 15.8502 16.94 14.5059 16.94H13.4501C13.1395 16.94 12.8418 17.0634 12.6222 17.2829L12.6222 17.2829L11.721 18.1841C10.7704 19.1347 9.22939 19.1347 8.27881 18.1841L7.37761 17.2829L7.37759 17.2829C7.15804 17.0634 6.86027 16.94 6.54972 16.94H5.49389C4.14962 16.94 3.05989 15.8503 3.05989 14.506V13.4502C3.05989 13.1398 2.93655 12.8419 2.71696 12.6224C2.71696 12.6223 2.71695 12.6223 2.71694 12.6223L1.81577 11.7211C0.865224 10.7706 0.865226 9.22945 1.81576 8.2789L2.71696 7.3777C2.71696 7.3777 2.71696 7.3777 2.71696 7.3777C2.93654 7.15813 3.05989 6.86033 3.05989 6.54981V5.49398C3.05989 4.14972 4.14963 3.05997 5.49389 3.05997H6.54972C6.86024 3.05997 7.15803 2.93662 7.3776 2.71706L7.37761 2.71705L8.27881 1.81585Z" stroke="url(#paint1_linear_6296_109593)" stroke-opacity="0.8" stroke-width="0.539216"/>
|
||||
</g>
|
||||
</mask>
|
||||
<g mask="url(#mask0_6296_109593)">
|
||||
<g id="badge-bg">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62521C9.14394 0.569383 10.8558 0.569374 11.9116 1.62521L12.8128 2.52641C12.9819 2.69542 13.2111 2.79037 13.4501 2.79037H14.5059C15.9991 2.79037 17.2095 4.00082 17.2095 5.49398V6.54981C17.2095 6.78882 17.3045 7.01805 17.4735 7.18706L18.3747 8.08826C19.4305 9.1441 19.4305 10.8559 18.3747 11.9118L17.4735 12.813C17.3045 12.982 17.2095 13.2112 17.2095 13.4502V14.506C17.2095 15.9992 15.9991 17.2096 14.5059 17.2096H13.4501C13.2111 17.2096 12.9819 17.3046 12.8128 17.4736L11.9116 18.3748C10.8558 19.4306 9.14403 19.4306 8.08817 18.3748L7.18696 17.4736C7.01795 17.3046 6.78873 17.2096 6.54972 17.2096H5.49389C4.00072 17.2096 2.79028 15.9992 2.79028 14.506V13.4502C2.79028 13.2112 2.69533 12.982 2.52632 12.813L1.62513 11.9118C0.569295 10.8559 0.569295 9.1441 1.62512 8.08826L2.52632 7.18706C2.69533 7.01806 2.79028 6.78882 2.79028 6.54981V5.49398C2.79028 4.00082 4.00072 2.79037 5.49389 2.79037H6.54972C6.78873 2.79037 7.01795 2.69542 7.18696 2.52641L8.08817 1.62521Z" fill="#003DC1"/>
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62521C9.14394 0.569383 10.8558 0.569374 11.9116 1.62521L12.8128 2.52641C12.9819 2.69542 13.2111 2.79037 13.4501 2.79037H14.5059C15.9991 2.79037 17.2095 4.00082 17.2095 5.49398V6.54981C17.2095 6.78882 17.3045 7.01805 17.4735 7.18706L18.3747 8.08826C19.4305 9.1441 19.4305 10.8559 18.3747 11.9118L17.4735 12.813C17.3045 12.982 17.2095 13.2112 17.2095 13.4502V14.506C17.2095 15.9992 15.9991 17.2096 14.5059 17.2096H13.4501C13.2111 17.2096 12.9819 17.3046 12.8128 17.4736L11.9116 18.3748C10.8558 19.4306 9.14403 19.4306 8.08817 18.3748L7.18696 17.4736C7.01795 17.3046 6.78873 17.2096 6.54972 17.2096H5.49389C4.00072 17.2096 2.79028 15.9992 2.79028 14.506V13.4502C2.79028 13.2112 2.69533 12.982 2.52632 12.813L1.62513 11.9118C0.569295 10.8559 0.569295 9.1441 1.62512 8.08826L2.52632 7.18706C2.69533 7.01806 2.79028 6.78882 2.79028 6.54981V5.49398C2.79028 4.00082 4.00072 2.79037 5.49389 2.79037H6.54972C6.78873 2.79037 7.01795 2.69542 7.18696 2.52641L8.08817 1.62521Z" fill="url(#paint2_linear_6296_109593)" fill-opacity="0.9"/>
|
||||
<path d="M8.44172 1.97876L8.44173 1.97875C9.30224 1.11821 10.6975 1.11818 11.5581 1.97876L12.4593 2.87997L12.4593 2.88003C12.7221 3.1427 13.0784 3.29037 13.4501 3.29037H14.5059C15.723 3.29037 16.7095 4.27696 16.7095 5.49398V6.54981C16.7095 6.92148 16.8572 7.27785 17.1199 7.54057L17.1199 7.54061L18.0211 8.44182L18.3747 8.08826L18.0211 8.44182C18.8817 9.30239 18.8817 10.6976 18.0211 11.5582L17.1199 12.4594C16.8572 12.7222 16.7095 13.0786 16.7095 13.4502V14.506C16.7095 15.7231 15.723 16.7096 14.5059 16.7096H13.4501C13.0784 16.7096 12.7221 16.8573 12.4594 17.1199L12.4593 17.12L11.5581 18.0212C10.6975 18.8818 9.30233 18.8818 8.44172 18.0212L7.54052 17.12L7.54048 17.12C7.27775 16.8573 6.92139 16.7096 6.54972 16.7096H5.49389C4.27686 16.7096 3.29028 15.7231 3.29028 14.506V13.4502C3.29028 13.0787 3.14267 12.7222 2.87984 12.4594L1.97868 11.5582C1.11811 10.6976 1.11811 9.30238 1.97867 8.44181L2.87986 7.54062C2.87987 7.54062 2.87987 7.54061 2.87987 7.54061C3.14266 7.27784 3.29028 6.92143 3.29028 6.54981V5.49398C3.29028 4.27696 4.27687 3.29037 5.49389 3.29037H6.54972C6.92135 3.29037 7.27774 3.14273 7.54051 2.87998L7.54052 2.87997L8.44172 1.97876Z" stroke="url(#paint3_linear_6296_109593)" stroke-opacity="0.8"/>
|
||||
</g>
|
||||
<g id="check" filter="url(#filter0_d_6296_109593)">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M13.4219 6.98132C13.8732 7.28924 13.9829 7.89545 13.667 8.33533L10.04 13.3858C9.87754 13.612 9.62408 13.7602 9.34287 13.7934C9.06166 13.8266 8.77923 13.7417 8.56605 13.5599L6.49346 11.7923C6.0789 11.4387 6.03689 10.8245 6.39963 10.4204C6.76238 10.0163 7.39252 9.97533 7.80709 10.3289L9.04316 11.3831L12.0328 7.22026C12.3487 6.78038 12.9706 6.6734 13.4219 6.98132Z" fill="url(#paint4_linear_6296_109593)" shape-rendering="crispEdges"/>
|
||||
</g>
|
||||
<path id="highlight" opacity="0.5" d="M0 0H15.5556L5.26663 20H0V0Z" fill="url(#paint5_linear_6296_109593)"/>
|
||||
</g>
|
||||
</g>
|
||||
<defs>
|
||||
<filter id="filter0_d_6296_109593" x="5.65283" y="6.55549" width="8.69458" height="7.995" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
|
||||
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
|
||||
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
|
||||
<feOffset dy="0.25"/>
|
||||
<feGaussianBlur stdDeviation="0.25"/>
|
||||
<feComposite in2="hardAlpha" operator="out"/>
|
||||
<feColorMatrix type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.2 0"/>
|
||||
<feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_6296_109593"/>
|
||||
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_6296_109593" result="shape"/>
|
||||
</filter>
|
||||
<linearGradient id="paint0_linear_6296_109593" x1="16.302" y1="19.1667" x2="-0.37184" y2="14.8201" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#296DFF"/>
|
||||
<stop offset="1" stop-color="#5289FF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint1_linear_6296_109593" x1="8.67462" y1="0.833336" x2="18.9651" y2="12.5067" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.2"/>
|
||||
<stop offset="1" stop-color="#296DFF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint2_linear_6296_109593" x1="16.302" y1="19.1667" x2="-0.37184" y2="14.8201" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#296DFF"/>
|
||||
<stop offset="1" stop-color="#5289FF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint3_linear_6296_109593" x1="8.67462" y1="0.833336" x2="18.9651" y2="12.5067" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.2"/>
|
||||
<stop offset="1" stop-color="#296DFF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint4_linear_6296_109593" x1="10.0001" y1="6.80549" x2="10.0001" y2="13.8005" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.95"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.8"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint5_linear_6296_109593" x1="-4.78632" y1="4.375" x2="16.2164" y2="10.4" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.12"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.2"/>
|
||||
</linearGradient>
|
||||
</defs>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 9.6 KiB |
@ -0,0 +1,58 @@
|
||||
<svg width="20" height="20" viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<g id="Verified">
|
||||
<mask id="mask0_6295_120949" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="0" y="0" width="20" height="20">
|
||||
<g id="Mask">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62512C9.14394 0.569299 10.8558 0.56929 11.9116 1.62512L12.8128 2.52633C12.9819 2.69533 13.2111 2.79028 13.4501 2.79028H14.5059C15.9991 2.79028 17.2095 4.00074 17.2095 5.4939V6.54972C17.2095 6.78874 17.3045 7.01796 17.4735 7.18697L18.3747 8.08818C19.4305 9.14401 19.4305 10.8559 18.3747 11.9117L17.4735 12.8129C17.3045 12.9819 17.2095 13.2112 17.2095 13.4502V14.5059C17.2095 15.9991 15.9991 17.2095 14.5059 17.2095H13.4501C13.2111 17.2095 12.9819 17.3045 12.8128 17.4735L11.9116 18.3747C10.8558 19.4305 9.14403 19.4305 8.08817 18.3747L7.18696 17.4735C7.01795 17.3045 6.78873 17.2095 6.54972 17.2095H5.49389C4.00072 17.2095 2.79028 15.9991 2.79028 14.5059V13.4502C2.79028 13.2112 2.69533 12.9819 2.52632 12.8129L1.62513 11.9117C0.569295 10.8559 0.569295 9.14401 1.62512 8.08818L2.52632 7.18697C2.69533 7.01797 2.79028 6.78874 2.79028 6.54972V5.4939C2.79028 4.00074 4.00072 2.79028 5.49389 2.79028H6.54972C6.78873 2.79028 7.01795 2.69533 7.18696 2.52633L8.08817 1.62512Z" fill="#B2CAFF"/>
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62512C9.14394 0.569299 10.8558 0.56929 11.9116 1.62512L12.8128 2.52633C12.9819 2.69533 13.2111 2.79028 13.4501 2.79028H14.5059C15.9991 2.79028 17.2095 4.00074 17.2095 5.4939V6.54972C17.2095 6.78874 17.3045 7.01796 17.4735 7.18697L18.3747 8.08818C19.4305 9.14401 19.4305 10.8559 18.3747 11.9117L17.4735 12.8129C17.3045 12.9819 17.2095 13.2112 17.2095 13.4502V14.5059C17.2095 15.9991 15.9991 17.2095 14.5059 17.2095H13.4501C13.2111 17.2095 12.9819 17.3045 12.8128 17.4735L11.9116 18.3747C10.8558 19.4305 9.14403 19.4305 8.08817 18.3747L7.18696 17.4735C7.01795 17.3045 6.78873 17.2095 6.54972 17.2095H5.49389C4.00072 17.2095 2.79028 15.9991 2.79028 14.5059V13.4502C2.79028 13.2112 2.69533 12.9819 2.52632 12.8129L1.62513 11.9117C0.569295 10.8559 0.569295 9.14401 1.62512 8.08818L2.52632 7.18697C2.69533 7.01797 2.79028 6.78874 2.79028 6.54972V5.4939C2.79028 4.00074 4.00072 2.79028 5.49389 2.79028H6.54972C6.78873 2.79028 7.01795 2.69533 7.18696 2.52633L8.08817 1.62512Z" fill="url(#paint0_linear_6295_120949)" fill-opacity="0.9"/>
|
||||
<path d="M8.27881 1.81577L8.27881 1.81576C9.2293 0.865233 10.7704 0.865217 11.721 1.81577L12.6222 2.71697L12.6222 2.71701C12.8418 2.93652 13.1395 3.05989 13.4501 3.05989H14.5059C15.8502 3.05989 16.9399 4.14963 16.9399 5.4939V6.54972C16.9399 6.86027 17.0633 7.15805 17.2828 7.3776L17.2829 7.37762L18.1841 8.27882L18.3747 8.08818L18.1841 8.27882C19.1346 9.22937 19.1346 10.7705 18.1841 11.7211L17.2829 12.6223C17.0633 12.8418 16.9399 13.1396 16.9399 13.4502V14.5059C16.9399 15.8502 15.8502 16.9399 14.5059 16.9399H13.4501C13.1395 16.9399 12.8418 17.0633 12.6222 17.2828L12.6222 17.2829L11.721 18.1841C10.7704 19.1346 9.22939 19.1346 8.27881 18.1841L7.37761 17.2829L7.37759 17.2828C7.15804 17.0633 6.86027 16.9399 6.54972 16.9399H5.49389C4.14962 16.9399 3.05989 15.8502 3.05989 14.5059V13.4502C3.05989 13.1397 2.93655 12.8418 2.71696 12.6223C2.71696 12.6223 2.71695 12.6223 2.71694 12.6222L1.81577 11.7211C0.865224 10.7705 0.865226 9.22936 1.81576 8.27882L2.71696 7.37762C2.71696 7.37762 2.71696 7.37762 2.71696 7.37762C2.93654 7.15805 3.05989 6.86024 3.05989 6.54972V5.4939C3.05989 4.14964 4.14963 3.05989 5.49389 3.05989H6.54972C6.86024 3.05989 7.15803 2.93653 7.3776 2.71698L7.37761 2.71697L8.27881 1.81577Z" stroke="url(#paint1_linear_6295_120949)" stroke-opacity="0.8" stroke-width="0.539216"/>
|
||||
</g>
|
||||
</mask>
|
||||
<g mask="url(#mask0_6295_120949)">
|
||||
<g id="badge-bg">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62512C9.14394 0.569299 10.8558 0.56929 11.9116 1.62512L12.8128 2.52633C12.9819 2.69533 13.2111 2.79028 13.4501 2.79028H14.5059C15.9991 2.79028 17.2095 4.00074 17.2095 5.4939V6.54972C17.2095 6.78874 17.3045 7.01796 17.4735 7.18697L18.3747 8.08818C19.4305 9.14401 19.4305 10.8559 18.3747 11.9117L17.4735 12.8129C17.3045 12.9819 17.2095 13.2112 17.2095 13.4502V14.5059C17.2095 15.9991 15.9991 17.2095 14.5059 17.2095H13.4501C13.2111 17.2095 12.9819 17.3045 12.8128 17.4735L11.9116 18.3747C10.8558 19.4305 9.14403 19.4305 8.08817 18.3747L7.18696 17.4735C7.01795 17.3045 6.78873 17.2095 6.54972 17.2095H5.49389C4.00072 17.2095 2.79028 15.9991 2.79028 14.5059V13.4502C2.79028 13.2112 2.69533 12.9819 2.52632 12.8129L1.62513 11.9117C0.569295 10.8559 0.569295 9.14401 1.62512 8.08818L2.52632 7.18697C2.69533 7.01797 2.79028 6.78874 2.79028 6.54972V5.4939C2.79028 4.00074 4.00072 2.79028 5.49389 2.79028H6.54972C6.78873 2.79028 7.01795 2.69533 7.18696 2.52633L8.08817 1.62512Z" fill="#B2CAFF"/>
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.08817 1.62512C9.14394 0.569299 10.8558 0.56929 11.9116 1.62512L12.8128 2.52633C12.9819 2.69533 13.2111 2.79028 13.4501 2.79028H14.5059C15.9991 2.79028 17.2095 4.00074 17.2095 5.4939V6.54972C17.2095 6.78874 17.3045 7.01796 17.4735 7.18697L18.3747 8.08818C19.4305 9.14401 19.4305 10.8559 18.3747 11.9117L17.4735 12.8129C17.3045 12.9819 17.2095 13.2112 17.2095 13.4502V14.5059C17.2095 15.9991 15.9991 17.2095 14.5059 17.2095H13.4501C13.2111 17.2095 12.9819 17.3045 12.8128 17.4735L11.9116 18.3747C10.8558 19.4305 9.14403 19.4305 8.08817 18.3747L7.18696 17.4735C7.01795 17.3045 6.78873 17.2095 6.54972 17.2095H5.49389C4.00072 17.2095 2.79028 15.9991 2.79028 14.5059V13.4502C2.79028 13.2112 2.69533 12.9819 2.52632 12.8129L1.62513 11.9117C0.569295 10.8559 0.569295 9.14401 1.62512 8.08818L2.52632 7.18697C2.69533 7.01797 2.79028 6.78874 2.79028 6.54972V5.4939C2.79028 4.00074 4.00072 2.79028 5.49389 2.79028H6.54972C6.78873 2.79028 7.01795 2.69533 7.18696 2.52633L8.08817 1.62512Z" fill="url(#paint2_linear_6295_120949)" fill-opacity="0.9"/>
|
||||
<path d="M8.44172 1.97868L8.44173 1.97867C9.30224 1.11812 10.6975 1.1181 11.5581 1.97868L12.4593 2.87988L12.4593 2.87995C12.7221 3.14262 13.0784 3.29028 13.4501 3.29028H14.5059C15.723 3.29028 16.7095 4.27687 16.7095 5.4939V6.54972C16.7095 6.9214 16.8572 7.27776 17.1199 7.54049L17.1199 7.54053L18.0211 8.44173L18.3747 8.08818L18.0211 8.44174C18.8817 9.3023 18.8817 10.6976 18.0211 11.5582L17.1199 12.4594C16.8572 12.7221 16.7095 13.0785 16.7095 13.4502V14.5059C16.7095 15.723 15.723 16.7095 14.5059 16.7095H13.4501C13.0784 16.7095 12.7221 16.8573 12.4594 17.1198L12.4593 17.1199L11.5581 18.0211C10.6975 18.8817 9.30233 18.8817 8.44172 18.0211L7.54052 17.1199L7.54048 17.1199C7.27775 16.8572 6.92139 16.7095 6.54972 16.7095H5.49389C4.27686 16.7095 3.29028 15.723 3.29028 14.5059V13.4502C3.29028 13.0786 3.14267 12.7221 2.87984 12.4593L1.97868 11.5582C1.11811 10.6976 1.11811 9.3023 1.97867 8.44173L2.87986 7.54054C2.87987 7.54053 2.87987 7.54053 2.87987 7.54053C3.14266 7.27775 3.29028 6.92134 3.29028 6.54972V5.4939C3.29028 4.27688 4.27687 3.29028 5.49389 3.29028H6.54972C6.92135 3.29028 7.27774 3.14265 7.54051 2.87989L7.54052 2.87988L8.44172 1.97868Z" stroke="url(#paint3_linear_6295_120949)" stroke-opacity="0.8"/>
|
||||
</g>
|
||||
<g id="check" filter="url(#filter0_d_6295_120949)">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M13.4219 6.98125C13.8732 7.28917 13.9829 7.89538 13.667 8.33526L10.04 13.3857C9.87754 13.6119 9.62408 13.7601 9.34287 13.7933C9.06166 13.8266 8.77923 13.7417 8.56605 13.5599L6.49346 11.7922C6.0789 11.4386 6.03689 10.8244 6.39963 10.4203C6.76238 10.0162 7.39252 9.97526 7.80709 10.3288L9.04316 11.3831L12.0328 7.22019C12.3487 6.78031 12.9706 6.67333 13.4219 6.98125Z" fill="url(#paint4_linear_6295_120949)" shape-rendering="crispEdges"/>
|
||||
</g>
|
||||
<path id="highlight" opacity="0.5" d="M0 0H15.5556L5.26663 20H0V0Z" fill="url(#paint5_linear_6295_120949)"/>
|
||||
</g>
|
||||
</g>
|
||||
<defs>
|
||||
<filter id="filter0_d_6295_120949" x="5.65283" y="6.55542" width="8.69458" height="7.99512" filterUnits="userSpaceOnUse" color-interpolation-filters="sRGB">
|
||||
<feFlood flood-opacity="0" result="BackgroundImageFix"/>
|
||||
<feColorMatrix in="SourceAlpha" type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0" result="hardAlpha"/>
|
||||
<feOffset dy="0.25"/>
|
||||
<feGaussianBlur stdDeviation="0.25"/>
|
||||
<feComposite in2="hardAlpha" operator="out"/>
|
||||
<feColorMatrix type="matrix" values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.2 0"/>
|
||||
<feBlend mode="normal" in2="BackgroundImageFix" result="effect1_dropShadow_6295_120949"/>
|
||||
<feBlend mode="normal" in="SourceGraphic" in2="effect1_dropShadow_6295_120949" result="shape"/>
|
||||
</filter>
|
||||
<linearGradient id="paint0_linear_6295_120949" x1="16.302" y1="19.1666" x2="-0.37184" y2="14.82" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#155AEF"/>
|
||||
<stop offset="1" stop-color="#5289FF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint1_linear_6295_120949" x1="8.67462" y1="0.833252" x2="18.9651" y2="12.5066" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.95"/>
|
||||
<stop offset="1" stop-color="#155AEF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint2_linear_6295_120949" x1="16.302" y1="19.1666" x2="-0.37184" y2="14.82" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="#155AEF"/>
|
||||
<stop offset="1" stop-color="#5289FF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint3_linear_6295_120949" x1="8.67462" y1="0.833252" x2="18.9651" y2="12.5066" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.95"/>
|
||||
<stop offset="1" stop-color="#155AEF"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint4_linear_6295_120949" x1="10.0001" y1="6.80542" x2="10.0001" y2="13.8004" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.9"/>
|
||||
</linearGradient>
|
||||
<linearGradient id="paint5_linear_6295_120949" x1="-4.78632" y1="4.375" x2="16.2164" y2="10.4" gradientUnits="userSpaceOnUse">
|
||||
<stop stop-color="white" stop-opacity="0.12"/>
|
||||
<stop offset="1" stop-color="white" stop-opacity="0.3"/>
|
||||
</linearGradient>
|
||||
</defs>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 9.7 KiB |
@ -0,0 +1,447 @@
|
||||
{
|
||||
"icon": {
|
||||
"type": "element",
|
||||
"isRootNode": true,
|
||||
"name": "svg",
|
||||
"attributes": {
|
||||
"width": "20",
|
||||
"height": "20",
|
||||
"viewBox": "0 0 20 20",
|
||||
"fill": "none",
|
||||
"xmlns": "http://www.w3.org/2000/svg"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "Partner"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "mask",
|
||||
"attributes": {
|
||||
"id": "mask0_6296_109592",
|
||||
"style": "mask-type:alpha",
|
||||
"maskUnits": "userSpaceOnUse",
|
||||
"x": "1",
|
||||
"y": "0",
|
||||
"width": "18",
|
||||
"height": "20"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "Mask"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "#932F19"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "url(#paint0_linear_6296_109592)",
|
||||
"fill-opacity": "0.9"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.47222 1.78016C8.45993 1.20991 8.90155 0.958665 9.36471 0.860217C9.78356 0.771189 10.2164 0.771189 10.6353 0.860217C11.0984 0.958665 11.5401 1.20991 12.5278 1.78016L15.8547 3.70096C16.8424 4.27121 17.2808 4.52805 17.5976 4.87994C17.8842 5.19815 18.1006 5.57304 18.2329 5.98028C18.3792 6.43061 18.3825 6.9387 18.3825 8.0792V11.9208C18.3825 13.0613 18.3792 13.5694 18.2329 14.0197C18.1006 14.427 17.8842 14.8018 17.5976 15.1201C17.2808 15.4719 16.8424 15.7288 15.8547 16.299L12.5278 18.2198C11.5401 18.7901 11.0984 19.0413 10.6353 19.1398C10.2164 19.2288 9.78356 19.2288 9.36471 19.1398C8.90155 19.0413 8.45993 18.7901 7.47222 18.2198L4.1453 16.299C3.1576 15.7288 2.7192 15.4719 2.40236 15.1201C2.11584 14.8018 1.89939 14.427 1.76707 14.0197C1.62075 13.5694 1.61752 13.0613 1.61752 11.9208V8.0792C1.61752 6.9387 1.62075 6.43061 1.76707 5.98028C1.89939 5.57304 2.11584 5.19815 2.40236 4.87994C2.7192 4.52805 3.1576 4.27121 4.1453 3.70096L7.47222 1.78016Z",
|
||||
"stroke": "url(#paint1_linear_6296_109592)",
|
||||
"stroke-opacity": "0.8",
|
||||
"stroke-width": "0.555556"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"mask": "url(#mask0_6296_109592)"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "badge-bg"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "#932F19"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "url(#paint2_linear_6296_109592)",
|
||||
"fill-opacity": "0.9"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.58333 1.97261C8.58402 1.39487 8.99036 1.16698 9.41092 1.07758C9.7993 0.99503 10.2007 0.99503 10.5891 1.07758C11.0096 1.16698 11.416 1.39487 12.4167 1.97261L15.7436 3.89341C16.7443 4.47116 17.1448 4.70911 17.4325 5.02863C17.6982 5.3237 17.8989 5.67133 18.0216 6.04895C18.1544 6.45786 18.1603 6.92371 18.1603 8.0792V11.9208C18.1603 13.0763 18.1544 13.5421 18.0216 13.951C17.8989 14.3287 17.6982 14.6763 17.4325 14.9714C17.1448 15.2909 16.7443 15.5288 15.7436 16.1066L12.4167 18.0274C11.416 18.6051 11.0096 18.833 10.5891 18.9224C10.2007 19.005 9.7993 19.005 9.41092 18.9224C8.99036 18.833 8.58402 18.6051 7.58333 18.0274L4.25641 16.1066C3.25572 15.5288 2.8552 15.2909 2.5675 14.9714C2.30182 14.6763 2.10112 14.3287 1.97842 13.951C1.84556 13.5421 1.83975 13.0763 1.83975 11.9208V8.0792C1.83975 6.92371 1.84556 6.45786 1.97842 6.04895C2.10112 5.67133 2.30182 5.3237 2.5675 5.02863C2.8552 4.70911 3.25572 4.47116 4.25641 3.89341L7.58333 1.97261Z",
|
||||
"stroke": "url(#paint3_linear_6296_109592)",
|
||||
"stroke-opacity": "0.8"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "handshake",
|
||||
"filter": "url(#filter0_d_6296_109592)"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M11.0969 9.64841C10.895 9.44642 10.5675 9.44642 10.3656 9.64841L9.99991 10.0141C9.59596 10.418 8.94109 10.418 8.53717 10.0141C8.13325 9.61015 8.13325 8.95527 8.53717 8.55135L11.4491 5.63868C12.5371 5.39255 13.7238 5.69302 14.5709 6.54011C15.8221 7.79128 15.8807 9.78339 14.7469 11.104L13.6567 12.2081L11.0969 9.64841ZM5.42889 6.54011C6.55286 5.41614 8.27475 5.25452 9.57067 6.05524L7.80581 7.81999C6.99797 8.62783 6.99797 9.9376 7.80581 10.7454C8.58917 11.5288 9.8445 11.5525 10.6564 10.8167L10.7313 10.7454L12.9253 12.9395L10.7313 15.1336C10.3273 15.5375 9.67245 15.5375 9.26855 15.1336L5.42889 11.2939C4.11615 9.9812 4.11615 7.85284 5.42889 6.54011Z",
|
||||
"fill": "url(#paint4_linear_6296_109592)",
|
||||
"shape-rendering": "crispEdges"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"id": "highlight",
|
||||
"opacity": "0.5",
|
||||
"d": "M0 0H15.5556L5.26663 20H0V0Z",
|
||||
"fill": "url(#paint5_linear_6296_109592)"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "defs",
|
||||
"attributes": {},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "filter",
|
||||
"attributes": {
|
||||
"id": "filter0_d_6296_109592",
|
||||
"x": "3.94434",
|
||||
"y": "5.30556",
|
||||
"width": "12.1111",
|
||||
"height": "10.881",
|
||||
"filterUnits": "userSpaceOnUse",
|
||||
"color-interpolation-filters": "sRGB"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feFlood",
|
||||
"attributes": {
|
||||
"flood-opacity": "0",
|
||||
"result": "BackgroundImageFix"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feColorMatrix",
|
||||
"attributes": {
|
||||
"in": "SourceAlpha",
|
||||
"type": "matrix",
|
||||
"values": "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0",
|
||||
"result": "hardAlpha"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feOffset",
|
||||
"attributes": {
|
||||
"dy": "0.25"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feGaussianBlur",
|
||||
"attributes": {
|
||||
"stdDeviation": "0.25"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feComposite",
|
||||
"attributes": {
|
||||
"in2": "hardAlpha",
|
||||
"operator": "out"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feColorMatrix",
|
||||
"attributes": {
|
||||
"type": "matrix",
|
||||
"values": "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.2 0"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feBlend",
|
||||
"attributes": {
|
||||
"mode": "normal",
|
||||
"in2": "BackgroundImageFix",
|
||||
"result": "effect1_dropShadow_6296_109592"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feBlend",
|
||||
"attributes": {
|
||||
"mode": "normal",
|
||||
"in": "SourceGraphic",
|
||||
"in2": "effect1_dropShadow_6296_109592",
|
||||
"result": "shape"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint0_linear_6296_109592",
|
||||
"x1": "0",
|
||||
"y1": "0",
|
||||
"x2": "22.6412",
|
||||
"y2": "1.78551",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "#FF692E"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#E04F16"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint1_linear_6296_109592",
|
||||
"x1": "8.55422",
|
||||
"y1": "-1.28187e-07",
|
||||
"x2": "19.7802",
|
||||
"y2": "12.7346",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.2"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#FF4405"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint2_linear_6296_109592",
|
||||
"x1": "0",
|
||||
"y1": "0",
|
||||
"x2": "22.6412",
|
||||
"y2": "1.78551",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "#FF692E"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#E04F16"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint3_linear_6296_109592",
|
||||
"x1": "8.55422",
|
||||
"y1": "-1.28187e-07",
|
||||
"x2": "19.7802",
|
||||
"y2": "12.7346",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.2"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#FF4405"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint4_linear_6296_109592",
|
||||
"x1": "9.99989",
|
||||
"y1": "5.55556",
|
||||
"x2": "9.99989",
|
||||
"y2": "15.4365",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.95"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.8"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint5_linear_6296_109592",
|
||||
"x1": "-4.78632",
|
||||
"y1": "4.375",
|
||||
"x2": "16.2164",
|
||||
"y2": "10.4",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.12"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.2"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"name": "PartnerDark"
|
||||
}
|
||||
@ -0,0 +1,16 @@
|
||||
// GENERATE BY script
|
||||
// DON NOT EDIT IT MANUALLY
|
||||
|
||||
import * as React from 'react'
|
||||
import data from './PartnerDark.json'
|
||||
import IconBase from '@/app/components/base/icons/IconBase'
|
||||
import type { IconBaseProps, IconData } from '@/app/components/base/icons/IconBase'
|
||||
|
||||
const Icon = React.forwardRef<React.MutableRefObject<SVGElement>, Omit<IconBaseProps, 'data'>>((
|
||||
props,
|
||||
ref,
|
||||
) => <IconBase {...props} ref={ref} data={data as IconData} />)
|
||||
|
||||
Icon.displayName = 'PartnerDark'
|
||||
|
||||
export default Icon
|
||||
@ -0,0 +1,446 @@
|
||||
{
|
||||
"icon": {
|
||||
"type": "element",
|
||||
"isRootNode": true,
|
||||
"name": "svg",
|
||||
"attributes": {
|
||||
"width": "20",
|
||||
"height": "20",
|
||||
"viewBox": "0 0 20 20",
|
||||
"fill": "none",
|
||||
"xmlns": "http://www.w3.org/2000/svg"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "Partner"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "mask",
|
||||
"attributes": {
|
||||
"id": "mask0_6291_109635",
|
||||
"style": "mask-type:alpha",
|
||||
"maskUnits": "userSpaceOnUse",
|
||||
"x": "1",
|
||||
"y": "0",
|
||||
"width": "18",
|
||||
"height": "20"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "Mask"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "#F9DBAF"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "url(#paint0_linear_6291_109635)",
|
||||
"fill-opacity": "0.9"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.47222 1.78016C8.45993 1.20991 8.90155 0.958665 9.36471 0.860217C9.78356 0.771189 10.2164 0.771189 10.6353 0.860217C11.0984 0.958665 11.5401 1.20991 12.5278 1.78016L15.8547 3.70096C16.8424 4.27121 17.2808 4.52805 17.5976 4.87994C17.8842 5.19815 18.1006 5.57304 18.2329 5.98028C18.3792 6.43061 18.3825 6.9387 18.3825 8.0792V11.9208C18.3825 13.0613 18.3792 13.5694 18.2329 14.0197C18.1006 14.427 17.8842 14.8018 17.5976 15.1201C17.2808 15.4719 16.8424 15.7288 15.8547 16.299L12.5278 18.2198C11.5401 18.7901 11.0984 19.0413 10.6353 19.1398C10.2164 19.2288 9.78356 19.2288 9.36471 19.1398C8.90155 19.0413 8.45993 18.7901 7.47222 18.2198L4.1453 16.299C3.1576 15.7288 2.7192 15.4719 2.40236 15.1201C2.11584 14.8018 1.89939 14.427 1.76707 14.0197C1.62075 13.5694 1.61752 13.0613 1.61752 11.9208V8.0792C1.61752 6.9387 1.62075 6.43061 1.76707 5.98028C1.89939 5.57304 2.11584 5.19815 2.40236 4.87994C2.7192 4.52805 3.1576 4.27121 4.1453 3.70096L7.47222 1.78016Z",
|
||||
"stroke": "url(#paint1_linear_6291_109635)",
|
||||
"stroke-opacity": "0.8",
|
||||
"stroke-width": "0.555556"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"mask": "url(#mask0_6291_109635)"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "badge-bg"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "#F9DBAF"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.33333 1.5396C8.30481 0.978718 8.79055 0.698276 9.30696 0.58851C9.76388 0.491388 10.2361 0.491388 10.693 0.58851C11.2094 0.698276 11.6952 0.978718 12.6667 1.5396L15.9936 3.4604C16.9651 4.02128 17.4508 4.30172 17.8041 4.69407C18.1166 5.04121 18.3528 5.45018 18.4971 5.89444C18.6603 6.39655 18.6603 6.95744 18.6603 8.0792V11.9208C18.6603 13.0426 18.6603 13.6034 18.4971 14.1056C18.3528 14.5498 18.1166 14.9588 17.8041 15.3059C17.4508 15.6983 16.9651 15.9787 15.9936 16.5396L12.6667 18.4604C11.6952 19.0213 11.2094 19.3017 10.693 19.4115C10.2361 19.5086 9.76388 19.5086 9.30696 19.4115C8.79055 19.3017 8.30481 19.0213 7.33333 18.4604L4.00641 16.5396C3.03493 15.9787 2.5492 15.6983 2.19593 15.3059C1.88336 14.9588 1.64724 14.5498 1.50289 14.1056C1.33975 13.6034 1.33975 13.0426 1.33975 11.9208V8.0792C1.33975 6.95744 1.33975 6.39655 1.50289 5.89444C1.64724 5.45018 1.88336 5.04121 2.19593 4.69407C2.5492 4.30172 3.03493 4.02128 4.00641 3.4604L7.33333 1.5396Z",
|
||||
"fill": "url(#paint2_linear_6291_109635)",
|
||||
"fill-opacity": "0.9"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M7.58333 1.97261C8.58402 1.39487 8.99036 1.16698 9.41092 1.07758C9.7993 0.99503 10.2007 0.99503 10.5891 1.07758C11.0096 1.16698 11.416 1.39487 12.4167 1.97261L15.7436 3.89341C16.7443 4.47116 17.1448 4.70911 17.4325 5.02863C17.6982 5.3237 17.8989 5.67133 18.0216 6.04895C18.1544 6.45786 18.1603 6.92371 18.1603 8.0792V11.9208C18.1603 13.0763 18.1544 13.5421 18.0216 13.951C17.8989 14.3287 17.6982 14.6763 17.4325 14.9714C17.1448 15.2909 16.7443 15.5288 15.7436 16.1066L12.4167 18.0274C11.416 18.6051 11.0096 18.833 10.5891 18.9224C10.2007 19.005 9.7993 19.005 9.41092 18.9224C8.99036 18.833 8.58402 18.6051 7.58333 18.0274L4.25641 16.1066C3.25572 15.5288 2.8552 15.2909 2.5675 14.9714C2.30182 14.6763 2.10112 14.3287 1.97842 13.951C1.84556 13.5421 1.83975 13.0763 1.83975 11.9208V8.0792C1.83975 6.92371 1.84556 6.45786 1.97842 6.04895C2.10112 5.67133 2.30182 5.3237 2.5675 5.02863C2.8552 4.70911 3.25572 4.47116 4.25641 3.89341L7.58333 1.97261Z",
|
||||
"stroke": "url(#paint3_linear_6291_109635)",
|
||||
"stroke-opacity": "0.8"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "g",
|
||||
"attributes": {
|
||||
"id": "handshake",
|
||||
"filter": "url(#filter0_d_6291_109635)"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"d": "M11.0969 9.64852C10.895 9.44652 10.5675 9.44652 10.3656 9.64852L9.99991 10.0142C9.59596 10.4181 8.94109 10.4181 8.53717 10.0142C8.13325 9.61025 8.13325 8.95537 8.53717 8.55146L11.4491 5.63879C12.5371 5.39265 13.7238 5.69313 14.5709 6.54022C15.8221 7.79139 15.8807 9.7835 14.7469 11.1041L13.6567 12.2083L11.0969 9.64852ZM5.42889 6.54022C6.55286 5.41625 8.27475 5.25463 9.57067 6.05534L7.80581 7.8201C6.99797 8.62794 6.99797 9.93771 7.80581 10.7456C8.58917 11.5289 9.8445 11.5526 10.6564 10.8168L10.7313 10.7456L12.9253 12.9396L10.7313 15.1337C10.3273 15.5376 9.67245 15.5376 9.26855 15.1337L5.42889 11.294C4.11615 9.98131 4.11615 7.85295 5.42889 6.54022Z",
|
||||
"fill": "url(#paint4_linear_6291_109635)",
|
||||
"shape-rendering": "crispEdges"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "path",
|
||||
"attributes": {
|
||||
"id": "highlight",
|
||||
"opacity": "0.5",
|
||||
"d": "M0 0H15.5556L5.26663 20H0V0Z",
|
||||
"fill": "url(#paint5_linear_6291_109635)"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "defs",
|
||||
"attributes": {},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "filter",
|
||||
"attributes": {
|
||||
"id": "filter0_d_6291_109635",
|
||||
"x": "3.94434",
|
||||
"y": "5.30566",
|
||||
"width": "12.1111",
|
||||
"height": "10.8809",
|
||||
"filterUnits": "userSpaceOnUse",
|
||||
"color-interpolation-filters": "sRGB"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feFlood",
|
||||
"attributes": {
|
||||
"flood-opacity": "0",
|
||||
"result": "BackgroundImageFix"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feColorMatrix",
|
||||
"attributes": {
|
||||
"in": "SourceAlpha",
|
||||
"type": "matrix",
|
||||
"values": "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0",
|
||||
"result": "hardAlpha"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feOffset",
|
||||
"attributes": {
|
||||
"dy": "0.25"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feGaussianBlur",
|
||||
"attributes": {
|
||||
"stdDeviation": "0.25"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feComposite",
|
||||
"attributes": {
|
||||
"in2": "hardAlpha",
|
||||
"operator": "out"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feColorMatrix",
|
||||
"attributes": {
|
||||
"type": "matrix",
|
||||
"values": "0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.2 0"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feBlend",
|
||||
"attributes": {
|
||||
"mode": "normal",
|
||||
"in2": "BackgroundImageFix",
|
||||
"result": "effect1_dropShadow_6291_109635"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "feBlend",
|
||||
"attributes": {
|
||||
"mode": "normal",
|
||||
"in": "SourceGraphic",
|
||||
"in2": "effect1_dropShadow_6291_109635",
|
||||
"result": "shape"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint0_linear_6291_109635",
|
||||
"x1": "0",
|
||||
"y1": "0",
|
||||
"x2": "22.6412",
|
||||
"y2": "1.78551",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "#FF692E"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#E04F16"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint1_linear_6291_109635",
|
||||
"x1": "8.55422",
|
||||
"y1": "-1.28187e-07",
|
||||
"x2": "19.7802",
|
||||
"y2": "12.7346",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.95"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#E62E05"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint2_linear_6291_109635",
|
||||
"x1": "0",
|
||||
"y1": "0",
|
||||
"x2": "22.6412",
|
||||
"y2": "1.78551",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "#FF692E"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#E04F16"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint3_linear_6291_109635",
|
||||
"x1": "8.55422",
|
||||
"y1": "-1.28187e-07",
|
||||
"x2": "19.7802",
|
||||
"y2": "12.7346",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.95"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "#E62E05"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint4_linear_6291_109635",
|
||||
"x1": "9.99989",
|
||||
"y1": "5.55566",
|
||||
"x2": "9.99989",
|
||||
"y2": "15.4366",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.9"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "linearGradient",
|
||||
"attributes": {
|
||||
"id": "paint5_linear_6291_109635",
|
||||
"x1": "-4.78632",
|
||||
"y1": "4.375",
|
||||
"x2": "16.2164",
|
||||
"y2": "10.4",
|
||||
"gradientUnits": "userSpaceOnUse"
|
||||
},
|
||||
"children": [
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.12"
|
||||
},
|
||||
"children": []
|
||||
},
|
||||
{
|
||||
"type": "element",
|
||||
"name": "stop",
|
||||
"attributes": {
|
||||
"offset": "1",
|
||||
"stop-color": "white",
|
||||
"stop-opacity": "0.3"
|
||||
},
|
||||
"children": []
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"name": "PartnerLight"
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user