mirror of
https://github.com/langgenius/dify.git
synced 2026-03-29 01:49:57 +08:00
Merge branch 'main' into feat/mcp
This commit is contained in:
@ -226,6 +226,11 @@ Deploy Dify to AWS with [CDK](https://aws.amazon.com/cdk/)
|
||||
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Using Alibaba Cloud Computing Nest
|
||||
|
||||
Quickly deploy Dify to Alibaba cloud with [Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
||||
@ -209,6 +209,9 @@ docker compose up -d
|
||||
|
||||
- [AWS CDK بواسطة @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### استخدام Alibaba Cloud للنشر
|
||||
[بسرعة نشر Dify إلى سحابة علي بابا مع عش الحوسبة السحابية علي بابا](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
## المساهمة
|
||||
|
||||
لأولئك الذين يرغبون في المساهمة، انظر إلى [دليل المساهمة](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) لدينا.
|
||||
|
||||
@ -225,6 +225,11 @@ GitHub-এ ডিফাইকে স্টার দিয়ে রাখুন
|
||||
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud ব্যবহার করে ডিপ্লয়
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
যারা কোড অবদান রাখতে চান, তাদের জন্য আমাদের [অবদান নির্দেশিকা] দেখুন (https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)।
|
||||
|
||||
@ -221,6 +221,11 @@ docker compose up -d
|
||||
##### AWS
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### 使用 阿里云计算巢 部署
|
||||
|
||||
使用 [阿里云计算巢](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88) 将 Dify 一键部署到 阿里云
|
||||
|
||||
|
||||
## Star History
|
||||
|
||||
[](https://star-history.com/#langgenius/dify&Date)
|
||||
|
||||
@ -221,6 +221,11 @@ Bereitstellung von Dify auf AWS mit [CDK](https://aws.amazon.com/cdk/)
|
||||
##### AWS
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
Falls Sie Code beitragen möchten, lesen Sie bitte unseren [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md). Gleichzeitig bitten wir Sie, Dify zu unterstützen, indem Sie es in den sozialen Medien teilen und auf Veranstaltungen und Konferenzen präsentieren.
|
||||
|
||||
@ -221,6 +221,10 @@ Despliegue Dify en AWS usando [CDK](https://aws.amazon.com/cdk/)
|
||||
##### AWS
|
||||
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
## Contribuir
|
||||
|
||||
Para aquellos que deseen contribuir con código, consulten nuestra [Guía de contribución](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
||||
@ -219,6 +219,11 @@ Déployez Dify sur AWS en utilisant [CDK](https://aws.amazon.com/cdk/)
|
||||
##### AWS
|
||||
- [AWS CDK par @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contribuer
|
||||
|
||||
Pour ceux qui souhaitent contribuer du code, consultez notre [Guide de contribution](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
||||
@ -220,6 +220,10 @@ docker compose up -d
|
||||
##### AWS
|
||||
- [@KevinZhaoによるAWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## 貢献
|
||||
|
||||
コードに貢献したい方は、[Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)を参照してください。
|
||||
|
||||
@ -219,6 +219,11 @@ wa'logh nIqHom neH ghun deployment toy'wI' [CDK](https://aws.amazon.com/cdk/) lo
|
||||
##### AWS
|
||||
- [AWS CDK qachlot @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
||||
@ -213,6 +213,11 @@ Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했
|
||||
##### AWS
|
||||
- [KevinZhao의 AWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## 기여
|
||||
|
||||
코드에 기여하고 싶은 분들은 [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요.
|
||||
|
||||
@ -218,6 +218,11 @@ Implante o Dify na AWS usando [CDK](https://aws.amazon.com/cdk/)
|
||||
##### AWS
|
||||
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contribuindo
|
||||
|
||||
Para aqueles que desejam contribuir com código, veja nosso [Guia de Contribuição](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
||||
@ -219,6 +219,11 @@ Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
|
||||
##### AWS
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Prispevam
|
||||
|
||||
Za tiste, ki bi radi prispevali kodo, si oglejte naš vodnik za prispevke . Hkrati vas prosimo, da podprete Dify tako, da ga delite na družbenih medijih ter na dogodkih in konferencah.
|
||||
|
||||
@ -212,6 +212,11 @@ Dify'ı bulut platformuna tek tıklamayla dağıtın [terraform](https://www.ter
|
||||
##### AWS
|
||||
- [AWS CDK tarafından @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Katkıda Bulunma
|
||||
|
||||
Kod katkısında bulunmak isteyenler için [Katkı Kılavuzumuza](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) bakabilirsiniz.
|
||||
|
||||
@ -224,6 +224,11 @@ Dify 的所有功能都提供相應的 API,因此您可以輕鬆地將 Dify
|
||||
|
||||
- [由 @KevinZhao 提供的 AWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### 使用 阿里云计算巢進行部署
|
||||
|
||||
[阿里云](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## 貢獻
|
||||
|
||||
對於想要貢獻程式碼的開發者,請參閱我們的[貢獻指南](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)。
|
||||
|
||||
@ -214,6 +214,12 @@ Triển khai Dify trên AWS bằng [CDK](https://aws.amazon.com/cdk/)
|
||||
##### AWS
|
||||
- [AWS CDK bởi @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Đóng góp
|
||||
|
||||
Đối với những người muốn đóng góp mã, xem [Hướng dẫn Đóng góp](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) của chúng tôi.
|
||||
|
||||
@ -56,8 +56,7 @@ class InsertExploreAppListApi(Resource):
|
||||
parser.add_argument("position", type=int, required=True, nullable=False, location="json")
|
||||
args = parser.parse_args()
|
||||
|
||||
with Session(db.engine) as session:
|
||||
app = session.execute(select(App).filter(App.id == args["app_id"])).scalar_one_or_none()
|
||||
app = db.session.execute(select(App).filter(App.id == args["app_id"])).scalar_one_or_none()
|
||||
if not app:
|
||||
raise NotFound(f"App '{args['app_id']}' is not found")
|
||||
|
||||
@ -78,38 +77,38 @@ class InsertExploreAppListApi(Resource):
|
||||
select(RecommendedApp).filter(RecommendedApp.app_id == args["app_id"])
|
||||
).scalar_one_or_none()
|
||||
|
||||
if not recommended_app:
|
||||
recommended_app = RecommendedApp(
|
||||
app_id=app.id,
|
||||
description=desc,
|
||||
copyright=copy_right,
|
||||
privacy_policy=privacy_policy,
|
||||
custom_disclaimer=custom_disclaimer,
|
||||
language=args["language"],
|
||||
category=args["category"],
|
||||
position=args["position"],
|
||||
)
|
||||
if not recommended_app:
|
||||
recommended_app = RecommendedApp(
|
||||
app_id=app.id,
|
||||
description=desc,
|
||||
copyright=copy_right,
|
||||
privacy_policy=privacy_policy,
|
||||
custom_disclaimer=custom_disclaimer,
|
||||
language=args["language"],
|
||||
category=args["category"],
|
||||
position=args["position"],
|
||||
)
|
||||
|
||||
db.session.add(recommended_app)
|
||||
db.session.add(recommended_app)
|
||||
|
||||
app.is_public = True
|
||||
db.session.commit()
|
||||
app.is_public = True
|
||||
db.session.commit()
|
||||
|
||||
return {"result": "success"}, 201
|
||||
else:
|
||||
recommended_app.description = desc
|
||||
recommended_app.copyright = copy_right
|
||||
recommended_app.privacy_policy = privacy_policy
|
||||
recommended_app.custom_disclaimer = custom_disclaimer
|
||||
recommended_app.language = args["language"]
|
||||
recommended_app.category = args["category"]
|
||||
recommended_app.position = args["position"]
|
||||
return {"result": "success"}, 201
|
||||
else:
|
||||
recommended_app.description = desc
|
||||
recommended_app.copyright = copy_right
|
||||
recommended_app.privacy_policy = privacy_policy
|
||||
recommended_app.custom_disclaimer = custom_disclaimer
|
||||
recommended_app.language = args["language"]
|
||||
recommended_app.category = args["category"]
|
||||
recommended_app.position = args["position"]
|
||||
|
||||
app.is_public = True
|
||||
app.is_public = True
|
||||
|
||||
db.session.commit()
|
||||
db.session.commit()
|
||||
|
||||
return {"result": "success"}, 200
|
||||
return {"result": "success"}, 200
|
||||
|
||||
|
||||
class InsertExploreAppApi(Resource):
|
||||
|
||||
@ -43,7 +43,6 @@ from core.model_runtime.errors.invoke import InvokeAuthorizationError
|
||||
from core.plugin.impl.exc import PluginDaemonClientSideError
|
||||
from core.rag.extractor.entity.extract_setting import ExtractSetting
|
||||
from extensions.ext_database import db
|
||||
from extensions.ext_redis import redis_client
|
||||
from fields.document_fields import (
|
||||
dataset_and_document_fields,
|
||||
document_fields,
|
||||
@ -54,8 +53,6 @@ from libs.login import login_required
|
||||
from models import Dataset, DatasetProcessRule, Document, DocumentSegment, UploadFile
|
||||
from services.dataset_service import DatasetService, DocumentService
|
||||
from services.entities.knowledge_entities.knowledge_entities import KnowledgeConfig
|
||||
from tasks.add_document_to_index_task import add_document_to_index_task
|
||||
from tasks.remove_document_from_index_task import remove_document_from_index_task
|
||||
|
||||
|
||||
class DocumentResource(Resource):
|
||||
@ -862,77 +859,16 @@ class DocumentStatusApi(DocumentResource):
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
|
||||
document_ids = request.args.getlist("document_id")
|
||||
for document_id in document_ids:
|
||||
document = self.get_document(dataset_id, document_id)
|
||||
|
||||
indexing_cache_key = "document_{}_indexing".format(document.id)
|
||||
cache_result = redis_client.get(indexing_cache_key)
|
||||
if cache_result is not None:
|
||||
raise InvalidActionError(f"Document:{document.name} is being indexed, please try again later")
|
||||
try:
|
||||
DocumentService.batch_update_document_status(dataset, document_ids, action, current_user)
|
||||
except services.errors.document.DocumentIndexingError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
except ValueError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
except NotFound as e:
|
||||
raise NotFound(str(e))
|
||||
|
||||
if action == "enable":
|
||||
if document.enabled:
|
||||
continue
|
||||
document.enabled = True
|
||||
document.disabled_at = None
|
||||
document.disabled_by = None
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
add_document_to_index_task.delay(document_id)
|
||||
|
||||
elif action == "disable":
|
||||
if not document.completed_at or document.indexing_status != "completed":
|
||||
raise InvalidActionError(f"Document: {document.name} is not completed.")
|
||||
if not document.enabled:
|
||||
continue
|
||||
|
||||
document.enabled = False
|
||||
document.disabled_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
document.disabled_by = current_user.id
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
remove_document_from_index_task.delay(document_id)
|
||||
|
||||
elif action == "archive":
|
||||
if document.archived:
|
||||
continue
|
||||
|
||||
document.archived = True
|
||||
document.archived_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
document.archived_by = current_user.id
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
if document.enabled:
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
remove_document_from_index_task.delay(document_id)
|
||||
|
||||
elif action == "un_archive":
|
||||
if not document.archived:
|
||||
continue
|
||||
document.archived = False
|
||||
document.archived_at = None
|
||||
document.archived_by = None
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
add_document_to_index_task.delay(document_id)
|
||||
|
||||
else:
|
||||
raise InvalidActionError()
|
||||
return {"result": "success"}, 200
|
||||
|
||||
|
||||
|
||||
@ -4,7 +4,7 @@ from werkzeug.exceptions import Forbidden, NotFound
|
||||
|
||||
import services.dataset_service
|
||||
from controllers.service_api import api
|
||||
from controllers.service_api.dataset.error import DatasetInUseError, DatasetNameDuplicateError
|
||||
from controllers.service_api.dataset.error import DatasetInUseError, DatasetNameDuplicateError, InvalidActionError
|
||||
from controllers.service_api.wraps import (
|
||||
DatasetApiResource,
|
||||
cloud_edition_billing_rate_limit_check,
|
||||
@ -17,7 +17,7 @@ from fields.dataset_fields import dataset_detail_fields
|
||||
from fields.tag_fields import tag_fields
|
||||
from libs.login import current_user
|
||||
from models.dataset import Dataset, DatasetPermissionEnum
|
||||
from services.dataset_service import DatasetPermissionService, DatasetService
|
||||
from services.dataset_service import DatasetPermissionService, DatasetService, DocumentService
|
||||
from services.entities.knowledge_entities.knowledge_entities import RetrievalModel
|
||||
from services.tag_service import TagService
|
||||
|
||||
@ -329,6 +329,56 @@ class DatasetApi(DatasetApiResource):
|
||||
raise DatasetInUseError()
|
||||
|
||||
|
||||
class DocumentStatusApi(DatasetApiResource):
|
||||
"""Resource for batch document status operations."""
|
||||
|
||||
def patch(self, tenant_id, dataset_id, action):
|
||||
"""
|
||||
Batch update document status.
|
||||
|
||||
Args:
|
||||
tenant_id: tenant id
|
||||
dataset_id: dataset id
|
||||
action: action to perform (enable, disable, archive, un_archive)
|
||||
|
||||
Returns:
|
||||
dict: A dictionary with a key 'result' and a value 'success'
|
||||
int: HTTP status code 200 indicating that the operation was successful.
|
||||
|
||||
Raises:
|
||||
NotFound: If the dataset with the given ID does not exist.
|
||||
Forbidden: If the user does not have permission.
|
||||
InvalidActionError: If the action is invalid or cannot be performed.
|
||||
"""
|
||||
dataset_id_str = str(dataset_id)
|
||||
dataset = DatasetService.get_dataset(dataset_id_str)
|
||||
|
||||
if dataset is None:
|
||||
raise NotFound("Dataset not found.")
|
||||
|
||||
# Check user's permission
|
||||
try:
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
except services.errors.account.NoPermissionError as e:
|
||||
raise Forbidden(str(e))
|
||||
|
||||
# Check dataset model setting
|
||||
DatasetService.check_dataset_model_setting(dataset)
|
||||
|
||||
# Get document IDs from request body
|
||||
data = request.get_json()
|
||||
document_ids = data.get("document_ids", [])
|
||||
|
||||
try:
|
||||
DocumentService.batch_update_document_status(dataset, document_ids, action, current_user)
|
||||
except services.errors.document.DocumentIndexingError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
except ValueError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
|
||||
return {"result": "success"}, 200
|
||||
|
||||
|
||||
class DatasetTagsApi(DatasetApiResource):
|
||||
@validate_dataset_token
|
||||
@marshal_with(tag_fields)
|
||||
@ -457,6 +507,7 @@ class DatasetTagsBindingStatusApi(DatasetApiResource):
|
||||
|
||||
api.add_resource(DatasetListApi, "/datasets")
|
||||
api.add_resource(DatasetApi, "/datasets/<uuid:dataset_id>")
|
||||
api.add_resource(DocumentStatusApi, "/datasets/<uuid:dataset_id>/documents/status/<string:action>")
|
||||
api.add_resource(DatasetTagsApi, "/datasets/tags")
|
||||
api.add_resource(DatasetTagBindingApi, "/datasets/tags/binding")
|
||||
api.add_resource(DatasetTagUnbindingApi, "/datasets/tags/unbinding")
|
||||
|
||||
@ -68,22 +68,17 @@ class MarkdownExtractor(BaseExtractor):
|
||||
continue
|
||||
header_match = re.match(r"^#+\s", line)
|
||||
if header_match:
|
||||
if current_header is not None:
|
||||
markdown_tups.append((current_header, current_text))
|
||||
|
||||
markdown_tups.append((current_header, current_text))
|
||||
current_header = line
|
||||
current_text = ""
|
||||
else:
|
||||
current_text += line + "\n"
|
||||
markdown_tups.append((current_header, current_text))
|
||||
|
||||
if current_header is not None:
|
||||
# pass linting, assert keys are defined
|
||||
markdown_tups = [
|
||||
(re.sub(r"#", "", cast(str, key)).strip(), re.sub(r"<.*?>", "", value)) for key, value in markdown_tups
|
||||
]
|
||||
else:
|
||||
markdown_tups = [(key, re.sub("\n", "", value)) for key, value in markdown_tups]
|
||||
markdown_tups = [
|
||||
(re.sub(r"#", "", cast(str, key)).strip() if key else None, re.sub(r"<.*?>", "", value))
|
||||
for key, value in markdown_tups
|
||||
]
|
||||
|
||||
return markdown_tups
|
||||
|
||||
|
||||
@ -8,5 +8,4 @@ EMPTY_VALUE_MAPPING = {
|
||||
SegmentType.ARRAY_STRING: [],
|
||||
SegmentType.ARRAY_NUMBER: [],
|
||||
SegmentType.ARRAY_OBJECT: [],
|
||||
SegmentType.ARRAY_FILE: [],
|
||||
}
|
||||
|
||||
@ -1,6 +1,5 @@
|
||||
from typing import Any
|
||||
|
||||
from core.file import File
|
||||
from core.variables import SegmentType
|
||||
|
||||
from .enums import Operation
|
||||
@ -86,8 +85,6 @@ def is_input_value_valid(*, variable_type: SegmentType, operation: Operation, va
|
||||
return isinstance(value, int | float)
|
||||
case SegmentType.ARRAY_OBJECT if operation == Operation.APPEND:
|
||||
return isinstance(value, dict)
|
||||
case SegmentType.ARRAY_FILE if operation == Operation.APPEND:
|
||||
return isinstance(value, File)
|
||||
|
||||
# Array & Extend / Overwrite
|
||||
case SegmentType.ARRAY_ANY if operation in {Operation.EXTEND, Operation.OVER_WRITE}:
|
||||
@ -98,8 +95,6 @@ def is_input_value_valid(*, variable_type: SegmentType, operation: Operation, va
|
||||
return isinstance(value, list) and all(isinstance(item, int | float) for item in value)
|
||||
case SegmentType.ARRAY_OBJECT if operation in {Operation.EXTEND, Operation.OVER_WRITE}:
|
||||
return isinstance(value, list) and all(isinstance(item, dict) for item in value)
|
||||
case SegmentType.ARRAY_FILE if operation in {Operation.EXTEND, Operation.OVER_WRITE}:
|
||||
return isinstance(value, list) and all(isinstance(item, File) for item in value)
|
||||
|
||||
case _:
|
||||
return False
|
||||
|
||||
@ -101,8 +101,6 @@ def _build_variable_from_mapping(*, mapping: Mapping[str, Any], selector: Sequen
|
||||
result = ArrayNumberVariable.model_validate(mapping)
|
||||
case SegmentType.ARRAY_OBJECT if isinstance(value, list):
|
||||
result = ArrayObjectVariable.model_validate(mapping)
|
||||
case SegmentType.ARRAY_FILE if isinstance(value, list):
|
||||
result = ArrayFileVariable.model_validate(mapping)
|
||||
case _:
|
||||
raise VariableError(f"not supported value type {value_type}")
|
||||
if result.size > dify_config.MAX_VARIABLE_SIZE:
|
||||
|
||||
@ -59,6 +59,7 @@ from services.external_knowledge_service import ExternalDatasetService
|
||||
from services.feature_service import FeatureModel, FeatureService
|
||||
from services.tag_service import TagService
|
||||
from services.vector_service import VectorService
|
||||
from tasks.add_document_to_index_task import add_document_to_index_task
|
||||
from tasks.batch_clean_document_task import batch_clean_document_task
|
||||
from tasks.clean_notion_document_task import clean_notion_document_task
|
||||
from tasks.deal_dataset_vector_index_task import deal_dataset_vector_index_task
|
||||
@ -70,6 +71,7 @@ from tasks.document_indexing_update_task import document_indexing_update_task
|
||||
from tasks.duplicate_document_indexing_task import duplicate_document_indexing_task
|
||||
from tasks.enable_segments_to_index_task import enable_segments_to_index_task
|
||||
from tasks.recover_document_indexing_task import recover_document_indexing_task
|
||||
from tasks.remove_document_from_index_task import remove_document_from_index_task
|
||||
from tasks.retry_document_indexing_task import retry_document_indexing_task
|
||||
from tasks.sync_website_document_indexing_task import sync_website_document_indexing_task
|
||||
|
||||
@ -434,7 +436,7 @@ class DatasetService:
|
||||
raise ValueError(ex.description)
|
||||
|
||||
filtered_data["updated_by"] = user.id
|
||||
filtered_data["updated_at"] = datetime.datetime.now()
|
||||
filtered_data["updated_at"] = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
|
||||
|
||||
# update Retrieval model
|
||||
filtered_data["retrieval_model"] = data["retrieval_model"]
|
||||
@ -976,12 +978,17 @@ class DocumentService:
|
||||
process_rule = knowledge_config.process_rule
|
||||
if process_rule:
|
||||
if process_rule.mode in ("custom", "hierarchical"):
|
||||
dataset_process_rule = DatasetProcessRule(
|
||||
dataset_id=dataset.id,
|
||||
mode=process_rule.mode,
|
||||
rules=process_rule.rules.model_dump_json() if process_rule.rules else None,
|
||||
created_by=account.id,
|
||||
)
|
||||
if process_rule.rules:
|
||||
dataset_process_rule = DatasetProcessRule(
|
||||
dataset_id=dataset.id,
|
||||
mode=process_rule.mode,
|
||||
rules=process_rule.rules.model_dump_json() if process_rule.rules else None,
|
||||
created_by=account.id,
|
||||
)
|
||||
else:
|
||||
dataset_process_rule = dataset.latest_process_rule
|
||||
if not dataset_process_rule:
|
||||
raise ValueError("No process rule found.")
|
||||
elif process_rule.mode == "automatic":
|
||||
dataset_process_rule = DatasetProcessRule(
|
||||
dataset_id=dataset.id,
|
||||
@ -1603,6 +1610,191 @@ class DocumentService:
|
||||
if not isinstance(args["process_rule"]["rules"]["segmentation"]["max_tokens"], int):
|
||||
raise ValueError("Process rule segmentation max_tokens is invalid")
|
||||
|
||||
@staticmethod
|
||||
def batch_update_document_status(dataset: Dataset, document_ids: list[str], action: str, user):
|
||||
"""
|
||||
Batch update document status.
|
||||
|
||||
Args:
|
||||
dataset (Dataset): The dataset object
|
||||
document_ids (list[str]): List of document IDs to update
|
||||
action (str): Action to perform (enable, disable, archive, un_archive)
|
||||
user: Current user performing the action
|
||||
|
||||
Raises:
|
||||
DocumentIndexingError: If document is being indexed or not in correct state
|
||||
ValueError: If action is invalid
|
||||
"""
|
||||
if not document_ids:
|
||||
return
|
||||
|
||||
# Early validation of action parameter
|
||||
valid_actions = ["enable", "disable", "archive", "un_archive"]
|
||||
if action not in valid_actions:
|
||||
raise ValueError(f"Invalid action: {action}. Must be one of {valid_actions}")
|
||||
|
||||
documents_to_update = []
|
||||
|
||||
# First pass: validate all documents and prepare updates
|
||||
for document_id in document_ids:
|
||||
document = DocumentService.get_document(dataset.id, document_id)
|
||||
if not document:
|
||||
continue
|
||||
|
||||
# Check if document is being indexed
|
||||
indexing_cache_key = f"document_{document.id}_indexing"
|
||||
cache_result = redis_client.get(indexing_cache_key)
|
||||
if cache_result is not None:
|
||||
raise DocumentIndexingError(f"Document:{document.name} is being indexed, please try again later")
|
||||
|
||||
# Prepare update based on action
|
||||
update_info = DocumentService._prepare_document_status_update(document, action, user)
|
||||
if update_info:
|
||||
documents_to_update.append(update_info)
|
||||
|
||||
# Second pass: apply all updates in a single transaction
|
||||
if documents_to_update:
|
||||
try:
|
||||
for update_info in documents_to_update:
|
||||
document = update_info["document"]
|
||||
updates = update_info["updates"]
|
||||
|
||||
# Apply updates to the document
|
||||
for field, value in updates.items():
|
||||
setattr(document, field, value)
|
||||
|
||||
db.session.add(document)
|
||||
|
||||
# Batch commit all changes
|
||||
db.session.commit()
|
||||
except Exception as e:
|
||||
# Rollback on any error
|
||||
db.session.rollback()
|
||||
raise e
|
||||
# Execute async tasks and set Redis cache after successful commit
|
||||
# propagation_error is used to capture any errors for submitting async task execution
|
||||
propagation_error = None
|
||||
for update_info in documents_to_update:
|
||||
try:
|
||||
# Execute async tasks after successful commit
|
||||
if update_info["async_task"]:
|
||||
task_info = update_info["async_task"]
|
||||
task_func = task_info["function"]
|
||||
task_args = task_info["args"]
|
||||
task_func.delay(*task_args)
|
||||
except Exception as e:
|
||||
# Log the error but do not rollback the transaction
|
||||
logging.exception(f"Error executing async task for document {update_info['document'].id}")
|
||||
# don't raise the error immediately, but capture it for later
|
||||
propagation_error = e
|
||||
try:
|
||||
# Set Redis cache if needed after successful commit
|
||||
if update_info["set_cache"]:
|
||||
document = update_info["document"]
|
||||
indexing_cache_key = f"document_{document.id}_indexing"
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
except Exception as e:
|
||||
# Log the error but do not rollback the transaction
|
||||
logging.exception(f"Error setting cache for document {update_info['document'].id}")
|
||||
# Raise any propagation error after all updates
|
||||
if propagation_error:
|
||||
raise propagation_error
|
||||
|
||||
@staticmethod
|
||||
def _prepare_document_status_update(document, action: str, user):
|
||||
"""
|
||||
Prepare document status update information.
|
||||
|
||||
Args:
|
||||
document: Document object to update
|
||||
action: Action to perform
|
||||
user: Current user
|
||||
|
||||
Returns:
|
||||
dict: Update information or None if no update needed
|
||||
"""
|
||||
now = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
|
||||
|
||||
if action == "enable":
|
||||
return DocumentService._prepare_enable_update(document, now)
|
||||
elif action == "disable":
|
||||
return DocumentService._prepare_disable_update(document, user, now)
|
||||
elif action == "archive":
|
||||
return DocumentService._prepare_archive_update(document, user, now)
|
||||
elif action == "un_archive":
|
||||
return DocumentService._prepare_unarchive_update(document, now)
|
||||
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def _prepare_enable_update(document, now):
|
||||
"""Prepare updates for enabling a document."""
|
||||
if document.enabled:
|
||||
return None
|
||||
|
||||
return {
|
||||
"document": document,
|
||||
"updates": {"enabled": True, "disabled_at": None, "disabled_by": None, "updated_at": now},
|
||||
"async_task": {"function": add_document_to_index_task, "args": [document.id]},
|
||||
"set_cache": True,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _prepare_disable_update(document, user, now):
|
||||
"""Prepare updates for disabling a document."""
|
||||
if not document.completed_at or document.indexing_status != "completed":
|
||||
raise DocumentIndexingError(f"Document: {document.name} is not completed.")
|
||||
|
||||
if not document.enabled:
|
||||
return None
|
||||
|
||||
return {
|
||||
"document": document,
|
||||
"updates": {"enabled": False, "disabled_at": now, "disabled_by": user.id, "updated_at": now},
|
||||
"async_task": {"function": remove_document_from_index_task, "args": [document.id]},
|
||||
"set_cache": True,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _prepare_archive_update(document, user, now):
|
||||
"""Prepare updates for archiving a document."""
|
||||
if document.archived:
|
||||
return None
|
||||
|
||||
update_info = {
|
||||
"document": document,
|
||||
"updates": {"archived": True, "archived_at": now, "archived_by": user.id, "updated_at": now},
|
||||
"async_task": None,
|
||||
"set_cache": False,
|
||||
}
|
||||
|
||||
# Only set async task and cache if document is currently enabled
|
||||
if document.enabled:
|
||||
update_info["async_task"] = {"function": remove_document_from_index_task, "args": [document.id]}
|
||||
update_info["set_cache"] = True
|
||||
|
||||
return update_info
|
||||
|
||||
@staticmethod
|
||||
def _prepare_unarchive_update(document, now):
|
||||
"""Prepare updates for unarchiving a document."""
|
||||
if not document.archived:
|
||||
return None
|
||||
|
||||
update_info = {
|
||||
"document": document,
|
||||
"updates": {"archived": False, "archived_at": None, "archived_by": None, "updated_at": now},
|
||||
"async_task": None,
|
||||
"set_cache": False,
|
||||
}
|
||||
|
||||
# Only re-index if the document is currently enabled
|
||||
if document.enabled:
|
||||
update_info["async_task"] = {"function": add_document_to_index_task, "args": [document.id]}
|
||||
update_info["set_cache"] = True
|
||||
|
||||
return update_info
|
||||
|
||||
|
||||
class SegmentService:
|
||||
@classmethod
|
||||
|
||||
@ -22,7 +22,7 @@ class PluginDataMigration:
|
||||
cls.migrate_datasets()
|
||||
cls.migrate_db_records("embeddings", "provider_name", ModelProviderID) # large table
|
||||
cls.migrate_db_records("dataset_collection_bindings", "provider_name", ModelProviderID)
|
||||
cls.migrate_db_records("tool_builtin_providers", "provider_name", ToolProviderID)
|
||||
cls.migrate_db_records("tool_builtin_providers", "provider", ToolProviderID)
|
||||
|
||||
@classmethod
|
||||
def migrate_datasets(cls) -> None:
|
||||
|
||||
@ -1,4 +1,5 @@
|
||||
import os
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from flask import Flask
|
||||
@ -11,6 +12,24 @@ PROJECT_DIR = os.path.abspath(os.path.join(ABS_PATH, os.pardir, os.pardir))
|
||||
|
||||
CACHED_APP = Flask(__name__)
|
||||
|
||||
# set global mock for Redis client
|
||||
redis_mock = MagicMock()
|
||||
redis_mock.get = MagicMock(return_value=None)
|
||||
redis_mock.setex = MagicMock()
|
||||
redis_mock.setnx = MagicMock()
|
||||
redis_mock.delete = MagicMock()
|
||||
redis_mock.lock = MagicMock()
|
||||
redis_mock.exists = MagicMock(return_value=False)
|
||||
redis_mock.set = MagicMock()
|
||||
redis_mock.expire = MagicMock()
|
||||
redis_mock.hgetall = MagicMock(return_value={})
|
||||
redis_mock.hdel = MagicMock()
|
||||
redis_mock.incr = MagicMock(return_value=1)
|
||||
|
||||
# apply the mock to the Redis client in the Flask app
|
||||
redis_patcher = patch("extensions.ext_redis.redis_client", redis_mock)
|
||||
redis_patcher.start()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app() -> Flask:
|
||||
@ -21,3 +40,19 @@ def app() -> Flask:
|
||||
def _provide_app_context(app: Flask):
|
||||
with app.app_context():
|
||||
yield
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def reset_redis_mock():
|
||||
"""reset the Redis mock before each test"""
|
||||
redis_mock.reset_mock()
|
||||
redis_mock.get.return_value = None
|
||||
redis_mock.setex.return_value = None
|
||||
redis_mock.setnx.return_value = None
|
||||
redis_mock.delete.return_value = None
|
||||
redis_mock.exists.return_value = False
|
||||
redis_mock.set.return_value = None
|
||||
redis_mock.expire.return_value = None
|
||||
redis_mock.hgetall.return_value = {}
|
||||
redis_mock.hdel.return_value = None
|
||||
redis_mock.incr.return_value = 1
|
||||
|
||||
@ -0,0 +1,22 @@
|
||||
from core.rag.extractor.markdown_extractor import MarkdownExtractor
|
||||
|
||||
|
||||
def test_markdown_to_tups():
|
||||
markdown = """
|
||||
this is some text without header
|
||||
|
||||
# title 1
|
||||
this is balabala text
|
||||
|
||||
## title 2
|
||||
this is more specific text.
|
||||
"""
|
||||
extractor = MarkdownExtractor(file_path="dummy_path")
|
||||
updated_output = extractor.markdown_to_tups(markdown)
|
||||
assert len(updated_output) == 3
|
||||
key, header_value = updated_output[0]
|
||||
assert key == None
|
||||
assert header_value.strip() == "this is some text without header"
|
||||
title_1, value = updated_output[1]
|
||||
assert title_1.strip() == "title 1"
|
||||
assert value.strip() == "this is balabala text"
|
||||
1238
api/tests/unit_tests/services/test_dataset_service.py
Normal file
1238
api/tests/unit_tests/services/test_dataset_service.py
Normal file
File diff suppressed because it is too large
Load Diff
@ -56,3 +56,5 @@ NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER=true
|
||||
NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL=true
|
||||
NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL=true
|
||||
|
||||
# The maximum number of tree node depth for workflow
|
||||
NEXT_PUBLIC_MAX_TREE_DEPTH=50
|
||||
|
||||
@ -1124,6 +1124,63 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/status/{action}'
|
||||
method='PATCH'
|
||||
title='Update Document Status'
|
||||
name='#batch_document_status'
|
||||
/>
|
||||
<Row>
|
||||
<Col>
|
||||
### Path
|
||||
<Properties>
|
||||
<Property name='dataset_id' type='string' key='dataset_id'>
|
||||
Knowledge ID
|
||||
</Property>
|
||||
<Property name='action' type='string' key='action'>
|
||||
- `enable` - Enable document
|
||||
- `disable` - Disable document
|
||||
- `archive` - Archive document
|
||||
- `un_archive` - Unarchive document
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### Request Body
|
||||
<Properties>
|
||||
<Property name='document_ids' type='array[string]' key='document_ids'>
|
||||
List of document IDs
|
||||
</Property>
|
||||
</Properties>
|
||||
</Col>
|
||||
<Col sticky>
|
||||
<CodeGroup
|
||||
title="Request"
|
||||
tag="PATCH"
|
||||
label="/datasets/{dataset_id}/documents/status/{action}"
|
||||
targetCode={`curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{\n "document_ids": ["doc-id-1", "doc-id-2"]\n}'`}
|
||||
>
|
||||
```bash {{ title: 'cURL' }}
|
||||
curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \
|
||||
--header 'Authorization: Bearer {api_key}' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data-raw '{
|
||||
"document_ids": ["doc-id-1", "doc-id-2"]
|
||||
}'
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
<CodeGroup title="Response">
|
||||
```json {{ title: 'Response' }}
|
||||
{
|
||||
"result": "success"
|
||||
}
|
||||
```
|
||||
</CodeGroup>
|
||||
</Col>
|
||||
</Row>
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/{document_id}/segments'
|
||||
method='POST'
|
||||
|
||||
@ -881,6 +881,63 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/status/{action}'
|
||||
method='PATCH'
|
||||
title='ドキュメントステータスの更新'
|
||||
name='#batch_document_status'
|
||||
/>
|
||||
<Row>
|
||||
<Col>
|
||||
### パス
|
||||
<Properties>
|
||||
<Property name='dataset_id' type='string' key='dataset_id'>
|
||||
ナレッジ ID
|
||||
</Property>
|
||||
<Property name='action' type='string' key='action'>
|
||||
- `enable` - ドキュメントを有効化
|
||||
- `disable` - ドキュメントを無効化
|
||||
- `archive` - ドキュメントをアーカイブ
|
||||
- `un_archive` - ドキュメントのアーカイブを解除
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### リクエストボディ
|
||||
<Properties>
|
||||
<Property name='document_ids' type='array[string]' key='document_ids'>
|
||||
ドキュメントIDのリスト
|
||||
</Property>
|
||||
</Properties>
|
||||
</Col>
|
||||
<Col sticky>
|
||||
<CodeGroup
|
||||
title="リクエスト"
|
||||
tag="PATCH"
|
||||
label="/datasets/{dataset_id}/documents/status/{action}"
|
||||
targetCode={`curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{\n "document_ids": ["doc-id-1", "doc-id-2"]\n}'`}
|
||||
>
|
||||
```bash {{ title: 'cURL' }}
|
||||
curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \
|
||||
--header 'Authorization: Bearer {api_key}' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data-raw '{
|
||||
"document_ids": ["doc-id-1", "doc-id-2"]
|
||||
}'
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
<CodeGroup title="レスポンス">
|
||||
```json {{ title: 'Response' }}
|
||||
{
|
||||
"result": "success"
|
||||
}
|
||||
```
|
||||
</CodeGroup>
|
||||
</Col>
|
||||
</Row>
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/{document_id}/segments'
|
||||
method='POST'
|
||||
@ -2413,3 +2470,4 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
||||
</tbody>
|
||||
</table>
|
||||
<div className="pb-4" />
|
||||
|
||||
|
||||
@ -1131,6 +1131,63 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/status/{action}'
|
||||
method='PATCH'
|
||||
title='更新文档状态'
|
||||
name='#batch_document_status'
|
||||
/>
|
||||
<Row>
|
||||
<Col>
|
||||
### Path
|
||||
<Properties>
|
||||
<Property name='dataset_id' type='string' key='dataset_id'>
|
||||
知识库 ID
|
||||
</Property>
|
||||
<Property name='action' type='string' key='action'>
|
||||
- `enable` - 启用文档
|
||||
- `disable` - 禁用文档
|
||||
- `archive` - 归档文档
|
||||
- `un_archive` - 取消归档文档
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### Request Body
|
||||
<Properties>
|
||||
<Property name='document_ids' type='array[string]' key='document_ids'>
|
||||
文档ID列表
|
||||
</Property>
|
||||
</Properties>
|
||||
</Col>
|
||||
<Col sticky>
|
||||
<CodeGroup
|
||||
title="Request"
|
||||
tag="PATCH"
|
||||
label="/datasets/{dataset_id}/documents/status/{action}"
|
||||
targetCode={`curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{\n "document_ids": ["doc-id-1", "doc-id-2"]\n}'`}
|
||||
>
|
||||
```bash {{ title: 'cURL' }}
|
||||
curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \
|
||||
--header 'Authorization: Bearer {api_key}' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data-raw '{
|
||||
"document_ids": ["doc-id-1", "doc-id-2"]
|
||||
}'
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
<CodeGroup title="Response">
|
||||
```json {{ title: 'Response' }}
|
||||
{
|
||||
"result": "success"
|
||||
}
|
||||
```
|
||||
</CodeGroup>
|
||||
</Col>
|
||||
</Row>
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/{document_id}/segments'
|
||||
method='POST'
|
||||
|
||||
@ -274,7 +274,7 @@ const Chat: FC<ChatProps> = ({
|
||||
</div>
|
||||
</div>
|
||||
<div
|
||||
className={`absolute bottom-0 flex justify-center bg-chat-input-mask ${(hasTryToAsk || !noChatInput || !noStopResponding) && chatFooterClassName}`}
|
||||
className={`absolute bottom-0 z-10 flex justify-center bg-chat-input-mask ${(hasTryToAsk || !noChatInput || !noStopResponding) && chatFooterClassName}`}
|
||||
ref={chatFooterRef}
|
||||
>
|
||||
<div
|
||||
|
||||
@ -271,9 +271,7 @@ const CodeBlock: any = memo(({ inline, className, children = '', ...props }: any
|
||||
const content = String(children).replace(/\n$/, '')
|
||||
switch (language) {
|
||||
case 'mermaid':
|
||||
if (isSVG)
|
||||
return <Flowchart PrimitiveCode={content} />
|
||||
break
|
||||
return <Flowchart PrimitiveCode={content} theme={theme as 'light' | 'dark'} />
|
||||
case 'echarts': {
|
||||
// Loading state: show loading indicator
|
||||
if (chartState === 'loading') {
|
||||
@ -428,7 +426,7 @@ const CodeBlock: any = memo(({ inline, className, children = '', ...props }: any
|
||||
<div className='flex h-8 items-center justify-between rounded-t-[10px] border-b border-divider-subtle bg-components-input-bg-normal p-1 pl-3'>
|
||||
<div className='system-xs-semibold-uppercase text-text-secondary'>{languageShowName}</div>
|
||||
<div className='flex items-center gap-1'>
|
||||
{(['mermaid', 'svg']).includes(language!) && <SVGBtn isSVG={isSVG} setIsSVG={setIsSVG} />}
|
||||
{language === 'svg' && <SVGBtn isSVG={isSVG} setIsSVG={setIsSVG} />}
|
||||
<ActionButton>
|
||||
<CopyIcon content={String(children).replace(/\n$/, '')} />
|
||||
</ActionButton>
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
import React, { useCallback, useEffect, useMemo, useRef, useState } from 'react'
|
||||
import mermaid from 'mermaid'
|
||||
import mermaid, { type MermaidConfig } from 'mermaid'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { ExclamationTriangleIcon } from '@heroicons/react/24/outline'
|
||||
import { MoonIcon, SunIcon } from '@heroicons/react/24/solid'
|
||||
@ -68,14 +68,13 @@ const THEMES = {
|
||||
const initMermaid = () => {
|
||||
if (typeof window !== 'undefined' && !isMermaidInitialized) {
|
||||
try {
|
||||
mermaid.initialize({
|
||||
const config: MermaidConfig = {
|
||||
startOnLoad: false,
|
||||
fontFamily: 'sans-serif',
|
||||
securityLevel: 'loose',
|
||||
flowchart: {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
diagramPadding: 10,
|
||||
curve: 'basis',
|
||||
nodeSpacing: 50,
|
||||
rankSpacing: 70,
|
||||
@ -94,10 +93,10 @@ const initMermaid = () => {
|
||||
mindmap: {
|
||||
useMaxWidth: true,
|
||||
padding: 10,
|
||||
diagramPadding: 20,
|
||||
},
|
||||
maxTextSize: 50000,
|
||||
})
|
||||
}
|
||||
mermaid.initialize(config)
|
||||
isMermaidInitialized = true
|
||||
}
|
||||
catch (error) {
|
||||
@ -113,7 +112,7 @@ const Flowchart = React.forwardRef((props: {
|
||||
theme?: 'light' | 'dark'
|
||||
}, ref) => {
|
||||
const { t } = useTranslation()
|
||||
const [svgCode, setSvgCode] = useState<string | null>(null)
|
||||
const [svgString, setSvgString] = useState<string | null>(null)
|
||||
const [look, setLook] = useState<'classic' | 'handDrawn'>('classic')
|
||||
const [isInitialized, setIsInitialized] = useState(false)
|
||||
const [currentTheme, setCurrentTheme] = useState<'light' | 'dark'>(props.theme || 'light')
|
||||
@ -125,6 +124,7 @@ const Flowchart = React.forwardRef((props: {
|
||||
const [imagePreviewUrl, setImagePreviewUrl] = useState('')
|
||||
const [isCodeComplete, setIsCodeComplete] = useState(false)
|
||||
const codeCompletionCheckRef = useRef<NodeJS.Timeout>()
|
||||
const prevCodeRef = useRef<string>()
|
||||
|
||||
// Create cache key from code, style and theme
|
||||
const cacheKey = useMemo(() => {
|
||||
@ -169,50 +169,18 @@ const Flowchart = React.forwardRef((props: {
|
||||
*/
|
||||
const handleRenderError = (error: any) => {
|
||||
console.error('Mermaid rendering error:', error)
|
||||
const errorMsg = (error as Error).message
|
||||
|
||||
if (errorMsg.includes('getAttribute')) {
|
||||
diagramCache.clear()
|
||||
mermaid.initialize({
|
||||
startOnLoad: false,
|
||||
securityLevel: 'loose',
|
||||
})
|
||||
// On any render error, assume the mermaid state is corrupted and force a re-initialization.
|
||||
try {
|
||||
diagramCache.clear() // Clear cache to prevent using potentially corrupted SVGs
|
||||
isMermaidInitialized = false // <-- THE FIX: Force re-initialization
|
||||
initMermaid() // Re-initialize with the default safe configuration
|
||||
}
|
||||
else {
|
||||
setErrMsg(`Rendering chart failed, please refresh and try again ${look === 'handDrawn' ? 'Or try using classic mode' : ''}`)
|
||||
}
|
||||
|
||||
if (look === 'handDrawn') {
|
||||
try {
|
||||
// Clear possible cache issues
|
||||
diagramCache.delete(`${props.PrimitiveCode}-handDrawn-${currentTheme}`)
|
||||
|
||||
// Reset mermaid configuration
|
||||
mermaid.initialize({
|
||||
startOnLoad: false,
|
||||
securityLevel: 'loose',
|
||||
theme: 'default',
|
||||
maxTextSize: 50000,
|
||||
})
|
||||
|
||||
// Try rendering with standard mode
|
||||
setLook('classic')
|
||||
setErrMsg('Hand-drawn mode is not supported for this diagram. Switched to classic mode.')
|
||||
|
||||
// Delay error clearing
|
||||
setTimeout(() => {
|
||||
if (containerRef.current) {
|
||||
// Try rendering again with standard mode, but can't call renderFlowchart directly due to circular dependency
|
||||
// Instead set state to trigger re-render
|
||||
setIsCodeComplete(true) // This will trigger useEffect re-render
|
||||
}
|
||||
}, 500)
|
||||
}
|
||||
catch (e) {
|
||||
console.error('Reset after handDrawn error failed:', e)
|
||||
}
|
||||
catch (reinitError) {
|
||||
console.error('Failed to re-initialize Mermaid after error:', reinitError)
|
||||
}
|
||||
|
||||
setErrMsg(`Rendering failed: ${(error as Error).message || 'Unknown error. Please check the console.'}`)
|
||||
setIsLoading(false)
|
||||
}
|
||||
|
||||
@ -223,51 +191,23 @@ const Flowchart = React.forwardRef((props: {
|
||||
setIsInitialized(true)
|
||||
}, [])
|
||||
|
||||
// Update theme when prop changes
|
||||
// Update theme when prop changes, but allow internal override.
|
||||
const prevThemeRef = useRef<string>()
|
||||
useEffect(() => {
|
||||
if (props.theme)
|
||||
// Only react if the theme prop from the outside has actually changed.
|
||||
if (props.theme && props.theme !== prevThemeRef.current) {
|
||||
// When the global theme prop changes, it should act as the source of truth,
|
||||
// overriding any local theme selection.
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setCurrentTheme(props.theme)
|
||||
// Reset look to classic for a consistent state after a global change.
|
||||
setLook('classic')
|
||||
}
|
||||
// Update the ref to the current prop value for the next render.
|
||||
prevThemeRef.current = props.theme
|
||||
}, [props.theme])
|
||||
|
||||
// Validate mermaid code and check for completeness
|
||||
useEffect(() => {
|
||||
if (codeCompletionCheckRef.current)
|
||||
clearTimeout(codeCompletionCheckRef.current)
|
||||
|
||||
// Reset code complete status when code changes
|
||||
setIsCodeComplete(false)
|
||||
|
||||
// If no code or code is extremely short, don't proceed
|
||||
if (!props.PrimitiveCode || props.PrimitiveCode.length < 10)
|
||||
return
|
||||
|
||||
// Check if code already in cache - if so we know it's valid
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setIsCodeComplete(true)
|
||||
return
|
||||
}
|
||||
|
||||
// Initial check using the extracted isMermaidCodeComplete function
|
||||
const isComplete = isMermaidCodeComplete(props.PrimitiveCode)
|
||||
if (isComplete) {
|
||||
setIsCodeComplete(true)
|
||||
return
|
||||
}
|
||||
|
||||
// Set a delay to check again in case code is still being generated
|
||||
codeCompletionCheckRef.current = setTimeout(() => {
|
||||
setIsCodeComplete(isMermaidCodeComplete(props.PrimitiveCode))
|
||||
}, 300)
|
||||
|
||||
return () => {
|
||||
if (codeCompletionCheckRef.current)
|
||||
clearTimeout(codeCompletionCheckRef.current)
|
||||
}
|
||||
}, [props.PrimitiveCode, cacheKey])
|
||||
|
||||
/**
|
||||
* Renders flowchart based on provided code
|
||||
*/
|
||||
const renderFlowchart = useCallback(async (primitiveCode: string) => {
|
||||
if (!isInitialized || !containerRef.current) {
|
||||
setIsLoading(false)
|
||||
@ -275,15 +215,11 @@ const Flowchart = React.forwardRef((props: {
|
||||
return
|
||||
}
|
||||
|
||||
// Don't render if code is not complete yet
|
||||
if (!isCodeComplete) {
|
||||
setIsLoading(true)
|
||||
return
|
||||
}
|
||||
|
||||
// Return cached result if available
|
||||
const cacheKey = `${primitiveCode}-${look}-${currentTheme}`
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setSvgCode(diagramCache.get(cacheKey) || null)
|
||||
setErrMsg('')
|
||||
setSvgString(diagramCache.get(cacheKey) || null)
|
||||
setIsLoading(false)
|
||||
return
|
||||
}
|
||||
@ -294,17 +230,45 @@ const Flowchart = React.forwardRef((props: {
|
||||
try {
|
||||
let finalCode: string
|
||||
|
||||
// Check if it's a gantt chart or mindmap
|
||||
const isGanttChart = primitiveCode.trim().startsWith('gantt')
|
||||
const isMindMap = primitiveCode.trim().startsWith('mindmap')
|
||||
const trimmedCode = primitiveCode.trim()
|
||||
const isGantt = trimmedCode.startsWith('gantt')
|
||||
const isMindMap = trimmedCode.startsWith('mindmap')
|
||||
const isSequence = trimmedCode.startsWith('sequenceDiagram')
|
||||
|
||||
if (isGanttChart || isMindMap) {
|
||||
// For gantt charts and mindmaps, ensure each task is on its own line
|
||||
// and preserve exact whitespace/format
|
||||
finalCode = primitiveCode.trim()
|
||||
if (isGantt || isMindMap || isSequence) {
|
||||
if (isGantt) {
|
||||
finalCode = trimmedCode
|
||||
.split('\n')
|
||||
.map((line) => {
|
||||
// Gantt charts have specific syntax needs.
|
||||
const taskMatch = line.match(/^\s*([^:]+?)\s*:\s*(.*)/)
|
||||
if (!taskMatch)
|
||||
return line // Not a task line, return as is.
|
||||
|
||||
const taskName = taskMatch[1].trim()
|
||||
let paramsStr = taskMatch[2].trim()
|
||||
|
||||
// Rule 1: Correct multiple "after" dependencies ONLY if they exist.
|
||||
// This is a common mistake, e.g., "..., after task1, after task2, ..."
|
||||
const afterCount = (paramsStr.match(/after /g) || []).length
|
||||
if (afterCount > 1)
|
||||
paramsStr = paramsStr.replace(/,\s*after\s+/g, ' ')
|
||||
|
||||
// Rule 2: Normalize spacing between parameters for consistency.
|
||||
const finalParams = paramsStr.replace(/\s*,\s*/g, ', ').trim()
|
||||
return `${taskName} :${finalParams}`
|
||||
})
|
||||
.join('\n')
|
||||
}
|
||||
else {
|
||||
// For mindmap and sequence charts, which are sensitive to syntax,
|
||||
// pass the code through directly.
|
||||
finalCode = trimmedCode
|
||||
}
|
||||
}
|
||||
else {
|
||||
// Step 1: Clean and prepare Mermaid code using the extracted prepareMermaidCode function
|
||||
// This function handles flowcharts appropriately.
|
||||
finalCode = prepareMermaidCode(primitiveCode, look)
|
||||
}
|
||||
|
||||
@ -319,13 +283,12 @@ const Flowchart = React.forwardRef((props: {
|
||||
THEMES,
|
||||
)
|
||||
|
||||
// Step 4: Clean SVG code and convert to base64 using the extracted functions
|
||||
// Step 4: Clean up SVG code
|
||||
const cleanedSvg = cleanUpSvgCode(processedSvg)
|
||||
const base64Svg = await svgToBase64(cleanedSvg)
|
||||
|
||||
if (base64Svg && typeof base64Svg === 'string') {
|
||||
diagramCache.set(cacheKey, base64Svg)
|
||||
setSvgCode(base64Svg)
|
||||
if (cleanedSvg && typeof cleanedSvg === 'string') {
|
||||
diagramCache.set(cacheKey, cleanedSvg)
|
||||
setSvgString(cleanedSvg)
|
||||
}
|
||||
|
||||
setIsLoading(false)
|
||||
@ -334,12 +297,9 @@ const Flowchart = React.forwardRef((props: {
|
||||
// Error handling
|
||||
handleRenderError(error)
|
||||
}
|
||||
}, [chartId, isInitialized, cacheKey, isCodeComplete, look, currentTheme, t])
|
||||
}, [chartId, isInitialized, look, currentTheme, t])
|
||||
|
||||
/**
|
||||
* Configure mermaid based on selected style and theme
|
||||
*/
|
||||
const configureMermaid = useCallback(() => {
|
||||
const configureMermaid = useCallback((primitiveCode: string) => {
|
||||
if (typeof window !== 'undefined' && isInitialized) {
|
||||
const themeVars = THEMES[currentTheme]
|
||||
const config: any = {
|
||||
@ -361,23 +321,37 @@ const Flowchart = React.forwardRef((props: {
|
||||
mindmap: {
|
||||
useMaxWidth: true,
|
||||
padding: 10,
|
||||
diagramPadding: 20,
|
||||
},
|
||||
}
|
||||
|
||||
const isFlowchart = primitiveCode.trim().startsWith('graph') || primitiveCode.trim().startsWith('flowchart')
|
||||
|
||||
if (look === 'classic') {
|
||||
config.theme = currentTheme === 'dark' ? 'dark' : 'neutral'
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
diagramPadding: 12,
|
||||
nodeSpacing: 60,
|
||||
rankSpacing: 80,
|
||||
curve: 'linear',
|
||||
ranker: 'tight-tree',
|
||||
|
||||
if (isFlowchart) {
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
nodeSpacing: 60,
|
||||
rankSpacing: 80,
|
||||
curve: 'linear',
|
||||
ranker: 'tight-tree',
|
||||
}
|
||||
}
|
||||
|
||||
if (currentTheme === 'dark') {
|
||||
config.themeVariables = {
|
||||
background: themeVars.background,
|
||||
primaryColor: themeVars.primaryColor,
|
||||
primaryBorderColor: themeVars.primaryBorderColor,
|
||||
primaryTextColor: themeVars.primaryTextColor,
|
||||
secondaryColor: themeVars.secondaryColor,
|
||||
tertiaryColor: themeVars.tertiaryColor,
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
else { // look === 'handDrawn'
|
||||
config.theme = 'default'
|
||||
config.themeCSS = `
|
||||
.node rect { fill-opacity: 0.85; }
|
||||
@ -389,27 +363,17 @@ const Flowchart = React.forwardRef((props: {
|
||||
config.themeVariables = {
|
||||
fontSize: '14px',
|
||||
fontFamily: 'sans-serif',
|
||||
primaryBorderColor: currentTheme === 'dark' ? THEMES.dark.connectionColor : THEMES.light.connectionColor,
|
||||
}
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
diagramPadding: 10,
|
||||
nodeSpacing: 40,
|
||||
rankSpacing: 60,
|
||||
curve: 'basis',
|
||||
}
|
||||
config.themeVariables.primaryBorderColor = currentTheme === 'dark' ? THEMES.dark.connectionColor : THEMES.light.connectionColor
|
||||
}
|
||||
|
||||
if (currentTheme === 'dark' && !config.themeVariables) {
|
||||
config.themeVariables = {
|
||||
background: themeVars.background,
|
||||
primaryColor: themeVars.primaryColor,
|
||||
primaryBorderColor: themeVars.primaryBorderColor,
|
||||
primaryTextColor: themeVars.primaryTextColor,
|
||||
secondaryColor: themeVars.secondaryColor,
|
||||
tertiaryColor: themeVars.tertiaryColor,
|
||||
fontFamily: 'sans-serif',
|
||||
if (isFlowchart) {
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
nodeSpacing: 40,
|
||||
rankSpacing: 60,
|
||||
curve: 'basis',
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -425,44 +389,50 @@ const Flowchart = React.forwardRef((props: {
|
||||
return false
|
||||
}, [currentTheme, isInitialized, look])
|
||||
|
||||
// Effect for theme and style configuration
|
||||
// This is the main rendering effect.
|
||||
// It triggers whenever the code, theme, or style changes.
|
||||
useEffect(() => {
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setSvgCode(diagramCache.get(cacheKey) || null)
|
||||
setIsLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
if (configureMermaid() && containerRef.current && isCodeComplete)
|
||||
renderFlowchart(props.PrimitiveCode)
|
||||
}, [look, props.PrimitiveCode, renderFlowchart, isInitialized, cacheKey, currentTheme, isCodeComplete, configureMermaid])
|
||||
|
||||
// Effect for rendering with debounce
|
||||
useEffect(() => {
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setSvgCode(diagramCache.get(cacheKey) || null)
|
||||
if (!isInitialized)
|
||||
return
|
||||
|
||||
// Don't render if code is too short
|
||||
if (!props.PrimitiveCode || props.PrimitiveCode.length < 10) {
|
||||
setIsLoading(false)
|
||||
setSvgString(null)
|
||||
return
|
||||
}
|
||||
|
||||
// Use a timeout to handle streaming code and debounce rendering
|
||||
if (renderTimeoutRef.current)
|
||||
clearTimeout(renderTimeoutRef.current)
|
||||
|
||||
if (isCodeComplete) {
|
||||
renderTimeoutRef.current = setTimeout(() => {
|
||||
if (isInitialized)
|
||||
renderFlowchart(props.PrimitiveCode)
|
||||
}, 300)
|
||||
}
|
||||
else {
|
||||
setIsLoading(true)
|
||||
}
|
||||
setIsLoading(true)
|
||||
|
||||
renderTimeoutRef.current = setTimeout(() => {
|
||||
// Final validation before rendering
|
||||
if (!isMermaidCodeComplete(props.PrimitiveCode)) {
|
||||
setIsLoading(false)
|
||||
setErrMsg('Diagram code is not complete or invalid.')
|
||||
return
|
||||
}
|
||||
|
||||
const cacheKey = `${props.PrimitiveCode}-${look}-${currentTheme}`
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setErrMsg('')
|
||||
setSvgString(diagramCache.get(cacheKey) || null)
|
||||
setIsLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
if (configureMermaid(props.PrimitiveCode))
|
||||
renderFlowchart(props.PrimitiveCode)
|
||||
}, 300) // 300ms debounce
|
||||
|
||||
return () => {
|
||||
if (renderTimeoutRef.current)
|
||||
clearTimeout(renderTimeoutRef.current)
|
||||
}
|
||||
}, [props.PrimitiveCode, renderFlowchart, isInitialized, cacheKey, isCodeComplete])
|
||||
}, [props.PrimitiveCode, look, currentTheme, isInitialized, configureMermaid, renderFlowchart])
|
||||
|
||||
// Cleanup on unmount
|
||||
useEffect(() => {
|
||||
@ -471,14 +441,22 @@ const Flowchart = React.forwardRef((props: {
|
||||
containerRef.current.innerHTML = ''
|
||||
if (renderTimeoutRef.current)
|
||||
clearTimeout(renderTimeoutRef.current)
|
||||
if (codeCompletionCheckRef.current)
|
||||
clearTimeout(codeCompletionCheckRef.current)
|
||||
}
|
||||
}, [])
|
||||
|
||||
const handlePreviewClick = async () => {
|
||||
if (svgString) {
|
||||
const base64 = await svgToBase64(svgString)
|
||||
setImagePreviewUrl(base64)
|
||||
}
|
||||
}
|
||||
|
||||
const toggleTheme = () => {
|
||||
setCurrentTheme(prevTheme => prevTheme === 'light' ? Theme.dark : Theme.light)
|
||||
const newTheme = currentTheme === 'light' ? 'dark' : 'light'
|
||||
// Ensure a full, clean re-render cycle, consistent with global theme change.
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setCurrentTheme(newTheme)
|
||||
}
|
||||
|
||||
// Style classes for theme-dependent elements
|
||||
@ -527,14 +505,26 @@ const Flowchart = React.forwardRef((props: {
|
||||
<div
|
||||
key='classic'
|
||||
className={getLookButtonClass('classic')}
|
||||
onClick={() => setLook('classic')}
|
||||
onClick={() => {
|
||||
if (look !== 'classic') {
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setLook('classic')
|
||||
}
|
||||
}}
|
||||
>
|
||||
<div className="msh-segmented-item-label">{t('app.mermaid.classic')}</div>
|
||||
</div>
|
||||
<div
|
||||
key='handDrawn'
|
||||
className={getLookButtonClass('handDrawn')}
|
||||
onClick={() => setLook('handDrawn')}
|
||||
onClick={() => {
|
||||
if (look !== 'handDrawn') {
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setLook('handDrawn')
|
||||
}
|
||||
}}
|
||||
>
|
||||
<div className="msh-segmented-item-label">{t('app.mermaid.handDrawn')}</div>
|
||||
</div>
|
||||
@ -544,7 +534,7 @@ const Flowchart = React.forwardRef((props: {
|
||||
|
||||
<div ref={containerRef} style={{ position: 'absolute', visibility: 'hidden', height: 0, overflow: 'hidden' }} />
|
||||
|
||||
{isLoading && !svgCode && (
|
||||
{isLoading && !svgString && (
|
||||
<div className='px-[26px] py-4'>
|
||||
<LoadingAnim type='text'/>
|
||||
{!isCodeComplete && (
|
||||
@ -555,8 +545,8 @@ const Flowchart = React.forwardRef((props: {
|
||||
</div>
|
||||
)}
|
||||
|
||||
{svgCode && (
|
||||
<div className={themeClasses.mermaidDiv} style={{ objectFit: 'cover' }} onClick={() => setImagePreviewUrl(svgCode)}>
|
||||
{svgString && (
|
||||
<div className={themeClasses.mermaidDiv} style={{ objectFit: 'cover' }} onClick={handlePreviewClick}>
|
||||
<div className="absolute bottom-2 left-2 z-[100]">
|
||||
<button
|
||||
onClick={(e) => {
|
||||
@ -571,11 +561,9 @@ const Flowchart = React.forwardRef((props: {
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<img
|
||||
src={svgCode}
|
||||
alt="mermaid_chart"
|
||||
<div
|
||||
style={{ maxWidth: '100%' }}
|
||||
onError={() => { setErrMsg('Chart rendering failed, please refresh and retry') }}
|
||||
dangerouslySetInnerHTML={{ __html: svgString }}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
|
||||
@ -3,52 +3,31 @@ export function cleanUpSvgCode(svgCode: string): string {
|
||||
}
|
||||
|
||||
/**
|
||||
* Preprocesses mermaid code to fix common syntax issues
|
||||
* Prepares mermaid code for rendering by sanitizing common syntax issues.
|
||||
* @param {string} mermaidCode - The mermaid code to prepare
|
||||
* @param {'classic' | 'handDrawn'} style - The rendering style
|
||||
* @returns {string} - The prepared mermaid code
|
||||
*/
|
||||
export function preprocessMermaidCode(code: string): string {
|
||||
if (!code || typeof code !== 'string')
|
||||
export const prepareMermaidCode = (mermaidCode: string, style: 'classic' | 'handDrawn'): string => {
|
||||
if (!mermaidCode || typeof mermaidCode !== 'string')
|
||||
return ''
|
||||
|
||||
// First check if this is a gantt chart
|
||||
if (code.trim().startsWith('gantt')) {
|
||||
// For gantt charts, we need to ensure each task is on its own line
|
||||
// Split the code into lines and process each line separately
|
||||
const lines = code.split('\n').map(line => line.trim())
|
||||
return lines.join('\n')
|
||||
}
|
||||
let code = mermaidCode.trim()
|
||||
|
||||
return code
|
||||
// Replace English colons with Chinese colons in section nodes to avoid parsing issues
|
||||
.replace(/section\s+([^:]+):/g, (match, sectionName) => `section ${sectionName}:`)
|
||||
// Fix common syntax issues
|
||||
.replace(/fifopacket/g, 'rect')
|
||||
// Ensure graph has direction
|
||||
.replace(/^graph\s+((?:TB|BT|RL|LR)*)/, (match, direction) => {
|
||||
return direction ? match : 'graph TD'
|
||||
})
|
||||
// Clean up empty lines and extra spaces
|
||||
.trim()
|
||||
}
|
||||
// Security: Sanitize against javascript: protocol in click events (XSS vector)
|
||||
code = code.replace(/(\bclick\s+\w+\s+")javascript:[^"]*(")/g, '$1#$2')
|
||||
|
||||
/**
|
||||
* Prepares mermaid code based on selected style
|
||||
*/
|
||||
export function prepareMermaidCode(code: string, style: 'classic' | 'handDrawn'): string {
|
||||
let finalCode = preprocessMermaidCode(code)
|
||||
// Convenience: Basic BR replacement. This is a common and safe operation.
|
||||
code = code.replace(/<br\s*\/?>/g, '\n')
|
||||
|
||||
// Special handling for gantt charts and mindmaps
|
||||
if (finalCode.trim().startsWith('gantt') || finalCode.trim().startsWith('mindmap')) {
|
||||
// For gantt charts and mindmaps, preserve the structure exactly as is
|
||||
return finalCode
|
||||
}
|
||||
let finalCode = code
|
||||
|
||||
// Hand-drawn style requires some specific clean-up.
|
||||
if (style === 'handDrawn') {
|
||||
finalCode = finalCode
|
||||
// Remove style definitions that interfere with hand-drawn style
|
||||
.replace(/style\s+[^\n]+/g, '')
|
||||
.replace(/linkStyle\s+[^\n]+/g, '')
|
||||
.replace(/^flowchart/, 'graph')
|
||||
// Remove any styles that might interfere with hand-drawn style
|
||||
.replace(/class="[^"]*"/g, '')
|
||||
.replace(/fill="[^"]*"/g, '')
|
||||
.replace(/stroke="[^"]*"/g, '')
|
||||
@ -82,7 +61,6 @@ export function svgToBase64(svgGraph: string): Promise<string> {
|
||||
})
|
||||
}
|
||||
catch (error) {
|
||||
console.error('Error converting SVG to base64:', error)
|
||||
return Promise.resolve('')
|
||||
}
|
||||
}
|
||||
@ -115,13 +93,11 @@ export function processSvgForTheme(
|
||||
}
|
||||
else {
|
||||
let i = 0
|
||||
themes.dark.nodeColors.forEach(() => {
|
||||
const regex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
processedSvg = processedSvg.replace(regex, (match: string) => {
|
||||
const colorIndex = i % themes.dark.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.dark.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
const nodeColorRegex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
processedSvg = processedSvg.replace(nodeColorRegex, (match: string) => {
|
||||
const colorIndex = i % themes.dark.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.dark.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
|
||||
processedSvg = processedSvg
|
||||
@ -139,14 +115,12 @@ export function processSvgForTheme(
|
||||
.replace(/stroke-width="1"/g, 'stroke-width="1.5"')
|
||||
}
|
||||
else {
|
||||
themes.light.nodeColors.forEach(() => {
|
||||
const regex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
let i = 0
|
||||
processedSvg = processedSvg.replace(regex, (match: string) => {
|
||||
const colorIndex = i % themes.light.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.light.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
let i = 0
|
||||
const nodeColorRegex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
processedSvg = processedSvg.replace(nodeColorRegex, (match: string) => {
|
||||
const colorIndex = i % themes.light.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.light.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
|
||||
processedSvg = processedSvg
|
||||
@ -187,24 +161,10 @@ export function isMermaidCodeComplete(code: string): boolean {
|
||||
// Check for basic syntax structure
|
||||
const hasValidStart = /^(graph|flowchart|sequenceDiagram|classDiagram|classDef|class|stateDiagram|gantt|pie|er|journey|requirementDiagram|mindmap)/.test(trimmedCode)
|
||||
|
||||
// Check for balanced brackets and parentheses
|
||||
const isBalanced = (() => {
|
||||
const stack = []
|
||||
const pairs = { '{': '}', '[': ']', '(': ')' }
|
||||
|
||||
for (const char of trimmedCode) {
|
||||
if (char in pairs) {
|
||||
stack.push(char)
|
||||
}
|
||||
else if (Object.values(pairs).includes(char)) {
|
||||
const last = stack.pop()
|
||||
if (pairs[last as keyof typeof pairs] !== char)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
return stack.length === 0
|
||||
})()
|
||||
// The balanced bracket check was too strict and produced false negatives for valid
|
||||
// mermaid syntax like the asymmetric shape `A>B]`. Relying on Mermaid's own
|
||||
// parser is more robust.
|
||||
const isBalanced = true
|
||||
|
||||
// Check for common syntax errors
|
||||
const hasNoSyntaxErrors = !trimmedCode.includes('undefined')
|
||||
@ -215,7 +175,7 @@ export function isMermaidCodeComplete(code: string): boolean {
|
||||
return hasValidStart && isBalanced && hasNoSyntaxErrors
|
||||
}
|
||||
catch (error) {
|
||||
console.debug('Mermaid code validation error:', error)
|
||||
console.error('Mermaid code validation error:', error)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
@ -161,7 +161,9 @@ const StepTwo = ({
|
||||
|
||||
const isInCreatePage = !datasetId || (datasetId && !currentDataset?.data_source_type)
|
||||
const dataSourceType = isInCreatePage ? inCreatePageDataSourceType : currentDataset?.data_source_type
|
||||
const [segmentationType, setSegmentationType] = useState<ProcessMode>(ProcessMode.general)
|
||||
const [segmentationType, setSegmentationType] = useState<ProcessMode>(
|
||||
currentDataset?.doc_form === ChunkingMode.parentChild ? ProcessMode.parentChild : ProcessMode.general,
|
||||
)
|
||||
const [segmentIdentifier, doSetSegmentIdentifier] = useState(DEFAULT_SEGMENT_IDENTIFIER)
|
||||
const setSegmentIdentifier = useCallback((value: string, canEmpty?: boolean) => {
|
||||
doSetSegmentIdentifier(value ? escape(value) : (canEmpty ? '' : DEFAULT_SEGMENT_IDENTIFIER))
|
||||
@ -207,7 +209,14 @@ const StepTwo = ({
|
||||
}
|
||||
if (value === ChunkingMode.parentChild && indexType === IndexingType.ECONOMICAL)
|
||||
setIndexType(IndexingType.QUALIFIED)
|
||||
|
||||
setDocForm(value)
|
||||
|
||||
if (value === ChunkingMode.parentChild)
|
||||
setSegmentationType(ProcessMode.parentChild)
|
||||
else
|
||||
setSegmentationType(ProcessMode.general)
|
||||
|
||||
// eslint-disable-next-line ts/no-use-before-define
|
||||
currentEstimateMutation.reset()
|
||||
}
|
||||
@ -503,6 +512,20 @@ const StepTwo = ({
|
||||
setOverlap(overlap!)
|
||||
setRules(rules.pre_processing_rules)
|
||||
setDefaultConfig(rules)
|
||||
|
||||
if (documentDetail.dataset_process_rule.mode === 'hierarchical') {
|
||||
setParentChildConfig({
|
||||
chunkForContext: rules.parent_mode || 'paragraph',
|
||||
parent: {
|
||||
delimiter: escape(rules.segmentation.separator),
|
||||
maxLength: rules.segmentation.max_tokens,
|
||||
},
|
||||
child: {
|
||||
delimiter: escape(rules.subchunk_segmentation.separator),
|
||||
maxLength: rules.subchunk_segmentation.max_tokens,
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -965,8 +988,8 @@ const StepTwo = ({
|
||||
<div className='system-md-semibold mb-0.5 text-text-secondary'>{t('datasetSettings.form.retrievalSetting.title')}</div>
|
||||
<div className='body-xs-regular text-text-tertiary'>
|
||||
<a target='_blank' rel='noopener noreferrer'
|
||||
href={docLink('/guides/knowledge-base/create-knowledge-and-upload-documents')}
|
||||
className='text-text-accent'>{t('datasetSettings.form.retrievalSetting.learnMore')}</a>
|
||||
href={docLink('/guides/knowledge-base/create-knowledge-and-upload-documents')}
|
||||
className='text-text-accent'>{t('datasetSettings.form.retrievalSetting.learnMore')}</a>
|
||||
{t('datasetSettings.form.retrievalSetting.longDescription')}
|
||||
</div>
|
||||
</div>
|
||||
@ -1130,7 +1153,7 @@ const StepTwo = ({
|
||||
const indexForLabel = index + 1
|
||||
return (
|
||||
<PreviewSlice
|
||||
key={child}
|
||||
key={`C-${indexForLabel}-${child}`}
|
||||
label={`C-${indexForLabel}`}
|
||||
text={child}
|
||||
tooltip={`Child-chunk-${indexForLabel} · ${child.length} Characters`}
|
||||
|
||||
@ -82,9 +82,9 @@ Chat applications support session persistence, allowing previous chat history to
|
||||
|
||||
### ChatCompletionResponse
|
||||
Returns the complete App result, `Content-Type` is `application/json`.
|
||||
- `event` (string) 事件类型,固定为 `message`
|
||||
- `task_id` (string) 任务 ID,用于请求跟踪和下方的停止响应接口
|
||||
- `id` (string) 唯一ID
|
||||
- `event` (string) Event type, always `message` in blocking mode.
|
||||
- `task_id` (string) Task ID, used for request tracking and the below Stop Generate API
|
||||
- `id` (string) Unique ID, same as `message_id`
|
||||
- `message_id` (string) Unique message ID
|
||||
- `conversation_id` (string) Conversation ID
|
||||
- `mode` (string) App mode, fixed as `chat`
|
||||
|
||||
@ -1,11 +1,11 @@
|
||||
import { useCallback } from 'react'
|
||||
import { apiPrefix } from '@/config'
|
||||
import { API_PREFIX } from '@/config'
|
||||
import { useSelector } from '@/context/app-context'
|
||||
|
||||
const useGetIcon = () => {
|
||||
const currentWorkspace = useSelector(s => s.currentWorkspace)
|
||||
const getIconUrl = useCallback((fileName: string) => {
|
||||
return `${apiPrefix}/workspaces/current/plugin/icon?tenant_id=${currentWorkspace.id}&filename=${fileName}`
|
||||
return `${API_PREFIX}/workspaces/current/plugin/icon?tenant_id=${currentWorkspace.id}&filename=${fileName}`
|
||||
}, [currentWorkspace.id])
|
||||
|
||||
return {
|
||||
|
||||
@ -12,6 +12,7 @@ import {
|
||||
PortalToFollowElemTrigger,
|
||||
} from '@/app/components/base/portal-to-follow-elem'
|
||||
import cn from '@/utils/classnames'
|
||||
import { useGlobalPublicStore } from '@/context/global-public-context'
|
||||
|
||||
type Props = {
|
||||
source: PluginSource
|
||||
@ -40,6 +41,8 @@ const OperationDropdown: FC<Props> = ({
|
||||
setOpen(!openRef.current)
|
||||
}, [setOpen])
|
||||
|
||||
const { enable_marketplace } = useGlobalPublicStore(s => s.systemFeatures)
|
||||
|
||||
return (
|
||||
<PortalToFollowElem
|
||||
open={open}
|
||||
@ -77,13 +80,13 @@ const OperationDropdown: FC<Props> = ({
|
||||
className='system-md-regular cursor-pointer rounded-lg px-3 py-1.5 text-text-secondary hover:bg-state-base-hover'
|
||||
>{t('plugin.detailPanel.operation.checkUpdate')}</div>
|
||||
)}
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && (
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && enable_marketplace && (
|
||||
<a href={detailUrl} target='_blank' className='system-md-regular flex cursor-pointer items-center rounded-lg px-3 py-1.5 text-text-secondary hover:bg-state-base-hover'>
|
||||
<span className='grow'>{t('plugin.detailPanel.operation.viewDetail')}</span>
|
||||
<RiArrowRightUpLine className='h-3.5 w-3.5 shrink-0 text-text-tertiary' />
|
||||
</a>
|
||||
)}
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && (
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && enable_marketplace && (
|
||||
<div className='my-1 h-px bg-divider-subtle'></div>
|
||||
)}
|
||||
<div
|
||||
|
||||
@ -29,6 +29,7 @@ import { useAppContext } from '@/context/app-context'
|
||||
import { gte } from 'semver'
|
||||
import Tooltip from '@/app/components/base/tooltip'
|
||||
import { getMarketplaceUrl } from '@/utils/var'
|
||||
import { useGlobalPublicStore } from '@/context/global-public-context'
|
||||
|
||||
type Props = {
|
||||
className?: string
|
||||
@ -75,6 +76,7 @@ const PluginItem: FC<Props> = ({
|
||||
const getValueFromI18nObject = useRenderI18nObject()
|
||||
const title = getValueFromI18nObject(label)
|
||||
const descriptionText = getValueFromI18nObject(description)
|
||||
const { enable_marketplace } = useGlobalPublicStore(s => s.systemFeatures)
|
||||
|
||||
return (
|
||||
<div
|
||||
@ -165,7 +167,7 @@ const PluginItem: FC<Props> = ({
|
||||
</a>
|
||||
</>
|
||||
}
|
||||
{source === PluginSource.marketplace
|
||||
{source === PluginSource.marketplace && enable_marketplace
|
||||
&& <>
|
||||
<a href={getMarketplaceUrl(`/plugins/${author}/${name}`, { theme })} target='_blank' className='flex items-center gap-0.5'>
|
||||
<div className='system-2xs-medium-uppercase text-text-tertiary'>{t('plugin.from')} <span className='text-text-secondary'>marketplace</span></div>
|
||||
|
||||
@ -35,7 +35,7 @@ import type { PluginDeclaration, PluginManifestInMarket } from '../types'
|
||||
import { sleep } from '@/utils'
|
||||
import { getDocsUrl } from '@/app/components/plugins/utils'
|
||||
import { fetchBundleInfoFromMarketPlace, fetchManifestFromMarketPlace } from '@/service/plugins'
|
||||
import { marketplaceApiPrefix } from '@/config'
|
||||
import { MARKETPLACE_API_PREFIX } from '@/config'
|
||||
import { SUPPORT_INSTALL_LOCAL_FILE_EXTENSIONS } from '@/config'
|
||||
import I18n from '@/context/i18n'
|
||||
import { noop } from 'lodash-es'
|
||||
@ -106,7 +106,7 @@ const PluginPage = ({
|
||||
setManifest({
|
||||
...plugin,
|
||||
version: version.version,
|
||||
icon: `${marketplaceApiPrefix}/plugins/${plugin.org}/${plugin.name}/icon`,
|
||||
icon: `${MARKETPLACE_API_PREFIX}/plugins/${plugin.org}/${plugin.name}/icon`,
|
||||
})
|
||||
showInstallFromMarketplace()
|
||||
return
|
||||
|
||||
@ -6,7 +6,7 @@ import Item from './item'
|
||||
import type { Plugin } from '@/app/components/plugins/types.ts'
|
||||
import cn from '@/utils/classnames'
|
||||
import Link from 'next/link'
|
||||
import { marketplaceUrlPrefix } from '@/config'
|
||||
import { MARKETPLACE_URL_PREFIX } from '@/config'
|
||||
import { RiArrowRightUpLine, RiSearchLine } from '@remixicon/react'
|
||||
import { noop } from 'lodash-es'
|
||||
|
||||
@ -32,7 +32,7 @@ const List = forwardRef<ListRef, ListProps>(({
|
||||
const { t } = useTranslation()
|
||||
const hasFilter = !searchText
|
||||
const hasRes = list.length > 0
|
||||
const urlWithSearchText = `${marketplaceUrlPrefix}/?q=${searchText}&tags=${tags.join(',')}`
|
||||
const urlWithSearchText = `${MARKETPLACE_URL_PREFIX}/?q=${searchText}&tags=${tags.join(',')}`
|
||||
const nextToStickyELemRef = useRef<HTMLDivElement>(null)
|
||||
|
||||
const { handleScroll, scrollPosition } = useStickyScroll({
|
||||
@ -71,7 +71,7 @@ const List = forwardRef<ListRef, ListProps>(({
|
||||
return (
|
||||
<Link
|
||||
className='system-sm-medium sticky bottom-0 z-10 flex h-8 cursor-pointer items-center rounded-b-lg border-[0.5px] border-t border-components-panel-border bg-components-panel-bg-blur px-4 py-1 text-text-accent-light-mode-only shadow-lg'
|
||||
href={`${marketplaceUrlPrefix}/`}
|
||||
href={`${MARKETPLACE_URL_PREFIX}/`}
|
||||
target='_blank'
|
||||
>
|
||||
<span>{t('plugin.findMoreInMarketplace')}</span>
|
||||
|
||||
@ -10,7 +10,7 @@ import type { ToolWithProvider } from '../../types'
|
||||
import { BlockEnum } from '../../types'
|
||||
import type { ToolDefaultValue, ToolValue } from '../types'
|
||||
import { ViewType } from '../view-type-select'
|
||||
import ActonItem from './action-item'
|
||||
import ActionItem from './action-item'
|
||||
import BlockIcon from '../../block-icon'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
|
||||
@ -118,7 +118,7 @@ const Tool: FC<Props> = ({
|
||||
|
||||
{hasAction && !isFold && (
|
||||
actions.map(action => (
|
||||
<ActonItem
|
||||
<ActionItem
|
||||
key={action.name}
|
||||
provider={payload}
|
||||
payload={action}
|
||||
|
||||
@ -408,7 +408,6 @@ export const NODE_WIDTH = 240
|
||||
export const X_OFFSET = 60
|
||||
export const NODE_WIDTH_X_OFFSET = NODE_WIDTH + X_OFFSET
|
||||
export const Y_OFFSET = 39
|
||||
export const MAX_TREE_DEPTH = 50
|
||||
export const START_INITIAL_POSITION = { x: 80, y: 282 }
|
||||
export const AUTO_LAYOUT_OFFSET = {
|
||||
x: -42,
|
||||
|
||||
@ -18,7 +18,6 @@ import {
|
||||
} from '../utils'
|
||||
import {
|
||||
CUSTOM_NODE,
|
||||
MAX_TREE_DEPTH,
|
||||
} from '../constants'
|
||||
import type { ToolNodeType } from '../nodes/tool/types'
|
||||
import { useIsChatMode } from './use-workflow'
|
||||
@ -33,6 +32,7 @@ import { useDatasetsDetailStore } from '../datasets-detail-store/store'
|
||||
import type { KnowledgeRetrievalNodeType } from '../nodes/knowledge-retrieval/types'
|
||||
import type { DataSet } from '@/models/datasets'
|
||||
import { fetchDatasets } from '@/service/datasets'
|
||||
import { MAX_TREE_DEPTH } from '@/config'
|
||||
|
||||
export const useChecklist = (nodes: Node[], edges: Edge[]) => {
|
||||
const { t } = useTranslation()
|
||||
|
||||
@ -15,7 +15,7 @@ import { pluginManifestToCardPluginProps } from '@/app/components/plugins/instal
|
||||
import { Badge as Badge2, BadgeState } from '@/app/components/base/badge/index'
|
||||
import Link from 'next/link'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { marketplaceUrlPrefix } from '@/config'
|
||||
import { MARKETPLACE_URL_PREFIX } from '@/config'
|
||||
|
||||
export type SwitchPluginVersionProps = {
|
||||
uniqueIdentifier: string
|
||||
@ -82,7 +82,7 @@ export const SwitchPluginVersion: FC<SwitchPluginVersionProps> = (props) => {
|
||||
modalBottomLeft={
|
||||
<Link
|
||||
className='flex items-center justify-center gap-1'
|
||||
href={`${marketplaceUrlPrefix}/plugins/${pluginDetail.declaration.author}/${pluginDetail.declaration.name}`}
|
||||
href={`${MARKETPLACE_URL_PREFIX}/plugins/${pluginDetail.declaration.author}/${pluginDetail.declaration.name}`}
|
||||
target='_blank'
|
||||
>
|
||||
<span className='system-xs-regular text-xs text-text-accent'>
|
||||
|
||||
@ -37,7 +37,6 @@ const typeList = [
|
||||
ChatVarType.ArrayString,
|
||||
ChatVarType.ArrayNumber,
|
||||
ChatVarType.ArrayObject,
|
||||
ChatVarType.ArrayFile,
|
||||
]
|
||||
|
||||
const objectPlaceholder = `# example
|
||||
@ -128,7 +127,6 @@ const ChatVariableModal = ({
|
||||
case ChatVarType.ArrayString:
|
||||
case ChatVarType.ArrayNumber:
|
||||
case ChatVarType.ArrayObject:
|
||||
case ChatVarType.ArrayFile:
|
||||
return value?.filter(Boolean) || []
|
||||
}
|
||||
}
|
||||
@ -296,86 +294,84 @@ const ChatVariableModal = ({
|
||||
</div>
|
||||
</div>
|
||||
{/* default value */}
|
||||
{type !== ChatVarType.ArrayFile && (
|
||||
<div className='mb-4'>
|
||||
<div className='system-sm-semibold mb-1 flex h-6 items-center justify-between text-text-secondary'>
|
||||
<div>{t('workflow.chatVariable.modal.value')}</div>
|
||||
{(type === ChatVarType.ArrayString || type === ChatVarType.ArrayNumber) && (
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='small'
|
||||
className='text-text-tertiary'
|
||||
onClick={() => handleEditorChange(!editInJSON)}
|
||||
>
|
||||
{editInJSON ? <RiInputField className='mr-1 h-3.5 w-3.5' /> : <RiDraftLine className='mr-1 h-3.5 w-3.5' />}
|
||||
{editInJSON ? t('workflow.chatVariable.modal.oneByOne') : t('workflow.chatVariable.modal.editInJSON')}
|
||||
</Button>
|
||||
)}
|
||||
{type === ChatVarType.Object && (
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='small'
|
||||
className='text-text-tertiary'
|
||||
onClick={() => handleEditorChange(!editInJSON)}
|
||||
>
|
||||
{editInJSON ? <RiInputField className='mr-1 h-3.5 w-3.5' /> : <RiDraftLine className='mr-1 h-3.5 w-3.5' />}
|
||||
{editInJSON ? t('workflow.chatVariable.modal.editInForm') : t('workflow.chatVariable.modal.editInJSON')}
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
<div className='flex'>
|
||||
{type === ChatVarType.String && (
|
||||
// Input will remove \n\r, so use Textarea just like description area
|
||||
<textarea
|
||||
className='system-sm-regular placeholder:system-sm-regular block h-20 w-full resize-none appearance-none rounded-lg border border-transparent bg-components-input-bg-normal p-2 caret-primary-600 outline-none placeholder:text-components-input-text-placeholder hover:border-components-input-border-hover hover:bg-components-input-bg-hover focus:border-components-input-border-active focus:bg-components-input-bg-active focus:shadow-xs'
|
||||
value={value}
|
||||
placeholder={t('workflow.chatVariable.modal.valuePlaceholder') || ''}
|
||||
onChange={e => setValue(e.target.value)}
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.Number && (
|
||||
<Input
|
||||
placeholder={t('workflow.chatVariable.modal.valuePlaceholder') || ''}
|
||||
value={value}
|
||||
onChange={e => setValue(Number(e.target.value))}
|
||||
type='number'
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.Object && !editInJSON && (
|
||||
<ObjectValueList
|
||||
list={objectValue}
|
||||
onChange={setObjectValue}
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.ArrayString && !editInJSON && (
|
||||
<ArrayValueList
|
||||
isString
|
||||
list={value || [undefined]}
|
||||
onChange={setValue}
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.ArrayNumber && !editInJSON && (
|
||||
<ArrayValueList
|
||||
isString={false}
|
||||
list={value || [undefined]}
|
||||
onChange={setValue}
|
||||
/>
|
||||
)}
|
||||
{editInJSON && (
|
||||
<div className='w-full rounded-[10px] bg-components-input-bg-normal py-2 pl-3 pr-1' style={{ height: editorMinHeight }}>
|
||||
<CodeEditor
|
||||
isExpand
|
||||
noWrapper
|
||||
language={CodeLanguage.json}
|
||||
value={editorContent}
|
||||
placeholder={<div className='whitespace-pre'>{placeholder}</div>}
|
||||
onChange={handleEditorValueChange}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
<div className='mb-4'>
|
||||
<div className='system-sm-semibold mb-1 flex h-6 items-center justify-between text-text-secondary'>
|
||||
<div>{t('workflow.chatVariable.modal.value')}</div>
|
||||
{(type === ChatVarType.ArrayString || type === ChatVarType.ArrayNumber) && (
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='small'
|
||||
className='text-text-tertiary'
|
||||
onClick={() => handleEditorChange(!editInJSON)}
|
||||
>
|
||||
{editInJSON ? <RiInputField className='mr-1 h-3.5 w-3.5' /> : <RiDraftLine className='mr-1 h-3.5 w-3.5' />}
|
||||
{editInJSON ? t('workflow.chatVariable.modal.oneByOne') : t('workflow.chatVariable.modal.editInJSON')}
|
||||
</Button>
|
||||
)}
|
||||
{type === ChatVarType.Object && (
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='small'
|
||||
className='text-text-tertiary'
|
||||
onClick={() => handleEditorChange(!editInJSON)}
|
||||
>
|
||||
{editInJSON ? <RiInputField className='mr-1 h-3.5 w-3.5' /> : <RiDraftLine className='mr-1 h-3.5 w-3.5' />}
|
||||
{editInJSON ? t('workflow.chatVariable.modal.editInForm') : t('workflow.chatVariable.modal.editInJSON')}
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
<div className='flex'>
|
||||
{type === ChatVarType.String && (
|
||||
// Input will remove \n\r, so use Textarea just like description area
|
||||
<textarea
|
||||
className='system-sm-regular placeholder:system-sm-regular block h-20 w-full resize-none appearance-none rounded-lg border border-transparent bg-components-input-bg-normal p-2 caret-primary-600 outline-none placeholder:text-components-input-text-placeholder hover:border-components-input-border-hover hover:bg-components-input-bg-hover focus:border-components-input-border-active focus:bg-components-input-bg-active focus:shadow-xs'
|
||||
value={value}
|
||||
placeholder={t('workflow.chatVariable.modal.valuePlaceholder') || ''}
|
||||
onChange={e => setValue(e.target.value)}
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.Number && (
|
||||
<Input
|
||||
placeholder={t('workflow.chatVariable.modal.valuePlaceholder') || ''}
|
||||
value={value}
|
||||
onChange={e => setValue(Number(e.target.value))}
|
||||
type='number'
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.Object && !editInJSON && (
|
||||
<ObjectValueList
|
||||
list={objectValue}
|
||||
onChange={setObjectValue}
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.ArrayString && !editInJSON && (
|
||||
<ArrayValueList
|
||||
isString
|
||||
list={value || [undefined]}
|
||||
onChange={setValue}
|
||||
/>
|
||||
)}
|
||||
{type === ChatVarType.ArrayNumber && !editInJSON && (
|
||||
<ArrayValueList
|
||||
isString={false}
|
||||
list={value || [undefined]}
|
||||
onChange={setValue}
|
||||
/>
|
||||
)}
|
||||
{editInJSON && (
|
||||
<div className='w-full rounded-[10px] bg-components-input-bg-normal py-2 pl-3 pr-1' style={{ height: editorMinHeight }}>
|
||||
<CodeEditor
|
||||
isExpand
|
||||
noWrapper
|
||||
language={CodeLanguage.json}
|
||||
value={editorContent}
|
||||
placeholder={<div className='whitespace-pre'>{placeholder}</div>}
|
||||
onChange={handleEditorValueChange}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
{/* description */}
|
||||
<div className=''>
|
||||
<div className='system-sm-semibold mb-1 flex h-6 items-center text-text-secondary'>{t('workflow.chatVariable.modal.description')}</div>
|
||||
|
||||
@ -5,5 +5,4 @@ export enum ChatVarType {
|
||||
ArrayString = 'array[string]',
|
||||
ArrayNumber = 'array[number]',
|
||||
ArrayObject = 'array[object]',
|
||||
ArrayFile = 'array[file]',
|
||||
}
|
||||
|
||||
@ -9,6 +9,7 @@ import { ThemeProvider } from 'next-themes'
|
||||
import './styles/globals.css'
|
||||
import './styles/markdown.scss'
|
||||
import GlobalPublicStoreProvider from '@/context/global-public-context'
|
||||
import { DatasetAttr } from '@/types/feature'
|
||||
|
||||
export const viewport: Viewport = {
|
||||
width: 'device-width',
|
||||
@ -25,6 +26,29 @@ const LocaleLayout = async ({
|
||||
}) => {
|
||||
const locale = await getLocaleOnServer()
|
||||
|
||||
const datasetMap: Record<DatasetAttr, string | undefined> = {
|
||||
[DatasetAttr.DATA_API_PREFIX]: process.env.NEXT_PUBLIC_API_PREFIX,
|
||||
[DatasetAttr.DATA_PUBLIC_API_PREFIX]: process.env.NEXT_PUBLIC_PUBLIC_API_PREFIX,
|
||||
[DatasetAttr.DATA_MARKETPLACE_API_PREFIX]: process.env.NEXT_PUBLIC_MARKETPLACE_API_PREFIX,
|
||||
[DatasetAttr.DATA_MARKETPLACE_URL_PREFIX]: process.env.NEXT_PUBLIC_MARKETPLACE_URL_PREFIX,
|
||||
[DatasetAttr.DATA_PUBLIC_EDITION]: process.env.NEXT_PUBLIC_EDITION,
|
||||
[DatasetAttr.DATA_PUBLIC_SUPPORT_MAIL_LOGIN]: process.env.NEXT_PUBLIC_SUPPORT_MAIL_LOGIN,
|
||||
[DatasetAttr.DATA_PUBLIC_SENTRY_DSN]: process.env.NEXT_PUBLIC_SENTRY_DSN,
|
||||
[DatasetAttr.DATA_PUBLIC_MAINTENANCE_NOTICE]: process.env.NEXT_PUBLIC_MAINTENANCE_NOTICE,
|
||||
[DatasetAttr.DATA_PUBLIC_SITE_ABOUT]: process.env.NEXT_PUBLIC_SITE_ABOUT,
|
||||
[DatasetAttr.DATA_PUBLIC_TEXT_GENERATION_TIMEOUT_MS]: process.env.NEXT_PUBLIC_TEXT_GENERATION_TIMEOUT_MS,
|
||||
[DatasetAttr.DATA_PUBLIC_MAX_TOOLS_NUM]: process.env.NEXT_PUBLIC_MAX_TOOLS_NUM,
|
||||
[DatasetAttr.DATA_PUBLIC_MAX_PARALLEL_LIMIT]: process.env.NEXT_PUBLIC_MAX_PARALLEL_LIMIT,
|
||||
[DatasetAttr.DATA_PUBLIC_TOP_K_MAX_VALUE]: process.env.NEXT_PUBLIC_TOP_K_MAX_VALUE,
|
||||
[DatasetAttr.DATA_PUBLIC_INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH]: process.env.NEXT_PUBLIC_INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH,
|
||||
[DatasetAttr.DATA_PUBLIC_LOOP_NODE_MAX_COUNT]: process.env.NEXT_PUBLIC_LOOP_NODE_MAX_COUNT,
|
||||
[DatasetAttr.DATA_PUBLIC_MAX_ITERATIONS_NUM]: process.env.NEXT_PUBLIC_MAX_ITERATIONS_NUM,
|
||||
[DatasetAttr.DATA_PUBLIC_MAX_TREE_DEPTH]: process.env.NEXT_PUBLIC_MAX_TREE_DEPTH,
|
||||
[DatasetAttr.DATA_PUBLIC_ENABLE_WEBSITE_JINAREADER]: process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER,
|
||||
[DatasetAttr.DATA_PUBLIC_ENABLE_WEBSITE_FIRECRAWL]: process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL,
|
||||
[DatasetAttr.DATA_PUBLIC_ENABLE_WEBSITE_WATERCRAWL]: process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL,
|
||||
}
|
||||
|
||||
return (
|
||||
<html lang={locale ?? 'en'} className="h-full" suppressHydrationWarning>
|
||||
<head>
|
||||
@ -35,25 +59,7 @@ const LocaleLayout = async ({
|
||||
</head>
|
||||
<body
|
||||
className="color-scheme h-full select-auto"
|
||||
data-api-prefix={process.env.NEXT_PUBLIC_API_PREFIX}
|
||||
data-pubic-api-prefix={process.env.NEXT_PUBLIC_PUBLIC_API_PREFIX}
|
||||
data-marketplace-api-prefix={process.env.NEXT_PUBLIC_MARKETPLACE_API_PREFIX}
|
||||
data-marketplace-url-prefix={process.env.NEXT_PUBLIC_MARKETPLACE_URL_PREFIX}
|
||||
data-public-edition={process.env.NEXT_PUBLIC_EDITION}
|
||||
data-public-support-mail-login={process.env.NEXT_PUBLIC_SUPPORT_MAIL_LOGIN}
|
||||
data-public-sentry-dsn={process.env.NEXT_PUBLIC_SENTRY_DSN}
|
||||
data-public-maintenance-notice={process.env.NEXT_PUBLIC_MAINTENANCE_NOTICE}
|
||||
data-public-site-about={process.env.NEXT_PUBLIC_SITE_ABOUT}
|
||||
data-public-text-generation-timeout-ms={process.env.NEXT_PUBLIC_TEXT_GENERATION_TIMEOUT_MS}
|
||||
data-public-max-tools-num={process.env.NEXT_PUBLIC_MAX_TOOLS_NUM}
|
||||
data-public-max-parallel-limit={process.env.NEXT_PUBLIC_MAX_PARALLEL_LIMIT}
|
||||
data-public-top-k-max-value={process.env.NEXT_PUBLIC_TOP_K_MAX_VALUE}
|
||||
data-public-indexing-max-segmentation-tokens-length={process.env.NEXT_PUBLIC_INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH}
|
||||
data-public-loop-node-max-count={process.env.NEXT_PUBLIC_LOOP_NODE_MAX_COUNT}
|
||||
data-public-max-iterations-num={process.env.NEXT_PUBLIC_MAX_ITERATIONS_NUM}
|
||||
data-public-enable-website-jinareader={process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER}
|
||||
data-public-enable-website-firecrawl={process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL}
|
||||
data-public-enable-website-watercrawl={process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL}
|
||||
{...datasetMap}
|
||||
>
|
||||
<BrowserInitor>
|
||||
<SentryInitor>
|
||||
|
||||
@ -2,7 +2,7 @@ import { useTranslation } from 'react-i18next'
|
||||
import { useSearchParams } from 'next/navigation'
|
||||
import style from '../page.module.css'
|
||||
import Button from '@/app/components/base/button'
|
||||
import { apiPrefix } from '@/config'
|
||||
import { API_PREFIX } from '@/config'
|
||||
import classNames from '@/utils/classnames'
|
||||
import { getPurifyHref } from '@/utils'
|
||||
|
||||
@ -15,7 +15,7 @@ export default function SocialAuth(props: SocialAuthProps) {
|
||||
const searchParams = useSearchParams()
|
||||
|
||||
const getOAuthLink = (href: string) => {
|
||||
const url = getPurifyHref(`${apiPrefix}${href}`)
|
||||
const url = getPurifyHref(`${API_PREFIX}${href}`)
|
||||
if (searchParams.has('invite_token'))
|
||||
return `${url}?${searchParams.toString()}`
|
||||
|
||||
|
||||
@ -1,49 +1,44 @@
|
||||
import { InputVarType } from '@/app/components/workflow/types'
|
||||
import { AgentStrategy } from '@/types/app'
|
||||
import { PromptRole } from '@/models/debug'
|
||||
import { DatasetAttr } from '@/types/feature'
|
||||
|
||||
export let apiPrefix = ''
|
||||
export let publicApiPrefix = ''
|
||||
export let marketplaceApiPrefix = ''
|
||||
export let marketplaceUrlPrefix = ''
|
||||
|
||||
// NEXT_PUBLIC_API_PREFIX=/console/api NEXT_PUBLIC_PUBLIC_API_PREFIX=/api npm run start
|
||||
if (process.env.NEXT_PUBLIC_API_PREFIX && process.env.NEXT_PUBLIC_PUBLIC_API_PREFIX) {
|
||||
apiPrefix = process.env.NEXT_PUBLIC_API_PREFIX
|
||||
publicApiPrefix = process.env.NEXT_PUBLIC_PUBLIC_API_PREFIX
|
||||
}
|
||||
else if (
|
||||
globalThis.document?.body?.getAttribute('data-api-prefix')
|
||||
&& globalThis.document?.body?.getAttribute('data-pubic-api-prefix')
|
||||
) {
|
||||
// Not build can not get env from process.env.NEXT_PUBLIC_ in browser https://nextjs.org/docs/basic-features/environment-variables#exposing-environment-variables-to-the-browser
|
||||
apiPrefix = globalThis.document.body.getAttribute('data-api-prefix') as string
|
||||
publicApiPrefix = globalThis.document.body.getAttribute('data-pubic-api-prefix') as string
|
||||
}
|
||||
else {
|
||||
// const domainParts = globalThis.location?.host?.split('.');
|
||||
// in production env, the host is dify.app . In other env, the host is [dev].dify.app
|
||||
// const env = domainParts.length === 2 ? 'ai' : domainParts?.[0];
|
||||
apiPrefix = 'http://localhost:5001/console/api'
|
||||
publicApiPrefix = 'http://localhost:5001/api' // avoid browser private mode api cross origin
|
||||
marketplaceApiPrefix = 'http://localhost:5002/api'
|
||||
const getBooleanConfig = (envVar: string | undefined, dataAttrKey: DatasetAttr, defaultValue: boolean = true) => {
|
||||
if (envVar !== undefined && envVar !== '')
|
||||
return envVar === 'true'
|
||||
const attrValue = globalThis.document?.body?.getAttribute(dataAttrKey)
|
||||
if (attrValue !== undefined && attrValue !== '')
|
||||
return attrValue === 'true'
|
||||
return defaultValue
|
||||
}
|
||||
|
||||
if (process.env.NEXT_PUBLIC_MARKETPLACE_API_PREFIX && process.env.NEXT_PUBLIC_MARKETPLACE_URL_PREFIX) {
|
||||
marketplaceApiPrefix = process.env.NEXT_PUBLIC_MARKETPLACE_API_PREFIX
|
||||
marketplaceUrlPrefix = process.env.NEXT_PUBLIC_MARKETPLACE_URL_PREFIX
|
||||
}
|
||||
else {
|
||||
marketplaceApiPrefix = globalThis.document?.body?.getAttribute('data-marketplace-api-prefix') || ''
|
||||
marketplaceUrlPrefix = globalThis.document?.body?.getAttribute('data-marketplace-url-prefix') || ''
|
||||
const getNumberConfig = (envVar: string | undefined, dataAttrKey: DatasetAttr, defaultValue: number) => {
|
||||
if (envVar)
|
||||
return Number.parseInt(envVar)
|
||||
|
||||
const attrValue = globalThis.document?.body?.getAttribute(dataAttrKey)
|
||||
if (attrValue)
|
||||
return Number.parseInt(attrValue)
|
||||
return defaultValue
|
||||
}
|
||||
|
||||
export const API_PREFIX: string = apiPrefix
|
||||
export const PUBLIC_API_PREFIX: string = publicApiPrefix
|
||||
export const MARKETPLACE_API_PREFIX: string = marketplaceApiPrefix
|
||||
export const MARKETPLACE_URL_PREFIX: string = marketplaceUrlPrefix
|
||||
const getStringConfig = (envVar: string | undefined, dataAttrKey: DatasetAttr, defaultValue: string) => {
|
||||
if (envVar)
|
||||
return envVar
|
||||
|
||||
const attrValue = globalThis.document?.body?.getAttribute(dataAttrKey)
|
||||
if (attrValue)
|
||||
return attrValue
|
||||
return defaultValue
|
||||
}
|
||||
|
||||
export const API_PREFIX = getStringConfig(process.env.NEXT_PUBLIC_API_PREFIX, DatasetAttr.DATA_API_PREFIX, 'http://localhost:5001/console/api')
|
||||
export const PUBLIC_API_PREFIX = getStringConfig(process.env.NEXT_PUBLIC_PUBLIC_API_PREFIX, DatasetAttr.DATA_PUBLIC_API_PREFIX, 'http://localhost:5001/api')
|
||||
export const MARKETPLACE_API_PREFIX = getStringConfig(process.env.NEXT_PUBLIC_MARKETPLACE_API_PREFIX, DatasetAttr.DATA_MARKETPLACE_API_PREFIX, 'http://localhost:5002/api')
|
||||
export const MARKETPLACE_URL_PREFIX = getStringConfig(process.env.NEXT_PUBLIC_MARKETPLACE_URL_PREFIX, DatasetAttr.DATA_MARKETPLACE_URL_PREFIX, '')
|
||||
|
||||
const EDITION = getStringConfig(process.env.NEXT_PUBLIC_EDITION, DatasetAttr.DATA_PUBLIC_EDITION, 'SELF_HOSTED')
|
||||
|
||||
const EDITION = process.env.NEXT_PUBLIC_EDITION || globalThis.document?.body?.getAttribute('data-public-edition') || 'SELF_HOSTED'
|
||||
export const IS_CE_EDITION = EDITION === 'SELF_HOSTED'
|
||||
export const IS_CLOUD_EDITION = EDITION === 'CLOUD'
|
||||
|
||||
@ -162,15 +157,6 @@ export const ANNOTATION_DEFAULT = {
|
||||
score_threshold: 0.9,
|
||||
}
|
||||
|
||||
export let maxToolsNum = 10
|
||||
|
||||
if (process.env.NEXT_PUBLIC_MAX_TOOLS_NUM && process.env.NEXT_PUBLIC_MAX_TOOLS_NUM !== '')
|
||||
maxToolsNum = Number.parseInt(process.env.NEXT_PUBLIC_MAX_TOOLS_NUM)
|
||||
else if (globalThis.document?.body?.getAttribute('data-public-max-tools-num') && globalThis.document.body.getAttribute('data-public-max-tools-num') !== '')
|
||||
maxToolsNum = Number.parseInt(globalThis.document.body.getAttribute('data-public-max-tools-num') as string)
|
||||
|
||||
export const MAX_TOOLS_NUM = maxToolsNum
|
||||
|
||||
export const DEFAULT_AGENT_SETTING = {
|
||||
enabled: false,
|
||||
max_iteration: 10,
|
||||
@ -269,15 +255,6 @@ export const VAR_REGEX = /\{\{(#[a-zA-Z0-9_-]{1,50}(\.[a-zA-Z_]\w{0,29}){1,10}#)
|
||||
|
||||
export const resetReg = () => VAR_REGEX.lastIndex = 0
|
||||
|
||||
export let textGenerationTimeoutMs = 60000
|
||||
|
||||
if (process.env.NEXT_PUBLIC_TEXT_GENERATION_TIMEOUT_MS && process.env.NEXT_PUBLIC_TEXT_GENERATION_TIMEOUT_MS !== '')
|
||||
textGenerationTimeoutMs = Number.parseInt(process.env.NEXT_PUBLIC_TEXT_GENERATION_TIMEOUT_MS)
|
||||
else if (globalThis.document?.body?.getAttribute('data-public-text-generation-timeout-ms') && globalThis.document.body.getAttribute('data-public-text-generation-timeout-ms') !== '')
|
||||
textGenerationTimeoutMs = Number.parseInt(globalThis.document.body.getAttribute('data-public-text-generation-timeout-ms') as string)
|
||||
|
||||
export const TEXT_GENERATION_TIMEOUT_MS = textGenerationTimeoutMs
|
||||
|
||||
export const DISABLE_UPLOAD_IMAGE_AS_ICON = process.env.NEXT_PUBLIC_DISABLE_UPLOAD_IMAGE_AS_ICON === 'true'
|
||||
|
||||
export const GITHUB_ACCESS_TOKEN = process.env.NEXT_PUBLIC_GITHUB_ACCESS_TOKEN || ''
|
||||
@ -286,32 +263,13 @@ export const SUPPORT_INSTALL_LOCAL_FILE_EXTENSIONS = '.difypkg,.difybndl'
|
||||
export const FULL_DOC_PREVIEW_LENGTH = 50
|
||||
|
||||
export const JSON_SCHEMA_MAX_DEPTH = 10
|
||||
let loopNodeMaxCount = 100
|
||||
|
||||
if (process.env.NEXT_PUBLIC_LOOP_NODE_MAX_COUNT && process.env.NEXT_PUBLIC_LOOP_NODE_MAX_COUNT !== '')
|
||||
loopNodeMaxCount = Number.parseInt(process.env.NEXT_PUBLIC_LOOP_NODE_MAX_COUNT)
|
||||
else if (globalThis.document?.body?.getAttribute('data-public-loop-node-max-count') && globalThis.document.body.getAttribute('data-public-loop-node-max-count') !== '')
|
||||
loopNodeMaxCount = Number.parseInt(globalThis.document.body.getAttribute('data-public-loop-node-max-count') as string)
|
||||
export const MAX_TOOLS_NUM = getNumberConfig(process.env.NEXT_PUBLIC_MAX_TOOLS_NUM, DatasetAttr.DATA_PUBLIC_MAX_TOOLS_NUM, 10)
|
||||
export const TEXT_GENERATION_TIMEOUT_MS = getNumberConfig(process.env.NEXT_PUBLIC_TEXT_GENERATION_TIMEOUT_MS, DatasetAttr.DATA_PUBLIC_TEXT_GENERATION_TIMEOUT_MS, 60000)
|
||||
export const LOOP_NODE_MAX_COUNT = getNumberConfig(process.env.NEXT_PUBLIC_LOOP_NODE_MAX_COUNT, DatasetAttr.DATA_PUBLIC_LOOP_NODE_MAX_COUNT, 100)
|
||||
export const MAX_ITERATIONS_NUM = getNumberConfig(process.env.NEXT_PUBLIC_MAX_ITERATIONS_NUM, DatasetAttr.DATA_PUBLIC_MAX_ITERATIONS_NUM, 99)
|
||||
export const MAX_TREE_DEPTH = getNumberConfig(process.env.NEXT_PUBLIC_MAX_TREE_DEPTH, DatasetAttr.DATA_PUBLIC_MAX_TREE_DEPTH, 50)
|
||||
|
||||
export const LOOP_NODE_MAX_COUNT = loopNodeMaxCount
|
||||
|
||||
let maxIterationsNum = 99
|
||||
|
||||
if (process.env.NEXT_PUBLIC_MAX_ITERATIONS_NUM && process.env.NEXT_PUBLIC_MAX_ITERATIONS_NUM !== '')
|
||||
maxIterationsNum = Number.parseInt(process.env.NEXT_PUBLIC_MAX_ITERATIONS_NUM)
|
||||
else if (globalThis.document?.body?.getAttribute('data-public-max-iterations-num') && globalThis.document.body.getAttribute('data-public-max-iterations-num') !== '')
|
||||
maxIterationsNum = Number.parseInt(globalThis.document.body.getAttribute('data-public-max-iterations-num') as string)
|
||||
|
||||
export const MAX_ITERATIONS_NUM = maxIterationsNum
|
||||
|
||||
export const ENABLE_WEBSITE_JINAREADER = process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER !== undefined
|
||||
? process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER === 'true'
|
||||
: globalThis.document?.body?.getAttribute('data-public-enable-website-jinareader') === 'true' || true
|
||||
|
||||
export const ENABLE_WEBSITE_FIRECRAWL = process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL !== undefined
|
||||
? process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL === 'true'
|
||||
: globalThis.document?.body?.getAttribute('data-public-enable-website-firecrawl') === 'true' || true
|
||||
|
||||
export const ENABLE_WEBSITE_WATERCRAWL = process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL !== undefined
|
||||
? process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL === 'true'
|
||||
: globalThis.document?.body?.getAttribute('data-public-enable-website-watercrawl') === 'true' || true
|
||||
export const ENABLE_WEBSITE_JINAREADER = getBooleanConfig(process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER, DatasetAttr.DATA_PUBLIC_ENABLE_WEBSITE_JINAREADER, true)
|
||||
export const ENABLE_WEBSITE_FIRECRAWL = getBooleanConfig(process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL, DatasetAttr.DATA_PUBLIC_ENABLE_WEBSITE_FIRECRAWL, true)
|
||||
export const ENABLE_WEBSITE_WATERCRAWL = getBooleanConfig(process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL, DatasetAttr.DATA_PUBLIC_ENABLE_WEBSITE_WATERCRAWL, false)
|
||||
|
||||
@ -35,4 +35,5 @@ export NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL=${ENABLE_WEBSITE_WATERCRAWL:-true}
|
||||
export NEXT_PUBLIC_LOOP_NODE_MAX_COUNT=${LOOP_NODE_MAX_COUNT}
|
||||
export NEXT_PUBLIC_MAX_PARALLEL_LIMIT=${MAX_PARALLEL_LIMIT}
|
||||
export NEXT_PUBLIC_MAX_ITERATIONS_NUM=${MAX_ITERATIONS_NUM}
|
||||
export NEXT_PUBLIC_MAX_TREE_DEPTH=${MAX_TREE_DEPTH}
|
||||
pm2 start /app/web/server.js --name dify-web --cwd /app/web -i ${PM2_INSTANCES} --no-daemon
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
const translation = {
|
||||
daysInWeek: {
|
||||
Tue: '星期二',
|
||||
Wed: '星期三',
|
||||
Fri: '自由',
|
||||
Mon: '懷念',
|
||||
Sun: '太陽',
|
||||
Sat: '星期六',
|
||||
Thu: '星期四',
|
||||
Sun: '日',
|
||||
Mon: '一',
|
||||
Tue: '二',
|
||||
Wed: '三',
|
||||
Thu: '四',
|
||||
Fri: '五',
|
||||
Sat: '六',
|
||||
},
|
||||
months: {
|
||||
January: '一月',
|
||||
|
||||
@ -1,4 +1,4 @@
|
||||
import { apiPrefix } from '@/config'
|
||||
import { API_PREFIX } from '@/config'
|
||||
import { fetchWithRetry } from '@/utils'
|
||||
|
||||
const LOCAL_STORAGE_KEY = 'is_other_tab_refreshing'
|
||||
@ -46,7 +46,7 @@ async function getNewAccessToken(timeout: number): Promise<void> {
|
||||
// it can lead to an infinite loop if the refresh attempt also returns 401.
|
||||
// To avoid this, handle token refresh separately in a dedicated function
|
||||
// that does not call baseFetch and uses a single retry mechanism.
|
||||
const [error, ret] = await fetchWithRetry(globalThis.fetch(`${apiPrefix}/refresh-token`, {
|
||||
const [error, ret] = await fetchWithRetry(globalThis.fetch(`${API_PREFIX}/refresh-token`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json;utf-8',
|
||||
|
||||
@ -386,7 +386,7 @@ html[data-theme="dark"] {
|
||||
--color-background-gradient-bg-fill-chat-bg-2: #1d1d20;
|
||||
--color-background-gradient-bg-fill-chat-bubble-bg-1: #c8ceda14;
|
||||
--color-background-gradient-bg-fill-chat-bubble-bg-2: #c8ceda05;
|
||||
--color-background-gradient-bg-fill-chat-bubble-bg-3: #a5bddb;
|
||||
--color-background-gradient-bg-fill-chat-bubble-bg-3: #27314d;
|
||||
--color-background-gradient-bg-fill-debug-bg-1: #c8ceda14;
|
||||
--color-background-gradient-bg-fill-debug-bg-2: #18181b0a;
|
||||
|
||||
|
||||
@ -97,3 +97,26 @@ export const defaultSystemFeatures: SystemFeatures = {
|
||||
allow_email_password_login: false,
|
||||
},
|
||||
}
|
||||
|
||||
export enum DatasetAttr {
|
||||
DATA_API_PREFIX = 'data-api-prefix',
|
||||
DATA_PUBLIC_API_PREFIX = 'data-public-api-prefix',
|
||||
DATA_MARKETPLACE_API_PREFIX = 'data-marketplace-api-prefix',
|
||||
DATA_MARKETPLACE_URL_PREFIX = 'data-marketplace-url-prefix',
|
||||
DATA_PUBLIC_EDITION = 'data-public-edition',
|
||||
DATA_PUBLIC_SUPPORT_MAIL_LOGIN = 'data-public-support-mail-login',
|
||||
DATA_PUBLIC_SENTRY_DSN = 'data-public-sentry-dsn',
|
||||
DATA_PUBLIC_MAINTENANCE_NOTICE = 'data-public-maintenance-notice',
|
||||
DATA_PUBLIC_SITE_ABOUT = 'data-public-site-about',
|
||||
DATA_PUBLIC_TEXT_GENERATION_TIMEOUT_MS = 'data-public-text-generation-timeout-ms',
|
||||
DATA_PUBLIC_MAX_TOOLS_NUM = 'data-public-max-tools-num',
|
||||
DATA_PUBLIC_MAX_PARALLEL_LIMIT = 'data-public-max-parallel-limit',
|
||||
DATA_PUBLIC_TOP_K_MAX_VALUE = 'data-public-top-k-max-value',
|
||||
DATA_PUBLIC_INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH = 'data-public-indexing-max-segmentation-tokens-length',
|
||||
DATA_PUBLIC_LOOP_NODE_MAX_COUNT = 'data-public-loop-node-max-count',
|
||||
DATA_PUBLIC_MAX_ITERATIONS_NUM = 'data-public-max-iterations-num',
|
||||
DATA_PUBLIC_MAX_TREE_DEPTH = 'data-public-max-tree-depth',
|
||||
DATA_PUBLIC_ENABLE_WEBSITE_JINAREADER = 'data-public-enable-website-jinareader',
|
||||
DATA_PUBLIC_ENABLE_WEBSITE_FIRECRAWL = 'data-public-enable-website-firecrawl',
|
||||
DATA_PUBLIC_ENABLE_WEBSITE_WATERCRAWL = 'data-public-enable-website-watercrawl',
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user