Compare commits

...

1557 Commits

Author SHA1 Message Date
37f7d5732a external knowledge api 2024-09-18 15:29:30 +08:00
dcb033d221 Merge branch 'main' into feat/external-knowledge
# Conflicts:
#	api/core/rag/datasource/retrieval_service.py
#	api/models/dataset.py
#	api/services/dataset_service.py
2024-09-18 14:40:43 +08:00
9f894bb3b3 external knowledge api 2024-09-18 14:36:51 +08:00
Qun
cf645c3ba1 feat: Add ComfyUI tool for Stable Diffusion (#8160) 2024-09-18 10:56:29 +08:00
e896d1e9d7 chore: update the .gitignore file to include opensearch,pgvector,and myscale (#8470) 2024-09-17 22:54:22 +08:00
6dba68f62d feat: Add base URL settings and secure_ascii options to the Brave search tool (#8463)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-15 17:38:43 +08:00
3d083b758f feat: add flux dev of siliconflow image-gen tool (#8450) 2024-09-15 17:14:12 +08:00
aa5b2db10a chore: workflow BRANCH, PARALLEL i18n (#8452) 2024-09-15 17:13:39 +08:00
b73faae0d0 fix(RunOnce): change to form submission instead of onKeyDown and onClick (#8460) 2024-09-15 17:09:47 +08:00
4788e1c8c8 [Python SDK] Add KnowledgeBaseClient and the corresponding test cases. (#8465)
Co-authored-by: Wang Ying <wangying@xkool.org>
2024-09-15 17:08:52 +08:00
bf16de50fe fix: internal error when tool authorization (#8449) 2024-09-14 21:50:02 +08:00
7e611ffbf3 multi-retrival use dataset's top-k (#8416) 2024-09-14 21:48:44 +08:00
65162a87b6 fix:docker-compose.middleware.yaml start the Weaviate container by default (#8446) (#8447) 2024-09-14 21:48:24 +08:00
445497cf89 add svg render & Image preview optimization (#8387)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-14 19:24:53 +08:00
fa1af8e47b add WorkflowClient.get_result, increase version number (#8435)
Co-authored-by: wangying <wangying@xkool.org>
2024-09-14 19:06:37 +08:00
624331472a fix: Improve scrolling behavior for Conversation Opener (#8437)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-14 19:05:19 +08:00
72b7f8a949 Bugfix/fix feishu plugins (#8443)
Co-authored-by: 黎斌 <libin.23@bytedance.com>
2024-09-14 18:59:06 +08:00
88c9834ef2 chore(workflow): Optimize the iteration when selecting a variable from a branch in the output variable causes iteration index err (#8440) 2024-09-14 18:02:43 +08:00
d882348f39 fix: delete the delay for the tooltips inside the add tool panel (#8436) 2024-09-14 17:24:31 +08:00
b6ad7a1e06 Fix: https://github.com/langgenius/dify/issues/8190 (Update Model nam… (#8426)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-14 17:14:18 +08:00
6f7625fa47 chore: update Jina embedding model (#8376) 2024-09-14 16:21:17 +08:00
de7bc22649 fix: sys_var startwith 'sys.' not 'sys' #8421 (#8422)
Co-authored-by: wuling <wuling@ke.com>
2024-09-14 15:16:12 +08:00
52857dc0a6 feat: allow users to specify timeout for text generations and workflows by environment variable (#8395) 2024-09-14 14:11:45 +08:00
032dd93b2f Fix: operation postion of answer in logs (#8411)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-09-14 14:08:31 +08:00
5b18e851d2 fix: when the variable does not exist, an error should be prompted (#8413)
Co-authored-by: Chen(MAC) <chenchen404@outlook.com>
2024-09-14 14:08:10 +08:00
f01602b570 fix(workflow): the answer node after the iteration node containing the answer was output prematurely (#8419) 2024-09-14 14:02:09 +08:00
0123498452 fix:logs and rm unused codes in CacheEmbedding (#8409) 2024-09-14 12:56:45 +08:00
f55e06d8bf fix: resolve runtime error when self.folder is None (#8401)
Co-authored-by: 陈长君 <chenchangjun@shuwen.com>
2024-09-14 11:07:16 +08:00
b613b11422 Fix: Support Bedrock cross region inference #8190 (Update Model name to distinguish between different region groups) (#8402)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-14 11:06:20 +08:00
8efae1cba2 fix(docker): aliyun oss path env key (#8394) 2024-09-14 09:52:59 +08:00
bf55b1910f fix: pyproject.toml typo (#8396) 2024-09-14 09:45:49 +08:00
71b4480c4a fix: o1-mini 65563 -> 65536 (#8388) 2024-09-14 02:39:58 +08:00
b6b1057a18 fix: sandbox issue related httpx and requests (#8397) 2024-09-14 02:02:55 +08:00
5b98acde2f chore: improve usage of striping prefix or suffix of string with Ruff 0.6.5 (#8392) 2024-09-13 23:34:39 +08:00
aad6f340b3 fix (#8322 followup): resolve the violation of pylint rules (#8391) 2024-09-13 23:19:36 +08:00
a1104ab97e chore: refurish python code by applying Pylint linter rules (#8322) 2024-09-13 22:42:08 +08:00
1ab81b4972 support hunyuan-turbo (#8372)
Co-authored-by: sunkesi <sunkesi@hosecloud.com>
2024-09-13 20:21:48 +08:00
06b66216d7 chore: update firecrawl scrape to V1 api (#8367) 2024-09-13 20:02:00 +08:00
cd3eaed335 fix(workflow): both parallel and single branch errors occur in if-else (#8378) 2024-09-13 19:55:54 +08:00
9d80d7def7 fix: edit load balancing not pass id (#8370) 2024-09-13 17:15:03 +08:00
Joe
84ac5ccc8f fix: add before send to remove langfuse defaultErrorResponse (#8361) 2024-09-13 16:08:08 +08:00
5dfd7abb2b fix: when edit load balancing config not pass the empty filed value hidden (#8366) 2024-09-13 16:05:26 +08:00
24af4b9313 fix: o1-series model encounters an error when the generate mode is blocking (#8363) 2024-09-13 15:37:54 +08:00
6613b8f2e0 chore: fix unnecessary string concatation in single line (#8311) 2024-09-13 14:24:49 +08:00
08c486452f fix: score_threshold handling in vector search methods (#8356) 2024-09-13 14:24:35 +08:00
a45ac6ab98 fix: ark token usage is none (#8351) 2024-09-13 14:19:24 +08:00
80a322aaa2 chore: update version to 0.8.2 in packaging and docker-compose files (#8352) 2024-09-13 13:45:13 +08:00
Joe
82f7875a52 feat: add langfuse sentry ignore error (#8353) 2024-09-13 13:44:19 +08:00
4637ddaa7f feat: add o1-series models support in Agent App (ReACT only) (#8350) 2024-09-13 13:08:27 +08:00
8d2269f762 fix: copy and paste shortcut in the textarea of the workflow run panel (#8345) 2024-09-13 12:20:56 +08:00
5f03e66489 Feature/service api workflow logs (#8323) 2024-09-13 11:03:57 +08:00
a9c1f1a041 fix(workflow): fix var-selector not update when edges change (#8259)
Co-authored-by: Chen(MAC) <chenchen404@outlook.com>
2024-09-13 11:03:39 +08:00
49cee773c5 fixed score threshold is none (#8342) 2024-09-13 10:21:58 +08:00
89e81873c4 merge error 2024-09-13 09:49:24 +08:00
c78828ab7c chore: update Dify version to 0.8.1 (#8329) 2024-09-13 02:48:24 +08:00
e90d3c29ab feat: add OpenAI o1 series models support (#8328) 2024-09-13 02:15:19 +08:00
153807f243 fix: response_format label (#8326) 2024-09-12 23:17:29 +08:00
5db0b56c5b docs: update lambda_translate_utils.yaml (#8293) 2024-09-12 20:33:07 +08:00
404db1ae5b Fix VariableEntityType Bug external-data-tool -> external_data_tool (#8299) 2024-09-12 20:27:55 +08:00
02c4b1af71 chore:add Azure openai api version 2024-08-01-preview (#8291) 2024-09-12 20:22:57 +08:00
aa11659062 Revert "Feat: update app published time after clicking publish button" (#8320) 2024-09-12 20:06:06 +08:00
d4985fb3aa Fix: Support Bedrock cross region inference [#8190](https://github.com/langgenius/dify/issues/8190) (#8317) 2024-09-12 19:15:20 +08:00
8815511ccb chore: apply flake8-pytest-style linter rules (#8307) 2024-09-12 18:09:16 +08:00
40fb4d16ef chore: refurbish Python code by applying refurb linter rules (#8296) 2024-09-12 15:50:49 +08:00
c69f5b07ba chore: apply ruff E501 line-too-long linter rule (#8275)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-12 14:00:36 +08:00
56c90e212a fix(workflow): missing content in the answer node stream output during iterations (#8292)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-12 13:59:48 +08:00
0f14873255 chore: cleanup ruff flake8-simplify linter rules (#8286)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-12 12:55:45 +08:00
0bb7569d46 fix: markdown paragraph margin (#8289) 2024-09-12 11:28:14 +08:00
ec57922bb6 fix(workflow/hooks/use-shortcuts): resolve issue of copy shortcut not working in workflow debug and preview panel (#8249)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-09-12 10:39:18 +08:00
781d294f49 chore: cleanup pycodestyle E rules (#8269) 2024-09-11 18:55:00 +08:00
f515af2232 let claude models in bedrock support the response_format parameter (#8220)
Co-authored-by: duyalei <>
2024-09-11 18:24:50 +08:00
fe8191b899 enhance: improve empty data display for detail panel (#8266) 2024-09-11 18:24:18 +08:00
4d2cd6703b chore: remove useless code (#8198) 2024-09-11 18:19:34 +08:00
9ca0e56a8a external dataset binding 2024-09-11 16:59:19 +08:00
292220c596 chore: apply pep8-naming rules for naming convention (#8261) 2024-09-11 16:40:52 +08:00
53f37a6704 fix:ollama text embedding 500 error (#8252) 2024-09-11 16:23:19 +08:00
75c1a82556 Update Gitlab query field, add query by path (#8244) 2024-09-11 16:09:53 +08:00
c5b3777d93 editor can also create api key (#8214) 2024-09-11 16:07:15 +08:00
678bbf8fe8 fix: upload img icon mis-align in the chat input area (#8263) 2024-09-11 15:58:20 +08:00
342607f4a4 fix: truthy value (#8208) 2024-09-11 15:44:53 +08:00
5f4cdd66fa fix(workflow): IF-ELSE nodes connected to the same subsequent node cause execution to stop (#8247) 2024-09-11 12:28:32 +08:00
91942e37ff fix: workflow parallel limit in ifelse node (#8242) 2024-09-11 11:30:33 +08:00
60913970dc fix: CHECK_UPDATE_URL comment (#8235) 2024-09-11 10:58:35 +08:00
82c42b9ec5 fix:error when adding the ollama embedding model (#8236)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-11 10:25:45 +08:00
2a3d8c25bc fix: improving the regionalization of translation (#8231)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-11 08:55:32 +08:00
cee0c51dbb feat: add from_variable_selector for stream chunk / message event (#8228) 2024-09-10 22:15:50 +08:00
fdbbdb706f fix(workflow): answers are output simultaneously across different braches in the question classifier node. (#8225) 2024-09-10 21:11:35 +08:00
f6dfe23cf8 fix(workflow): in multi-parallel execution with multiple conditional branches (#8221) 2024-09-10 21:09:18 +08:00
ffd4bf8bf0 fix(docker/docker-compose.yaml): Set default value for REDIS_SENTINEL_SOCKET_TIMEOUT and CELERY_SENTINEL_SOCKET_TIMEOUT (#8218) 2024-09-10 18:47:59 +08:00
bb3002b173 revert page column (#8217) 2024-09-10 18:21:22 +08:00
d4dc54447a fix the tooltip in tool nodes (#8215) 2024-09-10 17:53:44 +08:00
d109881410 chore(api/models): apply ruff reformatting (#7600)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-10 17:08:06 +08:00
d1605952b0 fix: input chat input wrong padding (#8207) 2024-09-10 17:01:32 +08:00
2cf1187b32 chore(api/core): apply ruff reformatting (#7624) 2024-09-10 17:00:20 +08:00
178730266d chore: translate i18n files (#8202)
Co-authored-by: takatost <5485478+takatost@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-09-10 16:13:26 +08:00
dabfd74622 feat: Parallel Execution of Nodes in Workflows (#8192)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
Co-authored-by: Yi <yxiaoisme@gmail.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-09-10 15:23:16 +08:00
5da0182800 docs: replace docker-compose with docker compose (#8195) 2024-09-10 15:02:52 +08:00
ed37439ef7 refactor(api/core): Improve type hints and apply ruff formatter in agent runner and model manager. (#8166) 2024-09-10 15:00:25 +08:00
af92f19291 filter excel empty sheet (#8194) 2024-09-10 14:55:08 +08:00
86f7f245e4 fix: The length of the tag should between 1 and 50 (#8187) (#8188) 2024-09-10 14:07:06 +08:00
2d690801d1 nvidia rerank top n missed (#8185) 2024-09-10 13:17:48 +08:00
fede54be77 fix: Version '2.6.2-2' for 'expat' was not found (#8182) 2024-09-10 13:00:37 +08:00
85ff82a694 code merge error (#8183)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-10 12:52:50 +08:00
c8df92d0eb add volcengine tos storage (#8164) 2024-09-10 09:19:47 +08:00
144d30d7ef chore: bump super-linter to v7 (#8148) 2024-09-10 09:13:48 +08:00
4313d92e6b feat(api/core/model_runtime/entities/defaults.py): Add TOP_K in default parameters. (#8167) 2024-09-10 09:11:31 +08:00
0695543f63 Fix variable typo (cont) (#8161) 2024-09-09 23:46:13 +08:00
0bec6a037c update qwen-long (#8157) 2024-09-09 19:09:42 +08:00
3ff9a1f24a Update LICENSE - remove 'SaaS' from restriction term definition (#8143) 2024-09-09 16:52:55 +08:00
a771eea4f6 fix: html raw render (#8138) 2024-09-09 16:12:59 +08:00
e7c77d961b Merge branch 'main' into feat/external-knowledge
# Conflicts:
#	api/controllers/console/auth/data_source_oauth.py
2024-09-09 15:54:43 +08:00
61a0ca9e0d chore: translate i18n files (#8135)
Co-authored-by: zxhlyh <16177003+zxhlyh@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-09 15:54:00 +08:00
551b33c8e5 fix: user-select style and pre-create iframe in embed.js (#8093) 2024-09-09 15:40:56 +08:00
fa34b9aed6 Modify model parameters in Spark LLMs and zhipuai LLMs (#8078)
Co-authored-by: Charlie.Wei <luowei@cvte.com>
2024-09-09 15:36:47 +08:00
bbb609179f chore: offline n to 1 retrieval (#8134) 2024-09-09 15:32:02 +08:00
a27d4d58ec fix: ollama text embedding 500 error (#8131) 2024-09-09 15:27:49 +08:00
50d92f0fd4 add dify-sandbox health check in docker-compose.yaml (#8121) (#8124) 2024-09-09 14:39:06 +08:00
a15791e788 Fix: tongyi code wrapper works not stable (#7871)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-09 11:15:17 +08:00
954580a4af feat: support more model types and builtin tools on aws/sagemaker (#8061)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-09-09 10:34:11 +08:00
ab7d79275e fix: Claude can not validate credientials (#8109) 2024-09-09 10:22:42 +08:00
d3658166fb Translate billing to PT-BR (#8105)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-09 10:16:22 +08:00
54b72bdd0a chore: keep dify compose file consistent format (#8102) 2024-09-09 08:30:03 +08:00
d28446301f feat:add fishaudio in xinference (#8100) 2024-09-08 23:58:02 +08:00
9050f92e5b fix: parameter input (#8076) 2024-09-08 15:43:55 +08:00
feefeb44d7 fix LangSmith project config error (#7996) 2024-09-08 13:25:27 +08:00
Zhi
d542b15cc0 feat: support redis sentinel mode (#7756) 2024-09-08 13:23:51 +08:00
2d7954c7da Fix variable typo (#8084) 2024-09-08 13:14:11 +08:00
b1918dae5e fix: knowledge input (#8065) 2024-09-07 17:53:39 +08:00
031a0b576d fix: i18n typo (#8077) 2024-09-07 16:59:38 +08:00
0cef25ef8c Revert "fix: parameter rule" (#8070) 2024-09-07 10:44:56 +08:00
cdb08be951 fix: overflow issues in chat history (#8062) 2024-09-06 19:20:18 +08:00
900fd82a92 fix: parameter rule (#8064) 2024-09-06 19:15:24 +08:00
44f963f281 If else add regexmatch (#8059)
Co-authored-by: 罗威 <luowei@cvte.com>
2024-09-06 18:35:51 +08:00
01858e1caf ifEsle node add regex match (#8007) 2024-09-06 17:44:09 +08:00
2060db8e11 fix: change milvus init args from (host, port) to (url, token) (#8019)
Signed-off-by: ChengZi <chen.zhang@zilliz.com>
2024-09-06 17:32:48 +08:00
9ded063417 chore: #7348, support query conversations by updated_at (#8047) 2024-09-06 17:31:51 +08:00
d72da2777c fix the tooltip in tools node (#8055) 2024-09-06 17:28:22 +08:00
89aede80cc Add OCI(Oracle Cloud Infrastructure) Generative AI Service as a Model Provider (#7775)
Co-authored-by: Walter Jin <jinshuhaicc@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: walter from vm <walter.jin@oracle.com>
2024-09-06 14:15:40 +08:00
e0d3cd91c6 support huawei cloud obs storage (#7980) (#7981) 2024-09-06 14:00:47 +08:00
1a054ac1f4 Update milvus-standalone version and expose required ports for the container. (#7709)
Co-authored-by: Jyong <76649700+JohnJyong@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-06 12:01:59 +08:00
3230f4a0ec Message rendering (#6868)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-09-05 21:00:09 +08:00
dadca0f91a Fix/datasets api description error (#8025) 2024-09-05 16:45:44 +08:00
d489b8b3e0 feat: return page number of pdf documents upon retrieval (#7749) 2024-09-05 16:43:26 +08:00
bd0992275c feat: support fish audio TTS (#7982) 2024-09-05 14:18:39 +08:00
3e7597f2bd feat: add gpt-4o-2024-08-06 and json_schema for azure openAI service (#7648) 2024-09-04 21:56:08 +08:00
0e71f6db84 fix spliter length missed (#7987) 2024-09-04 21:47:12 +08:00
f6b9982c23 Concurrent calls to the Wenxin model, and the exception problem when obtaining the token is fixed (#7976)
Co-authored-by: puqs1 <puqs1@lenovo.com>
2024-09-04 21:44:57 +08:00
fb113a9479 chore: translate i18n files (#7965)
Co-authored-by: JohnJyong <76649700+JohnJyong@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Hanqing Zhao <sherry9277@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-04 17:45:12 +08:00
15791510c8 fix wrong error message (#7972) 2024-09-04 16:46:41 +08:00
0f72a8e89d chore: refactor the beichuan model (#7953) 2024-09-04 16:22:31 +08:00
14af87527f Feat:remove estimation of embedding cost (#7950)
Co-authored-by: jyong <718720800@qq.com>
2024-09-04 14:41:47 +08:00
83e84865be feat: add health check for pg and redis in docker-compose.middleware.yaml (#7961) (#7962) 2024-09-04 14:25:46 +08:00
c2a3c5a748 fix: get commit sha failed in translate action (#7959) 2024-09-04 13:13:21 +08:00
83494cb4f5 fix:empty voice occurs when xinference CosyVoice tts model (#7958) 2024-09-04 13:04:31 +08:00
0bc19c3fbf Feat: update app published time after clicking publish button (#7801) 2024-09-04 13:03:06 +08:00
571415d1a4 fix: split text keep separator (#7930) 2024-09-04 12:59:10 +08:00
7b2cf8215f chore: fix inverted index japanese translation (#7957) 2024-09-04 12:44:59 +08:00
Joe
fee4d3f6ca feat: ops trace add llm model (#7306) 2024-09-04 10:39:00 +08:00
161cc0cda9 Revert "fix: an issue of keyword search feature in application log list" (#7949) 2024-09-04 10:00:55 +08:00
71bff9fcf3 chore: #7943 i18n (#7948) 2024-09-04 09:42:25 +08:00
80d14c9b22 fix(api): Code-Based Extension cause error on position map sorting (#7934)
Signed-off-by: 陳鈞 <jim60105@gmail.com>
2024-09-04 08:41:12 +08:00
c5bdf08558 Chore/add roadmap (#7943) 2024-09-04 08:33:02 +08:00
596f160a1e Chore/add default step 1x url (#7933) 2024-09-04 08:32:22 +08:00
d8b6c053a2 fix rerank model value is empty string (#7937) 2024-09-03 21:25:21 +08:00
4b262cae58 chore: #7603 i18n (#7931) 2024-09-03 19:19:52 +08:00
1a5116cba0 Fix/segment create with api (#7928) 2024-09-03 18:14:47 +08:00
01581dd35f improve the notion table extract (#7925) 2024-09-03 17:52:07 +08:00
7fdd964379 fix: frontend handle sometimes server not generate the wrong follow up data struct (#7916) 2024-09-03 14:09:46 +08:00
0cfcc97e9d feat: support auto generate i18n translate (#6964)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-03 10:17:05 +08:00
8986be0aab chore: Update versions to 0.7.3 (#7895) 2024-09-03 09:49:32 +08:00
f76bbbf5e6 chore(Dockerfile): Bump expat to 2.6.2-2 (#7904) 2024-09-03 09:48:30 +08:00
fe217da05c fix: correct typo in the setting screen (#7897) 2024-09-02 22:49:56 +08:00
80aa7c4019 feat: allow users to use the app icon as the answer icon (#7888)
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-02 20:00:41 +08:00
6f33351eb3 ignore linked images when image id is none (#7890) 2024-09-02 19:37:05 +08:00
35f13c7327 Add Russian language (#7860)
Co-authored-by: d8rt8v <alex@ydertev.ru>
Co-authored-by: crazywoola <427733928@qq.com>
2024-09-02 19:09:41 +08:00
a8b9e01b3e fix: fixed typo on loading reranking_mode (#7887) 2024-09-02 16:18:47 +08:00
7193e189f3 Add perplexity search as a new tool (#7861) 2024-09-02 14:48:13 +08:00
3f2a806abe fix: glm models prices and max_tokens correction (#7882) 2024-09-02 14:29:09 +08:00
5e4907e940 fix: layout shift on app card hover (#7872) 2024-09-02 11:05:54 +08:00
omr
bf63c5d1e3 fix typo: langauge -> language (#7875) 2024-09-02 08:41:45 +08:00
78989e9049 Add ALIYUN_OSS_PATH configuration for Aliyun OSS (#7864)
Co-authored-by: seayon <zhaoxuyang@shouqianba.com>
2024-09-01 21:30:17 +08:00
1510bdbcf6 refactor: Remove typecasting by any (#7862) 2024-09-01 14:58:12 +08:00
024d688b77 fix(RetrievalConfig): Fix score threshold assignment for zero value (#7865) 2024-09-01 14:57:50 +08:00
ef82a29e23 fix: crash when ECharts accesses undefined objects (#7853) 2024-09-01 14:52:27 +08:00
1f56a20b62 feat: support auth by api key for ark provider (#7845) 2024-08-31 10:56:32 +08:00
0c2a62f847 fix: correct http timeout configs‘ default values and ignorance by HttpRequestNode (#7762) 2024-08-30 19:09:10 +08:00
ea748b50f2 fix: an issue of keyword search feature in application log list (#7816) 2024-08-30 18:48:05 +08:00
62bfc4dba6 fix: tooltip size sets improperly (#7836) 2024-08-30 18:13:54 +08:00
Zhi
ceb2b150ff enhance: include workspace name in create-tenant command (#7834) 2024-08-30 15:53:50 +08:00
dc015c380a feat: add zhipu glm_4_plus and glm_4v_plus model (#7824) 2024-08-30 15:08:31 +08:00
c9e0f0bf20 fix: correct typo in environment variable description (#7817) 2024-08-30 00:03:40 +08:00
bd6d4d0553 fix: filter out installed apps without an app (#7799) 2024-08-29 19:03:08 +08:00
f0273f00e1 Fixed when testing the openai compatible interface model, an error is reported when no object is returned (#7808) 2024-08-29 18:58:19 +08:00
962cdbbebd chore: add app generator overload (#7792) 2024-08-29 16:04:01 +08:00
2c51e3a327 fix: webapp sso setting may not the latest value when refresh (#7795) 2024-08-29 15:57:43 +08:00
8e311cc45c fixed permission is None (#7788) 2024-08-29 12:46:42 +08:00
c441bea4d1 fix: datasets permission is missing (#7787) 2024-08-29 12:46:33 +08:00
ad30668eb6 Sync Input component from feat/attachments branch (#7782) 2024-08-29 11:23:16 +08:00
62f4801523 Update ssrf_proxy related doc link in docker-compose file (#7778) 2024-08-29 11:22:39 +08:00
ec1408346e docs: navigate to open issues in contributing documents (#7781) 2024-08-29 11:18:49 +08:00
0e0a703496 chore: ignore openai error record in sentry (#7770) 2024-08-28 23:26:11 +08:00
54b693d5b1 feat: update saas billing hint. (#7760) 2024-08-28 18:55:47 +08:00
1262277714 chore: improve http executor configs (#7730) 2024-08-28 17:46:37 +08:00
3a67fc6c5a feat: add support for array types in available variable list (#7715) 2024-08-28 17:30:13 +08:00
26abbe8e5b feat(Tools): add a tool to query the stock price from Alpha Vantage (#7019) (#7752) 2024-08-28 17:27:20 +08:00
5d0914daea fix: not able to pass array of string/number/object into variable aggregator groups (#7757) 2024-08-28 17:25:20 +08:00
7541a492b7 fix: crawl options max length can not set 0 (#7758)
Co-authored-by: Yi <yxiaoisme@gmail.com>
2024-08-28 17:16:07 +08:00
3a071b8db9 fix: datasets permission is missing (#7751) 2024-08-28 15:36:11 +08:00
9342b4b951 Update package "libldap-2.5-0" for docker build. (#7726) 2024-08-28 14:44:05 +08:00
4682e0ac7c fix(storage): 🐛 HeadBucket Operation Permission (#7733)
Co-authored-by: 莫岳恒 <moyueheng@datagrand.com>
2024-08-28 13:57:45 +08:00
7cfebffbb8 chore: update default endpoint for ark provider (#7741) 2024-08-28 13:56:50 +08:00
693fe912f2 Fix annotation reply settings (#7696) 2024-08-28 09:42:54 +08:00
bc3a8e0ca2 feat: store created_by and updated_by for apps, modelconfigs, and sites (#7613) 2024-08-28 08:47:30 +08:00
e38334cfd2 fix: doc_language return null when document segment settings (#7719) 2024-08-28 08:45:51 +08:00
92cab33b73 feat(Tools): add feishu document and message plugins (#6435)
Co-authored-by: 黎斌 <libin.23@bytedance.com>
2024-08-27 20:21:42 +08:00
3f467613fc feat: support configs for code execution request (#7704) 2024-08-27 19:38:33 +08:00
205d33a813 Fix: read properties of undefined issue (#7708)
Co-authored-by: libing <libing@healink.cn>
2024-08-27 19:23:56 +08:00
da326baa5e fix: tongyi Error: 'NoneType' object is not subscriptable (#7705) 2024-08-27 16:56:06 +08:00
d9198b5646 feat: remove unused code (#7702) 2024-08-27 16:47:34 +08:00
60001a62c4 fixed chunk_overlap is None (#7703) 2024-08-27 16:38:06 +08:00
ee7d5e7206 feat: support Moonshot and GLM models tool call for volc ark provider (#7666) 2024-08-27 14:43:37 +08:00
2726fb3d5d feat:dailymessages (#7603) 2024-08-27 12:53:27 +08:00
d7aa4076c9 feat: display account name on the logs page for the apps (#7668)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-08-27 12:40:44 +08:00
122ce41020 feat: rewrite Elasticsearch index and search code to achieve Elasticsearch vector and full-text search (#7641)
Co-authored-by: haokai <haokai@shuwen.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Bowen Liang <bowenliang@apache.org>
Co-authored-by: wellCh4n <wellCh4n@foxmail.com>
2024-08-27 11:43:44 +08:00
e7afee1176 Langfuse view button (#7684) 2024-08-27 11:25:56 +08:00
88730906ec fix: empty knowledge add file (#7690) 2024-08-27 11:25:27 +08:00
a15080a1d7 bug: (#7586 followup) fix config of CODE_MAX_STRING_LENGTH (#7683) 2024-08-27 10:38:24 +08:00
35431bce0d fix dataset_id and index_node_id idx missed in document_segments tabl… (#7681) 2024-08-27 10:25:24 +08:00
7b7576ad55 Add Azure AI Studio as provider (#7549)
Co-authored-by: Hélio Lúcio <canais.hlucio@voegol.com.br>
2024-08-27 09:52:59 +08:00
162faee4f2 fix: set score_threshold to zero if it is None for MyScale vectordb (#7640)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-08-27 09:47:16 +08:00
Zhi
b7ff98d7ff fix: Remove useless debug information. (#7647) 2024-08-26 20:40:26 +08:00
0474f0c906 chore: Update version to 0.7.2 (#7646) 2024-08-26 20:11:55 +08:00
430e100142 refactor: Add @staticmethod decorator in api/core (#7652) 2024-08-26 19:45:03 +08:00
1473083a41 catch openai rate limit error (#7658) 2024-08-26 19:36:44 +08:00
7cda73f192 Proposal to revise Japanese expressions (#7664) 2024-08-26 19:05:49 +08:00
7c2bb31a55 [fix] openai's tool role dose not support name parameter. (#7659) 2024-08-26 18:52:34 +08:00
ba82023445 fix: support float type for tool parameter's default value (#7644) 2024-08-26 17:10:54 +08:00
13be84e4d4 chore(api/controllers): Apply Ruff Formatter. (#7645) 2024-08-26 15:29:10 +08:00
7ae728a9a3 fix nltk averaged_perceptron_tagger download and fix score limit is none (#7582)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-08-26 15:14:05 +08:00
a7743a4f47 add:save_model_credentials error log (#7630) 2024-08-26 14:46:29 +08:00
Zhi
103ff28530 feat: speed up the Docker build for dify-api for Chinese developers. (#7626) 2024-08-26 14:45:28 +08:00
Zhi
8dfdb37de3 fix: use LOG_LEVEL for celery startup (#7628) 2024-08-26 14:44:58 +08:00
17fd773a30 chore(api/services): apply ruff reformatting (#7599)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-08-26 13:43:57 +08:00
979422cdc6 chore(api/tasks): apply ruff reformatting (#7594) 2024-08-26 13:38:37 +08:00
3be756eaed feat: tooltip (#7634) 2024-08-26 13:00:02 +08:00
1ba3d3acd6 feat: replace show/hide workflow_steps with switch (#7627) 2024-08-26 11:00:57 +08:00
23cedc3f1c Web app now supports SSO config (#7137) 2024-08-25 18:47:16 +08:00
Joe
741c548f3c feat: web sso (#7135) 2024-08-25 18:47:02 +08:00
556f4ad5df feat: add siliconflow text2img tool (#7612) 2024-08-25 14:39:58 +08:00
561a61e7fe Improve MIME type detection for image URLs (#6531)
Co-authored-by: seayon <zhaoxuyang@shouqianba.com>
2024-08-25 13:36:16 +08:00
47919983bf fix: typo in comment (#7606) 2024-08-25 09:56:08 +08:00
efc136cce5 feat: Introduce Ark SDK v3 and ensure compatibility with models of SDK v2 (#7579)
Co-authored-by: crazywoola <427733928@qq.com>
2024-08-24 19:29:45 +08:00
b035c02f78 chore(api/tests): apply ruff reformat #7590 (#7591)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-08-23 23:52:25 +08:00
2da63654e5 chore(api/configs): apply ruff reformat (#7590) 2024-08-23 23:46:01 +08:00
3ace01cfb3 chore: cleanup and rearrange unclassified configs into feature config groups (#7586) 2024-08-23 22:40:07 +08:00
e3d7c7c6f9 fix(onebot): use yarl to format url (#7589) 2024-08-23 22:22:42 +08:00
8807d880dc Feat: add OneBot protocol tool (#7583) 2024-08-23 19:16:30 +08:00
70d6ab0bf5 Update stable_diffusion.py (#7536) 2024-08-23 18:58:13 +08:00
e42848f4b7 Do not pass query parameter when the value is empty (#7585) 2024-08-23 18:50:38 +08:00
25386af41a fix: knowledge setting "knowledge name" input width (#7584) 2024-08-23 17:20:19 +08:00
f29685f8a1 fix score_threshold is none, return all top K documents (#7581) 2024-08-23 16:59:34 +08:00
a63e15081f update nltk version 2024-08-23 16:43:47 +08:00
ad13011043 add JSON Mode support for moonshot models (#7568) 2024-08-23 16:24:45 +08:00
df69ad9f0e Langfuse view button (#7578) 2024-08-23 16:23:26 +08:00
9864b35465 langfuser add view button (#7571) 2024-08-23 15:53:49 +08:00
6025002971 add qwen text-embedding-v3 support. (#7567) 2024-08-23 15:32:38 +08:00
0c38a8fdd4 Fix: voice language (#7570) 2024-08-23 15:25:07 +08:00
fb75bd9790 chore: improve the check time of variable name in conversation and env var (#7572) 2024-08-23 15:08:34 +08:00
399d7cd596 chore: improve the check time of variable name (#7569) 2024-08-23 14:30:26 +08:00
0a7ab9a47d fix: incorrect duplication when no target node is selected (#7539) 2024-08-23 13:16:15 +08:00
9618f86980 fix: workflow context menu popup issue (#7530) 2024-08-23 13:14:17 +08:00
a71fc18530 feat: set workflow zoom range for shortcut (#7563) 2024-08-23 13:11:55 +08:00
3ac8a2871e chore: #6554 i18n (#7562) 2024-08-23 11:16:37 +08:00
a24717765e feat: forward zhipu finish_reason (#7560) 2024-08-23 11:15:38 +08:00
a40073afa4 Add N-to-1 warning translation for JP (#7553) 2024-08-23 08:34:22 +08:00
e6b117e33f fix: correct response structure in openapi documentation of app (#7556) 2024-08-23 08:33:41 +08:00
3e6a6bf396 fix: wrong usage of created_at on the modal for API Key (#7548) 2024-08-23 08:21:31 +08:00
931e6f1625 Added Space between Chinese and English within tools' description (#7545) 2024-08-22 19:20:13 +08:00
4ce47284dc update nltk version to 3.8.1 (#7544) 2024-08-22 18:17:21 +08:00
f5dcc6092b feat: CONTRIBUTING_VI i18n (#7532) 2024-08-22 18:08:36 +08:00
0724640bbb fix rerank mode is none 2024-08-22 15:36:47 +08:00
cb70e12827 fix rerank mode is none 2024-08-22 15:33:43 +08:00
fef4e09dfc docs: update certbot/README.md (#7528) 2024-08-22 13:36:15 +08:00
60ef7ba855 fix: add missed modifications of <AppIcon /> (#7512) 2024-08-22 13:32:59 +08:00
6f968bafb2 feat: update the "tag delete" confirm modal (#7522) 2024-08-22 11:33:20 +08:00
9f6aab11d4 fix: tag input state lost issue (#7500) 2024-08-22 10:26:09 +08:00
0006c6f0fd fix(storage): 🐛 Create S3 bucket if it doesn't exist (#7514)
Co-authored-by: 莫岳恒 <moyueheng@datagrand.com>
2024-08-22 09:45:42 +08:00
2c427e04be Feat/7134 use dataset api create a dataset with permission (#7508) 2024-08-21 20:25:45 +08:00
f53454f81d add finish_reason to the LLM node output (#7498) 2024-08-21 17:29:30 +08:00
784b11ce19 Chore/remove python dependencies selector (#7494) 2024-08-21 16:57:14 +08:00
715eb8fa32 fix rerank mode is none (#7496) 2024-08-21 16:42:28 +08:00
067b956b2c merge migration 2024-08-21 16:25:18 +08:00
a02118d5bc Fix/incorrect code template (#7490) 2024-08-21 15:31:13 +08:00
85fc0fdb51 chore: support CODE_MAX_PRECISION (#7484) 2024-08-21 15:11:56 +08:00
f7af8c7cc7 feat: gpt-4o-mini-2024-07-18 support json schema (#7489) 2024-08-21 15:11:29 +08:00
0c99a3d0c5 fix the issue of the refine_switches at param being invalid in the Novita.AI tool (#7485) 2024-08-21 15:09:05 +08:00
66dfb5c89a fix: json schema not saved correctly (#7487) 2024-08-21 14:58:14 +08:00
6435b4eb44 Separate CODE_MAX_DEPTH and set it as an environment variable (#7474) 2024-08-21 12:48:25 +08:00
4e7b6aec3a feat: support pinning, including, and excluding for model providers and tools (#7419)
Co-authored-by: GareArc <chen4851@purude.edu>
2024-08-21 11:16:43 +08:00
6c25d7bed3 chore: improve the copywrite of the assigner node append mode description (#7467) 2024-08-21 10:34:25 +08:00
028fd52c9b fix: image icon not showing correctly on left panel in workflow web app page (#7466) 2024-08-21 10:29:16 +08:00
9a715f6b68 fix(tool): tool node error (#7459)
Co-authored-by: hobo.l <hobo.l@binance.com>
2024-08-21 09:04:54 +08:00
8c32f8c77d chore: #7348 i18n (#7451) 2024-08-21 09:03:51 +08:00
b7778de224 fix: document error message can not be cleared (#7453) 2024-08-20 19:30:57 +08:00
c70d69322b feat: support dialogue count in chatflow (#7440) 2024-08-20 18:28:39 +08:00
e35e251863 feat: Sort conversations by updated_at desc (#7348)
Co-authored-by: wangpj <wangpj@hundsunc.om>
Co-authored-by: JzoNg <jzongcode@gmail.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-08-20 17:55:44 +08:00
eae53e11e6 refactor(api/models/workflow.py): Add __init__ to Workflow (#7443) 2024-08-20 17:52:21 +08:00
4f5f27cf2b refactor(api/core/workflow/enums.py): Rename SystemVariable to SystemVariableKey. (#7445) 2024-08-20 17:52:06 +08:00
5e42e90abc fix(api/services/workflow/workflow_converter.py): Add NoneType checkers & format file. (#7446) 2024-08-20 17:51:49 +08:00
a10b207de2 refactor(api/core/app/app_config/entities.py): Move Type to outside and add EXTERNAL_DATA_TOOL. (#7444) 2024-08-20 17:30:14 +08:00
e2d214e030 chore: add and update theme related css variables values (#7442) 2024-08-20 16:33:40 +08:00
e7762b731c external knowledge 2024-08-20 16:18:35 +08:00
4f64a5d36d refactor(api/core/workflow/nodes/variable_assigner): Split into multi files. (#7434) 2024-08-20 15:40:19 +08:00
0d4753785f chore: remove .idea and .vscode from root path (#7437) 2024-08-20 15:37:29 +08:00
2e9084f369 chore(database): Rename table name from workflow__conversation_variables to workflow_conversation_variables. (#7432) 2024-08-20 14:34:03 +08:00
0f90e6df75 add pgvector full text search settting (#7427) 2024-08-20 13:20:19 +08:00
f6c8390b0b external knowledge 2024-08-20 12:47:51 +08:00
4fd57929df Merge branch 'main' into feat/external-knowledge 2024-08-20 12:46:37 +08:00
517cdb2ca4 add external knowledge 2024-08-20 11:13:29 +08:00
53146ad685 feat: support line break of tooltip content (#7424) 2024-08-20 11:03:55 +08:00
0223fc6fd5 feat: add pgvector full_text_search (#7396) 2024-08-20 11:01:13 +08:00
218380ba43 fix:end of day (#7426) 2024-08-20 10:57:33 +08:00
afd23f7ad8 chore: #7196 i18n (#7416) 2024-08-20 10:21:24 +08:00
6991a243aa chore: correct _tts_invoke_streaming max length (#7423) 2024-08-20 10:20:04 +08:00
1f944c6eeb feat(api): support wenxin bge-large and tao embedding model. (#7393) 2024-08-19 22:25:09 +08:00
31f9977411 Web app support sending message using numpad enter (#7414) 2024-08-19 22:24:21 +08:00
3d27d15f00 chore(*): Bump version 0.7.1 (#7389) 2024-08-19 21:24:56 +08:00
ab6499e5b7 upgrade: sandbox to 0.2.6 (#7410) 2024-08-19 21:24:15 +08:00
4ff4859036 add CrossRef builtin tool: doi query and title query (#7406) 2024-08-19 19:14:20 +08:00
53cf756207 feat: OpenRouter add gpt-4o-2024-08-06 model (#7409) 2024-08-19 19:14:08 +08:00
0087afc2e3 fix(api/core/model_runtime/model_providers/__base/large_language_model.py): Add TEXT type checker (#7407) 2024-08-19 18:45:30 +08:00
bd07e1d2fd fix:start of the period should be YYYY-MM-DD 00:00 (#7371) 2024-08-19 18:12:41 +08:00
8b06105fa1 Feat: shortcut hook (#7385) 2024-08-19 18:11:11 +08:00
68dc6d5bc3 chore: rearrange api python dependencies (#7391) 2024-08-19 14:05:41 +08:00
acd72e3ab2 feat: support xinference's auth system (#7369) 2024-08-19 12:41:56 +08:00
bbb6fcc4f0 chore: update ruff from 0.5.x to 0.6.x (#7384) 2024-08-19 09:21:11 +08:00
fbf31b5d52 feat: custom app icon (#7196)
Co-authored-by: crazywoola <427733928@qq.com>
2024-08-19 09:16:33 +08:00
a0c689c273 feat: add jina tokenizer tool (#7375) 2024-08-19 09:15:46 +08:00
bfd905602f feat(api): support wenxin text embedding (#7377) 2024-08-19 09:15:19 +08:00
a0a67873aa chore: optimize ark model parameters (#7378) 2024-08-19 08:44:19 +08:00
6cd8ab0cbc chore: add LOG_FILE to docker-compose (#7372) 2024-08-17 18:22:57 +08:00
5350b1d938 fix(api/services/workflow/workflow_converter.py): Add converrsation variable to workflow. (#7257) 2024-08-17 10:30:12 +08:00
baaa3f7f42 add base url for moonshot model (#7360) 2024-08-17 10:28:09 +08:00
4d4af00399 fix: keywords (#7357) 2024-08-16 20:43:55 +08:00
3a33062405 feat: support siliconflow rerank (#7337) 2024-08-16 20:21:41 +08:00
7d4a0a417a add workflowClient ,fix rename bug (#7352) 2024-08-16 20:21:08 +08:00
5a729a69cd feat: tools/gitlab (#7329)
Co-authored-by: crazywoola <427733928@qq.com>
2024-08-16 16:54:49 +08:00
dbc1ae45de chore: update docstrings (#7343) 2024-08-16 14:19:01 +08:00
9e6b755f62 feat: show path variable friendly in tool edit (#7344) 2024-08-16 14:09:25 +08:00
a2fafee53a chore(api/libs/bearer_data_source.py): Remove expired fie. (#7300) 2024-08-16 10:33:51 +08:00
c7df6783df Revert "feat: support pinning, including, and excluding for Model Providers and Tools" (#7324) 2024-08-15 23:51:00 +08:00
fcb6921b57 enh:setfocus after voice input (#7317) 2024-08-15 22:12:51 +08:00
135dcfa3e5 fix: not show correct iteration times number in run history (#7318) 2024-08-15 21:02:41 +08:00
acfab01dcf fix editor auth (#7297) 2024-08-15 20:36:51 +08:00
6fdbc7dbf3 fix error when use farui-plus model (#7316)
Co-authored-by: 雪风 <xuefeng@shifaedu.cn>
2024-08-15 20:14:13 +08:00
d1a6702aa4 Update PerfXCloud Model List (#7212)
Co-authored-by: xhb <466010723@qq.com>
2024-08-15 19:42:15 +08:00
28944ef6c1 chore: delete unused resources POSTGRES_MAX_CONNECTIONS (#7315) 2024-08-15 19:36:31 +08:00
6e7f5fae09 add some api to DifyClient (#7314) 2024-08-15 19:26:59 +08:00
ed85d8281a fix: null annotation (#7313) 2024-08-15 19:20:14 +08:00
f3d3a3a5db chore: #7222 i18n (#7312) 2024-08-15 17:56:29 +08:00
c89697c49c fix(elasticsearch): docker env (#7270) 2024-08-15 17:53:28 +08:00
9414143b5f chore(api/libs): Apply ruff format. (#7301) 2024-08-15 17:53:12 +08:00
d07b2b9915 Fix: missing default value of type array object in conversation variable modal (#7309) 2024-08-15 17:28:12 +08:00
04131f86df fix: inability-to-add-node-and-change-the-edge (#7303) 2024-08-15 17:26:11 +08:00
2d89b7d0a9 fix(api/services/app_dsl_service.py): Add conversation variables. (#7304) 2024-08-15 16:46:48 +08:00
603a89055c Feat/7023 dify editor can resize the image (#7296) 2024-08-15 14:23:56 +08:00
3f9720bca0 fix(api/core/app/segments/segments.py): Fix file to markdown. (#7293) 2024-08-15 13:09:49 +08:00
7619850855 feat: support pinning, including, and excluding for Model Providers and Tools (#7283) 2024-08-15 12:58:38 +08:00
3571292fbf chore(api): Introduce Ruff Formatter. (#7291) 2024-08-15 12:54:05 +08:00
8f16165f92 chore(api/core): Improve FileVar's type hint and imports. (#7290) 2024-08-15 12:43:18 +08:00
6ff7fd80a1 feat: support OPENAI json_schema (#7258) 2024-08-15 11:29:19 +08:00
5aa373dc04 feat: add chatgpt-4o-latest (#7289) 2024-08-15 11:19:10 +08:00
32dc963556 feat(api/workflow): Add Conversation.dialogue_count (#7275) 2024-08-15 10:53:05 +08:00
8f5d8397f9 fix: can not input param value in tool test modal (#7281) 2024-08-15 10:31:34 +08:00
681ec6f845 Add jp translation for variable aggregator (#7277) 2024-08-15 09:47:51 +08:00
d2ccd8ba53 fix: #7222 docstrings (#7276) 2024-08-15 09:47:26 +08:00
yu5
7f67cb93ec fix ja-JP translation of secret values (#7279) 2024-08-15 09:44:02 +08:00
d29b32fce2 fix: typo in upstage/llm/_position.yaml (#7286) 2024-08-15 08:39:35 +08:00
pp
101db126c8 fix: missed rerank_mode when convert to DatasetEntity (#7269) 2024-08-15 00:41:12 +08:00
ba79088ffc Fix SQL parser Error in MyScale vdb. (#7255) 2024-08-14 16:41:18 +08:00
3a27166c2e chore: allow download audio/video through HTTP node (#7224) 2024-08-14 16:25:59 +08:00
429e85f5d6 Fix: support hide env & conversation var in prompt editor (#7256) 2024-08-14 15:14:39 +08:00
b5d472fad7 test(*): Avoid import from api in tests. (#7251) 2024-08-14 14:09:26 +08:00
52383d0161 add support for tongyi-farui (#7248)
Co-authored-by: 雪风 <xuefeng@shifaedu.cn>
2024-08-14 14:09:13 +08:00
48d2febebf fix(api/core/tools/entities/tool_entities.py): Fix type define. (#7250) 2024-08-14 14:08:54 +08:00
ca085034de doc: add missing params (#7242) 2024-08-13 22:31:27 +08:00
f6c12b10ac chore: update package versions to 0.7.0 (#7236) 2024-08-13 22:28:06 +08:00
5b77ef01d4 chore(api/services/app_dsl_service.py): Bump DSL version to 0.1.1 (#7235) 2024-08-13 18:20:41 +08:00
5d85fad522 Revert yarn.lock (#7234) 2024-08-13 18:19:36 +08:00
2fe2e350ce add secondary sort_key when using order_by and paginate at the same time (#7225) 2024-08-13 17:39:51 +08:00
986fd5bfc6 Add gitlab support (#7179)
Co-authored-by: crazywoola <427733928@qq.com>
2024-08-13 17:36:45 +08:00
f104b930cf feat: support elasticsearch vector database (#3558)
Co-authored-by: miendinh <miendinh@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-08-13 17:36:20 +08:00
4423710a13 Add ECharts feature ( #6385 ) (#6961) 2024-08-13 17:35:12 +08:00
9381c08c43 chore: not use step_boundary field (#7231) 2024-08-13 17:22:33 +08:00
0f59d76997 fix: add context_size and max_chunks to Tongyi embedding to resolve issue #7189 (#7227) 2024-08-13 16:35:22 +08:00
b3743a9ae5 chore: refactor searXNG tool (#7220) 2024-08-13 15:34:29 +08:00
13d061911b Error Exception Message Of "Message Not Exists.", Should be "Suggested Questions Is Disabled." (#7219) 2024-08-13 15:17:18 +08:00
935e72d449 Feat: conversation variable & variable assigner node (#7222)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-08-13 14:44:10 +08:00
8b55bd5828 fix: display notion document title correctly (#7215) 2024-08-13 14:05:57 +08:00
a12ddc47e7 feat: add support of speech2text function for OpenAI-API-compatible and Siliconflow (#7197) 2024-08-12 21:38:59 +08:00
57ce8449b0 feat: Support NEXT_TELEMETRY_DISABLED (#7181) 2024-08-12 19:15:41 +08:00
67b9fdaad7 siliconflow support bge-3 && bce-v1 embedding (#7198) 2024-08-12 19:14:43 +08:00
f9cf418f0f Fix/workflow run single step (#7194) 2024-08-12 17:14:17 +08:00
dfa7fe1289 chore: #7177 README_VI (#7182) 2024-08-12 15:57:21 +08:00
d2471cf6f9 Fix jp translation with new dify-doc (#7185) 2024-08-12 15:49:46 +08:00
a68df696ec Fix issue with incorrect port number in app_base_url due to NAT networks (#6653) 2024-08-12 12:26:12 +08:00
4b93df5a30 Docs/remove all meeting schedule links (#7177) 2024-08-12 11:28:26 +08:00
12dd3c0277 fix: leave chat page show assign to readonly property prolem (#7178) 2024-08-12 11:25:10 +08:00
ccb6ddd840 chore: bump Ruff to 0.5.7 (#7174) 2024-08-12 10:24:48 +08:00
c48584fbb1 chore: update localization link in README (#7168) 2024-08-12 10:19:13 +08:00
f2cb1fb09f Fix : Workflow "start" paste url not support s3 pre-signed URL (#6855)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
2024-08-11 16:45:15 +08:00
ac60182c91 fix: solving http-request-tool bugs in workflow (#6685) 2024-08-11 16:32:06 +08:00
700d37be8d Add explanatory comment to NGINX_ENABLE_CERTBOT_CHALLENGE key in .env.example (#7154) 2024-08-10 16:58:50 +08:00
8b5761efb2 change secret input to text input in searxng (#7160) 2024-08-10 16:58:22 +08:00
ef4d85f5c0 feat: README_VI (#7158) 2024-08-10 11:48:03 +08:00
5b32f2e0dd Feat: Add model provider Text Embedding Inference for embedding and rerank (#7132) 2024-08-09 19:12:13 +08:00
4cbeb6815b Fix: Wrong cutoff length lead to empty input in openai compatible embedding model. (#7133) 2024-08-09 19:11:57 +08:00
2c188a45c8 feat(app/log): Add Referenced Content in Application Logs (#7082) 2024-08-09 19:06:08 +08:00
d338f69837 feat: add decode option to json process tools (#7138) 2024-08-09 19:05:27 +08:00
7ebad74372 chore: #6515 i18n (#7149) 2024-08-09 19:05:00 +08:00
e2a13a9945 chore: #5487 i18n (#7150) 2024-08-09 19:04:50 +08:00
c6b0dc6a29 update dataset embedding model, update document status to be indexing (#7145) 2024-08-09 16:47:15 +08:00
f667ef98cb Feat/update tools length (#7141) 2024-08-09 16:07:37 +08:00
Joe
425174e82f feat: update ops trace (#7102) 2024-08-09 15:22:16 +08:00
Joe
7201b56a6d fix: workflow log run time error (#7130) 2024-08-09 14:46:31 +08:00
34cab0e0b7 Fix: account delete function & confirm issues (#7129) 2024-08-09 11:37:34 +08:00
4dfa8eedb8 Feat/tool-D-ID (#6278) 2024-08-09 11:05:33 +08:00
633808de06 chore: improve Vietnamese (vi-VN) translations (#7127) 2024-08-09 10:07:12 +08:00
f4591f97aa Update i18n/ja-JP/dataset-documents.ts "embeddedSpend" value. (#7124) 2024-08-09 08:33:18 +08:00
07511dfaf4 update stepfun model (#7118)
Co-authored-by: chenlinfeng <chenlinfeng@step.ai>
Co-authored-by: Tfsh <tianfs_fight@163.com>
2024-08-08 20:40:37 +08:00
7944ce0594 feat: wenxin add yi-34b-chat (#7117) 2024-08-08 20:01:21 +08:00
ad682c394d fix annotation reply is null (#7103) 2024-08-08 19:07:50 +08:00
7210613551 feat: Postgres max connections (#7067) 2024-08-08 17:25:23 +08:00
83acb53c08 feat: add zhipu embedding-3 (#7100) 2024-08-08 17:08:46 +08:00
a7162240e6 feat: add text-embedding functon and LLM models to Siliconflow (#7090) 2024-08-08 17:08:28 +08:00
12095f8cd6 extract docx filter comment element (#7092) 2024-08-08 16:53:29 +08:00
925f0d2e09 fix: workflow search blocks (#7097) 2024-08-08 15:33:02 +08:00
b6d206e095 feat: app icon enhancements (#7095) 2024-08-08 15:29:11 +08:00
5542ee4d0d workflow logs support workflow run id filter (#6833) 2024-08-08 14:54:02 +08:00
4ffa706e4f feat: add a builtin tool to support regex extraction. (#7081) (#7087) 2024-08-08 14:23:57 +08:00
169cde6c3c add nltk punkt resource (#7063) 2024-08-08 14:23:22 +08:00
34a9dbe826 Feat/add 360-zhinao provider (#7069) 2024-08-08 14:23:08 +08:00
8e23e24bd5 feat: Poetry requests timeout (#7086) 2024-08-08 14:17:29 +08:00
f288d367ac Add price info for zhipu models (#7084) 2024-08-08 14:17:05 +08:00
5e2fa13126 feat: support glm-4-long (#7070)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-08-08 10:54:39 +08:00
1571a8afd4 fix: workflow delete node shortcut (#7076) 2024-08-08 10:42:08 +08:00
67a2f14cef fix(api/core/workflow/nodes/tool/tool_node.py): Keep None value in tool params. (#7066) 2024-08-07 19:36:21 +08:00
a0d5b61c2a update enterprise inquiry link to point to chatflow (#7064) 2024-08-07 19:27:54 +08:00
Joe
d7bb422a5c fix: hunyuan assistant_prompt_message pydantic error (#7062) 2024-08-07 18:31:40 +08:00
40c6f3c724 fix: add redis lock to AnalyticdbVector init (#6859)
Co-authored-by: xiaozeyu <xiaozeyu.xzy@alibaba-inc.com>
2024-08-07 17:32:06 +08:00
df8f8c9a2c feat(api/core/rag/datasource/vdb/analyticdb/analyticdb_vector.py): Checking config before init analyticdb (#7050) 2024-08-07 17:31:36 +08:00
536c43257b refactor(*): Update hard-code '[__HIDDEN__]' to the constant. (#7048) 2024-08-07 17:30:56 +08:00
4e8f6b3603 chore(api/core/app/segments/segments.py): Remove todo tags. (#7052) 2024-08-07 17:29:35 +08:00
80c94f02e9 add vector field for other vectordb (#7051) 2024-08-07 17:14:03 +08:00
aad02113c6 fix(api/core/app/segments): Allow to contains boolean value in object segment (#7047) 2024-08-07 16:20:22 +08:00
72c75b75cf feat: Add hyperlink parsing to the DOCX document. (#7017) 2024-08-07 16:01:14 +08:00
ffa992acf7 Add support for i18n Farsi language (fa-IR) (#7044) 2024-08-07 15:44:42 +08:00
99b78dd198 feat: add gpt-4o-2024-08-06 (#7046) 2024-08-07 15:35:57 +08:00
7f81a86e9e chore: show non-English characters in exported DSL files (#7042) 2024-08-07 14:03:15 +08:00
4c4f6e362f chore: lint code to remove unused imports and variables (#7033) 2024-08-07 13:04:44 +08:00
1a302ca957 feat: add disabled support to tooltip-plus component (#7036) 2024-08-07 11:26:47 +08:00
11f9d2f124 doc: correct typos in mdx files (#7029) 2024-08-07 09:03:45 +08:00
a93bc83c8d Provide output data also in json property of workflow tool (#6924) (#7027) 2024-08-07 08:54:51 +08:00
3516989738 fix: typos in wenxin llm (#7021) 2024-08-06 22:33:03 +08:00
26991443ed fix: Fix incorrect context size for jina-reranker-v2 model (#7006) 2024-08-06 21:08:29 +08:00
eece50acec fix: tran list issue (#7009)
Co-authored-by: libing <libing@healink.cn>
2024-08-06 21:01:38 +08:00
28d4e5b045 Fix/reranking mode is null (#7012) 2024-08-06 19:12:04 +08:00
c110888aee feat: agent app support generate prompt (#7007) 2024-08-06 17:43:54 +08:00
c53875ce8c fix #6902 .docx handles images within tables and handles cross-column tables (#6951) 2024-08-06 17:14:24 +08:00
7f18c06b0a fix: code-block-missing-checks (#7002) 2024-08-06 16:11:14 +08:00
96dcf0fe8a fix: code tool fails when null property exists in object (#6988) 2024-08-06 16:11:00 +08:00
0c22e4e3d1 Feat/new confirm (#6984) 2024-08-06 14:31:13 +08:00
bd3ed89516 feat: add function calling for deepseek models (#6990) 2024-08-06 13:37:27 +08:00
1c043b8426 Chores: fix name typo (#6987) 2024-08-06 13:33:21 +08:00
23ed15d19f feat:nvidia add nemotron4-340b and microsoft/phi-3 (#6973) 2024-08-06 10:16:41 +08:00
312d905c9b chore: update duckduckgo tool (#6983) 2024-08-06 10:16:04 +08:00
cba9319cc7 fix doc (#6974) 2024-08-06 10:10:55 +08:00
d839f1ada7 version to 0.6.16 (#6972) 2024-08-05 23:33:37 +08:00
6da14c2d48 security: fix api image security issues (#6971) 2024-08-05 20:21:08 +08:00
a34285196b Revise the wrong pricing of certain LLM models. (#6967) 2024-08-05 18:41:44 +08:00
e4587b2151 chore: MAX_TREE_DEPTH spelling mistake (#6965) 2024-08-05 18:41:08 +08:00
ea30174057 chore: optimize streaming tts of xinference (#6966) 2024-08-05 18:23:23 +08:00
dd676866aa chore: exclude .txt extenstion in create_by_text API (#6956) 2024-08-05 15:52:07 +08:00
f0d10553b4 Fixed a bug where permission was clearly displaye… (#6934) 2024-08-05 13:19:01 +08:00
ef616c604a fix: The permissions issue of the editor role accessing some backend … (#6945)
Co-authored-by: liuzhenghua-jk <liuzhenghua-jk@360shuke.com>
2024-08-05 12:55:55 +08:00
2288efbf48 Fix: tag & settings modal in dataset card in Firefox (#6953) 2024-08-05 12:51:26 +08:00
f656e1bae2 fix: ensure db migration in docker entry script running with upgrade-db command for proper locking (#6946) 2024-08-05 10:55:26 +08:00
5a7fc8cd8c chore: fix markdown format and one typo (#6939) 2024-08-05 08:29:59 +08:00
141e4e0276 fix: restore xinference secret field (#6941)
Co-authored-by: liuzhenghua-jk <liuzhenghua-jk@360shuke.com>
2024-08-04 22:32:24 +08:00
20d3e1d297 Fix increase_usage of total_price in agent_runner (#6688) 2024-08-04 14:42:22 +08:00
79715345ef fix: import workflow errors (#6937) 2024-08-04 14:34:39 +08:00
dff3f41ef6 Workflow TTS playback node filtering issue. (#6877) 2024-08-04 14:28:56 +08:00
5e634a59a2 compatible xinference reranker server (#6927) 2024-08-04 13:49:38 +08:00
Joe
26e46d365c fix: workflow trace user_id error (#6932) 2024-08-04 03:28:50 +08:00
bcd7c8e921 fix: sending app trace data to other app trace provider (#6931) 2024-08-04 00:05:51 +08:00
70283f5b9f dep: support for Python 3.12 (#6771) 2024-08-02 21:14:36 +08:00
2e941bb91c add new provider Solar (#6884) 2024-08-02 20:48:09 +08:00
541bf1db5a feat: add the tool Serper for Google search. (#6786) (#6790) 2024-08-02 20:37:04 +08:00
048bc4c06e fix update dataset failed when embedding model is not exist (#6920) 2024-08-02 20:30:22 +08:00
4d0a6cc382 fix(nodes/knowledge-retrieval): workflow knowledge retrieval rerank model check (#6918) 2024-08-02 20:30:05 +08:00
6feea0d75b fix: default rerank model check (#6917) 2024-08-02 18:27:06 +08:00
Joe
f97a51ce24 fix: reranking disable timer error (#6910) 2024-08-02 16:34:50 +08:00
df530b53e5 fix: system model setting missing space between buttons (#6912) 2024-08-02 16:26:16 +08:00
6aa02f8c63 dep: bump pgvecto-rs client from 0.1.x to 0.2.x (#6891) 2024-08-02 15:51:23 +08:00
7ab04e17e7 fix: return code in service api (#6911) 2024-08-02 15:48:58 +08:00
bf3f1027c8 fix: code execution node not display clear reasons when sandbox res error (#6830) 2024-08-02 15:36:44 +08:00
62cc4077bb Fix: webapp color theme (#6908) 2024-08-02 15:08:14 +08:00
e683461416 fix: knowledge save button visible (#6905) 2024-08-02 14:20:20 +08:00
33dab4fe54 fix: multiple retrieval default weighted score (#6897) 2024-08-02 14:05:27 +08:00
8166a8caf5 feat: update llama3.1 parameters for openrouter (#6901) 2024-08-02 13:13:34 +08:00
44801df8f8 fix score threshold limit be None (#6900) 2024-08-02 12:10:51 +08:00
56af1a0adf pref: change ollama embedded api request (#6876) 2024-08-02 12:04:47 +08:00
f8617db012 fix tongyi tool calls (#6896) 2024-08-02 10:03:43 +08:00
2ab9af3b38 delete weight_type in knowledge retrieval node (#6892) 2024-08-01 21:38:59 +08:00
24a89f7753 Modify/modify jp doc (#6889) 2024-08-01 20:33:35 +08:00
cc4785f094 fix: xinference reranker return_documents (#6888) 2024-08-01 19:57:53 +08:00
ian
093f902335 fix: Change API key authentication failure response code from 404 to 401 (#6885) 2024-08-01 17:41:35 +08:00
104c797dd0 feat: Add support for i18n Turkish language (tr-TR) (#6886)
Co-authored-by: hursit <hursit.topal@enuygun.com>
2024-08-01 17:30:35 +08:00
a9cd6df97e Remove tts (blocking call) (#6869) 2024-08-01 14:50:22 +08:00
f31142e758 Azure 4o mini options (#6873) 2024-08-01 14:04:18 +08:00
9ae88ede12 chore: n to 1 retrieval (#6839) 2024-08-01 13:45:18 +08:00
792f908afb Revert "feat:Azure gpt4o mini" (#6870) 2024-08-01 13:32:03 +08:00
29e3c3061c fix: remote image not display in answer node (#6867) 2024-08-01 13:21:49 +08:00
14367ddc09 feat:Azure gpt4o mini (#6866) 2024-08-01 13:03:08 +08:00
8157fccf6d delete weight_type (#6865) 2024-08-01 13:02:33 +08:00
cbf7f21ade Add azure gpt4omini (#6862)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-08-01 12:57:52 +08:00
9c4f3be0f3 Fix keyboard shortcut conflict between workflow and browser (#6863) 2024-08-01 12:57:30 +08:00
f6e8e120a1 support xinference tts (#6746) 2024-08-01 11:59:15 +08:00
Joe
08f922d8c9 fix: anthropic max token NoneType error (#6858) 2024-08-01 11:30:00 +08:00
e9d6a43907 fix: model parameter selector (#6861) 2024-08-01 11:23:53 +08:00
feb4576ee7 chore: update SQLAlchemy configuration with custom naming convention (#6854) 2024-08-01 11:16:49 +08:00
56b43f62d1 feat: nvidia add llama3.1 model (#6844) 2024-07-31 21:24:02 +08:00
4b410494b3 Add model parameter enable_enhance for hunyuan llm model (#6847)
Co-authored-by: sun <sun@centen.cn>
2024-07-31 20:04:43 +08:00
13f5867a16 add unstructured profiles (#6846) 2024-07-31 19:39:38 +08:00
Joe
df9bd36cab fix: claude-3-5-sonnet-20240620 max token error (#6843) 2024-07-31 18:34:44 +08:00
77c071e26f chore: upgrade slider ui (#6838) 2024-07-31 17:46:43 +08:00
af76381b98 fix notion internal setting (#6836) 2024-07-31 17:17:46 +08:00
35d0534eb9 chore: fix ssrf doc url (#6828) 2024-07-31 17:14:53 +08:00
4be12b29b9 fix: improved error handling for spider tool (#6835) 2024-07-31 17:11:52 +08:00
dd64e65ea0 fix: edit segment missing space between buttons (#6826)
Co-authored-by: xc Dou <xcdou@192.168.88.89>
2024-07-31 15:11:07 +08:00
c23aa50bea Add AWS builtin Tools (#6721)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-07-31 14:41:42 +08:00
8eb0d0fddd feat: support Celery auto-scale (#6249)
Co-authored-by: takatost <takatost@gmail.com>
2024-07-31 14:34:44 +08:00
8904745129 feat: tag filter adapte to scrolling (#6819) 2024-07-31 13:55:32 +08:00
936ac8826d Add docker-compose certbot configurations with backward compatibility (#6702)
Co-authored-by: Your Name <you@example.com>
2024-07-31 13:21:56 +08:00
545d3c5a93 chore: Add processId field for metrics of threads/db-pool-stat/health (#6797)
Co-authored-by: 老潮 <zhangyongsheng@3vjia.com>
Co-authored-by: takatost <takatost@users.noreply.github.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-07-31 00:21:16 +08:00
3c371a6cb0 fix: workflow api (#6810) 2024-07-30 23:51:48 +08:00
9ce5cea911 feat: bedrock invoke enhancement (#6808) 2024-07-30 21:57:18 +08:00
98d9837fbc fix wrong charset when decoding Chinese content (#6774)
Co-authored-by: zhangwb <zhangwb@zts.com.cn>
2024-07-30 21:32:45 +08:00
53a89bbbc7 chore : option card (#6800) 2024-07-30 17:33:08 +08:00
0a744a73b3 fix: eco knowledge retrieval method (#6798) 2024-07-30 16:59:03 +08:00
0675c5f716 chore: add shortcut keys and hints for the shortcuts (#6779) 2024-07-30 16:18:58 +08:00
72963d1f13 fix: nonetype in webscraper validation (#6788) 2024-07-30 14:45:14 +08:00
028261f760 improve issue templates (#6785) 2024-07-30 14:17:45 +08:00
a98284b1ef refactor(api): Switch to dify_config (#6750)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-30 11:15:26 +08:00
daa31b2cb3 chore: remove redundant version pinning for indirect dependencies (#6772) 2024-07-30 08:45:57 +08:00
b414ea41d6 dep: initial support for Milvus 2.4.x (#6084) 2024-07-29 19:56:45 +08:00
Joe
f78d0082ae feat: implement function dispatch table for trace processing (#6628) 2024-07-29 18:47:25 +08:00
3e18d32ce5 add deepseek-coder-v2 in siliconflow (#6149) 2024-07-29 18:45:19 +08:00
94d68b6a08 upgrade deepseek params (#6744) 2024-07-29 18:31:56 +08:00
c9ff0e3961 Add model hunyuan-embedding (#6657)
Co-authored-by: sun <sun@centen.cn>
2024-07-29 18:30:52 +08:00
8dd68e2034 fix(api/core/moderation/output_moderation.py): Fix config call. (#6769) 2024-07-29 18:30:29 +08:00
2cd662c43b chore: n to 1 retrieval legacy text (#6767) 2024-07-29 18:09:44 +08:00
4945184f8c fix: n to 1 retrieval legacy text (#6760) 2024-07-29 16:03:47 +08:00
cb01bf2986 chore: set logging level to debug when reading YAML files and falling back to default value in case of None (#6758) 2024-07-29 13:40:18 +08:00
f43e27814c Fix action button size (#6753) 2024-07-29 13:19:15 +08:00
20268708cc chore: improve position map conversion and tolerate empty position yaml file (#6541) 2024-07-29 10:32:11 +08:00
c8da4a1b7e Add jp translation for new features (#6749) 2024-07-29 08:58:48 +08:00
829472a1d7 switch to diffy_config with Pydantic in files, moderation and app (#6747)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-07-29 02:57:45 +08:00
e23461c837 Fix/6615 40 varchar limit on DatasetCollectionBinding and Embedding model name (#6723) 2024-07-28 09:42:58 +08:00
21f6caacd4 feat: enhance the firecrawl tool (#6705) 2024-07-27 15:00:06 +08:00
082c46a903 chore: migrate to poetry in devcontainer commands (#6724) 2024-07-27 14:49:34 +08:00
6a3bef8378 feat(api/core/app/segments): Update segment types and variables (#6734)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-27 14:43:51 +08:00
b6c3010f02 refactor(api/core/workflow/nodes/base_node.py): Update extract_variable_selector_to_variable_mapping method signature. (#6733)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-27 14:43:25 +08:00
90d2c01218 Feat/6725 can not get image url from cogview tool (#6728) 2024-07-27 00:07:31 +08:00
83af50368f fix(api/core/model_runtime/model_providers/azure_openai/llm/llm.py): Try to skip if delta.delta is None. (#6727)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-27 00:05:21 +08:00
cf258b7a67 add xlsx support hyperlink extract (#6722) 2024-07-26 19:26:52 +08:00
5d77dc4f58 feat(api/core/app/segments/parser.py): Remove blank segment in convert_template (#6709)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-26 18:19:33 +08:00
Joe
e4542215cc fix: tongyi empty tool_calls is not supported in message (#6719) 2024-07-26 18:10:13 +08:00
3d3677e912 Feat/model provider novita (#6717)
Co-authored-by: takatost <takatost@gmail.com>
2024-07-26 17:37:21 +08:00
427f48be6b fix(answer/operation): feedback status in the logs (#6716) 2024-07-26 17:36:36 +08:00
c6996a48a4 refactor(api/core/app/segments): Support more kinds of Segments. (#6706)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-26 15:03:56 +08:00
6b50bb0fe6 issues #6655 Open ai tts issues (#6696) 2024-07-26 14:55:49 +08:00
80b3871c55 fix(log/list): Incorrect field 'app_id' causing annotations to fail t… (#6697) 2024-07-26 11:24:32 +08:00
4839523e53 Fix appId missing in annotations (#6699) 2024-07-26 11:21:51 +08:00
ecb9c311b5 chore: make prompt generator max tokens configurable (#6693) 2024-07-26 10:20:23 +08:00
bd97ce9489 fix: doc link in knowledge base (#6691) 2024-07-26 09:14:08 +08:00
79cb23e8ac security/SSRF vulns (#6682) 2024-07-25 20:50:26 +08:00
c5ac004f15 [seanguo] fix: unsupported filename in windows & add Mistral Large 2 (#6679) 2024-07-25 19:26:46 +08:00
5fbfa0f2c8 Update bug_report.yml (#6678) 2024-07-25 18:59:04 +08:00
78a339a794 modify llama3-1 yaml filename to support Windows pull operations (#6677) 2024-07-25 18:58:55 +08:00
f904df4b63 Add french and jp translation for new feature (#6675) 2024-07-25 18:55:16 +08:00
5e4ac11df3 fix: code block segmentation problem of markdown document (#6465) 2024-07-25 17:24:37 +08:00
16b4f560cd fix bugs(when using Oracle23ai as Vector DB) (#6658) 2024-07-25 17:07:14 +08:00
75e6576c67 refactor(api/core/app/segments): implement to_object in ObjectVariable and ArrayVariable. (#6671)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-25 17:06:38 +08:00
0b4c26578e Enhance database URI security and add URL encoding (#6668) 2024-07-25 16:48:00 +08:00
ebcc07e3e9 feat: support max_retries in jina requests (#6585) 2024-07-25 13:10:39 +08:00
55c2b61921 fix(api/fields/workflow_fields.py): Add check in environment variables (#6621) 2024-07-25 11:30:52 +08:00
ca696fe94c Add support of tool-call for model provider "hunyuan" (#6656)
Co-authored-by: sun <sun@centen.cn>
2024-07-25 11:27:58 +08:00
585444c50c chore: fix type annotations (#6600) 2024-07-25 11:21:51 +08:00
9815aab7a3 [seanguo] feat: add llama 3.1 support in bedrock (#6645) 2024-07-25 11:20:37 +08:00
349ec0db77 fix tencent_cos_storage image-preview error is not a byte (#6652) 2024-07-25 11:20:20 +08:00
a876baf0a9 Resolve variable type parameter error (#6646) 2024-07-25 11:15:54 +08:00
91fd8521c3 fix reranking model field error (#6654) 2024-07-25 10:07:55 +08:00
4ec9a87e46 fix(api/core/workflow/nodes/iteration/iteration_node.py): Extend output in iteration if output is a array. (#6647)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-25 00:32:39 +08:00
fb5e3662d5 Chores: add missing profile for middleware docker compose cmd and fix ssrf-proxy doc link (#6372)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-07-24 19:36:06 +08:00
31efe10c75 refactor(api/core/workflow/workflow_engine_manager.py): Remove (#6630)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-24 19:35:40 +08:00
72bc9d5f2b feat(api/core/app/segments/variables.py): Support description in Variable. (#6636)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2024-07-24 19:35:22 +08:00
600f13436d remove rerank model must be required when retrieval_model is multiple (#6640) 2024-07-24 19:34:41 +08:00
Joe
b347a2f839 Feat/user session id search (#6638) 2024-07-24 19:34:23 +08:00
47b5bd7243 fix: value is not an array (#6632) 2024-07-24 19:14:04 +08:00
d4c55748f1 doc: fix about model features (#6619) 2024-07-24 19:12:10 +08:00
0625db0bf5 chore: optimize asynchronous workflow deletion performance of app related data (#6639) 2024-07-24 19:00:37 +08:00
05141ede16 chore: optimize asynchronous deletion performance of app related data (#6634) 2024-07-24 18:15:03 +08:00
c112188207 feat: added ActionButton component (#6631) 2024-07-24 18:09:44 +08:00
5af2df0cd5 fix: qwen fc error (#6620)
Co-authored-by: dufei <du_fei@venusgroup.com.cn>
2024-07-24 16:56:06 +08:00
f324374b95 Fix/6615 40 varchar limit on model name (#6623) 2024-07-24 16:23:16 +08:00
2aad128883 Fix: DSL backup (#6616) 2024-07-24 15:02:30 +08:00
3c78fdec1c Fix: reset button in embedded chatbot (#6611) 2024-07-24 14:46:52 +08:00
6fe9aa69cc feat: n to 1 retrieval legacy (#6554) 2024-07-24 12:50:48 +08:00
e4bb943fe5 Feat/delete single dataset retrival (#6570) 2024-07-24 12:50:11 +08:00
0fb741f269 fix: downgraded sentry-sdk to 1.44.1 due to claude LLM token returning 0 (#6597) 2024-07-24 04:49:03 +08:00
4c85393a1d feat: add GroqCloud llama3.1 series models support (#6596) 2024-07-24 00:41:58 +08:00
d5c2680fde feat: support llama3.1 series models for openrouter provider (#6595) 2024-07-24 00:37:48 +08:00
49729647ea bump to 0.6.15 (#6592) 2024-07-23 22:46:42 +08:00
85a883e281 fix(variables): NoneVariable should inherit from NoneSegment. (#6584) 2024-07-23 21:46:08 +08:00
Joe
8123a00e97 feat: update prompt generate (#6516) 2024-07-23 19:52:14 +08:00
0f6a064c08 chore: enchance auto generate prompt (#6564) 2024-07-23 19:51:38 +08:00
2bc0632d0d fix(segments): Support NoneType. (#6581) 2024-07-23 17:59:32 +08:00
75445a0c66 fix audio not working during development due to react's useEffect wil be triggered twice (#6126) 2024-07-23 17:24:29 +08:00
6a9d202414 chore: layout UI upgrade (#6577) 2024-07-23 17:11:02 +08:00
ad7552ea8d fix(api/core/workflow/nodes/llm/llm_node.py): Fix LLM Node error. (#6576) 2024-07-23 17:09:16 +08:00
c0ada940bd fix: tool params not work as expected when develop a tool (#6550) 2024-07-23 17:00:39 +08:00
1690788827 fix: name 'current_app' is not defined in recommended_app_service (#6574) 2024-07-23 16:48:21 +08:00
7c55c39085 feat: add tencent asr (#6091) 2024-07-23 16:38:39 +08:00
f17d4fe412 fix: extract only like feedback to caculate User Satisfaction (#6553) 2024-07-23 16:32:36 +08:00
f019bc4bd7 feat(variables): Support to_object. (#6572) 2024-07-23 16:22:06 +08:00
cfc408095c fix(api/nodes): Fallback to get_any in some nodes that use object or array. (#6566) 2024-07-23 15:51:07 +08:00
6b5fac3004 fix: fetch context error in llm node (#6562) 2024-07-23 15:04:51 +08:00
0569c547ee fix the issue of MILVUS_DATABASE has no effect. (#6424) 2024-07-23 15:03:55 +08:00
06fc1bce9e Add search by full text when using Oracle23ai as vector DB (#6559) 2024-07-23 15:03:21 +08:00
093b8ca475 fix: escape double quotation marks in the vector DB search query (#6506) 2024-07-23 15:02:25 +08:00
5fcc2caeed feat: add Mingdao HAP tool, implemented read and maintain HAP application worksheet data. (#6257)
Co-authored-by: takatost <takatost@gmail.com>
2024-07-23 14:34:19 +08:00
f30a51e673 fix: chat flow chat with annotation or moderation but answer empty (#6202)
Co-authored-by: jinqi.guo <jinqi.guo@ubtrobot.com>
2024-07-23 14:13:58 +08:00
642723d09e chore(deps): bump sentry-sdk from 1.39.2 to 2.8.0 in /api (#6517)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 13:48:23 +08:00
155e708540 Revert "chore: improve prompt auto generator" (#6556) 2024-07-23 13:35:35 +08:00
d726473c6d Revert "chore: use node specify llm to auto generate prompt" (#6555) 2024-07-23 13:31:32 +08:00
e80412df23 feat: renanme template (#6547) 2024-07-23 10:07:54 +08:00
66765acf00 Update help.yml (#6546) 2024-07-23 10:01:19 +08:00
7208ea1da9 fix: template (#6545) 2024-07-23 09:58:47 +08:00
5e2f3ec6f0 update discussion template (#6544) 2024-07-23 09:44:27 +08:00
cd7fa8027a fix(api/core/model_manager.py): Avoid mutation during iteration. (#6536) 2024-07-22 22:58:22 +08:00
617847e3c0 fix(api/services/app_generate_service.py): Remove wrong type hints. (#6535) 2024-07-22 22:58:07 +08:00
71a7211411 Feat/add email support for pro and team (#6533) 2024-07-22 19:56:46 +08:00
dc7335cdf8 chore: use node specify llm to auto generate prompt (#6525) 2024-07-22 18:16:33 +08:00
a7c1e4c7ae chore: remove support email from readme (#6530) 2024-07-22 17:23:19 +08:00
87594008f8 fix: iteration node bg color (#6523) 2024-07-22 15:43:24 +08:00
5e6fc58db3 Feat/environment variables in workflow (#6515)
Co-authored-by: JzoNg <jzongcode@gmail.com>
2024-07-22 15:29:39 +08:00
87d583f454 fix: privilege for editor role (#6521) 2024-07-22 15:01:25 +08:00
a67831773f refactor: handle missing position file gracefully (#6513) 2024-07-22 13:24:32 +08:00
5b89b6fe2d allow custom base_url of dify api server (#6510) 2024-07-22 13:24:24 +08:00
a6350daa02 chore: improve prompt auto generator (#6514) 2024-07-22 11:44:12 +08:00
dfb6f4fec6 fix: extract tool calls correctly while arguments is empty (#6503) 2024-07-22 07:43:18 +08:00
f38034e455 clean vector collection redis cache (#6494) 2024-07-21 15:09:09 +08:00
c57b3931d5 refactor(api): switch to dify_config in controllers/console (#6485) 2024-07-21 01:11:40 +08:00
f73a3a58ae update delete embeddings by id (#6489) 2024-07-20 09:04:21 +08:00
1e0e573165 update clean embedding cache query logic (#6483) 2024-07-20 01:29:25 +08:00
Joe
27e08a8e2e Fix/extra table tracing app config (#6487) 2024-07-20 00:53:31 +08:00
49ef9ef225 feat(tool): getimg.ai integration (#6260) 2024-07-19 20:32:42 +08:00
c013086e64 fix: next suggest question logic problem (#6451)
Co-authored-by: evenyan <yikun.yan@ubtrobot.com>
2024-07-19 20:26:11 +08:00
48f872a68c fix: build error (#6480) 2024-07-19 18:37:42 +08:00
4f9f175f25 fix: correct gpt-4o-mini max token (#6472)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-07-19 18:24:58 +08:00
47e5dc218a Update CONTRIBUTING_CN "安装常见问题解答" link. (#6470) 2024-07-19 17:06:32 +08:00
90372932fe Update CONTRIBUTING "installation FAQ" link. (#6471) 2024-07-19 17:05:30 +08:00
0bb2b285da Update CONTRIBUTING_JA "installation FAQ" link. (#6469) 2024-07-19 17:05:20 +08:00
3da854fe40 chore: some components upgrage to new ui (#6468) 2024-07-19 16:39:49 +08:00
57729823a0 fix wrong method using (#6459) 2024-07-19 13:48:13 +08:00
9e168f9d1c feat: support gpt-4o-mini for openrouter provider (#6447) 2024-07-19 13:09:41 +08:00
ea45496a74 update ernie models (#6454) 2024-07-19 13:08:39 +08:00
a5fcd91ba5 chore: make text generation timeout duration configurable (#6450) 2024-07-19 12:54:15 +08:00
2ba05b041f refactor(myscale):Set the default value of the myscale vector db in DifyConfig. (#6441) 2024-07-19 10:57:45 +08:00
8e49146a35 [EMERGENCY] Fix Anthropic header issue (#6445) 2024-07-19 07:38:15 +08:00
dad3fd2dc1 feat: add gpt-4o-mini (#6442) 2024-07-19 01:53:43 +08:00
284ef52bba feat: passing the inputs values using difyChatbotConfig (#6376) 2024-07-18 21:54:16 +08:00
e493ce9981 update clean embedding cache logic (#6434) 2024-07-18 20:25:28 +08:00
7b45a5d452 fix: Unable to display images generated by Dall-E 3 (#6155) 2024-07-18 19:37:04 +08:00
4a026fa352 Enhancement: add model provider - Amazon Sagemaker (#6255)
Co-authored-by: Yuanbo Li <ybalbert@amazon.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-07-18 19:32:31 +08:00
dc847ba145 Fix the vector retrieval sorting issue (#6431)
Co-authored-by: weifj <“weifj@tuyuansu.com.cn”>
2024-07-18 19:25:41 +08:00
c0ec40e483 fix(api/core/tools/provider/builtin/spider/tools/scraper_crawler.yaml): Fix wrong placeholder config in scraper crawler tool. (#6432) 2024-07-18 19:23:18 +08:00
929c22a4e8 fix: tools edit modal schema edit issue (#6396) 2024-07-18 19:02:23 +08:00
ba181197c2 feat: api_key support for xinference (#6417)
Signed-off-by: themanforfree <themanforfree@gmail.com>
2024-07-18 18:58:46 +08:00
218930c897 fix tool icon get failed (#6375)
Co-authored-by: songyawen <songyawen@zkme.xyz>
2024-07-18 18:55:48 +08:00
c8f5dfcf17 refactor(rag): switch to dify_config. (#6410)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-07-18 18:40:36 +08:00
27c8deb4ec feat: add custom tool timeout config to docker-compose.yaml and .env (#6419)
Signed-off-by: forrestsocool <sensensudo@gmail.com>
2024-07-18 18:40:17 +08:00
4ae4895ebe feat: add frontend unit test framework (#6426) 2024-07-18 17:35:10 +08:00
afe95fa780 feat: support get workflow task execution status (#6411) 2024-07-18 15:06:14 +08:00
166a40c66e fix: improve separation element in prompt log and TTS buttons in the operation (#6413) 2024-07-18 14:44:34 +08:00
588615b20e feat: Spider web scraper & crawler tool (#5725) 2024-07-18 14:29:33 +08:00
d5dca46854 feat: add a Tianditu tool (#6320)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-07-18 13:04:03 +08:00
23e5eeec00 feat: added custom secure_ascii to the json_process tool (#6401) 2024-07-18 08:43:14 +08:00
287b42997d fix inconsistent label (#6404) 2024-07-18 08:37:16 +08:00
5236cb1888 fix: kill signal is not passed to the main process (#6159) 2024-07-18 07:50:54 +08:00
3b5b548af3 Add Stepfun LLM Support (#6346) 2024-07-18 07:47:18 +08:00
4782fb50c4 Support new Claude-3.5 Sonnet max token limit (#6335) 2024-07-18 07:47:06 +08:00
f55876bcc5 fix web import url is too long (#6402) 2024-07-18 01:14:36 +08:00
8a80af39c9 refactor(models&tools): switch to dify_config in models and tools. (#6394)
Co-authored-by: Poorandy <andymonicamua1@gmail.com>
2024-07-17 22:26:18 +08:00
35f4a264d6 fix: default duration (#6393) 2024-07-17 21:19:04 +08:00
6c798cbdaf fix: tool authorization setting panel not validate required fields (#6387) 2024-07-17 21:10:28 +08:00
279f1c986f embed.js add esc exit and fix avoid infinite nesting (#6360)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
2024-07-17 20:52:44 +08:00
443e96777b update empty document caused delete exist collection (#6392) 2024-07-17 20:38:32 +08:00
65bc4e0fc0 Fix issues related to search apps, notification duration, and loading icon on the explore page (#6374) 2024-07-17 20:24:31 +08:00
a6dbd26f75 Add the API documentation for streaming TTS (Text-to-Speech) (#6382) 2024-07-17 19:44:16 +08:00
f3f052ba36 fix: rename model from ernie-4.0-8k-Latest to ernie-4.0-8k-latest (#6383) 2024-07-17 19:07:47 +08:00
1bc90b992b Feat/optimize clean dataset logic (#6384) 2024-07-17 17:36:11 +08:00
fc37887a21 refactor(api/core/workflow/nodes/http_request): Remove mask_authorization_header because its alwary true. (#6379) 2024-07-17 16:52:14 +08:00
984658f5e9 fix: workflow sync before export (#6380) 2024-07-17 16:51:48 +08:00
4ed1476531 fix: incorrect config key name (#6371)
Co-authored-by: LionYuYu <lyu@theknotww.com>
2024-07-17 15:52:51 +08:00
ca69e1a2f5 Add multilingual support for TTS (Text-to-Speech) functionality. (#6369) 2024-07-17 14:41:29 +08:00
20f73cb756 fix: default model set wrong(#6327) (#6332)
Co-authored-by: maiyouming <maiyouming@yafex.cn>
2024-07-17 14:14:12 +08:00
4e2fba404d WebscraperTool bypass cloudflare site by cloudscraper (#6337) 2024-07-17 14:13:57 +08:00
7943f7f697 chore: fix legacy API usages of Query.get() by Session.get() in SqlAlchemy 2 (#6340) 2024-07-17 13:54:35 +08:00
7c397f5722 update celery beat scheduler time to env (#6352) 2024-07-17 02:31:30 +08:00
06fcc0c650 Fix tts api err (#6349)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-07-16 21:53:57 +08:00
0de224b153 fix wrong using of RetrievalMethod Enum (#6345) 2024-07-16 19:09:04 +08:00
ed9e692263 feat: bedrock model runtime enhancement (#6299) 2024-07-16 15:54:39 +08:00
cc0c826f36 Add tool: Google Translate (#6156) 2024-07-16 15:28:33 +08:00
0099ef6896 fix: better qr code panel and webapp url regen confirmation (#6321) 2024-07-16 15:06:49 +08:00
55d7374ab7 Docs: Translate (#6329) 2024-07-16 15:01:25 +08:00
988aa4b5da update clean_unused_datasets_task timedelta (#6324) 2024-07-16 13:43:04 +08:00
c5d06e7943 dep: bump Pydantic from 2.7 to 2.8 (#6273) 2024-07-16 13:40:58 +08:00
23e8043160 fix: prompt editor new line (#6310) 2024-07-16 11:23:26 +08:00
d66d7146a3 chore:update azure GA version 2024-06-01 (#6307) 2024-07-16 10:32:18 +08:00
eabfd84ceb bump to 0.6.14 (#6294) 2024-07-15 21:01:09 +08:00
d320d1468d Feat/delete file when clean document (#5882) 2024-07-15 19:57:05 +08:00
b47fa27a35 fix: zhipuai validate error when user's api key not support for chatglm_turbo in issue #6289 (#6290) 2024-07-15 19:27:18 +08:00
68ad9a91b2 fix: validateColorHex: cannot read properties of undefined (reading 'length') (#6242) 2024-07-15 19:26:00 +08:00
c17a4165c1 6282 i18n add support for Italian (#6288) 2024-07-15 19:25:07 +08:00
96c171805a Update bedrock.yaml (#6281) 2024-07-15 16:53:03 +08:00
9a536979ab feat(frontend): workflow import dsl from url (#6286) 2024-07-15 16:24:03 +08:00
46a5294d94 feat(backend): support import DSL from URL (#6287) 2024-07-15 16:23:40 +08:00
ec181649ae Update model provider configuration for Triton Inference Server and X… (#6274) 2024-07-15 15:07:28 +08:00
4fdcb30ff8 fix: custom tool input number fail (#6200)
Co-authored-by: jinqi.guo <jinqi.guo@ubtrobot.com>
2024-07-14 22:11:13 +08:00
07add06c59 Feat/add zhipu CogView 3 tool (#6210) 2024-07-13 17:39:17 +08:00
a7b33b55e8 Fix mermaid render (#6088)
Co-authored-by: 靖谦 <jingqian@kaiwu.cloud>
2024-07-12 20:09:24 +08:00
0cbbaf3f68 fix: markdown proc will remove image (#5855) 2024-07-12 20:07:22 +08:00
c564f32ab6 fix: remove the maximum length limit of "paragraph" variable (#6234) 2024-07-12 19:58:42 +08:00
7c2c949f01 Update ernie_bot.py (#6236) 2024-07-12 19:54:53 +08:00
066168da52 fix: model-provider-card-style (#6246) 2024-07-12 17:17:07 +08:00
1df71ec64d refactor(api): switch to dify_config with Pydantic in controllers and schedule (#6237) 2024-07-12 16:51:43 +08:00
a9ee52f2d7 Fix/firecrawl parameters issue (#6213) 2024-07-12 12:59:50 +08:00
7b225a5ab0 refactor(services/tasks): Swtich to dify_config witch Pydantic (#6203) 2024-07-12 12:25:38 +08:00
d7a6f25c63 fix: differentiate prompts fields based on function_calling_type (#5880) 2024-07-12 11:07:38 +08:00
f46792334c chore: remove underscore in util class name and css variable (#6221) 2024-07-12 11:07:24 +08:00
ee3936916f upgrade deepseek params (#6215) 2024-07-12 10:55:44 +08:00
109de52fe2 Fix: When editing an Agent, selecting custom tools does not allow filtering by labels. (#6197)
Co-authored-by: dufei <du_fei@venusgroup.com.cn>
2024-07-12 09:02:25 +08:00
10dd0f3fa0 fix document error for "/workflows/:task_id/stop" (#6209) 2024-07-12 08:33:50 +08:00
2f064c68bc Create ernie-4.0-turbo-8k-preview (#6132) 2024-07-11 20:20:07 +08:00
079583eaa4 fix: Correct environment variable name (#6184)
Co-authored-by: liuzhenghua-jk <liuzhenghua-jk@360shuke.com>
2024-07-11 20:16:52 +08:00
0e82072323 Fix if_else node compatibility with historical workflows. (#6186) 2024-07-11 17:13:16 +08:00
678ad6b7eb Fix/file stream azure blob (#6196) 2024-07-11 17:01:03 +08:00
63e34e5227 feat: support MyScale vector database (#6092) 2024-07-11 15:21:59 +08:00
c606295ea6 fix: data not updated (#6161) 2024-07-11 11:09:14 +08:00
27d72e30ad fix: can add a custom tool without a name (#6172) 2024-07-11 11:08:50 +08:00
5660878f7b chore: update the tool's doc (#6167) 2024-07-11 11:02:58 +08:00
12e55b2cac chore: update i18n for #6069 (#6163) 2024-07-11 10:02:35 +08:00
97e094dfd8 chore: update i18n for #5943 (#6162) 2024-07-10 23:28:02 +08:00
9622fbb62f feat: app rate limit (#5844)
Co-authored-by: liuzhenghua-jk <liuzhenghua-jk@360shuke.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-07-10 21:31:35 +08:00
cc8dc6d35e Revert "chore: update the tool's doc" (#6153) 2024-07-10 19:57:12 +08:00
215661ef91 feat: add PerfXCloud, Qwen series #6116 (#6117) 2024-07-10 18:26:10 +08:00
Joe
5a3e09518c feat: add if elif (#6094) 2024-07-10 18:22:51 +08:00
ebba124c5c feat: workflow if-else support elif (#6072) 2024-07-10 18:20:13 +08:00
a62325ac87 feat: add until className defines (#6141) 2024-07-10 15:34:56 +08:00
1d2ab2126c chore: update the tool's doc (#6122) 2024-07-10 12:42:34 +08:00
b07dea836c feat(embed): enhance config and add custom styling support (#5781) 2024-07-10 09:27:24 +08:00
f9d00e0498 chore: use poetry for linter tools installation and bump Ruff from 0.4 to 0.5 (#6081) 2024-07-09 23:06:23 +08:00
757ceda063 chore(deps): bump braces from 3.0.2 to 3.0.3 in /web (#6098)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 23:05:12 +08:00
d27e3ab99d chore: remove unresolved reference (#6110) 2024-07-09 23:04:44 +08:00
Joe
ce930f19b9 fix dataset operator (#6064)
Co-authored-by: JzoNg <jzongcode@gmail.com>
2024-07-09 17:47:54 +08:00
3b14939d66 Chore: new tailwind vars (#6100) 2024-07-09 16:37:59 +08:00
279caf033c fix: node-title-is-overflow-in-checklist (#5870) 2024-07-09 15:12:41 +08:00
eff280f3e7 feat: tailwind related improvement (#6085) 2024-07-09 15:05:40 +08:00
7c70eb87bc feat: support AnalyticDB vector store (#5586)
Co-authored-by: xiaozeyu <xiaozeyu.xzy@alibaba-inc.com>
2024-07-09 13:32:04 +08:00
6ef401a9f0 feat:add tts-streaming config and future (#5492) 2024-07-09 11:33:58 +08:00
b29a36f461 Feat: add index bar to select tool panel of workflow (#6066)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-07-09 09:43:34 +08:00
17f22347ae bump to 0.6.13 (#6078) 2024-07-08 23:23:07 +08:00
22aaf8960b fix: Inconsistency Between Actual and Debug Input Variables (#6055) 2024-07-08 22:27:55 +08:00
0046ef7707 refactor: revamp picker block (#4227)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-07-08 21:56:09 +08:00
68b1d063f7 chore: remove tsne unused code (#6077) 2024-07-08 21:25:19 +08:00
5e6c3001bd fix: relative in overflow div (#5998) 2024-07-08 18:35:12 +08:00
7ed4e963aa chore(action): move docker login above Set up QEMU in build-push action workflow (#6073) 2024-07-08 18:32:29 +08:00
001d868cbd remove clunky welcome message (#6069) 2024-07-08 18:17:41 +08:00
6610b4cee5 feat: add request_params field to jina_reader tool (#5610) 2024-07-08 18:11:50 +08:00
cbbe28f40d fix azure stream download (#6063) 2024-07-08 17:13:16 +08:00
603187393a chore: hide tracing introduce detail (#6049) 2024-07-08 10:50:18 +08:00
411e938e3b Address the issue of the absence of poetry in the development container. (#6036)
Co-authored-by: ox01024@163.com <Waffle>
2024-07-08 09:44:07 +08:00
610da4f662 Fix authorization header validation to handle bearer types correctly - "authorization config header is required" error (#6040) 2024-07-08 00:09:59 +08:00
3ec80f9dda Fix/6034 get random order of categories in explore and workflow is missing in zh hant (#6043) 2024-07-07 17:06:47 +08:00
Mab
91c5818236 Modify slack webhook url validation to allow workflow (#6041) (#6042)
Co-authored-by: Shunsuke Mabuchi <mabuchs@amazon.co.jp>
2024-07-07 14:09:20 +08:00
c436454cd4 fix(configs): Update pydantic settings in config files (#6023) 2024-07-07 12:18:15 +08:00
a877d4831d Fix/incorrect parameter extractor memory (#6038) 2024-07-07 12:17:34 +08:00
d522308a29 chore: optimize memory fetch performance (#6039) 2024-07-07 08:54:24 +08:00
85744b72e5 feat: support moonshot and glm base models for volcengine provider (#6029) 2024-07-07 01:17:33 +08:00
f0b7051e1a Optimize db config (#6011) 2024-07-07 01:06:51 +08:00
3b23d6764f fix: token count includes base64 string of input images (#5868) 2024-07-06 16:53:32 +08:00
9b7c74a5d9 chore: skip pip upgrade preparation in api dockerfile (#5999) 2024-07-06 14:17:34 +08:00
4d105d7bd7 feat(*): Swtich to dify_config. (#6025) 2024-07-06 12:05:13 +08:00
eee779a923 fix: the input field of tool panel not worked as expected (#6003) 2024-07-06 09:54:30 +08:00
ab847c81fa Add 2 firecrawl tools : Scrape and Search (#6016)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-07-06 09:45:39 +08:00
b217ee414f test(test_rerank): Remove duplicate test cases. (#6024) 2024-07-06 09:44:50 +08:00
23dc6edb99 chore: optimize memory messages fetch count limit (#6021) 2024-07-06 03:25:38 +08:00
79df8825c8 Revert "feat: knowledge admin role" (#6018) 2024-07-05 21:31:34 +08:00
71c50b7e20 feat: add Llama 3 and Mixtral model options to ddgo_ai.yaml (#5979)
Signed-off-by: K8sCat <k8scat@gmail.com>
2024-07-05 21:11:15 +08:00
af98fd29bf fix: add status_code 304 (#6000)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-07-05 21:10:33 +08:00
cddea83e65 6014 i18n add support for spanish (#6017) 2024-07-05 21:05:33 +08:00
9f16739518 [Feature] Support loading for mermaid. (#6004)
Co-authored-by: 靖谦 <jingqian@kaiwu.cloud>
2024-07-05 21:01:50 +08:00
Joe
3f0da88ff7 fix: update workflow trace query (#6010) 2024-07-05 18:37:26 +08:00
cc63af8e72 Removed firecrawl-py, fixed and improved firecrawl tool (#5896)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-07-05 18:04:51 +08:00
bf2268b0af fix API tool's schema not support array (#6006) 2024-07-05 17:11:59 +08:00
00b4cc3cd4 feat: implement forgot password feature (#5534) 2024-07-05 13:38:51 +08:00
f546db5437 fix: document truncation and loss in notion document sync (#5631)
Co-authored-by: Aurelius Huang <cm.huang@aftership.com>
2024-07-05 11:48:17 +08:00
f8aaa57f31 feat: add retry mechanism for zhipuai (#5926) 2024-07-05 10:49:18 +08:00
cabcf94be3 fix: TENCENT_VECTOR_DB_REPLICAS can be set to 0 (#5968)
Co-authored-by: jianglin <jianglin@wangxiaobao.com>
2024-07-05 08:32:28 +08:00
2d6624cf9e typo: Update README.md (#5987) 2024-07-04 22:50:27 +08:00
02982df0d4 fix: Fix some type error in http executor. (#5915) 2024-07-04 19:34:37 +08:00
421a24c38d refactor(api/app.py): Simplify the retrieval of debug settings. (#5916) 2024-07-04 18:19:06 +08:00
d7f75d17cc Chore/remove-unused-code (#5917) 2024-07-04 18:18:26 +08:00
Joe
5d9ad430af feat: knowledge admin role (#5965)
Co-authored-by: JzoNg <jzongcode@gmail.com>
Co-authored-by: jyong <718720800@qq.com>
2024-07-04 16:21:40 +08:00
46eca01fa3 fix: no json output vars in front-page tool (#5943) 2024-07-04 16:15:56 +08:00
4d9c22bfc6 refactor: optimize-the-performance-of-var-reference-picker (#5918) 2024-07-04 15:33:36 +08:00
52e59cf4df chore: enchance firecrawl user experience (#5958) 2024-07-04 15:26:38 +08:00
Joe
688b8fe114 fix: langfuse logical operator error (#5948) 2024-07-04 13:47:15 +08:00
aecdfa2d5c feat: add claude3 function calling (#5889) 2024-07-03 22:21:02 +08:00
cb8feb732f refactor: Create a dify_config with Pydantic. (#5938) 2024-07-03 21:09:23 +08:00
c490bdfbf9 fix: zhipuai pytest correction (#5934) 2024-07-03 19:19:33 +08:00
e7494d632c docs(api/core/tools/docs/en_US/tool_scale_out.md): Format by markdownlint. (#5903) 2024-07-03 13:13:38 +08:00
e3006f98c9 chore: remove dify SaaS URL in default configs (#5888) 2024-07-02 22:49:18 +08:00
b34baf1e3a feat: pr template (#5886) 2024-07-02 22:38:17 +08:00
372dc7ac1a fix bug : TencentVectorDBConfig Add TENCENT_VECTOR_DB_DATABASE (#5879) 2024-07-02 21:31:14 +08:00
66a62e6c13 refactor(api/core/app/apps/base_app_generator.py): improve input validation and sanitization in BaseAppGenerator (#5866) 2024-07-02 18:58:07 +08:00
04c0a9ad45 chore: click area that trigger showing tracing config is too large (#5878) 2024-07-02 18:28:41 +08:00
0944ca9d91 Fix/remove tsne position test (#5858)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
2024-07-02 17:57:42 +08:00
d468f8b75c Fix/docker env namings (#5857)
Co-authored-by: dahuahua <38651850@qq.com>
2024-07-02 16:14:34 +08:00
6e256507d3 doc: docker-compose won't start due to wrong README (#5859) 2024-07-02 16:10:22 +08:00
a33ce09e6e fix:retieval setting document link 404 (#5861) 2024-07-02 16:02:17 +08:00
9f973bb703 chore:remove .env.example duplicate key (#5853) 2024-07-02 15:10:18 +08:00
6d0a605c5f fix: not show opening question if the opening message is empty (#5856) 2024-07-02 15:04:02 +08:00
a948bf6ee8 fix: react.js error 185 maximum update depth exceeded in streaming responses during conversation (#5849)
Co-authored-by: 林星宇 <xy@udxx.cn>
2024-07-02 14:26:56 +08:00
b8999e367a Ensure *.sh are LF-style, so that they can be used directly by Docker for Windows (#5793) 2024-07-02 13:38:18 +08:00
Joe
59ad091e69 feat: add export permission (#5841) 2024-07-02 13:37:37 +08:00
Joe
598e030a7e feat: update LangfuseConfig host config (#5846) 2024-07-02 13:14:07 +08:00
774a17cedf fix:unable to select workplace at the bottom (#5785)
Co-authored-by: wxfanghongtai <wxfanghongtai@gf.com.cn>
2024-07-02 13:10:50 +08:00
d889e1b233 fix: output variable name may be duplicate (#5845) 2024-07-02 13:02:59 +08:00
32d85fb896 chore: Update some type hints in config. (#5833) 2024-07-02 08:50:02 +08:00
af308b99a3 sync delete app table record when delete app (#5819) 2024-07-02 08:48:29 +08:00
49d9c60a53 chore: update i18n for #5811 (#5838) 2024-07-02 08:47:53 +08:00
ed83df972f Chore/remove extra docker middleware variables (#5836)
Co-authored-by: dahuahua <38651850@qq.com>
2024-07-01 23:34:00 +08:00
3124728e03 Fix/docker nginx https config (#5832)
Co-authored-by: dahuahua <38651850@qq.com>
2024-07-01 23:15:26 +08:00
af469ea5bd add provision scripts repo link for azure to readme (#5820) 2024-07-01 20:44:47 +08:00
2a27568537 Enhance: tools wecom bot support markdown message (#5791) 2024-07-01 18:19:47 +08:00
1d3e96ffa6 add support oracle oci object storage (#5616) 2024-07-01 17:21:44 +08:00
Joe
b7b1396c51 fix: ops trace slow db (#5812) 2024-07-01 17:09:53 +08:00
71bcf75d9a Feat/add delete knowledge confirm (#5810) 2024-07-01 17:06:51 +08:00
850c2273ee feat: Nominatim OpenStreetMap search tool (#5789) 2024-07-01 16:34:32 +08:00
78d41a27cc feat: knowledge used by app can still be removed (#5811) 2024-07-01 16:14:49 +08:00
0f8625cac2 fix: ssrf proxy and nginx entrypoint command in docker-compose files (#5803) 2024-07-01 14:48:27 +08:00
cb09dbef66 feat: correctly delete applications using Celery workers (#5787) 2024-07-01 14:21:17 +08:00
5692f9b33b fix: signin url (#5800) 2024-07-01 14:13:32 +08:00
fdfbbde10d [seanguo] modify bedrock Claude3 invoke method to converse API (#5768)
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
2024-07-01 04:36:13 +08:00
a27462d58b Chore/improve docker compose (#5784) 2024-07-01 01:11:33 +08:00
91da622df5 chore: merge CODE_EXECUTION_API_KEY into SANDBOX_API_KEY in the docker-compose.yaml (#5779) 2024-06-30 21:39:48 +08:00
373b5047fd chore: fulfill default value in docker compose yaml (#5778) 2024-06-30 21:17:53 +08:00
36610b6acf fix: can’t change exec permissions after mounting docker-entrypoint.sh for nginx and ssrf-proxy services causing startup failures (#5776) 2024-06-30 20:18:53 +08:00
eab0ac3a13 chore: remove port expose in docker compose (#5754)
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
2024-06-30 10:31:31 +08:00
Joe
f637ae4794 fix: langsmith message_trace end_user_data session_id error (#5759) 2024-06-30 01:12:16 +08:00
Joe
ffb07eb24b fix: workflow trace none type error (#5758) 2024-06-29 23:32:52 +08:00
f101fcd0e7 fix: missing process data in parameter extractor (#5755) 2024-06-29 23:29:43 +08:00
fc0f75d13b Docs/add docker dotenv notes (#5750) 2024-06-29 22:09:59 +08:00
1e045a0187 fix: slow sql of ops tracing (#5749) 2024-06-29 20:28:30 +08:00
cdf64d4ee2 Update docker-compose.yaml (#5745) 2024-06-29 18:35:32 +08:00
8fd75e6965 bump to 0.6.12-fix1 (#5743) 2024-06-29 17:43:20 +08:00
0b8faade6f fix: env SMTP_PORT is empty caused err when launching (#5742) 2024-06-29 17:34:12 +08:00
d56cedfc67 fix: app config does not use empty string in the env (#5741) 2024-06-29 17:15:25 +08:00
906857b28a fix: couldn't log in or resetup after a failed setup (#5739) 2024-06-29 17:07:21 +08:00
9513155fa4 chore: support both $$ and $ latex format (#5723) 2024-06-29 11:24:25 +08:00
a6356be348 Rename README to README.md (#5727) 2024-06-29 00:53:14 +08:00
f33ef92f0c Chore/set entrypoint scripts permissions (#5726) 2024-06-29 00:48:34 +08:00
d435230059 add README for new docker/ directory (#5724) 2024-06-29 00:29:44 +08:00
6d0cea5fe6 bump to 0.6.12 (#5712) 2024-06-28 22:00:19 +08:00
2996358cf2 Ignore new middleware.env docker file (#5715) 2024-06-28 21:14:18 +08:00
0bf4817474 fix: _convert_prompt_message_to_dict parameters err (#5716) 2024-06-28 21:00:00 +08:00
8e5569f773 fix: fix-app-site-missing command (#5714) 2024-06-28 20:33:53 +08:00
d30c13891b feat: add fix-app-site-missing command (#5711) 2024-06-28 20:20:23 +08:00
a2c260fba0 add docker-legacy and docker/nginx/conf.d/default.conf to .gitignore (#5707) 2024-06-28 19:47:24 +08:00
017d2c804b fix: do not remove (#5706) 2024-06-28 19:33:42 +08:00
b5efc79bc5 build(deps): bump braces from 3.0.2 to 3.0.3 in /web (#5705)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-28 18:21:15 +08:00
e2037cf22b fix: yarn lock file missing (#5703) 2024-06-28 17:52:44 +08:00
68ac433218 feat: add support Spark4.0 (#5688) 2024-06-28 17:39:11 +08:00
b7d23408ad Correction of Typo in French (#5699) 2024-06-28 17:38:37 +08:00
488e3c3d56 Chore/improve deployment flow (#4299)
Co-authored-by: 天魂 <365125264@qq.com>
2024-06-28 17:37:52 +08:00
Joe
dd5f3873da feat: change TRACE_QUEUE_MANAGER_INTERVAL default value (#5698)
Co-authored-by: takatost <takatost@gmail.com>
2024-06-28 17:34:58 +08:00
73ce945d40 Feat/add json process tool (#5555) 2024-06-28 11:57:32 +08:00
d37ee498cd fix: do not remove (#5682)
Co-authored-by: hobo.l <hobo.l@binance.com>
2024-06-28 11:19:34 +08:00
b3d6726f65 Feature/add qwen llm (#5659) 2024-06-28 11:06:29 +08:00
f9e4b4e74c Fix docker command (#5681) 2024-06-28 01:23:01 +08:00
2b080b5cfc feature: Add presence_penalty and frequency_penalty parameters to the … (#5637)
Co-authored-by: liuzhenghua-jk <liuzhenghua-jk@360shuke.com>
2024-06-28 00:27:20 +08:00
Joe
e8b8f6c6dd Feat/fix ops trace (#5672)
Co-authored-by: takatost <takatost@gmail.com>
2024-06-28 00:24:37 +08:00
f0ea540b34 feat: xxo enhancement. (#5671) 2024-06-27 17:58:45 +08:00
2a13ef9ae0 chore: rearrange python dependencies in groups (#5603) 2024-06-27 17:52:54 +08:00
91e6ef8655 chore: delete unused resource (#5667) 2024-06-27 17:51:33 +08:00
2119d59da8 fix: knowledge retrieval score threshold setting (#5658) 2024-06-27 14:26:14 +08:00
3ccad33194 feat: add jina new pre-defined rerankers, include: jina-reranker-v2 (#5657) 2024-06-27 13:45:35 +08:00
bafc8a0bde fix: tool call message role according to credentials (#5625)
Co-authored-by: sunxichen <sun.xc@digitalcnzz.com>
2024-06-27 12:35:27 +08:00
92c56fdf2b fix: HTTP request header is overwritten when user set Content-Type (#5628) 2024-06-27 12:31:37 +08:00
dcb72e0067 chore: apply flake8-comprehensions Ruff rules to improve collection comprehensions (#5652)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-06-27 11:21:31 +08:00
2e718b85e9 fix(api): language list (#5649)
Co-authored-by: hobo.l <hobo.l@binance.com>
2024-06-27 08:46:53 +08:00
17d2f0bb0d fix(api/configs): Ignore empty environment variables when loading config. (#5647) 2024-06-26 21:39:19 +08:00
0dfdb61ee9 fix: type error in config (#5643) 2024-06-26 21:01:16 +08:00
8d9a459083 fix: remove obsoleted 'version' elements in compose files (#5553) 2024-06-26 20:11:51 +08:00
89a7c70730 chore: add a secondary confirmation dialog when the user delete the tool (#5634) 2024-06-26 19:27:22 +08:00
e1a72e0e2b fix: ro-RO is not a valid language (#5635) 2024-06-26 18:56:31 +08:00
4c0a31d38b FR: #4048 - Add color customization to the chatbot (#4885)
Co-authored-by: crazywoola <427733928@qq.com>
2024-06-26 17:51:00 +08:00
8fa6cb5e03 feat: tracing fe (#5487) 2024-06-26 17:33:57 +08:00
Joe
4e2de638af feat: add ops trace (#5483)
Co-authored-by: takatost <takatost@gmail.com>
2024-06-26 17:33:29 +08:00
31a061ebaa chore: cleanup test_delete_by_document_id method in opensearch vdb test (#5619) 2024-06-26 17:21:36 +08:00
f7234c93af chore(pyproject.toml): Add Ruff formatter config. (#5627) 2024-06-26 16:30:28 +08:00
b7d5849191 Fix link to documentation of nodes (#5623) 2024-06-26 16:28:05 +08:00
af9448e6f2 feat: undo/redo for workflow editor (#3927)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
2024-06-26 14:37:12 +08:00
d0fe56a98e fix: populate app configs to system environment variables (#5590) 2024-06-26 14:27:49 +08:00
b8926ea267 fix: DuckDuckGo image search tool error (#5606) 2024-06-26 13:21:40 +08:00
43335b5c87 delete the deprecated method (#5612) 2024-06-26 12:51:50 +08:00
af6e3869ff fix: context icon in chat (#5604) 2024-06-26 10:40:38 +08:00
964f0e1400 fix: Modify the incorrect configuration name for Google storage (#5595)
Co-authored-by: Wenming Pan <pwm@google.com>
2024-06-26 07:54:22 +08:00
b2e2298822 feat: update issue template (#5592) 2024-06-25 22:54:23 +08:00
87ee3e627f chore: fix typo in config descriptions (#5585) 2024-06-25 21:19:56 +08:00
45a3ea6fed fix: add support for FILE type in ToolParameterConverter (#5578) 2024-06-25 18:47:59 +08:00
7c9e88dfb3 Fix/single run panel show parent scrollbar (#5574)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
2024-06-25 16:41:49 +08:00
2a0f03a511 refactor: extract cors configs into dify config and cleanup the config class (#5507)
Co-authored-by: takatost <takatost@gmail.com>
2024-06-25 15:48:02 +08:00
ec1d3ddee2 feat: support importing and overwriting workflow DSL (#5511)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
2024-06-25 15:46:12 +08:00
cdc2a6f637 The firecrawl tool now supports self-hosting (#5528)
Co-authored-by: takatost <takatost@gmail.com>
2024-06-25 15:15:21 +08:00
023dba9475 fix: revert CI path filters (#5561) 2024-06-24 23:46:15 +08:00
f8d97be932 fix: useless CI style checks (#5559) 2024-06-24 23:31:54 +08:00
0c352eef2d fix: fe tool filter missing different languages handling (#5558) 2024-06-24 23:10:59 +08:00
877a2c144b feat: support predefined models for openrouter (#5494) 2024-06-24 16:31:53 +08:00
f7900f298f chore: refactor the http executor node (#5212) 2024-06-24 16:14:59 +08:00
1e28a8c033 chore: add create_json_message api for tools (#5440) 2024-06-24 15:46:16 +08:00
ba67206bb9 fix(api/model_runtime/azure/llm): Switch to tool_call. (#5541) 2024-06-24 15:35:21 +08:00
41ceb6a4eb Fix: position of log modal (#5538) 2024-06-24 14:57:35 +08:00
13fcd7a901 feat: Add program_name attribute to TiDB connection (#5499)
Signed-off-by: Xiaoguang Sun <sunxiaoguang@gmail.com>
2024-06-24 14:41:07 +08:00
756d9a4bc2 add opensearch default value (#5536) 2024-06-24 14:33:31 +08:00
4a031de0d9 feat: make Citations and Attributions display enable default (#5508)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
2024-06-24 13:03:49 +08:00
54b8d98cde Fix: resolve issue with embedding model field visibility toggling on datasets page (#5451) 2024-06-24 12:58:36 +08:00
f220d294e0 Fix: custom disclaim (#5535) 2024-06-24 12:50:08 +08:00
8294e97113 Chore: chat log refactor (#5523) 2024-06-24 12:29:14 +08:00
47a5d4527b feat: use root dir to start python and celery (#5515) 2024-06-24 09:53:26 +08:00
dcec9d7bb7 feat: add new features to enhance image and link handling in Jina tool (#5517) 2024-06-24 01:06:26 +08:00
ea29007bc0 fix: apply best practices for the latest buildkit (#5527) 2024-06-24 00:45:33 +08:00
3a626cd251 fix: added error handling for novita ai tool query (#5506)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-23 17:59:03 +08:00
e9ce0b10de feat: add Asia/Ho_Chi_Minh timezone (#5521) 2024-06-23 16:47:44 +08:00
c5d64baba4 fix: correct typos (#5510) 2024-06-22 23:01:02 +08:00
29ca6815ae chore: use singular style in middleware config class name (#5502) 2024-06-22 18:26:38 +08:00
5217f7cf69 refactor: extract hosted service configs into dify config (#5504) 2024-06-22 17:41:17 +08:00
57063095c1 fix: summary of duckduckgo_search (#5488) 2024-06-22 13:58:30 +08:00
48757e581e fix: zhipu tool calling, this PR fixes the bug described in issue #5496 (#5469)
Co-authored-by: vccler <vccler@163.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-06-22 12:41:24 +08:00
LXM
e8ad0339a3 fix: tongyi json output (#5396) 2024-06-22 12:25:23 +08:00
3bbd75f1f2 fix: firecrawl apikey not start with fc- (#5498) 2024-06-22 11:52:53 +08:00
f67b164b0d refactor: extract db configs and celery configs into dify config (#5491) 2024-06-22 10:29:56 +08:00
b05cc3a1e4 refactor: extract storage provider configs into dify configs (#5443) 2024-06-22 10:07:03 +08:00
8890978ad3 chore: use singular style in config class name (#5489) 2024-06-22 09:54:25 +08:00
9a5c423d59 chore: remove pip support for api service (#5453)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
2024-06-22 02:05:50 +08:00
6a09409ec9 Add Oracle23ai as a vector datasource (#5342)
Co-authored-by: walter from vm <walter.jin@oracle.com>
2024-06-22 01:48:07 +08:00
27f0ae8416 build: support Poetry for depencencies tool in api's Dockerfile (#5105)
Co-authored-by: takatost <takatost@gmail.com>
2024-06-22 01:34:08 +08:00
91d38a535f fix: max_tokens of qwen-plus & qwen-plus-chat (#5480) 2024-06-21 16:49:33 +08:00
95c882934e feat: add support for Vertex AI claude-3-5-sonnet@20240620 (#5475)
Co-authored-by: Wenming Pan <pwm@google.com>
2024-06-21 16:45:56 +08:00
e88f5607ac fix: view workflow log detail page crash (#5474) 2024-06-21 14:46:06 +08:00
5d4d65a85b fix: button (#5470) 2024-06-21 14:17:45 +08:00
92ddb410cd feat: option to hide workflow steps (#5436) 2024-06-21 12:51:10 +08:00
1336b844fd feat(api/auth): switch-to-stateful-authentication (#5438) 2024-06-21 12:39:07 +08:00
26b6fd2236 feat: add support for bedrock claude-3-5-sonnet-20240620 (#5461) 2024-06-21 10:21:35 +08:00
ff0f02d809 feat: add support for claude-3-5-sonnet-20240620 (#5452) 2024-06-21 00:23:15 +08:00
b04715d48c fix/i18n: correct indexMethodHighQualityTip (#5431) 2024-06-20 23:12:02 +08:00
da68ea6812 fix: some types of buttons ui breaks (#5437) 2024-06-20 17:08:07 +08:00
7e3f194031 fix: in iteration node picker may show the wrong var type (#5435) 2024-06-20 16:53:10 +08:00
65d34ebb96 refactor: extract vdb configs into pydantic-setting based dify configs (#5426) 2024-06-20 16:24:10 +08:00
142dc0afd7 refactor: Remove unused code in large_language_model.py (#5433) 2024-06-20 16:20:40 +08:00
39c14ec7c1 improve: unify Excel files parsing in either xls or xlsx file format by Pandas (#4965) 2024-06-20 16:14:49 +08:00
0d20df9a51 fix: add notion page in knowledge (#5430) 2024-06-20 15:48:38 +08:00
3db110c0b9 fix: annotation id not pass to update setting (#5429) 2024-06-20 15:25:25 +08:00
23fa3dedc4 fix(core): Fix incorrect type hints. (#5427) 2024-06-20 15:16:21 +08:00
e4259a8f13 fix: workflow note node copy & link style (#5428) 2024-06-20 15:08:12 +08:00
51d34f5936 fix: button component will refresh page (#5420) 2024-06-20 12:44:27 +08:00
a51ec2094f fix: sentry config float type err (#5416) 2024-06-20 11:28:52 +08:00
940c2faea1 fix: prompt editor insert cursor position (#5415) 2024-06-20 11:19:43 +08:00
aed56b1a8f fix: Revert "feat: initial support for Milvus 2.4.x (#3795)" downgrading to 2.3.x for Linux arm64 installation failure (#5414) 2024-06-20 11:18:05 +08:00
a88aa20824 fix: optional parameter missing default value None in http request node (#5413) 2024-06-20 11:07:01 +08:00
2328ed8ffa feat: new icons (#5412) 2024-06-20 11:05:08 +08:00
0105129fa8 fix bug: tencent vdb #5378 (#5408) 2024-06-20 10:37:39 +08:00
b78faa461f Corrected an error in the APi docs (#5398) 2024-06-19 19:21:16 +08:00
b1db581ebe feat: update template (#5395) 2024-06-19 17:43:45 +08:00
e05183c7d2 fix: unnecessory data fetch when swithing apps category on explore page (#5155) 2024-06-19 17:33:19 +08:00
c923684edd chore: extract retrival method literal values into enum (#5060) 2024-06-19 16:05:27 +08:00
9d5a89eab6 feat: add log date timezone (#4623)
Co-authored-by: liuzhenghua-jk <liuzhenghua-jk@360shuke.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-06-19 15:51:00 +08:00
bdf3ea4369 docs(api/README): Remove unnecessary = (#5380) 2024-06-19 15:17:13 +08:00
4c37847ea4 Fix: use new button (#5384) 2024-06-19 14:53:19 +08:00
a3bd5eba02 refactor: refactor the button component using forwardRef (#4379)
Co-authored-by: KVOJJJin <jzongcode@gmail.com>
2024-06-19 14:13:16 +08:00
bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 2024-06-19 13:55:44 +08:00
3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-06-19 13:41:12 +08:00
d160d1ed02 feat: support opensearch approximate k-NN (#5322) 2024-06-19 12:44:33 +08:00
a651e7e2da Add sample environment variables for Aliyun OSS (#5366)
Signed-off-by: denverdino <denverdino@gmail.com>
2024-06-19 12:37:30 +08:00
e785cbb81d Fix: multi image preview sign (#5376)
Co-authored-by: huangyusong <huangyusong@yingzi.com>
2024-06-19 12:36:40 +08:00
2b0c779173 feat: default timezone to user's local timezone in activate form (#5374) 2024-06-19 10:27:06 +08:00
a965d1ac98 fix: remove conversation_id description in completion-messages docs (#5372) 2024-06-19 10:11:46 +08:00
0e3113b7ce feat: allow non-english wikipedias to be searched (#5371) 2024-06-19 10:06:47 +08:00
fad36d0cfd docs: Add notes for running Postgres Docker on Windows WSL2 Ubuntu … (#5373) 2024-06-18 23:33:27 +08:00
7d5ebbb611 docs(readme): Optimize the content in the readme file (#5364)
Co-authored-by: 开坦克的贝塔 <k@aircode.io>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-18 18:33:22 +08:00
85eee0dfbb Update README.md (#5359) 2024-06-18 18:21:45 +08:00
369a395ee9 fix: resolve issue with cot_agent_runner not analyzing user-uploaded images correctly (#5360)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-18 18:15:41 +08:00
4e3d76a1d1 chore: add novita_client to pyproject.toml (#5349) 2024-06-18 14:52:20 +08:00
7450b9acf3 dep: bump chromadb from 0.5.0 to 0.5.1 (#5345) 2024-06-18 14:05:14 +08:00
c7d378555a chore: set build system to Poetry and remove unnecessary settings with package mode disabled (#5263) 2024-06-18 13:27:03 +08:00
5f0ce5811a feat: add flask upgrade-db command for running db upgrade with redis lock (#5333) 2024-06-18 13:26:01 +08:00
9b7fdadce4 fix: wrong token usage in iteration node for streaming result (#5336) 2024-06-18 13:08:40 +08:00
132f5fb3de feat: add Novita AI image generation tool, implemented model search, text-to-image and create tile functionalities (#5308)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-18 11:08:25 +08:00
3828d4cd22 feat: support Latex (#5001)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-18 10:43:47 +08:00
c7641be093 fix: workflow results in FAIL status due to null reference error (#5332) 2024-06-18 09:33:33 +08:00
8266842809 chore: update llm.py (#5335) 2024-06-18 09:29:14 +08:00
d7213b12cc fix: extract params by function calling for models supporting tool call (#5334) 2024-06-17 23:25:29 +08:00
c163521b9e Update and fix the model param of Deepseek (#5329) 2024-06-17 21:40:04 +08:00
7305713b97 fix: allow special characters in email (#5327)
Co-authored-by: crazywoola <427733928@qq.com>
2024-06-17 21:32:59 +08:00
edffa5666d fix: got unknown type of prompt message in multi-round ReAct agent chat (#5245) 2024-06-17 21:20:17 +08:00
54756cd3b2 chore(core/workflow/utils/variable_template_parser): Refactor VariableTemplateParser class for better readability and maintainability. (#5328) 2024-06-17 21:18:56 +08:00
b73ec87afc fix(core/workflow): Handle special values in node run result outputs (#5321) 2024-06-17 20:41:57 +08:00
61f4f08744 Add bedrock command r models (#4521)
Co-authored-by: Justin Wu <justin.wu@ringcentral.com>
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
2024-06-17 20:37:46 +08:00
07387e9586 add the filename length limit (#5326) 2024-06-17 20:36:54 +08:00
147a39b984 feat: support tencent cos storage (#5297) 2024-06-17 19:18:52 +08:00
7a758a35fe fix: pin tenacity to 8.3.0 (#5319) 2024-06-17 18:03:42 +08:00
f146bebe5a fix:update Member field error (#5295) 2024-06-17 17:22:16 +08:00
be3512aa57 fix: unable to reindex documents (#5276) 2024-06-17 17:19:43 +08:00
cc4a4ec796 feat: permission and security fixes (#5266) 2024-06-17 16:06:32 +08:00
a1d8c86ee3 chore: upgrade next to 14.1.1 (#5310) 2024-06-17 15:50:41 +08:00
61ebcd8adb Fix: workflow result display (#5299) 2024-06-17 14:36:17 +08:00
24282236f0 fix: not checked require_summary of duckduckgo search raise error (#5303) 2024-06-17 14:18:49 +08:00
5a99aeb864 fix(core): Reorder field_validator and classmethod to fit Pydantic V2. (#5257) 2024-06-17 10:04:28 +08:00
e95f8fa3dc Dalle3 add seed (#5288)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-17 09:27:27 +08:00
9a64aa76c1 fix: typo and check (#5287) 2024-06-17 09:15:43 +08:00
42029791e4 fix: add event handler to delete the site when the related app deleted (#5282) 2024-06-17 08:47:26 +08:00
4f60fe7bc6 Fixed wrong /text-to-audio curl example (#5286) 2024-06-17 08:45:51 +08:00
baf5490504 Fix: z-index of delete account modal (#5277) 2024-06-16 20:42:47 +08:00
013bffc161 fix: copyright with latest time (#5271) 2024-06-16 14:39:29 +08:00
c03e6ee41b Feat: support delete account (#5208)
Co-authored-by: crazywoola <427733928@qq.com>
2024-06-16 10:26:39 +08:00
d94279ae75 fix: casting non-string type value for tool parameter options (#5267) 2024-06-16 09:47:20 +08:00
3a423e8ce7 fix: visioin model always with low quality (#5253) 2024-06-16 09:46:17 +08:00
37c87164dc fix: respect the interface language specified by the user on the activation success screen (#5258) 2024-06-16 09:37:19 +08:00
4b54843ed7 fix: run agent with Vertex AI Gemini models (#5260)
Co-authored-by: Wenming Pan <pwm@google.com>
2024-06-16 09:36:31 +08:00
ef55d0da78 chore: add icon in .idea (#5259)
Signed-off-by: Gallardot <gallardot@apache.org>
2024-06-16 09:25:11 +08:00
9961cdd7c8 fix: modal z-index cleanup (#5234) 2024-06-15 21:09:19 +08:00
2e842333b1 fix: correct typos in the icons for microsoft (#5243) 2024-06-15 21:02:47 +08:00
6ccde0452a feat: Added hindi translation i18n (#5240) 2024-06-15 21:01:03 +08:00
795714bc2f feat(Tools): Add Serply Web/Job/Scholar/News Search tool for more options (#5186)
Co-authored-by: teampen <136991215+teampen@users.noreply.github.com>
2024-06-15 20:09:33 +08:00
d9bee03ff6 fix: embedding job fails using IAM role (#5252) 2024-06-15 18:57:54 +08:00
4f0488abb5 fix: wrong order of history prompts in ReAct agent mode (#5236) 2024-06-15 10:53:30 +08:00
12c815c597 fix: ExtractSetting optional value missing None as default val (#5238) 2024-06-15 02:58:47 +08:00
d098bdc59b version to 0.6.11 (#5224) 2024-06-15 02:46:24 +08:00
ba5f8afaa8 Feat/firecrawl data source (#5232)
Co-authored-by: Nicolas <nicolascamara29@gmail.com>
Co-authored-by: chenhe <guchenhe@gmail.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-06-15 02:46:02 +08:00
918ebe1620 update tooltip (#5235) 2024-06-15 02:21:46 +08:00
6be0027853 fix: note editor italic (#5230) 2024-06-14 22:31:39 +08:00
bc757f1ddc fix: z-index (#5229) 2024-06-14 22:31:19 +08:00
8da035aac6 Update README.md (#5228) 2024-06-14 22:31:01 +08:00
ef6034abfd fix: allow the name and icon of the web app to be set independently of that of the bot itself (#5225) 2024-06-14 22:16:11 +08:00
0391282b5e fix: initialize site with customized icon and icon_background (#5227) 2024-06-14 22:15:50 +08:00
28554350de feat: support firecrawl frontend code (#5226) 2024-06-14 22:02:41 +08:00
8d1386df0f feat(Tools): Add Feishu multi-dimensional table operation function (#5213)
Co-authored-by: 黎斌 <libin.23@bytedance.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-06-14 21:19:20 +08:00
e7752e8135 chore: development script for syncing Poetry lockfile (#5170) 2024-06-14 20:54:07 +08:00
43c19007e0 fix: workspace member's last_active should be last_active_time, but not last_login_time (#4906) 2024-06-14 20:49:19 +08:00
c6b791d070 fix: number variable cause type error in openai moderation (#5222) 2024-06-14 20:43:03 +08:00
8bcc5a36bb feat: new editor user permission profile (#4435)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-06-14 20:34:25 +08:00
cdb6c801c1 Fix: http_request delete method not working (#4975) 2024-06-14 20:07:22 +08:00
511ead4b8d Update README, deploy dify with YAML file on Kubernetes (#5131) 2024-06-14 19:53:40 +08:00
4080f7b8ad feat: support tencent vector db (#3568) 2024-06-14 19:25:17 +08:00
9ed21737d5 fix: add repo check for build-push.yml (#5141) 2024-06-14 19:15:27 +08:00
337bad8525 feat: Add Optional API Key, Proxy Server, and Bypass Cache Parameters to Jina Tools (#5197) 2024-06-14 19:09:25 +08:00
Bin
0f35d07052 support ERNIE-4.0-8K-Latest (#5216) 2024-06-14 18:45:24 +08:00
7f44e88eda fix(model_providers/ollama): Fix OllamaLargeLanguageModel to correctly set the stop option (#5217) 2024-06-14 18:26:14 +08:00
b7ff765d8d Add novita.ai as model provider (#4961) 2024-06-14 18:23:06 +08:00
c28d709d7f feat: workflow add note node (#5164) 2024-06-14 17:08:11 +08:00
d7fbae286a add aws s3 iam check (#5174) 2024-06-14 15:19:59 +08:00
0633aae7dc feat: allow to use IAM Role for Bedrock (#5188) 2024-06-14 15:18:42 +08:00
f87f11e92c chore: make the Celery command more noticeable (#5203) 2024-06-14 15:06:07 +08:00
2b04388361 chore: remove bump-pydantic dependency (#5177) 2024-06-14 15:05:17 +08:00
3c0f21d174 fix: workflow as tool create error by type misuse (#5205) 2024-06-14 15:01:09 +08:00
8e2f8ffb9e Modify docs in JP (#5185) 2024-06-14 14:06:23 +08:00
e68d1b88de Fix: conversation id display & support copy (#5195) 2024-06-14 13:58:51 +08:00
ed53ef29f4 fix(core/tools): Fix the issue with iterating over None in _transform_tool_parameters_type. (#5190) 2024-06-14 11:25:48 +08:00
4289f17be2 Chore: refactor embedded chatbot (#5125) 2024-06-14 08:42:41 +08:00
54e02b8147 chore(deps): bump authlib from 1.2.0 to 1.3.1 in /api (#5115)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-06-14 03:55:40 +08:00
7f98c2ea3f refactor: Delete the dataset to verify whether it is in use (#5112) 2024-06-14 03:25:38 +08:00
7189a4c379 chore(deps): bump azure-identity from 1.15.0 to 1.16.1 in /api (#5116)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-06-14 03:24:32 +08:00
415022aa14 fix: pydantic2 error (#5172) 2024-06-14 03:05:04 +08:00
edf2047f04 fix: milvus_vector default dataset index_struct type from weaviate to milvus (#5098) 2024-06-14 02:36:01 +08:00
b85ae146a7 fix: JSON mode with an image doesn't work for Gemini (#5169) 2024-06-14 02:32:09 +08:00
5ec7d85629 fix: issues by pydantic2 upgrade (#5171) 2024-06-14 02:28:28 +08:00
f13af5a811 fix(model_providers/vertex_ai): Vertex AI Anthropic models authentication failed (#4971) 2024-06-14 01:34:31 +08:00
f976740b57 improve: mordernizing validation by migrating pydantic from 1.x to 2.x (#4592) 2024-06-14 01:05:37 +08:00
e8afc416dd improve: CI experience (#5168) 2024-06-13 23:16:28 +08:00
0cccf9c67d feat: introduce APP_MAX_EXECUTION_TIME (#5167) 2024-06-13 23:08:05 +08:00
cdc08a434f feat: support Chroma vector store (#5015) 2024-06-13 18:02:18 +08:00
3f18369ad2 Fix: google storage init with sa and download (#5054) 2024-06-13 17:36:34 +08:00
db976a1f74 Upgrade boto3 library to support EKS Pod Identity. (#5064) 2024-06-13 17:36:14 +08:00
e61f5d029a chore(docs): fix minor small typos (#5124) 2024-06-13 17:36:01 +08:00
eaca892c4e fix: front end error when same tool is called twice at once (#5068) 2024-06-13 17:16:59 +08:00
015c26d303 fix: style misalignment and inconsistency (#5149) 2024-06-13 16:32:42 +08:00
8210637bc5 feat: support jina-clip-v1 embedding model (#5146) 2024-06-13 16:31:18 +08:00
790543131a chore:add some new api version for azure openai (#5142) 2024-06-13 16:30:47 +08:00
a40f68cf94 chore: update qdrant_vector.py (#5128) 2024-06-13 15:35:14 +08:00
adc948e87c fix(api/core/model_runtime/model_providers/baichuan,localai): Parse ToolPromptMessage. #4943 (#5138)
Co-authored-by: -LAN- <laipz8200@outlook.com>
2024-06-13 13:08:30 +08:00
742b08e1d5 chore: update question classifier prompt (#5137)
Signed-off-by: 0xff-dev <stevenshuang521@gmail.com>
2024-06-13 13:04:51 +08:00
79e8489942 feat: support siliconflow (#5129) 2024-06-13 12:59:41 +08:00
d6fa130cb5 remove dalle3 seed (#5136)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-13 08:05:55 +08:00
0c92f81efc chore: sync pyproject.toml from requirements.txt (#5130) 2024-06-13 00:06:05 +08:00
11fd4a5dcc Fix: fix load_yaml logging, Avoid setting the log level to warning (#5019)
Co-authored-by: huangyusong <huangyusong@yingzi.com>
2024-06-12 19:27:01 +08:00
b399e8a359 fixed a typo and grammar error in sampled app (#5061) 2024-06-12 18:02:22 +08:00
e04fc9b304 fix: select field not work when it is not required (#5101) 2024-06-12 17:46:53 +08:00
ea69dc2a7e feat: support hunyuan llm models (#5013)
Co-authored-by: takatost <takatost@users.noreply.github.com>
Co-authored-by: Bowen Liang <bowenliang@apache.org>
2024-06-12 17:24:23 +08:00
ecc7f130b4 fix(typo): misspelling (#5094) 2024-06-12 17:01:21 +08:00
95443bd551 chore: workflow syncing modal (#5108) 2024-06-12 16:35:19 +08:00
0ce97e6315 feat: support doubao llm function calling (#5100) 2024-06-12 15:43:50 +08:00
25b0a97851 build: use Poetry as default build system for dependency installation in CI jobs (#5088) 2024-06-12 14:43:03 +08:00
28997772a5 fix: remote_url doesn't work for gemini (#5090) 2024-06-12 13:14:53 +08:00
b7c72f7a97 dalle3 add style consistency parameter (#5067)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-06-12 12:59:03 +08:00
9f7b38c068 fix: #4970 (#5093) 2024-06-12 11:29:38 +08:00
3b36ba797f feat: add duckduckgo img search, translate, ai chat (#5074) 2024-06-12 10:04:10 +08:00
4d2e6c3391 fix: Google HL Parameter for SearchApi (#5071) 2024-06-12 08:30:01 +08:00
3520d35f38 fix: autoHeightTextarea dimensions in Firefox (#4891) 2024-06-12 08:24:58 +08:00
5f104bab57 Fix: infinite loading not work when message is too short (#5075) 2024-06-12 08:23:39 +08:00
2050a8b8f0 feat: add glm4 new models and zhipu embedding-2 (#5089) 2024-06-12 08:22:17 +08:00
e3544c6ef7 fix: dependency package versions are not synchronized to requirements.txt (#5084) 2024-06-11 22:21:18 +08:00
472b976946 Arabic README.md (#5078) 2024-06-11 18:43:54 +08:00
f62f71a81a build: initial support for poetry build tool (#4513)
Co-authored-by: Bowen Liang <bowenliang@apache.org>
2024-06-11 13:11:28 +08:00
f426e1b3bd 🔧 Fix(docker/volumes/ssrf_proxy/squid.conf): The squid process on ssrf_proxy docker service crashes at startup (#5050) 2024-06-11 12:32:05 +08:00
5f870ac950 chore: update maas model provider description (#5056) 2024-06-11 11:22:22 +08:00
415816cf35 feat: add dataset delete endpoint (#5048) 2024-06-11 11:21:38 +08:00
9103112555 fix: wrong link to web app repo in chatflow mode (#5062) 2024-06-11 11:20:52 +08:00
5986841e27 fix: issue where an error occurs when invoking TTS without selecting a voice (#5046) 2024-06-09 20:28:24 +08:00
2573b138bf fix: update presence_penalty configuration for wenxin AI ernie-4.0-8k and ernie-3.5-8k models (#5039) 2024-06-09 14:44:11 +08:00
308ce66af5 🔧 fix docker-compose ssrf_proxy service WARNING: You should probably remove '::/0' from the ACL named 'all' (#5005) 2024-06-09 14:39:52 +08:00
bdad993901 improve: generalize vector factory classes and vector type (#5033) 2024-06-08 22:29:24 +08:00
3b62ab564a feat: feature modal style (#5032) 2024-06-08 07:32:34 +08:00
d319d9fc5e fix(style): some style issues (#5029) 2024-06-07 20:59:39 +08:00
ea5c8a72e2 Fix language setting not success (#5023) 2024-06-07 20:02:08 +08:00
3b60c28b3a deal the external image when extract docx image (#5024) 2024-06-07 20:00:39 +08:00
ea0219a5d5 Fix: z-index in header (#5017) 2024-06-07 16:01:33 +08:00
481e7bc6b9 Fix/azure blob new version (#5004) 2024-06-06 23:36:13 +08:00
1ccba85c91 fix: modal z-index and cleanup (#4978) 2024-06-06 22:28:13 +08:00
2539e56514 fix: some base models cannot be selected in Azure OpenAI Service setting page (#4985) 2024-06-06 22:27:57 +08:00
3929d289e0 feat: set default memory messages limit to infinite (#5002) 2024-06-06 17:39:44 +08:00
52585aea74 fix: typo in sd3 (#5000) 2024-06-06 17:08:49 +08:00
73dee84cab fix: add handling for non-string type in variable template parser (#4996) 2024-06-06 16:38:13 +08:00
efecdccf35 feat: support login by given mail (#4991) 2024-06-06 15:01:58 +08:00
da5f2e168a fix: llm selector position is incorrect in not workflow app (#4982) 2024-06-06 10:47:36 +08:00
Joe
5cdb95be1f fix: gemini timeout error (#4955) 2024-06-06 10:19:03 +08:00
7fa735a43b chore: rename vdb tests for PGVector and PGvectoRS (#4973) 2024-06-06 07:22:49 +08:00
3579fd1b09 feat: add create tenant command (#4974) 2024-06-06 00:42:00 +08:00
237b8fe3d9 add meta.doc_id index for tidb (#4963) 2024-06-05 20:45:43 +08:00
02e4de5166 fix some tidb bugs (#4960) 2024-06-05 19:14:18 +08:00
64c8093c1e Typo in Knowledge settings (#4958) 2024-06-05 18:31:24 +08:00
0797f9bc05 feat: support tidb vector (#4588) 2024-06-05 18:19:53 +08:00
602c4e51ec fix: duckduckgo search does not work (#4949)
Co-authored-by: Jyong <76649700+johnjyong@users.noreply.github.com>
2024-06-05 17:33:58 +08:00
YC
9f8ca75a81 fixing a bug of handling header row when parsing xls file, and tune xls/xlsx parsing result to be more structured (#3600) 2024-06-05 15:28:43 +08:00
80a87f36ea fix: missing iterator in task pipeline (#4948) 2024-06-05 15:10:20 +08:00
63addc9258 fix: missing dataset patch parameters in settings modal (#4901) 2024-06-05 14:21:59 +08:00
f32b440c4a chore: fix indention violations by applying E111 to E117 ruff rules (#4925) 2024-06-05 14:05:15 +08:00
6b6afb7708 fix: import error in web/app/components/header/account-setting/model-provider-page/declarations.ts (#4944) 2024-06-05 14:01:12 +08:00
a4041cb40b fix: end node limit in next step (#4945) 2024-06-05 14:00:47 +08:00
7749b71fff Optimize knowledge retrieval performance by batching dataset quries. (#4917) 2024-06-05 13:30:32 +08:00
3006124e6d feat: pricing page add llm load balancing info (#4942) 2024-06-05 11:31:44 +08:00
3d276f4a7f change "Import from text file" to "Import from file" (#4935) 2024-06-05 09:29:29 +08:00
b20d173324 pref: optimize feature model_load_balancing_enabled value fetch speed… (#4933) 2024-06-05 02:06:19 +08:00
f44d1e62d2 fix: bedrock get_num_tokens prompt_messages parameter name err (#4932) 2024-06-05 01:53:05 +08:00
21ac2afb3a fix: question classifier instruction npe (#4931) 2024-06-05 01:27:58 +08:00
f7dd327bc2 version to 0.6.10 (#4929) 2024-06-05 01:12:20 +08:00
09298a32e7 fix: vanna CVE-2024-5565 by disable visualize of ask func (#4930) 2024-06-05 00:46:22 +08:00
37f292ea91 feat: model load balancing (#4926) 2024-06-05 00:13:29 +08:00
d1dbbc1e33 feat: backend model load balancing support (#4927) 2024-06-05 00:13:04 +08:00
52ec152dd3 fix: incorrect parameters transforming while validating (#4928) 2024-06-05 00:01:30 +08:00
c7bddb637b support instruction in classifier node (#4913) 2024-06-04 20:07:54 +08:00
4e3b0c5aea support rename document (#4915) 2024-06-04 20:07:40 +08:00
b6631cd878 modify rerank and splitter code directory (#4924) 2024-06-04 20:07:25 +08:00
c212700341 fix: router replace in Explore page (#4918) 2024-06-04 19:41:54 +08:00
e121788ff5 chore: make the error msg more clear when validate app token (#4919)
Co-authored-by: Jyong <76649700+johnjyong@users.noreply.github.com>
2024-06-04 18:04:10 +08:00
96460d5ea3 feat: document support rename in in dataset (#4732) 2024-06-04 15:10:34 +08:00
9cf9720efa Fix/azure blob token expire (#4914) 2024-06-04 14:30:23 +08:00
2d9f55b632 feat: Add Vanna.AI as a builtin tool (#4878)
Co-authored-by: Yeuoly <admin@srmxy.cn>
2024-06-04 14:05:29 +08:00
7133a16511 chore: refactor the serpapi's google search tool (#4834) 2024-06-04 14:05:05 +08:00
a38dfc006e feat: question classify node support use var in instruction (#4710) 2024-06-04 14:01:40 +08:00
86e7c7321f Fixed a bug where any content in the 'fetch' was converted to True (#4400) 2024-06-04 13:27:23 +08:00
58db719a2c dep: bump pandas from 1.x to 2.x (#4820) 2024-06-04 13:24:28 +08:00
9abeb99b32 chore: modify tools/JinaReader label to Jina (#4908) 2024-06-04 13:21:05 +08:00
d828a7fc35 fix azure blob token expire (#4911) 2024-06-04 13:04:56 +08:00
c6f9ea4434 chore: update page.tsx (#4897) 2024-06-04 10:19:49 +08:00
fb6843815c chore: separate style checks into multiple jobs triggering on file changes (#4876) 2024-06-04 03:03:18 +08:00
b97181a793 chore(deps): bump azure-storage-blob from 12.9.0 to 12.13.0 in /api (#4695)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-04 02:57:33 +08:00
5d15aca85f chore: remove unused code and class in text splitter (#4864) 2024-06-04 02:54:09 +08:00
b98a1a3303 feat: added Anthropic Claude3 models to Google Cloud Vertex AI (#4870)
Co-authored-by: pwm <pwm@google.com>
2024-06-04 02:52:46 +08:00
696c5308a9 chore: optimize nvidia nim credential schema and info (#4898) 2024-06-04 02:26:26 +08:00
3542d55e67 improve: generalize tool parameter converter (#4786) 2024-06-03 21:26:58 +08:00
3c8a120e51 add-nvidia-mim (#4882) 2024-06-03 21:10:18 +08:00
cd24308f20 chore: add issue link tempate for IDEA (#4866) 2024-06-03 13:39:54 +08:00
69190e088e fix: update npm version to fix Incorrect argument types in createChatMessage (#4865) 2024-06-03 08:22:27 +08:00
d058a234ba Fixed workflow tts feature audition (#4867) 2024-06-03 00:22:14 +08:00
41e536109b fix: Incorrect argument types in createChatMessage (#4861) 2024-06-02 21:03:42 +08:00
f916aa0f92 chore: upgrade sandbox (#4839) 2024-06-02 11:30:14 +08:00
cdbc260571 Bugfix: Vertex AI vision model not support image (#4853) 2024-06-02 11:11:09 +08:00
b234710af9 chore: fix invalid escape sequences by applying W605 rule (#4851) 2024-06-02 10:02:37 +08:00
23498883d4 chore: skip explicit installing jinja2 as testing dependency (#4845) 2024-06-02 09:49:20 +08:00
a47e8d0da2 test: CI test for db migration scripts on changes (#4739) 2024-05-31 16:45:34 +08:00
6dd0e07af8 test: triggering tests on changes and allow cancelling in-progress CI test jobs (#4743) 2024-05-31 16:42:14 +08:00
b1c9671a60 fix: status query not stop when leaving document embedding detail page (#4754) 2024-05-31 16:07:48 +08:00
7aaa1ff270 chore: increase workflow max steps to 500 (#4835) 2024-05-31 15:16:35 +08:00
85698ca4f7 chore: cleanup tools, remove useless code (#4833) 2024-05-31 14:19:59 +08:00
176d91937d fix 'NoneType' and new ContentType supported. (#4818) 2024-05-31 14:19:33 +08:00
e0da0744b5 add: ollama keep alive parameter added. issue #4024 (#4655) 2024-05-31 12:22:02 +08:00
0b4902bdc2 fix: workflow app run (#4831) 2024-05-31 12:15:25 +08:00
e9904e66e6 chore: Enable case-insensitive search for large models (#4817) 2024-05-31 08:55:37 +08:00
3de8e8fd6a Feat/i18n workflow (#4819) 2024-05-30 21:03:32 +08:00
38a470a873 fix: app_count of dataset is error when apps was deleted (#4810) 2024-05-30 19:23:46 +08:00
4308a79e89 fix: revision styles for workflow (#4087) 2024-05-30 19:10:14 +08:00
93d3350c8c update sd-webui api parameters to v1.9.3 (#4798)
Co-authored-by: Your Name <chen@krasus.red>
2024-05-30 19:04:47 +08:00
615c009c42 fix: remove redundant props (#4787) 2024-05-30 18:58:08 +08:00
a325a294bd feat: opportunistic tls flag for smtp (#4794) 2024-05-30 18:56:46 +08:00
4b91383efc feat: workflow variable aggregator support group (#4811)
Co-authored-by: Yeuoly <admin@srmxy.cn>
2024-05-30 18:54:58 +08:00
18ab63bd37 add: i18n: update korean (#4813) 2024-05-30 17:40:35 +08:00
a7fb1ffcd8 feat: show more usage info in billing page (#4808) 2024-05-30 16:15:38 +08:00
11f173693b fix: some filed in model param selector has no left spacing (#4803) 2024-05-30 14:49:41 +08:00
5b2cd8d03a chore: node help link (#4795) 2024-05-30 14:24:53 +08:00
b10e67be3b Add SearchApi tools (#4648) 2024-05-30 11:11:17 +08:00
d41c077fac chore: improve node user experience (#4792) 2024-05-30 10:53:02 +08:00
3175a2c76a fix: in tool and http node of iteration can not show item var correctly (#4791) 2024-05-30 10:40:27 +08:00
3b60b712ec feat: Add logging warning when MAIL_TYPE is not set (#4771) 2024-05-29 18:06:16 +08:00
afed3610fc fix organize agent's history messages without recalculating tokens (#4324)
Co-authored-by: chenyongzhao <chenyz@mama.cn>
2024-05-29 15:25:20 +08:00
74f38eacda feat: support define tags in tool yaml (#4763) 2024-05-29 15:19:14 +08:00
b189faca52 feat: update ernie model (#4756) 2024-05-29 14:57:23 +08:00
d4cd6149ac fix: incorrect workflow max call depth (#4759) 2024-05-29 14:52:28 +08:00
e1cd9aef8f feat: support baichuan3 turbo, baichuan3 turbo 128k, and baichuan4 (#4762) 2024-05-29 14:46:04 +08:00
ba37275503 fix: confusing chart description (#4760) 2024-05-29 14:36:33 +08:00
e01b44af61 style: fix annotation panel display misalignment (#4750) 2024-05-29 14:23:44 +08:00
72a90074bc Add WORKFLOW_CALL_MAX_DEPTH env var. (#4713) 2024-05-29 13:39:11 +08:00
705a6e3a8e Fix/4742 ollama num gpu option not consistent with allowed values (#4751) 2024-05-29 13:33:35 +08:00
f4a240d225 style: the 'all' of add tool panel should contain workflow tools (#4755) 2024-05-29 13:04:23 +08:00
793f0c1dd6 fix: Corrected schema link in model_runtime's README.md (#4757) 2024-05-29 13:03:21 +08:00
008edd0eeb fix: optimize sticky header styles z-index in tools - ProviderList component (#4746) 2024-05-29 08:36:11 +08:00
9e6b6e7b82 fix: workflow run sequence number slow sql (#4737) 2024-05-28 20:41:52 +08:00
164d6e47b9 Show tool i18n name on chat pannel (#4724) 2024-05-28 18:58:02 +08:00
88b4d69278 fix: Correct context size for banchuan2-53b and banchuan2-turbo (#4721) 2024-05-28 16:37:44 +08:00
5bcbcd3c57 fix: retrieval value greater more than 1 caused ui problem (#4718) 2024-05-28 16:01:19 +08:00
1b2d862973 add error msg for hit test (#4704) 2024-05-28 14:54:53 +08:00
e6f6a59f3b style: update VarPanel to use whitespace-pre-wrap for value display (#4684) 2024-05-28 14:54:29 +08:00
e198bc9b9a fix: workflow as tool garbled (#4707) 2024-05-28 14:51:42 +08:00
b7f81f0999 fix: the new node name is generated based on the original node when duplicating (#4675) 2024-05-28 13:50:43 +08:00
eb8dc15ad6 fix: Input fields in the model provider's settings modal do not switch sequence via keyboard navigation (Tab key) (#4662) 2024-05-28 11:34:44 +08:00
2ee3a1b6f3 fix: key-value-table styles (#4678) 2024-05-28 10:57:40 +08:00
0960b17fbc Add workflow translations for ja-JP (#4698)
Co-authored-by: crazywoola <427733928@qq.com>
2024-05-28 10:27:35 +08:00
6534566b7e feat: add América/São Paulo tz (#4701) 2024-05-28 10:12:18 +08:00
e60350d95d version to 0.6.9 (#4692) 2024-05-27 22:48:34 +08:00
f40743183e 🔧 Add env variable for time signature (#4650) 2024-05-27 22:20:49 +08:00
e852a21634 Feat/workflow phase2 (#4687) 2024-05-27 22:01:11 +08:00
45deaee762 feat: workflow new nodes (#4683)
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: Patryk Garstecki <patryk20120@yahoo.pl>
Co-authored-by: Sebastian.W <thiner@gmail.com>
Co-authored-by: 呆萌闷油瓶 <253605712@qq.com>
Co-authored-by: takatost <takatost@users.noreply.github.com>
Co-authored-by: rechardwang <wh_goodjob@163.com>
Co-authored-by: Nite Knite <nkCoding@gmail.com>
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
Co-authored-by: Joshua <138381132+joshua20231026@users.noreply.github.com>
Co-authored-by: Weaxs <459312872@qq.com>
Co-authored-by: Ikko Eltociear Ashimine <eltociear@gmail.com>
Co-authored-by: leejoo0 <81673835+leejoo0@users.noreply.github.com>
Co-authored-by: JzoNg <jzongcode@gmail.com>
Co-authored-by: sino <sino2322@gmail.com>
Co-authored-by: Vikey Chen <vikeytk@gmail.com>
Co-authored-by: wanghl <Wang-HL@users.noreply.github.com>
Co-authored-by: Haolin Wang-汪皓临 <haolin.wang@atlaslovestravel.com>
Co-authored-by: Zixuan Cheng <61724187+Theysua@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Bowen Liang <bowenliang@apache.org>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
Co-authored-by: fanghongtai <42790567+fanghongtai@users.noreply.github.com>
Co-authored-by: wxfanghongtai <wxfanghongtai@gf.com.cn>
Co-authored-by: Matri <qjp@bithuman.io>
Co-authored-by: Benjamin <benjaminx@gmail.com>
2024-05-27 21:57:08 +08:00
444fdb79dc fix typo: stopParameerRule -> stopParameterRule (#4681) 2024-05-27 20:42:07 +08:00
140dd873f1 fix: show exception message when sandbox execution fails (#4663) 2024-05-27 18:06:15 +08:00
27dae156db fix: colon in file mistral.mistral-small-2402-v1:0 (#4673) 2024-05-27 13:15:20 +08:00
2deb23e00e fix: Show rerank in system for localai (#4652) 2024-05-27 12:09:51 +08:00
8152bc6fbf Feat/upgrade check i18n scripts (#4671) 2024-05-27 10:36:34 +08:00
af026c5953 fix: node.js sdk if request is a get data must not exist (#4618) 2024-05-27 08:48:07 +08:00
11275cbaaf fix: z-index (#4065) 2024-05-27 08:47:27 +08:00
fe9bf5fc4a [seanguo] add support of amazon titan v2 and modify the price of amazon titan v1 (#4643)
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
2024-05-26 23:30:22 +08:00
cd4924d472 Fix tts audition (#4656)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-05-26 21:00:36 +08:00
f56b984d97 Fix Unnecessary Newline Characters in Extracted Tool Response Text (#4646)
Co-authored-by: kronus <kronus@istarshine.com>
2024-05-25 15:24:59 +08:00
f804adbff3 feat: Support for Vertex AI - load Default Application Configuration (#4641)
Co-authored-by: miendinh <miendinh@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-05-25 13:40:25 +08:00
109aabc6f2 fix: incorrect handling when http header value contain multiple colons. (#4574) 2024-05-25 13:20:06 +08:00
ad620f02c7 fix: the date is incorrect if the db field is timestamp and the TZ is not the UTC (#4624)
Co-authored-by: liuzhenghua-jk <liuzhenghua-jk@360shuke.com>
2024-05-25 13:11:18 +08:00
5bd432a85f Fix tts audition (#4637)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-05-25 12:55:04 +08:00
026175c8f7 feat: update notion extractor (#3898)
Co-authored-by: duyalei <>
2024-05-24 20:30:48 +08:00
f156014daa update lite8k/speed8k/128k max_token to newest (#4636)
Co-authored-by: Your Name <chen@krasus.red>
2024-05-24 19:33:42 +08:00
2cd55e456a fix: WORKFLOW_MAX_EXECUTION_STEPS spell error in config.py (#4642) 2024-05-24 16:32:26 +08:00
8c2ca60c8b feat: Add WORKFLOW_MAX_EXECUTION_TIME env var (#4632) 2024-05-24 15:27:12 +08:00
3fda2245a4 improve: extract method for safe loading yaml file and avoid using PyYaml's FullLoader (#4031) 2024-05-24 12:08:12 +08:00
296887754f Support for Vertex AI (#4586) 2024-05-24 12:01:40 +08:00
9ae72cdcf4 feat: Add Gemini Flash (#4616) 2024-05-24 11:43:06 +08:00
16fec084f5 Fix/4630 bug api suggested (#4633) 2024-05-24 11:11:12 +08:00
10c61da686 feat: add confirm ui (#4625) 2024-05-23 20:15:51 +08:00
24624491cd add qdrant metadata.doc_id index when create qdrant collection (#4570) 2024-05-23 18:11:01 +08:00
233c4150d1 support images and tables extract from docx (#4619) 2024-05-23 18:05:23 +08:00
5893ebec55 fix: code node garbled in Javascript (#4615) 2024-05-23 17:18:57 +08:00
11642192d1 chore: add https://api.openai.com placeholder in OpenAI api base (#4604) 2024-05-23 12:56:05 +08:00
e57bdd4e58 chore:update gpt-3.5-turbo and gpt-4-turbo parameter for azure (#4596) 2024-05-23 11:51:38 +08:00
461488e9bf Add Azure OpenAI API version for GPT4o support (#4569)
Co-authored-by: wwwc <wwwc@outlook.com>
2024-05-22 17:43:16 +08:00
2988b67c24 fix: hide automatic button on automatic result page (#4494) 2024-05-22 16:44:20 +08:00
4f62541bfb chore: remove model provider free token link (#4579) 2024-05-22 16:42:49 +08:00
24576a39e5 fix: some google search result raise exception (#4567) 2024-05-22 14:28:52 +08:00
3ab19be9ea Fix bedrock claude wrong pricing (#4572)
Co-authored-by: Justin Wu <justin.wu@ringcentral.com>
2024-05-22 14:28:28 +08:00
5b009a5afb chore(api): Use channel from UI as API query parameter (#4562) 2024-05-22 14:28:03 +08:00
3efb5fe7e2 Refactor part of the ProviderManager code to improve readability (#4524) 2024-05-22 11:18:03 +08:00
ee53f98d8c Hide the copy button when there is no content to copy (#4546) 2024-05-22 11:15:13 +08:00
d5a33a0323 feat:add gpt-4o for azure (#4568) 2024-05-22 11:02:43 +08:00
f25927855e add qdrant metadata.doc_id index (#4559) 2024-05-22 01:42:08 +08:00
c873035084 oauth2 supports. (#4551) 2024-05-21 17:52:41 +08:00
f32bba6531 Update requirements.txt with latest OSS package with AuthV4 support (#4425) 2024-05-20 17:26:07 +08:00
40bc936739 chore: update yfinance dependency to version 0.2.40 (#4517) 2024-05-20 16:47:05 +08:00
6b5685ef0c feat: Jina Search & Jina Reader CSS selectors (#4523) 2024-05-20 16:40:46 +08:00
e8e213ad1e chore: apply and fix flake8-bugbear lint rules (#4496) 2024-05-20 16:34:13 +08:00
5f4df34829 improve: generalize transformations and scripts of runner and preloads into TemplateTransformer (#4487) 2024-05-20 15:56:26 +08:00
c255a20d7c allow to config max segmentation tokens length for RAG document using environment variable (#4375) 2024-05-20 13:20:27 +08:00
b5204111da Add UNSTRUCTURED_API_KEY env support (#4369) 2024-05-20 13:14:17 +08:00
3a51f2a778 fix: workaround db migration error when adding custom_disclaimer column to recommended_apps (#4518)
Co-authored-by: takatost <takatost@gmail.com>
2024-05-20 12:33:21 +08:00
4086f5051c feat:Provide parameter config for mask_sensitive_info of MiniMax mode… (#4294)
Co-authored-by: 老潮 <zhangyongsheng@3vjia.com>
Co-authored-by: takatost <takatost@users.noreply.github.com>
Co-authored-by: takatost <takatost@gmail.com>
2024-05-20 10:15:27 +08:00
5440108431 fix: read llm node's first prompt role by optional chaining (#4510) 2024-05-20 08:13:37 +08:00
46bd53a929 chore: sort categories in recommended app service response (#4498) 2024-05-19 22:44:29 +08:00
a10c2ccd41 fix: files data missed for message (#4512) 2024-05-19 22:42:52 +08:00
1cca100a48 fix:modify spelling errors: lanuage ->language in schema.md (#4499)
Co-authored-by: wxfanghongtai <wxfanghongtai@gf.com.cn>
2024-05-19 18:31:05 +08:00
04ad46dd31 chore: skip unnecessary key checks prior to accessing a dictionary (#4497) 2024-05-19 18:30:45 +08:00
aa13d14019 Feat/chat custom disclaimer (#4306) 2024-05-18 10:52:48 +08:00
b1f003646b Update docker-compose.yaml- New DEBUG variable (#4476)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Bowen Liang <bowenliang@apache.org>
2024-05-18 10:35:28 +08:00
e90eccdf92 Fix: HTTP request node PARAMS parameters, if ':' appears in the value… (#4403)
Co-authored-by: Haolin Wang-汪皓临 <haolin.wang@atlaslovestravel.com>
2024-05-18 10:35:01 +08:00
ba06447cd5 chore: update docker-compose.yaml (#4492) 2024-05-18 10:27:12 +08:00
9808520992 fix: copy button is always displayed on the chat logs page (#4488) 2024-05-17 22:26:19 +08:00
8b931b085c fix: app logo (#4483) 2024-05-17 18:02:00 +08:00
528faceb35 fix: cot agent token usage is empty (#4474) 2024-05-17 14:45:20 +08:00
c2a8fa91b1 fix: cot gent duplicate messages (#4470) 2024-05-17 13:32:02 +08:00
091fba74cb enhance: claude stream tool call (#4469) 2024-05-17 12:43:58 +08:00
083ef2e6fc improve: exract Code Node provider for each supported scripting language (#4164) 2024-05-17 11:58:12 +08:00
de3a7603ac fix: workflow add next node from knowledge retrieval node (#4467) 2024-05-17 11:42:03 +08:00
0ac5d621b6 add llm: ernie-character-8k of wenxin (#4448) 2024-05-16 18:31:07 +08:00
bdd409970f fix the wrong env variable AZURE_BLOB_CONTAINER_NAME (#4455) 2024-05-16 18:30:52 +08:00
d8f38f79f2 feat: add pre ping for sqlalchemy configuration (#4454) 2024-05-16 17:07:21 +08:00
0f1172f55b fix: self node type shouldn't show in the picker (#4445) 2024-05-16 13:25:09 +08:00
3df47b7b59 fix: wrong category name in examples of question classifier completion prompt (#4421) 2024-05-16 13:04:57 +08:00
6e9066ebf4 feat: support doubao llm and embeding models (#4431) 2024-05-16 11:41:24 +08:00
dd94931116 Remove useless code (#4416) 2024-05-15 16:14:49 +08:00
da81233d61 Custom sqlalchemy database uri scheme is supported (#4367) 2024-05-15 15:27:15 +08:00
a76ae2d756 chore: remove useless code in knowledge_retrieval_node (#4412) 2024-05-15 15:24:40 +08:00
97b65f9b4b Optimize webscraper (#4392)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-05-15 15:23:16 +08:00
c0fe414e0a fix: workflow delete edge when node is selected (#4414) 2024-05-15 15:12:36 +08:00
182dadd433 chore: remove model as tool (#4409) 2024-05-15 12:25:04 +08:00
1d0f88264f Fix HTTP REQUEST NODE is always waiting but endpoint have responsed (#4395) 2024-05-15 11:05:46 +08:00
332baca538 FIX: fix the temperature value of ollama model (#4027) 2024-05-15 08:05:54 +08:00
e2a78888b9 Fix: setup google-storage client (#4296)
Co-authored-by: kotamat <kota1681@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-05-15 08:05:41 +08:00
5d6d0e63c5 docs: Add CONTRIBUTING_JA.md (#4383) 2024-05-15 07:48:19 +08:00
2eb468f885 fix: add timeout to SMTPClient to prevent worker blocking (#4352) 2024-05-14 23:44:53 +08:00
98140ae5d9 fix the issue of MILVUS_DATABASE has no effect. (#4353) 2024-05-14 19:54:31 +08:00
d1ccb22d8a feat: Use Romanian & other langs in QA (#4205)
Co-authored-by: crazywoola <427733928@qq.com>
2024-05-14 17:48:24 +08:00
66c8070da8 fix: Jinja switch not aligned in vertical direction (#4374) 2024-05-14 16:29:41 +08:00
3271e3e803 improve the code readability of http_executor node (#4360) 2024-05-14 16:11:12 +08:00
16d47923c3 fix: requests timeout (#4370) 2024-05-14 16:01:23 +08:00
6f1633fa75 fix: delete end node (#4372) 2024-05-14 15:51:08 +08:00
08e4103fa1 Create README_KR.md (#4364) 2024-05-14 15:36:03 +08:00
eee95190cc version to 0.6.8 (#4347) 2024-05-14 03:18:26 +08:00
e8311357ff feat: gpt-4o (#4346) 2024-05-14 02:52:41 +08:00
0f14fdd4c9 fix: handleUpdateWorkflowCanvas is not a function (#4343) 2024-05-13 20:36:23 +08:00
ece0f08a2b add yi models (#4335)
Co-authored-by: 陈力坤 <likunchen@caixin.com>
2024-05-13 17:40:53 +08:00
5edb3d55e5 feat: i18n: add korean language (ko-KR) (#4333) 2024-05-13 15:20:44 +08:00
63382f758e fix typo (#4329) 2024-05-13 15:20:16 +08:00
bbef964eb5 improve: code upgrade (#4231) 2024-05-13 14:39:14 +08:00
e6db7ad1d5 chore: update gmpy2_pkcs10aep_cipher.py (#4314) 2024-05-13 10:45:29 +08:00
8cc492721b fix: minimax streaming function_call message (#4271) 2024-05-11 21:07:22 +08:00
a80fe20456 add-some-new-models-hosted-on-nvidia (#4303) 2024-05-11 21:05:31 +08:00
f7986805c6 Update README.md to remove outdated badge (#4302) 2024-05-11 20:48:15 +08:00
aa5ca90f00 fix: text generation app not show copy button (#4304) 2024-05-11 20:39:17 +08:00
4af00e4a45 feat: support copy run text result in debug panel in workflow (#4300) 2024-05-11 16:59:17 +08:00
c01c95d77f fix: chatflow run progress problem (#4298) 2024-05-11 16:23:31 +08:00
20a9037d5b fix: align versions of react typing package (#4297) 2024-05-11 15:39:56 +08:00
34d3998566 fix: webapps not show number type input field (#4292) 2024-05-11 14:42:04 +08:00
198d6c00d6 Update docker-compose.yaml (#4288) 2024-05-11 13:41:12 +08:00
1663df8a05 feat: hide run detail in webapps and installed apps (#4289) 2024-05-11 13:40:27 +08:00
d8926a2571 feat: hide node detail outputs in webapp & installed app in explore (#3954) 2024-05-11 13:40:11 +08:00
4796f9d914 feat:add gpt-4-turbo for azure (#4287) 2024-05-11 13:02:56 +08:00
a588df4371 Add rerank model type for LocalAI provider (#3952) 2024-05-11 11:29:28 +08:00
2c1c660c6e fix(Backend:http_executor): 🔧 prevent splitting JSON data as v… (#4276) 2024-05-11 11:23:35 +08:00
13f4ed6e0e fix: workflow zoomin/out shortcuts (#4283) 2024-05-11 10:38:12 +08:00
1e451991db fix: deutsch edit app (#4270) 2024-05-11 10:07:54 +08:00
749b236d3d fix: do nothing if switch to current app (#4249)
Co-authored-by: langyong <langyong@lixiang.com>
2024-05-11 08:50:46 +08:00
00ce372b71 fix: hook dependency (#4242) 2024-05-11 08:43:37 +08:00
370e1c1a17 fix(frontend): 🔧 add privacy policy spaces (#4277) 2024-05-11 08:42:03 +08:00
28495273b4 feat: Add storage type and Google Storage settings to worker (#4266) 2024-05-10 18:54:08 +08:00
36a9c5cc6b fix: remove unexpected zip and add FlipForward arrow icon (#4263) 2024-05-10 18:52:41 +08:00
228de1f12a fix: miss usage of os.path.join for URL assembly and add tests on yarl (#4224) 2024-05-10 18:14:48 +08:00
01555463d2 feat: llm support jinja fe (#4260) 2024-05-10 18:14:05 +08:00
6b99075dc8 fix: system default model name length (#4245) (#4246)
Co-authored-by: takatost <takatost@gmail.com>
2024-05-10 18:12:18 +08:00
8578ee0864 feat: support LLM jinja2 template prompt (#3968)
Co-authored-by: Joel <iamjoel007@gmail.com>
2024-05-10 18:08:32 +08:00
897e07f639 question classifier prompt optimize (#4262) 2024-05-10 17:22:46 +08:00
875249eb00 Feat/vector db pgvector (#3879) 2024-05-10 17:20:30 +08:00
4d5a4e4cef correct comparison chart (#4254) 2024-05-10 14:54:38 +08:00
86a6e6bd04 feat: increase max steps to 50 in workflow (#4252) 2024-05-10 14:50:00 +08:00
8f3042e5b3 feat: Add draft hash check in workflow (#4251) 2024-05-10 14:48:29 +08:00
a1ab87107b chore: workflow sync with hash (#4250) 2024-05-10 14:48:20 +08:00
f49c99937c fix: workflow end node deletion (#4240) 2024-05-10 10:38:05 +08:00
9b24f12bf5 feat: workflow interaction (#4214) 2024-05-09 17:18:51 +08:00
487ce7c82a fix: add missing translations (#4212) 2024-05-09 15:38:51 +08:00
cc835d523c refactor: install form (#4154) 2024-05-09 15:38:09 +08:00
64c3bc070a version to 0.6.7 (#4208) 2024-05-09 13:58:25 +08:00
7405b2e819 modify spelling errors: bulild -> build (#4206) 2024-05-09 13:49:19 +08:00
ca5081e327 fix delete log annotation (#4201)
Co-authored-by: langyong <langyong@lixiang.com>
2024-05-09 12:53:06 +08:00
a79941df22 fix: button widths (#4145) 2024-05-09 12:52:07 +08:00
8137d63000 fix: workflow http node timeout & url check (#4175) 2024-05-08 13:20:26 +08:00
4aa21242b6 feat: add volcengine maas model provider (#4142) 2024-05-08 12:45:53 +08:00
8ce93faf08 Typo on deepseek.yaml and yi.yaml (#4170) 2024-05-08 10:52:04 +08:00
903ece6160 Fix:typo Incorrect Japanese 2 (#4167) 2024-05-08 09:04:37 +08:00
9f440c11e0 feat: DeepSeek (#4162) 2024-05-08 00:28:16 +08:00
58bd5627bf Add-Deepseek (#4157) 2024-05-07 22:45:38 +08:00
97dcb8977a fix: stop event propagation when deleting selected workflow var node (#4158) 2024-05-07 21:00:43 +08:00
2fdd64c1b5 feat: add proxy configuration for Cohere model (#4152) 2024-05-07 18:12:13 +08:00
591b993685 fix dataset segment update api not effect issue (#4151) 2024-05-07 17:47:20 +08:00
543a00e597 feat: update model_provider jina to support custom url and model (#4110)
Co-authored-by: Gimling <huangjl@ruyi.ai>
Co-authored-by: takatost <takatost@gmail.com>
2024-05-07 17:43:24 +08:00
f361c7004d feat: support vision models from xinference (#4094)
Co-authored-by: Yeuoly <admin@srmxy.cn>
2024-05-07 17:37:36 +08:00
bb7c62777d Add support for local ai speech to text (#3921)
Co-authored-by: Yeuoly <admin@srmxy.cn>
2024-05-07 17:14:24 +08:00
d51f52a649 fix: http authorization leakage (#4146) 2024-05-07 16:56:25 +08:00
e353809680 question classifier optimize (#4147) 2024-05-07 16:44:27 +08:00
c2f0f958ef fix: passing in 0 as a numeric variable will be converted to null (#4148) 2024-05-07 16:38:23 +08:00
087b7a6607 azure_openai add gpt-4-turbo-2024-04-09 model (#4144)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-05-07 15:55:23 +08:00
6271463240 feat(Languages): 👽 add pl-PL language (#4128) 2024-05-07 15:41:57 +08:00
6f1911533c bug fix: update minimax model_apis (#4116) 2024-05-07 14:40:24 +08:00
d5d8b98d82 feat: support openai stream usage (#4140) 2024-05-07 13:49:45 +08:00
e7fe7ec0f6 feat: support time format (#4138) 2024-05-07 13:02:00 +08:00
049abd698f improve: test CodeExecutor with code templates and extract CodeLanguage enum (#4098) 2024-05-07 12:37:18 +08:00
45d21677a0 Improved Japanese translation (#4119) 2024-05-07 12:25:01 +08:00
76bec6ce7f feat: add http node max size env (#4137) 2024-05-07 12:07:56 +08:00
6563cb6ec6 fix: prevent http node overwrite on open (#4127) 2024-05-07 10:08:18 +08:00
13cd409575 feat: support aliyun oss auth v4 (#3886)
Co-authored-by: owen <owen@owen.hawk-toad.ts.net>
2024-05-06 11:56:04 +08:00
13292ff73e 🦄 refactor(dataset svc): delete check none (#4101)
Co-authored-by: baxiang <baxiang@lixiang.com>
2024-05-06 11:45:26 +08:00
3f8e2456f7 fix: typo in get-automatic-res.tsx (#4097) 2024-05-06 11:36:19 +08:00
822ee7db88 fix: correct the license link (#4093) 2024-05-06 11:35:16 +08:00
94a650475d improve: menu collapse readability (#4099)
Co-authored-by: rongjun.qiu <qiurj@hengtonggroup.com.cn>
2024-05-06 11:34:56 +08:00
03cf00422a Urgent Correction: Resolving Critical License Documentation Error in Dify's Japanese README (#4075)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-05-06 11:28:32 +08:00
51a9e678f0 Leptonai integrate (#4079) 2024-05-05 14:37:47 +08:00
ad76ee76a8 Update bedrock.yaml add Region Asia Pacific (Sydney) (#4016) 2024-05-05 10:49:17 +08:00
630136b5b7 Revert "fix: hydration warning (#3897)" (#4059) 2024-05-04 18:00:23 +08:00
b5f101bdac fix: transform None into correct dest type (#4077) 2024-05-04 16:34:42 +08:00
5940564d84 feat: add a new built-in tool of Slack Incoming Webhook (#4067) 2024-05-04 16:17:34 +08:00
67902b5da7 fix: agent log timezone (#4076) 2024-05-04 16:17:15 +08:00
c0476c7881 Feat: frontend support timezone of timestamp (#4070) 2024-05-04 16:15:32 +08:00
f68b6b0e5e Fix typo: writeOpner -> writeOpener (#4060) 2024-05-03 18:55:47 +08:00
44857702ae test: add integration tests on CodeExecutor with the sandbox service (#4015) 2024-05-03 08:54:40 +08:00
b1399cd5f9 fix: unable to fetch CoT agent runner log (#4052) 2024-05-03 08:54:15 +08:00
6f1e4a19a2 fix: workflow avg user interaction. (#4056) 2024-05-02 20:24:40 +08:00
93393e005e version to 0.6.6 (#4050) 2024-05-02 16:06:40 +08:00
4ea2755fce test: remove explicit env settings for CI pytests (#4041) 2024-05-02 00:49:39 +08:00
ecb51a83d4 chore(deps): bump semver from 5.7.1 to 5.7.2 in /web (#4022)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-30 18:47:05 +08:00
093b5c0e63 fix: typo of jinja2 (#4019) 2024-04-30 18:39:02 +08:00
bf42b0ae44 fix: lodash version has warning (#4020)
Co-authored-by: nite-knite <nkCoding@gmail.com>
2024-04-30 18:11:49 +08:00
342b4fd19d chore(deps): bump word-wrap from 1.2.3 to 1.2.5 in /web
Bumps [word-wrap](https://github.com/jonschlinkert/word-wrap) from 1.2.3 to 1.2.5.
- [Release notes](https://github.com/jonschlinkert/word-wrap/releases)
- [Commits](https://github.com/jonschlinkert/word-wrap/compare/1.2.3...1.2.5)

---
updated-dependencies:
- dependency-name: word-wrap
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-30 09:39:10 +00:00
cbdb861ee4 add glm-3-turbo max_tokens parameter setting (#4017)
Co-authored-by: 陈力坤 <likunchen@caixin.com>
2024-04-30 17:08:04 +08:00
da5a8b9a59 feat: support question classifier node output (#4000) 2024-04-30 17:07:29 +08:00
1e6e8b446d feat: support minimax abab6.5, abab6.5s (#4012) 2024-04-30 17:02:01 +08:00
c1fdaa6ae0 fix: prompt undefined caused match problem (#4010) 2024-04-30 16:31:36 +08:00
142814d451 chore: skip deprecated field_schema param in creating payload index on Qdrant (#3903) 2024-04-30 16:16:10 +08:00
704755d005 fix: submitCodeExecutionTask (#4006) 2024-04-30 16:01:03 +08:00
d1263700c0 Update the description and labels in Judge0ce tool (#3990)
Co-authored-by: crazywoola <427733928@qq.com>
2024-04-30 14:58:29 +08:00
0704fe9695 fix(web): copy button visible at chat page normally (#4005)
Co-authored-by: rongjun.qiu <qiurj@hengtonggroup.com.cn>
2024-04-30 14:55:57 +08:00
1d3f1d88ef Enabled Notion integration setup in Docker Compose Deployment (#3919) 2024-04-30 14:48:39 +08:00
8b3edac091 fix: prompt editor insert quickly (#4004) 2024-04-30 14:25:21 +08:00
05cab85579 fix: workflow disable shortcuts when feature panel occured (#4001) 2024-04-30 13:35:49 +08:00
b72fbe200d chore: add sandbox tag (#3997) 2024-04-30 12:35:19 +08:00
b1194da6a5 fix: ci (#3983) 2024-04-29 18:59:37 +08:00
338e4669e5 add storage factory (#3922) 2024-04-29 18:22:03 +08:00
c5e2659771 Feat/install process refinement (#3982) 2024-04-29 17:55:52 +08:00
1d432728ac add default value for QDRANT_GRPC_PORT (#3976) 2024-04-29 15:28:34 +08:00
2fd702a319 Fix: password check in page of install (#3978) 2024-04-29 15:27:45 +08:00
f26ad16af7 Add new tool: Firecrawl (#3819)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: Yeuoly <admin@srmxy.cn>
2024-04-29 14:20:36 +08:00
8f2ae51fe5 feat: add support for request timeout settings in the HTTP request node. (#3854)
Co-authored-by: Yeuoly <admin@srmxy.cn>
2024-04-29 13:59:07 +08:00
2f84d00300 fix-nvidia-llama3 (#3973) 2024-04-29 13:41:15 +08:00
b82a2d97ef fix: db connections not being released during workflow execution (#3971) 2024-04-29 12:42:09 +08:00
3e9dbe3e0a add pgvecto_rs support and upgrade SQLAlchemy (#3833) 2024-04-29 11:58:17 +08:00
975b2fb79e delete duplicate check get_dataset (#3966)
Co-authored-by: baxiang <baxiang@lixiang.com>
2024-04-29 11:57:26 +08:00
fa509ce64e feat: rename var name sync to used jinjia code (#3964) 2024-04-29 11:34:30 +08:00
99292edd46 chore: update @types/react (#3939) 2024-04-28 19:01:09 +08:00
3e992cb23c feat: code transform node editor support insert var by add slash or left brace (#3946)
Co-authored-by: StyleZhang <jasonapring2015@outlook.com>
2024-04-28 17:51:58 +08:00
e7b4d024ee optimize: code node has a bad error message (#3949) 2024-04-28 17:40:29 +08:00
ff67a6d338 feat: llm text stream support for workflow app (#3798)
Co-authored-by: JzoNg <jzongcode@gmail.com>
2024-04-28 17:37:00 +08:00
8e4989ed03 feat: workflow remove preview mode (#3941) 2024-04-28 17:09:56 +08:00
0940f01634 enhancement:support Qdrant gRPC mode (#3929) 2024-04-28 15:33:32 +08:00
9d1cb1bc92 improvement: Optimizing the experience of the app list page (#3885) 2024-04-28 13:52:45 +08:00
0ca4e30b19 feat: add start commands to devcontainer (#3902) 2024-04-28 12:30:56 +08:00
ba88f8a6f0 fix: code full screen in web app cause error (#3935) 2024-04-28 11:59:57 +08:00
aefe0cbf51 fix: api doc example error (#3925) 2024-04-28 10:18:07 +08:00
9ad489d133 feat: Add google storage support (#3887)
Co-authored-by: miendinh <miendinh@users.noreply.github.com>
2024-04-27 18:26:52 +08:00
661b30784e chore: skip warning messages when pytest auto-collecting the vdb test class by removing Test prefix (#3906) 2024-04-27 16:36:09 +08:00
43a5ba9415 feat: add support for Bedrock LLAMA3 (#3890) 2024-04-27 13:13:09 +08:00
08a65d74d5 fix: hydration warning (#3897) 2024-04-26 21:34:29 +08:00
cefe156811 feat: replicate supports default version. (#3884) 2024-04-26 21:16:22 +08:00
3b5b4d628b Add support for Traditional Chinese language (#3899)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-04-26 21:10:23 +08:00
8746e48df0 chore: integrate code-inspector-plugin (#3900) 2024-04-26 21:00:29 +08:00
0ec8b57825 add together ai model setting (#3895) 2024-04-26 20:43:17 +08:00
045827043d test: improve vector store tests (#3855) 2024-04-26 19:18:42 +08:00
4d66a86579 fix: fetch page name of notion wiki (#3847) 2024-04-26 18:04:37 +08:00
2a8881d0e8 fix: tool webscraper - too many redirects in case target url does not… (#3831)
Co-authored-by: miendinh <miendinh@users.noreply.github.com>
2024-04-26 17:58:46 +08:00
ffc60bb917 add the comment in entrypoint.sh (#3882) 2024-04-26 17:19:49 +08:00
2e454c770b fix: copy invite link for HTTPS has deplicate origin (#3877) 2024-04-26 15:19:30 +08:00
7d711135bc fix: full screen editor not follow panel width (#3876) 2024-04-26 14:23:13 +08:00
f62b2b5b45 optimize the knowledge failed documents query (#3870)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-04-26 11:47:23 +08:00
7919596a21 fix: UP031 style rule violation (#3866) 2024-04-26 11:24:08 +08:00
9b4898efeb fix: chat api doc not show title in english vision (#3864) 2024-04-26 10:32:45 +08:00
45dd1683fd test: add tests covering all methods of vector store (#3849) 2024-04-25 22:27:30 +08:00
8bca908f15 refactor: config file (#3852) 2024-04-25 22:26:45 +08:00
9cbb8ddd7f fix: billing tenant account role. (#3850) 2024-04-25 21:55:08 +08:00
1be222af2e fix: using api can not execute relyt vector database (#3766)
Co-authored-by: jingsi <jingsi@leadincloud.com>
2024-04-25 19:46:20 +08:00
bf9fc8fef4 Reduce tool redundancy for [Judge0 CE] (#3837)
Co-authored-by: crazywoola <427733928@qq.com>
2024-04-25 19:20:54 +08:00
86e7330fa2 test: refactor vdb tests by visitor design pattern (#3838) 2024-04-25 18:55:49 +08:00
34bfb715e1 fix: citations always appear in the chatflow app (#3844) 2024-04-25 18:31:38 +08:00
019d7069f8 fix: debug run not show total right tokens (#3843) 2024-04-25 18:22:30 +08:00
c54fcfb45d extract enum type for tenant account role (#3788) 2024-04-25 18:20:08 +08:00
cde87cb225 fix: model parameter default value (#3841) 2024-04-25 18:04:37 +08:00
12435774ca feat: query prompt template support in chatflow (#3791)
Co-authored-by: Joel <iamjoel007@gmail.com>
2024-04-25 18:01:53 +08:00
80b9507e7a feat: add aliyun oss storage (#3690)
Co-authored-by: henrybit <qipenghui3056@sina.com>
2024-04-25 16:57:19 +08:00
0ac0f0ffd0 version to 0.6.5 (#3834) 2024-04-25 16:50:37 +08:00
3d14aba4b4 Fix: event of click away in message-log-modal (#3828) 2024-04-25 15:58:03 +08:00
64f694865c Update EN,KL,JA,FR,ES documentation Llma2 to Llama3 model support (#3827) 2024-04-25 15:52:00 +08:00
d36b728088 fix: workflow sync data (#3824) 2024-04-25 14:02:06 +08:00
1a7b4c42ab fix: event of keyboard "enter" in text generator app (#3823) 2024-04-25 13:58:06 +08:00
2a64ce740e chore: remove anthropic pay entrance (#3822) 2024-04-25 13:18:59 +08:00
78988ed60e fix:still enable SSL verification when using qdrant based on HTTP protocol (#3805) 2024-04-25 13:04:31 +08:00
2832adda88 fix: missing url field when searching special keywords (#3820) 2024-04-25 12:33:58 +08:00
a4e4fb4094 fix: credentials validate failed for groqcloud model provider (#3817) 2024-04-25 12:09:44 +08:00
777ec64635 feat: add log_file environment variable (#3793) 2024-04-24 21:55:14 +08:00
9cec8c1750 test: add unit tests for vector stores of Milvus, Qdrant and Weaviate (#3688) 2024-04-24 21:52:42 +08:00
8ca5aa1190 use pymilvus 2.3.7 (#3790) 2024-04-24 18:37:08 +08:00
4d8f1b9ca4 feat: test all unit tests (#3787)
Co-authored-by: Joel <iamjoel007@gmail.com>
2024-04-24 17:33:01 +08:00
3da179f77b feat: add conversation_id and user_id in chatflow/workflow system vars (#3771)
Co-authored-by: Joel <iamjoel007@gmail.com>
2024-04-24 17:20:01 +08:00
a34e8cb0bd test: add test for PKCS1OAEP_Cipher with gmpy2 (#3760) 2024-04-24 17:15:31 +08:00
b249767c5c Fix: redirection of app remove (#3770) 2024-04-24 17:11:51 +08:00
89a7434565 fix: handle inputs show the focus ui together in tools node (#3763) 2024-04-24 15:53:07 +08:00
3b537cbdeb fix: endpoint for 'Update a document from a file' (#3751) 2024-04-24 15:25:53 +08:00
731464f5b8 fix: workflow sync (#3756) 2024-04-24 15:19:19 +08:00
1ad70f8721 feat: support prompt messages sorting (#3757) 2024-04-24 15:09:01 +08:00
2ea8c73cd8 fix: type num of variable converted to str (#3758) 2024-04-24 15:07:56 +08:00
f257f2c396 Knowledge optimization (#3755)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: JzoNg <jzongcode@gmail.com>
2024-04-24 15:02:29 +08:00
3cd8e6f5c6 fix: llm editor readonly cover error (#3752) 2024-04-24 13:28:22 +08:00
0715db7681 chore: add selector for use app store (#3746) 2024-04-24 13:07:20 +08:00
a39de8a686 fix: workflow restore (#3750) 2024-04-24 13:05:33 +08:00
ccaf335466 fix: rollback gmpy2 to 2.1.5 (#3745) 2024-04-24 12:53:23 +08:00
40e36e9b52 fix: toggling AppDetailNav causes unnecessary component rerenders (#3718) 2024-04-24 12:07:28 +08:00
9eebe9d54e fix: workflow node variable (#3743) 2024-04-24 11:41:12 +08:00
a23a191615 feat: add copy button to code (#3719) 2024-04-24 09:34:51 +08:00
7d9c5586f9 Update "@formatjs/intl-localematcher" to version 0.5.4 in package.json (#3726) 2024-04-24 09:06:23 +08:00
f07c89bba4 Update README_JA.md (#3727) 2024-04-24 09:04:27 +08:00
59cba930e5 bedrock llm Model file name change (#3714)
Co-authored-by: heshunchang <shuncanghe@clouditera.com>
Co-authored-by: crazywoola <427733928@qq.com>
2024-04-23 18:57:34 +08:00
39ae56e136 fix: workflow connection (#3713) 2024-04-23 18:02:15 +08:00
f92130338b feat: prompt editor support auto height by content height and fix some bugs (#3712) 2024-04-23 17:46:59 +08:00
2867d29021 fix: milvus usage with create_collection (#3683) 2024-04-23 17:37:40 +08:00
f76ac8bdee enhance:speedup xinference audio transcription (#3636) 2024-04-23 17:09:30 +08:00
83caffe000 fix: workflow restore (#3711) 2024-04-23 17:02:23 +08:00
96160837d2 fix: cannot change file uploader method (#3710) 2024-04-23 17:02:12 +08:00
3480f1c59e refactor: tool parameter cache (#3703) 2024-04-23 15:22:42 +08:00
65ac4f69af fix: workflow shortcuts (#3701) 2024-04-23 14:45:57 +08:00
2c50fab3dd fix: skip dataset icon (#3696) 2024-04-23 12:41:41 +08:00
9525ccac4f Localize links to localized READMEs (#3689) 2024-04-23 09:30:32 +08:00
ff76c4bd5d Add new tool: Judge0 CE (#3684)
Co-authored-by: crazywoola <427733928@qq.com>
2024-04-23 09:07:21 +08:00
5dacf77627 fix: Added prevention of click event propagation for overlay layer (#3666)
Co-authored-by: crazywoola <427733928@qq.com>
2024-04-22 19:53:20 +08:00
2a213c6af7 fix: incorrect type parser (#3682) 2024-04-22 19:32:41 +08:00
b2535e7db6 chore: update description of code interpreter tool (#3679)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2024-04-22 19:19:16 +08:00
28236147ee feat: add support for bedrock Mistral AI model (#3676)
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
2024-04-22 17:24:02 +08:00
4969783383 add groq llama3 (#3673) 2024-04-22 15:21:09 +08:00
3731 changed files with 222494 additions and 61104 deletions

View File

@ -1,4 +1,4 @@
# Devlopment with devcontainer
# Development with devcontainer
This project includes a devcontainer configuration that allows you to open the project in a container with a fully configured development environment.
Both frontend and backend environments are initialized when the container is started.
## GitHub Codespaces
@ -33,5 +33,5 @@ Performance Impact: While usually minimal, programs running inside a devcontaine
if you see such error message when you open this project in codespaces:
![Alt text](troubleshooting.png)
a simple workaround is change `/signin` endpoint into another one, then login with github account and close the tab, then change it back to `/signin` endpoint. Then all things will be fine.
a simple workaround is change `/signin` endpoint into another one, then login with GitHub account and close the tab, then change it back to `/signin` endpoint. Then all things will be fine.
The reason is `signin` endpoint is not allowed in codespaces, details can be found [here](https://github.com/orgs/community/discussions/5204)

View File

@ -32,8 +32,8 @@
]
}
},
"postStartCommand": "cd api && pip install -r requirements.txt",
"postCreateCommand": "cd web && npm install"
"postStartCommand": "./.devcontainer/post_start_command.sh",
"postCreateCommand": "./.devcontainer/post_create_command.sh"
// Features to add to the dev container. More info: https://containers.dev/features.
// "features": {},

View File

@ -0,0 +1,11 @@
#!/bin/bash
cd web && npm install
pipx install poetry
echo 'alias start-api="cd /workspaces/dify/api && poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug"' >> ~/.bashrc
echo 'alias start-worker="cd /workspaces/dify/api && poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion"' >> ~/.bashrc
echo 'alias start-web="cd /workspaces/dify/web && npm run dev"' >> ~/.bashrc
echo 'alias start-containers="cd /workspaces/dify/docker && docker-compose -f docker-compose.middleware.yaml -p dify up -d"' >> ~/.bashrc
source /home/vscode/.bashrc

View File

@ -0,0 +1,3 @@
#!/bin/bash
poetry install -C api

7
.gitattributes vendored Normal file
View File

@ -0,0 +1,7 @@
# Ensure that .sh scripts use LF as line separator, even if they are checked out
# to Windows(NTFS) file-system, by a user of Docker for Window.
# These .sh scripts will be run from the Container after `docker compose up -d`.
# If they appear to be CRLF style, Dash from the Container will fail to execute
# them.
*.sh text eol=lf

24
.github/DISCUSSION_TEMPLATE/general.yml vendored Normal file
View File

@ -0,0 +1,24 @@
title: "General Discussion"
body:
- type: checkboxes
attributes:
label: Self Checks
description: "To make sure we get to you in time, please check the following :)"
options:
- label: I have searched for existing issues [search for existing issues](https://github.com/langgenius/dify/issues), including closed ones.
required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true
- label: "[FOR CHINESE USERS] 请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields."
required: true
- type: textarea
attributes:
label: Content
placeholder: Please describe the content you would like to discuss.
validations:
required: true
- type: markdown
attributes:
value: Please limit one request per issue.

30
.github/DISCUSSION_TEMPLATE/help.yml vendored Normal file
View File

@ -0,0 +1,30 @@
title: "Help"
body:
- type: checkboxes
attributes:
label: Self Checks
description: "To make sure we get to you in time, please check the following :)"
options:
- label: I have searched for existing issues [search for existing issues](https://github.com/langgenius/dify/issues), including closed ones.
required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true
- label: "[FOR CHINESE USERS] 请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields."
required: true
- type: textarea
attributes:
label: 1. Is this request related to a challenge you're experiencing? Tell me about your story.
placeholder: Please describe the specific scenario or problem you're facing as clearly as possible. For instance "I was trying to use [feature] for [specific task], and [what happened]... It was frustrating because...."
validations:
required: true
- type: textarea
attributes:
label: 2. Additional context or comments
placeholder: (Any other information, comments, documentations, links, or screenshots that would provide more clarity. This is the place to add anything else not covered above.)
validations:
required: false
- type: markdown
attributes:
value: Please limit one request per issue.

View File

@ -0,0 +1,37 @@
title: Suggestions for New Features
body:
- type: checkboxes
attributes:
label: Self Checks
description: "To make sure we get to you in time, please check the following :)"
options:
- label: I have searched for existing issues [search for existing issues](https://github.com/langgenius/dify/issues), including closed ones.
required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true
- label: "[FOR CHINESE USERS] 请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields."
required: true
- type: textarea
attributes:
label: 1. Is this request related to a challenge you're experiencing? Tell me about your story.
placeholder: Please describe the specific scenario or problem you're facing as clearly as possible. For instance "I was trying to use [feature] for [specific task], and [what happened]... It was frustrating because...."
validations:
required: true
- type: textarea
attributes:
label: 2. Additional context or comments
placeholder: (Any other information, comments, documentations, links, or screenshots that would provide more clarity. This is the place to add anything else not covered above.)
validations:
required: false
- type: checkboxes
attributes:
label: 3. Can you help us with this feature?
description: Let us know! This is not a commitment, but a starting point for collaboration.
options:
- label: I am interested in contributing to this feature.
required: false
- type: markdown
attributes:
value: Please limit one request per issue.

View File

@ -8,19 +8,20 @@ body:
label: Self Checks
description: "To make sure we get to you in time, please check the following :)"
options:
- label: This is only for bug report, if you would like to ask a quesion, please head to [Discussions](https://github.com/langgenius/dify/discussions/categories/general).
- label: This is only for bug report, if you would like to ask a question, please head to [Discussions](https://github.com/langgenius/dify/discussions/categories/general).
required: true
- label: I have searched for existing issues [search for existing issues](https://github.com/langgenius/dify/issues), including closed ones.
required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true
- label: "Pleas do not modify this template :) and fill in all the required fields."
- label: "[FOR CHINESE USERS] 请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields."
required: true
- type: input
attributes:
label: Dify version
placeholder: 0.3.21
description: See about section in Dify console
validations:
required: true
@ -40,7 +41,7 @@ body:
- type: textarea
attributes:
label: Steps to reproduce
description: We highly suggest including screenshots and a bug report log.
description: We highly suggest including screenshots and a bug report log. Please use the right markdown syntax for code blocks.
placeholder: Having detailed steps helps us reproduce the bug.
validations:
required: true

View File

@ -1,7 +1,7 @@
name: "📚 Documentation Issue"
description: Report issues in our documentation
labels:
- ducumentation
- documentation
body:
- type: checkboxes
attributes:
@ -12,7 +12,9 @@ body:
required: true
- label: I confirm that I am using English to submit report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true
- label: "Pleas do not modify this template :) and fill in all the required fields."
- label: "[FOR CHINESE USERS] 请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields."
required: true
- type: textarea
attributes:

View File

@ -12,35 +12,25 @@ body:
required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true
- label: "Pleas do not modify this template :) and fill in all the required fields."
- label: "[FOR CHINESE USERS] 请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields."
required: true
- type: textarea
attributes:
label: 1. Is this request related to a challenge you're experiencing?
label: 1. Is this request related to a challenge you're experiencing? Tell me about your story.
placeholder: Please describe the specific scenario or problem you're facing as clearly as possible. For instance "I was trying to use [feature] for [specific task], and [what happened]... It was frustrating because...."
validations:
required: true
- type: textarea
attributes:
label: 2. Describe the feature you'd like to see
placeholder: Think about what you want to achieve and how this feature will help you. Sketches, flow diagrams, or any visual representation will be a major plus.
validations:
required: true
- type: textarea
attributes:
label: 3. How will this feature improve your workflow or experience?
placeholder: Tell us how this change will benefit your work. This helps us prioritize based on user impact.
validations:
required: true
- type: textarea
attributes:
label: 4. Additional context or comments
label: 2. Additional context or comments
placeholder: (Any other information, comments, documentations, links, or screenshots that would provide more clarity. This is the place to add anything else not covered above.)
validations:
required: false
- type: checkboxes
attributes:
label: 5. Can you help us with this feature?
label: 3. Can you help us with this feature?
description: Let us know! This is not a commitment, but a starting point for collaboration.
options:
- label: I am interested in contributing to this feature.

View File

@ -12,12 +12,13 @@ body:
required: true
- label: I confirm that I am using English to submit this report (我已阅读并同意 [Language Policy](https://github.com/langgenius/dify/issues/1542)).
required: true
- label: "Pleas do not modify this template :) and fill in all the required fields."
- label: "[FOR CHINESE USERS] 请务必使用英文提交 Issue否则会被关闭。谢谢:"
required: true
- label: "Please do not modify this template :) and fill in all the required fields."
required: true
- type: input
attributes:
label: Dify version
placeholder: 0.3.21
description: Hover over system tray icon or look at Settings
validations:
required: true

View File

@ -1,13 +1,21 @@
# Checklist:
> [!IMPORTANT]
> Please review the checklist below before submitting your pull request.
- [ ] Please open an issue before creating a PR or link to an existing issue
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] I ran `dev/reformat`(backend) and `cd web && npx lint-staged`(frontend) to appease the lint gods
# Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Describe the big picture of your changes here to communicate to the maintainers why we should accept this pull request. If it fixes a bug or resolves a feature request, be sure to link to that issue. Close issue syntax: `Fixes #<issue number>`, see [documentation](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword) for more details.
Fixes # (issue)
Fixes
## Type of Change
Please delete options that are not relevant.
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
@ -15,18 +23,12 @@ Please delete options that are not relevant.
- [ ] Improvement, including but not limited to code refactoring, performance optimization, and UI/UX improvement
- [ ] Dependency upgrade
# How Has This Been Tested?
# Testing Instructions
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
- [ ] TODO
- [ ] Test A
- [ ] Test B
# Suggested Checklist:
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] My changes generate no new warnings
- [ ] I ran `dev/reformat`(backend) and `cd web && npx lint-staged`(frontend) to appease the lint gods
- [ ] `optional` I have made corresponding changes to the documentation
- [ ] `optional` I have added tests that prove my fix is effective or that my feature works
- [ ] `optional` New and existing unit tests pass locally with my changes

View File

@ -4,59 +4,92 @@ on:
pull_request:
branches:
- main
paths:
- api/**
- docker/**
concurrency:
group: api-tests-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
jobs:
test:
name: API Tests
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.11", "3.12"]
env:
OPENAI_API_KEY: sk-IamNotARealKeyJustForMockTestKawaiiiiiiiiii
AZURE_OPENAI_API_BASE: https://difyai-openai.openai.azure.com
AZURE_OPENAI_API_KEY: xxxxb1707exxxxxxxxxxaaxxxxxf94
ANTHROPIC_API_KEY: sk-ant-api11-IamNotARealKeyJustForMockTestKawaiiiiiiiiii-NotBaka-ASkksz
CHATGLM_API_BASE: http://a.abc.com:11451
XINFERENCE_SERVER_URL: http://a.abc.com:11451
XINFERENCE_GENERATION_MODEL_UID: generate
XINFERENCE_CHAT_MODEL_UID: chat
XINFERENCE_EMBEDDINGS_MODEL_UID: embedding
XINFERENCE_RERANK_MODEL_UID: rerank
GOOGLE_API_KEY: abcdefghijklmnopqrstuvwxyz
HUGGINGFACE_API_KEY: hf-awuwuwuwuwuwuwuwuwuwuwuwuwuwuwuwuwu
HUGGINGFACE_TEXT_GEN_ENDPOINT_URL: a
HUGGINGFACE_TEXT2TEXT_GEN_ENDPOINT_URL: b
HUGGINGFACE_EMBEDDINGS_ENDPOINT_URL: c
MOCK_SWITCH: true
CODE_MAX_STRING_LENGTH: 80000
python-version:
- "3.10"
- "3.11"
- "3.12"
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Install APT packages
uses: awalsh128/cache-apt-pkgs-action@v1
with:
packages: ffmpeg
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache: 'poetry'
cache-dependency-path: |
./api/requirements.txt
./api/requirements-dev.txt
api/pyproject.toml
api/poetry.lock
- name: Poetry check
run: |
poetry check -C api --lock
poetry show -C api
- name: Install dependencies
run: pip install -r ./api/requirements.txt -r ./api/requirements-dev.txt
run: poetry install -C api --with dev
- name: Run Unit tests
run: poetry run -C api bash dev/pytest/pytest_unit_tests.sh
- name: Run ModelRuntime
run: dev/pytest/pytest_model_runtime.sh
run: poetry run -C api bash dev/pytest/pytest_model_runtime.sh
- name: Run Tool
run: dev/pytest/pytest_tools.sh
run: poetry run -C api bash dev/pytest/pytest_tools.sh
- name: Set up dotenvs
run: |
cp docker/.env.example docker/.env
cp docker/middleware.env.example docker/middleware.env
- name: Expose Service Ports
run: sh .github/workflows/expose_service_ports.sh
- name: Set up Sandbox
uses: hoverkraft-tech/compose-action@v2.0.0
with:
compose-file: |
docker/docker-compose.middleware.yaml
services: |
sandbox
ssrf_proxy
- name: Run Workflow
run: dev/pytest/pytest_workflow.sh
run: poetry run -C api bash dev/pytest/pytest_workflow.sh
- name: Set up Vector Stores (Weaviate, Qdrant, PGVector, Milvus, PgVecto-RS, Chroma, MyScale, ElasticSearch)
uses: hoverkraft-tech/compose-action@v2.0.0
with:
compose-file: |
docker/docker-compose.yaml
services: |
weaviate
qdrant
etcd
minio
milvus-standalone
pgvecto-rs
pgvector
chroma
elasticsearch
- name: Test Vector Stores
run: poetry run -C api bash dev/pytest/pytest_vdb.sh

View File

@ -8,6 +8,10 @@ on:
release:
types: [published]
concurrency:
group: build-push-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
env:
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
@ -15,24 +19,34 @@ env:
DIFY_API_IMAGE_NAME: ${{ vars.DIFY_API_IMAGE_NAME || 'langgenius/dify-api' }}
jobs:
build-and-push:
runs-on: ubuntu-latest
if: github.event.pull_request.draft == false
build:
runs-on: ${{ matrix.platform == 'linux/arm64' && 'arm64_runner' || 'ubuntu-latest' }}
if: github.repository == 'langgenius/dify'
strategy:
matrix:
include:
- service_name: "web"
image_name_env: "DIFY_WEB_IMAGE_NAME"
context: "web"
- service_name: "api"
- service_name: "build-api-amd64"
image_name_env: "DIFY_API_IMAGE_NAME"
context: "api"
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
platform: linux/amd64
- service_name: "build-api-arm64"
image_name_env: "DIFY_API_IMAGE_NAME"
context: "api"
platform: linux/arm64
- service_name: "build-web-amd64"
image_name_env: "DIFY_WEB_IMAGE_NAME"
context: "web"
platform: linux/amd64
- service_name: "build-web-arm64"
image_name_env: "DIFY_WEB_IMAGE_NAME"
context: "web"
platform: linux/arm64
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
steps:
- name: Prepare
run: |
platform=${{ matrix.platform }}
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v2
@ -40,7 +54,72 @@ jobs:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_TOKEN }}
- name: Extract metadata (tags, labels) for Docker
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Extract metadata for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env[matrix.image_name_env] }}
- name: Build Docker image
id: build
uses: docker/build-push-action@v6
with:
context: "{{defaultContext}}:${{ matrix.context }}"
platforms: ${{ matrix.platform }}
build-args: COMMIT_SHA=${{ fromJSON(steps.meta.outputs.json).labels['org.opencontainers.image.revision'] }}
labels: ${{ steps.meta.outputs.labels }}
outputs: type=image,name=${{ env[matrix.image_name_env] }},push-by-digest=true,name-canonical=true,push=true
cache-from: type=gha,scope=${{ matrix.service_name }}
cache-to: type=gha,mode=max,scope=${{ matrix.service_name }}
- name: Export digest
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
- name: Upload digest
uses: actions/upload-artifact@v4
with:
name: digests-${{ matrix.context }}-${{ env.PLATFORM_PAIR }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 1
create-manifest:
needs: build
runs-on: ubuntu-latest
if: github.repository == 'langgenius/dify'
strategy:
matrix:
include:
- service_name: "merge-api-images"
image_name_env: "DIFY_API_IMAGE_NAME"
context: "api"
- service_name: "merge-web-images"
image_name_env: "DIFY_WEB_IMAGE_NAME"
context: "web"
steps:
- name: Download digests
uses: actions/download-artifact@v4
with:
path: /tmp/digests
pattern: digests-${{ matrix.context }}-*
merge-multiple: true
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ env.DOCKERHUB_USER }}
password: ${{ env.DOCKERHUB_TOKEN }}
- name: Extract metadata for Docker
id: meta
uses: docker/metadata-action@v5
with:
@ -51,14 +130,12 @@ jobs:
type=sha,enable=true,priority=100,prefix=,suffix=,format=long
type=raw,value=${{ github.ref_name }},enable=${{ startsWith(github.ref, 'refs/tags/') }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: "{{defaultContext}}:${{ matrix.context }}"
platforms: ${{ startsWith(github.ref, 'refs/tags/') && 'linux/amd64,linux/arm64' || 'linux/amd64' }}
build-args: COMMIT_SHA=${{ fromJSON(steps.meta.outputs.json).labels['org.opencontainers.image.revision'] }}
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Create manifest list and push
working-directory: /tmp/digests
run: |
docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env[matrix.image_name_env] }}@sha256:%s ' *)
- name: Inspect image
run: |
docker buildx imagetools inspect ${{ env[matrix.image_name_env] }}:${{ steps.meta.outputs.version }}

63
.github/workflows/db-migration-test.yml vendored Normal file
View File

@ -0,0 +1,63 @@
name: DB Migration Test
on:
pull_request:
branches:
- main
paths:
- api/migrations/**
concurrency:
group: db-migration-test-${{ github.ref }}
cancel-in-progress: true
jobs:
db-migration-test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version:
- "3.10"
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
cache-dependency-path: |
api/pyproject.toml
api/poetry.lock
- name: Install dependencies
run: poetry install -C api
- name: Prepare middleware env
run: |
cd docker
cp middleware.env.example middleware.env
- name: Set up Middlewares
uses: hoverkraft-tech/compose-action@v2.0.0
with:
compose-file: |
docker/docker-compose.middleware.yaml
services: |
db
redis
- name: Prepare configs
run: |
cd api
cp .env.example .env
- name: Run DB Migration
run: |
cd api
poetry run python -m flask upgrade-db

11
.github/workflows/expose_service_ports.sh vendored Executable file
View File

@ -0,0 +1,11 @@
#!/bin/bash
yq eval '.services.weaviate.ports += ["8080:8080"]' -i docker/docker-compose.yaml
yq eval '.services.qdrant.ports += ["6333:6333"]' -i docker/docker-compose.yaml
yq eval '.services.chroma.ports += ["8000:8000"]' -i docker/docker-compose.yaml
yq eval '.services["milvus-standalone"].ports += ["19530:19530"]' -i docker/docker-compose.yaml
yq eval '.services.pgvector.ports += ["5433:5432"]' -i docker/docker-compose.yaml
yq eval '.services["pgvecto-rs"].ports += ["5431:5432"]' -i docker/docker-compose.yaml
yq eval '.services["elasticsearch"].ports += ["9200:9200"]' -i docker/docker-compose.yaml
echo "Ports exposed for sandbox, weaviate, qdrant, chroma, milvus, pgvector, pgvecto-rs, elasticsearch"

View File

@ -6,7 +6,7 @@ on:
- main
concurrency:
group: dep-${{ github.head_ref || github.run_id }}
group: style-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
jobs:
@ -18,54 +18,97 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
- name: Check changed files
id: changed-files
uses: tj-actions/changed-files@v45
with:
files: api/**
- name: Install Poetry
uses: abatilo/actions-poetry@v3
- name: Set up Python
uses: actions/setup-python@v5
if: steps.changed-files.outputs.any_changed == 'true'
with:
python-version: '3.10'
- name: Python dependencies
run: pip install ruff dotenv-linter
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry install -C api --only lint
- name: Ruff check
run: ruff check ./api
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry run -C api ruff check ./api
- name: Dotenv check
run: dotenv-linter ./api/.env.example ./web/.env.example
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry run -C api dotenv-linter ./api/.env.example ./web/.env.example
- name: Ruff formatter check
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry run -C api ruff format --check ./api
- name: Lint hints
if: failure()
run: echo "Please run 'dev/reformat' to fix the fixable linting errors."
test:
name: ESLint and SuperLinter
web-style:
name: Web Style
runs-on: ubuntu-latest
needs: python-style
defaults:
run:
working-directory: ./web
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Check changed files
id: changed-files
uses: tj-actions/changed-files@v45
with:
fetch-depth: 0
files: web/**
- name: Setup NodeJS
uses: actions/setup-node@v4
if: steps.changed-files.outputs.any_changed == 'true'
with:
node-version: 20
cache: yarn
cache-dependency-path: ./web/package.json
- name: Web dependencies
run: |
cd ./web
yarn install --frozen-lockfile
if: steps.changed-files.outputs.any_changed == 'true'
run: yarn install --frozen-lockfile
- name: Web style check
run: |
cd ./web
yarn run lint
if: steps.changed-files.outputs.any_changed == 'true'
run: yarn run lint
superlinter:
name: SuperLinter
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Check changed files
id: changed-files
uses: tj-actions/changed-files@v45
with:
files: |
**.sh
**.yaml
**.yml
**Dockerfile
dev/**
- name: Super-linter
uses: super-linter/super-linter/slim@v6
uses: super-linter/super-linter/slim@v7
if: steps.changed-files.outputs.any_changed == 'true'
env:
BASH_SEVERITY: warning
DEFAULT_BRANCH: main
@ -74,6 +117,8 @@ jobs:
IGNORE_GITIGNORED_FILES: true
VALIDATE_BASH: true
VALIDATE_BASH_EXEC: true
VALIDATE_GITHUB_ACTIONS: true
# FIXME: temporarily disabled until api-docker.yaml's run script is fixed for shellcheck
# VALIDATE_GITHUB_ACTIONS: true
VALIDATE_DOCKERFILE_HADOLINT: true
VALIDATE_XML: true
VALIDATE_YAML: true

View File

@ -4,6 +4,13 @@ on:
pull_request:
branches:
- main
paths:
- sdks/**
concurrency:
group: sdk-tests-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
jobs:
build:
name: unit test for Node.js SDK

View File

@ -0,0 +1,54 @@
name: Check i18n Files and Create PR
on:
pull_request:
types: [closed]
branches: [main]
jobs:
check-and-update:
if: github.event.pull_request.merged == true
runs-on: ubuntu-latest
defaults:
run:
working-directory: web
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 2 # last 2 commits
- name: Check for file changes in i18n/en-US
id: check_files
run: |
recent_commit_sha=$(git rev-parse HEAD)
second_recent_commit_sha=$(git rev-parse HEAD~1)
changed_files=$(git diff --name-only $recent_commit_sha $second_recent_commit_sha -- 'i18n/en-US/*.ts')
echo "Changed files: $changed_files"
if [ -n "$changed_files" ]; then
echo "FILES_CHANGED=true" >> $GITHUB_ENV
else
echo "FILES_CHANGED=false" >> $GITHUB_ENV
fi
- name: Set up Node.js
if: env.FILES_CHANGED == 'true'
uses: actions/setup-node@v2
with:
node-version: 'lts/*'
- name: Install dependencies
if: env.FILES_CHANGED == 'true'
run: yarn install --frozen-lockfile
- name: Run npm script
if: env.FILES_CHANGED == 'true'
run: npm run auto-gen-i18n
- name: Create Pull Request
if: env.FILES_CHANGED == 'true'
uses: peter-evans/create-pull-request@v6
with:
commit-message: Update i18n files based on en-US changes
title: 'chore: translate i18n files'
body: This PR was automatically created to update i18n files based on changes in en-US locale.
branch: chore/automated-i18n-updates

34
.gitignore vendored
View File

@ -134,13 +134,31 @@ dmypy.json
web/.vscode/settings.json
# Intellij IDEA Files
.idea/
.idea/*
!.idea/vcs.xml
!.idea/icon.png
.ideaDataSources/
*.iml
api/.idea
api/.env
api/storage/*
docker-legacy/volumes/app/storage/*
docker-legacy/volumes/db/data/*
docker-legacy/volumes/redis/data/*
docker-legacy/volumes/weaviate/*
docker-legacy/volumes/qdrant/*
docker-legacy/volumes/etcd/*
docker-legacy/volumes/minio/*
docker-legacy/volumes/milvus/*
docker-legacy/volumes/chroma/*
docker-legacy/volumes/opensearch/data/*
docker-legacy/volumes/pgvectors/data/*
docker-legacy/volumes/pgvector/data/*
docker/volumes/app/storage/*
docker/volumes/certbot/*
docker/volumes/db/data/*
docker/volumes/redis/data/*
docker/volumes/weaviate/*
@ -148,6 +166,16 @@ docker/volumes/qdrant/*
docker/volumes/etcd/*
docker/volumes/minio/*
docker/volumes/milvus/*
docker/volumes/chroma/*
docker/volumes/opensearch/data/*
docker/volumes/myscale/data/*
docker/volumes/myscale/log/*
docker/volumes/unstructured/*
docker/volumes/pgvector/data/*
docker/volumes/pgvecto_rs/data/*
docker/nginx/conf.d/default.conf
docker/middleware.env
sdks/python-client/build
sdks/python-client/dist
@ -156,3 +184,7 @@ sdks/python-client/dify_client.egg-info
.vscode/*
!.vscode/launch.json
pyrightconfig.json
api/.vscode
.idea/
.vscode

View File

@ -4,11 +4,11 @@ We need to be nimble and ship fast given where we are, but we also want to make
This guide, like Dify itself, is a constant work in progress. We highly appreciate your understanding if at times it lags behind the actual project, and welcome any feedback for us to improve.
In terms of licensing, please take a minute to read our short [License and Contributor Agreement](./license). The community also adheres to the [code of conduct](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md).
In terms of licensing, please take a minute to read our short [License and Contributor Agreement](./LICENSE). The community also adheres to the [code of conduct](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md).
## Before you jump in
[Find](https://github.com/langgenius/dify/issues?q=is:issue+is:closed) an existing issue, or [open](https://github.com/langgenius/dify/issues/new/choose) a new one. We categorize issues into 2 types:
[Find](https://github.com/langgenius/dify/issues?q=is:issue+is:open) an existing issue, or [open](https://github.com/langgenius/dify/issues/new/choose) a new one. We categorize issues into 2 types:
### Feature requests:
@ -81,7 +81,7 @@ Dify requires the following dependencies to build, make sure they're installed o
Dify is composed of a backend and a frontend. Navigate to the backend directory by `cd api/`, then follow the [Backend README](api/README.md) to install it. In a separate terminal, navigate to the frontend directory by `cd web/`, then follow the [Frontend README](web/README.md) to install.
Check the [installation FAQ](https://docs.dify.ai/getting-started/faq/install-faq) for a list of common issues and steps to troubleshoot.
Check the [installation FAQ](https://docs.dify.ai/learn-more/faq/self-host-faq) for a list of common issues and steps to troubleshoot.
### 5. Visit dify in your browser

View File

@ -2,17 +2,17 @@
考虑到我们的现状,我们需要灵活快速地交付,但我们也希望确保像你这样的贡献者在贡献过程中获得尽可能顺畅的体验。我们为此编写了这份贡献指南,旨在让你熟悉代码库和我们与贡献者的合作方式,以便你能快速进入有趣的部分。
这份指南,就像 Dify 本身一样,是一个不断改进的工作。如果有时它落后于实际项目,我们非常感谢你的理解,并欢迎任何反馈以供我们改进。
这份指南,就像 Dify 本身一样,是一个不断改进的工作。如果有时它落后于实际项目,我们非常感谢你的理解,并欢迎提供任何反馈以供我们改进。
在许可方面,请花一分钟阅读我们简短的[许可证和贡献者协议](./license)。社区还遵守[行为准则](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md)。
在许可方面,请花一分钟阅读我们简短的 [许可证和贡献者协议](./LICENSE)。社区还遵守 [行为准则](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md)。
## 在开始之前
[查找](https://github.com/langgenius/dify/issues?q=is:issue+is:closed)现有问题,或[创建](https://github.com/langgenius/dify/issues/new/choose)一个新问题。我们将问题分为两类:
[查找](https://github.com/langgenius/dify/issues?q=is:issue+is:open)现有问题,或 [创建](https://github.com/langgenius/dify/issues/new/choose) 一个新问题。我们将问题分为两类:
### 功能请求:
* 如果您要提出新的功能请求,请解释所提议的功能的目标,并尽可能提供详细的上下文。[@perzeusss](https://github.com/perzeuss)制作了一个很好的[功能请求助手](https://udify.app/chat/MK2kVSnw1gakVwMX),可以帮助您起草需求。随时尝试一下。
* 如果您要提出新的功能请求,请解释所提议的功能的目标,并尽可能提供详细的上下文。[@perzeusss](https://github.com/perzeuss) 制作了一个很好的 [功能请求助手](https://udify.app/chat/MK2kVSnw1gakVwMX),可以帮助您起草需求。随时尝试一下。
* 如果您想从现有问题中选择一个,请在其下方留下评论表示您的意愿。
@ -20,45 +20,44 @@
根据所提议的功能所属的领域不同,您可能需要与不同的团队成员交流。以下是我们团队成员目前正在从事的各个领域的概述:
| Member | Scope |
| 团队成员 | 工作范围 |
| ------------------------------------------------------------ | ---------------------------------------------------- |
| [@yeuoly](https://github.com/Yeuoly) | Architecting Agents |
| [@jyong](https://github.com/JohnJyong) | RAG pipeline design |
| [@GarfieldDai](https://github.com/GarfieldDai) | Building workflow orchestrations |
| [@iamjoel](https://github.com/iamjoel) & [@zxhlyh](https://github.com/zxhlyh) | Making our frontend a breeze to use |
| [@guchenhe](https://github.com/guchenhe) & [@crazywoola](https://github.com/crazywoola) | Developer experience, points of contact for anything |
| [@takatost](https://github.com/takatost) | Overall product direction and architecture |
| [@yeuoly](https://github.com/Yeuoly) | 架构 Agents |
| [@jyong](https://github.com/JohnJyong) | RAG 流水线设计 |
| [@GarfieldDai](https://github.com/GarfieldDai) | 构建 workflow 编排 |
| [@iamjoel](https://github.com/iamjoel) & [@zxhlyh](https://github.com/zxhlyh) | 让我们的前端更易用 |
| [@guchenhe](https://github.com/guchenhe) & [@crazywoola](https://github.com/crazywoola) | 开发人员体验, 综合事项联系人 |
| [@takatost](https://github.com/takatost) | 产品整体方向和架构 |
How we prioritize:
事项优先级:
| Feature Type | Priority |
| 功能类型 | 优先级 |
| ------------------------------------------------------------ | --------------- |
| High-Priority Features as being labeled by a team member | High Priority |
| Popular feature requests from our [community feedback board](https://github.com/langgenius/dify/discussions/categories/feedbacks) | Medium Priority |
| Non-core features and minor enhancements | Low Priority |
| Valuable but not immediate | Future-Feature |
| 被团队成员标记为高优先级的功能 | 高优先级 |
| [community feedback board](https://github.com/langgenius/dify/discussions/categories/feedbacks) 内反馈的常见功能请求 | 中等优先级 |
| 非核心功能和小幅改进 | 低优先级 |
| 有价值当不紧急 | 未来功能 |
### 其他任何事情例如bug报告、性能优化、拼写错误更正
### 其他任何事情(例如 bug 报告、性能优化、拼写错误更正):
* 立即开始编码。
How we prioritize:
事项优先级:
| Issue Type | Priority |
| Issue 类型 | 优先级 |
| ------------------------------------------------------------ | --------------- |
| Bugs in core functions (cannot login, applications not working, security loopholes) | Critical |
| Non-critical bugs, performance boosts | Medium Priority |
| Minor fixes (typos, confusing but working UI) | Low Priority |
| 核心功能的 Bugs例如无法登录、应用无法工作、安全漏洞 | 紧急 |
| 非紧急 bugs, 性能提升 | 中等优先级 |
| 小幅修复(错别字, 能正常工作但存在误导的 UI) | 低优先级 |
## 安装
以下是设置Dify进行开发的步骤
以下是设置 Dify 进行开发的步骤:
### 1. Fork该仓库
### 1. Fork 该仓库
### 2. 克隆仓库
从终端克隆fork的仓库:
从终端克隆代码仓库:
```
git clone git@github.com:<github_username>/dify.git
@ -76,72 +75,72 @@ Dify 依赖以下工具和库:
### 4. 安装
Dify由后端和前端组成。通过`cd api/`导航到后端目录,然后按照[后端README](api/README.md)进行安装。在另一个终端中,通过`cd web/`导航到前端目录,然后按照[前端README](web/README.md)进行安装。
Dify 由后端和前端组成。通过 `cd api/` 导航到后端目录,然后按照 [后端 README](api/README.md) 进行安装。在另一个终端中,通过 `cd web/` 导航到前端目录,然后按照 [前端 README](web/README.md) 进行安装。
查看[安装常见问题解答](https://docs.dify.ai/getting-started/faq/install-faq)以获取常见问题列表和故障排除步骤。
查看 [安装常见问题解答](https://docs.dify.ai/v/zh-hans/learn-more/faq/install-faq) 以获取常见问题列表和故障排除步骤。
### 5. 在浏览器中访问Dify
### 5. 在浏览器中访问 Dify
为了验证您的设置,打开浏览器并访问[http://localhost:3000](http://localhost:3000)默认或您自定义的URL和端口。现在您应该看到Dify正在运行。
为了验证您的设置,打开浏览器并访问 [http://localhost:3000](http://localhost:3000)(默认或您自定义的 URL 和端口)。现在您应该看到 Dify 正在运行。
## 开发
如果您要添加模型提供程序,请参考[此指南](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/README.md)。
如果您要添加模型提供程序,请参考 [此指南](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/README.md)。
如果您要向AgentWorkflow添加工具提供程序请参考[此指南](./api/core/tools/README.md)。
如果您要向 AgentWorkflow 添加工具提供程序,请参考 [此指南](./api/core/tools/README.md)。
为了帮助您快速了解您的贡献在哪个部分以下是Dify后端和前端的简要注释大纲
为了帮助您快速了解您的贡献在哪个部分,以下是 Dify 后端和前端的简要注释大纲:
### 后端
Dify的后端使用Python编写使用[Flask](https://flask.palletsprojects.com/en/3.0.x/)框架。它使用[SQLAlchemy](https://www.sqlalchemy.org/)作为ORM使用[Celery](https://docs.celeryq.dev/en/stable/getting-started/introduction.html)作为任务队列。授权逻辑通过Flask-login进行处理。
Dify 的后端使用 Python 编写,使用 [Flask](https://flask.palletsprojects.com/en/3.0.x/) 框架。它使用 [SQLAlchemy](https://www.sqlalchemy.org/) 作为 ORM使用 [Celery](https://docs.celeryq.dev/en/stable/getting-started/introduction.html) 作为任务队列。授权逻辑通过 Flask-login 进行处理。
```
[api/]
├── constants // Constant settings used throughout code base.
├── controllers // API route definitions and request handling logic.
├── core // Core application orchestration, model integrations, and tools.
├── docker // Docker & containerization related configurations.
├── events // Event handling and processing
├── extensions // Extensions with 3rd party frameworks/platforms.
├── fields // field definitions for serialization/marshalling.
├── libs // Reusable libraries and helpers.
├── migrations // Scripts for database migration.
├── models // Database models & schema definitions.
├── services // Specifies business logic.
├── storage // Private key storage.
├── tasks // Handling of async tasks and background jobs.
├── constants // 用于整个代码库的常量设置。
├── controllers // API 路由定义和请求处理逻辑。
├── core // 核心应用编排、模型集成和工具。
├── docker // Docker 和容器化相关配置。
├── events // 事件处理和处理。
├── extensions // 与第三方框架/平台的扩展。
├── fields // 用于序列化/封装的字段定义。
├── libs // 可重用的库和助手。
├── migrations // 数据库迁移脚本。
├── models // 数据库模型和架构定义。
├── services // 指定业务逻辑。
├── storage // 私钥存储。
├── tasks // 异步任务和后台作业的处理。
└── tests
```
### 前端
该网站使用基于Typescript[Next.js](https://nextjs.org/)模板进行引导,并使用[Tailwind CSS](https://tailwindcss.com/)进行样式设计。[React-i18next](https://react.i18next.com/)用于国际化。
该网站使用基于 Typescript[Next.js](https://nextjs.org/) 模板进行引导,并使用 [Tailwind CSS](https://tailwindcss.com/) 进行样式设计。[React-i18next](https://react.i18next.com/) 用于国际化。
```
[web/]
├── app // layouts, pages, and components
│ ├── (commonLayout) // common layout used throughout the app
│ ├── (shareLayout) // layouts specifically shared across token-specific sessions
│ ├── activate // activate page
│ ├── components // shared by pages and layouts
│ ├── install // install page
│ ├── signin // signin page
│ └── styles // globally shared styles
├── assets // Static assets
├── bin // scripts ran at build step
├── config // adjustable settings and options
├── context // shared contexts used by different portions of the app
├── dictionaries // Language-specific translate files
├── docker // container configurations
├── hooks // Reusable hooks
├── i18n // Internationalization configuration
├── models // describes data models & shapes of API responses
├── public // meta assets like favicon
├── service // specifies shapes of API actions
├── app // 布局、页面和组件
│ ├── (commonLayout) // 整个应用通用的布局
│ ├── (shareLayout) // 在特定会话中共享的布局
│ ├── activate // 激活页面
│ ├── components // 页面和布局共享的组件
│ ├── install // 安装页面
│ ├── signin // 登录页面
│ └── styles // 全局共享的样式
├── assets // 静态资源
├── bin // 构建步骤运行的脚本
├── config // 可调整的设置和选项
├── context // 应用中不同部分使用的共享上下文
├── dictionaries // 语言特定的翻译文件
├── docker // 容器配置
├── hooks // 可重用的钩子
├── i18n // 国际化配置
├── models // 描述数据模型和 API 响应的形状
├── public // favicon 等元资源
├── service // 定义 API 操作的形状
├── test
├── types // descriptions of function params and return values
└── utils // Shared utility functions
├── types // 函数参数和返回值的描述
└── utils // 共享的实用函数
```
## 提交你的 PR

160
CONTRIBUTING_JA.md Normal file
View File

@ -0,0 +1,160 @@
Dify にコントリビュートしたいとお考えなのですね。それは素晴らしいことです。
私たちは、LLM アプリケーションの構築と管理のための最も直感的なワークフローを設計するという壮大な野望を持っています。人数も資金も限られている新興企業として、コミュニティからの支援は本当に重要です。
私たちは現状を鑑み、機敏かつ迅速に開発をする必要がありますが、同時にあなた様のようなコントリビューターの方々に、可能な限りスムーズな貢献体験をしていただきたいと思っています。そのためにこのコントリビュートガイドを作成しました。
コードベースやコントリビュータの方々と私たちがどのように仕事をしているのかに慣れていただき、楽しいパートにすぐに飛び込めるようにすることが目的です。
このガイドは Dify そのものと同様に、継続的に改善されています。実際のプロジェクトに遅れをとることがあるかもしれませんが、ご理解のほどよろしくお願いいたします。
ライセンスに関しては、私たちの短い[ライセンスおよびコントリビューター規約](./LICENSE)をお読みください。また、コミュニティは[行動規範](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md)を遵守しています。
## 飛び込む前に
[既存の Issue](https://github.com/langgenius/dify/issues?q=is:issue+is:open) を探すか、[新しい Issue](https://github.com/langgenius/dify/issues/new/choose) を作成してください。私たちは Issue を 2 つのタイプに分類しています。
### 機能リクエスト
* 新しい機能要望を出す場合は、提案する機能が何を実現するものなのかを説明し、可能な限り多くのコンテキストを含めてください。[@perzeusss](https://github.com/perzeuss)は、あなた様の要望を書き出すのに役立つ [Feature Request Copilot](https://udify.app/chat/MK2kVSnw1gakVwMX) を作ってくれました。気軽に試してみてください。
* 既存の課題から 1 つ選びたい場合は、その下にコメントを書いてください。
関連する方向で作業しているチームメンバーが参加します。すべてが良好であれば、コーディングを開始する許可が与えられます。私たちが変更を提案した場合にあなた様の作業が無駄になることがないよう、それまでこの機能の作業を控えていただくようお願いいたします。
提案された機能がどの分野に属するかによって、あなた様は異なるチーム・メンバーと話をするかもしれません。以下は、各チームメンバーが現在取り組んでいる分野の概要です。
| Member | Scope |
| --------------------------------------------------------------------------------------- | ------------------------------------ |
| [@yeuoly](https://github.com/Yeuoly) | エージェントアーキテクチャ |
| [@jyong](https://github.com/JohnJyong) | RAG パイプライン設計 |
| [@GarfieldDai](https://github.com/GarfieldDai) | workflow orchestrations の構築 |
| [@iamjoel](https://github.com/iamjoel) & [@zxhlyh](https://github.com/zxhlyh) | フロントエンドを使いやすくする |
| [@guchenhe](https://github.com/guchenhe) & [@crazywoola](https://github.com/crazywoola) | 開発者体験、何でも相談できる窓口 |
| [@takatost](https://github.com/takatost) | 全体的な製品の方向性とアーキテクチャ |
優先順位の付け方:
| Feature Type | Priority |
| --------------------------------------------------------------------------------------------------------------------- | --------------- |
| チームメンバーによってラベル付けされた優先度の高い機能 | High Priority |
| [community feedback board](https://github.com/langgenius/dify/discussions/categories/feedbacks)の人気の機能リクエスト | Medium Priority |
| 非コア機能とマイナーな機能強化 | Low Priority |
| 価値はあるが即効性はない | Future-Feature |
### その他 (バグレポート、パフォーマンスの最適化、誤字の修正など)
* すぐにコーディングを始めてください
優先順位の付け方:
| Issue Type | Priority |
| -------------------------------------------------------------------------------------- | --------------- |
| コア機能のバグ(ログインできない、アプリケーションが動作しない、セキュリティの抜け穴) | Critical |
| 致命的でないバグ、パフォーマンス向上 | Medium Priority |
| 細かな修正(誤字脱字、機能はするが分かりにくい UI | Low Priority |
## インストール
以下の手順で 、Difyのセットアップをしてください。
### 1. このリポジトリをフォークする
### 2. リポジトリをクローンする
フォークしたリポジトリをターミナルからクローンします。
```
git clone git@github.com:<github_username>/dify.git
```
### 3. 依存関係の確認
Dify を構築するには次の依存関係が必要です。それらがシステムにインストールされていることを確認してください。
- [Docker](https://www.docker.com/)
- [Docker Compose](https://docs.docker.com/compose/install/)
- [Node.js v18.x (LTS)](http://nodejs.org)
- [npm](https://www.npmjs.com/) version 8.x.x or [Yarn](https://yarnpkg.com/)
- [Python](https://www.python.org/) version 3.10.x
### 4. インストール
Dify はバックエンドとフロントエンドから構成されています。
まず`cd api/`でバックエンドのディレクトリに移動し、[Backend README](api/README.md)に従ってインストールします。
次に別のターミナルで、`cd web/`でフロントエンドのディレクトリに移動し、[Frontend README](web/README.md)に従ってインストールしてください。
よくある問題とトラブルシューティングの手順については、[installation FAQ](https://docs.dify.ai/v/japanese/learn-more/faq/install-faq) を確認してください。
### 5. ブラウザで dify にアクセスする
設定を確認するために、ブラウザで[http://localhost:3000](http://localhost:3000)(デフォルト、または自分で設定した URL とポート)にアクセスしてください。Dify が起動して実行中であることが確認できるはずです。
## 開発中
モデルプロバイダーを追加する場合は、[このガイド](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/README.md)が役立ちます。
Agent や Workflow にツールプロバイダーを追加する場合は、[このガイド](./api/core/tools/README.md)が役立ちます。
Dify のバックエンドとフロントエンドの概要を簡単に説明します。
### バックエンド
Dify のバックエンドは[Flask](https://flask.palletsprojects.com/en/3.0.x/)を使って Python で書かれています。ORM には[SQLAlchemy](https://www.sqlalchemy.org/)を、タスクキューには[Celery](https://docs.celeryq.dev/en/stable/getting-started/introduction.html)を使っています。認証ロジックは Flask-login 経由で行われます。
```
[api/]
├── constants // コードベース全体で使用される定数設定
├── controllers // APIルート定義とリクエスト処理ロジック
├── core // アプリケーションの中核的な管理、モデル統合、およびツール
├── docker // Dockerおよびコンテナ関連の設定
├── events // イベントのハンドリングと処理
├── extensions // 第三者のフレームワーク/プラットフォームとの拡張
├── fields // シリアライゼーション/マーシャリング用のフィールド定義
├── libs // 再利用可能なライブラリとヘルパー
├── migrations // データベースマイグレーションスクリプト
├── models // データベースモデルとスキーマ定義
├── services // ビジネスロジックの定義
├── storage // 秘密鍵の保存
├── tasks // 非同期タスクとバックグラウンドジョブの処理
└── tests // テスト関連のファイル
```
### フロントエンド
このウェブサイトは、Typescriptベースの[Next.js](https://nextjs.org/)テンプレートを使ってブートストラップされ、[Tailwind CSS](https://tailwindcss.com/)を使ってスタイリングされています。国際化には[React-i18next](https://react.i18next.com/)を使用しています。
```
[web/]
├── app // レイアウト、ページ、コンポーネント
│ ├── (commonLayout) // アプリ全体で共通のレイアウト
│ ├── (shareLayout) // トークン特有のセッションで共有されるレイアウト
│ ├── activate // アクティベートページ
│ ├── components // ページやレイアウトで共有されるコンポーネント
│ ├── install // インストールページ
│ ├── signin // サインインページ
│ └── styles // グローバルに共有されるスタイル
├── assets // 静的アセット
├── bin // ビルドステップで実行されるスクリプト
├── config // 調整可能な設定とオプション
├── context // アプリの異なる部分で使用される共有コンテキスト
├── dictionaries // 言語別の翻訳ファイル
├── docker // コンテナ設定
├── hooks // 再利用可能なフック
├── i18n // 国際化設定
├── models // データモデルとAPIレスポンスの形状を記述
├── public // ファビコンなどのメタアセット
├── service // APIアクションの形状を指定
├── test
├── types // 関数のパラメータと戻り値の記述
└── utils // 共有ユーティリティ関数
```
## PR を投稿する
いよいよ、私たちのリポジトリにプルリクエスト (PR) を提出する時が来ました。主要な機能については、まず `deploy/dev` ブランチにマージしてテストしてから `main` ブランチにマージします。
マージ競合などの問題が発生した場合、またはプル リクエストを開く方法がわからない場合は、[GitHub's pull request tutorial](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests) をチェックしてみてください。
これで完了です!あなた様の PR がマージされると、[README](https://github.com/langgenius/dify/blob/main/README.md) にコントリビューターとして紹介されます。
## ヘルプを得る
コントリビュート中に行き詰まったり、疑問が生じたりした場合は、GitHub の関連する issue から質問していただくか、[Discord](https://discord.gg/8Tpq4AcN9c)でチャットしてください。

156
CONTRIBUTING_VI.md Normal file
View File

@ -0,0 +1,156 @@
Thật tuyệt vời khi bạn muốn đóng góp cho Dify! Chúng tôi rất mong chờ được thấy những gì bạn sẽ làm. Là một startup với nguồn nhân lực và tài chính hạn chế, chúng tôi có tham vọng lớn là thiết kế quy trình trực quan nhất để xây dựng và quản lý các ứng dụng LLM. Mọi sự giúp đỡ từ cộng đồng đều rất quý giá đối với chúng tôi.
Chúng tôi cần linh hoạt và làm việc nhanh chóng, nhưng đồng thời cũng muốn đảm bảo các cộng tác viên như bạn có trải nghiệm đóng góp thuận lợi nhất có thể. Chúng tôi đã tạo ra hướng dẫn đóng góp này nhằm giúp bạn làm quen với codebase và cách chúng tôi làm việc với các cộng tác viên, để bạn có thể nhanh chóng bắt tay vào phần thú vị.
Hướng dẫn này, cũng như bản thân Dify, đang trong quá trình cải tiến liên tục. Chúng tôi rất cảm kích sự thông cảm của bạn nếu đôi khi nó không theo kịp dự án thực tế, và chúng tôi luôn hoan nghênh mọi phản hồi để cải thiện.
Về vấn đề cấp phép, xin vui lòng dành chút thời gian đọc qua [Thỏa thuận Cấp phép và Đóng góp](./LICENSE) ngắn gọn của chúng tôi. Cộng đồng cũng tuân thủ [quy tắc ứng xử](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md).
## Trước khi bắt đầu
[Tìm kiếm](https://github.com/langgenius/dify/issues?q=is:issue+is:open) một vấn đề hiện có, hoặc [tạo mới](https://github.com/langgenius/dify/issues/new/choose) một vấn đề. Chúng tôi phân loại các vấn đề thành 2 loại:
### Yêu cầu tính năng:
* Nếu bạn đang tạo một yêu cầu tính năng mới, chúng tôi muốn bạn giải thích tính năng đề xuất sẽ đạt được điều gì và cung cấp càng nhiều thông tin chi tiết càng tốt. [@perzeusss](https://github.com/perzeuss) đã tạo một [Trợ lý Yêu cầu Tính năng](https://udify.app/chat/MK2kVSnw1gakVwMX) rất hữu ích để giúp bạn soạn thảo nhu cầu của mình. Hãy thử dùng nó nhé.
* Nếu bạn muốn chọn một vấn đề từ danh sách hiện có, chỉ cần để lại bình luận dưới vấn đề đó nói rằng bạn sẽ làm.
Một thành viên trong nhóm làm việc trong lĩnh vực liên quan sẽ được thông báo. Nếu mọi thứ ổn, họ sẽ cho phép bạn bắt đầu code. Chúng tôi yêu cầu bạn chờ đợi cho đến lúc đó trước khi bắt tay vào làm tính năng, để không lãng phí công sức của bạn nếu chúng tôi đề xuất thay đổi.
Tùy thuộc vào lĩnh vực mà tính năng đề xuất thuộc về, bạn có thể nói chuyện với các thành viên khác nhau trong nhóm. Dưới đây là danh sách các lĩnh vực mà các thành viên trong nhóm chúng tôi đang làm việc hiện tại:
| Thành viên | Phạm vi |
| ------------------------------------------------------------ | ---------------------------------------------------- |
| [@yeuoly](https://github.com/Yeuoly) | Thiết kế kiến trúc Agents |
| [@jyong](https://github.com/JohnJyong) | Thiết kế quy trình RAG |
| [@GarfieldDai](https://github.com/GarfieldDai) | Xây dựng quy trình làm việc |
| [@iamjoel](https://github.com/iamjoel) & [@zxhlyh](https://github.com/zxhlyh) | Làm cho giao diện người dùng dễ sử dụng |
| [@guchenhe](https://github.com/guchenhe) & [@crazywoola](https://github.com/crazywoola) | Trải nghiệm nhà phát triển, đầu mối liên hệ cho mọi vấn đề |
| [@takatost](https://github.com/takatost) | Định hướng và kiến trúc tổng thể sản phẩm |
Cách chúng tôi ưu tiên:
| Loại tính năng | Mức độ ưu tiên |
| ------------------------------------------------------------ | -------------- |
| Tính năng ưu tiên cao được gắn nhãn bởi thành viên trong nhóm | Ưu tiên cao |
| Yêu cầu tính năng phổ biến từ [bảng phản hồi cộng đồng](https://github.com/langgenius/dify/discussions/categories/feedbacks) của chúng tôi | Ưu tiên trung bình |
| Tính năng không quan trọng và cải tiến nhỏ | Ưu tiên thấp |
| Có giá trị nhưng không cấp bách | Tính năng tương lai |
### Những vấn đề khác (ví dụ: báo cáo lỗi, tối ưu hiệu suất, sửa lỗi chính tả):
* Bắt đầu code ngay lập tức.
Cách chúng tôi ưu tiên:
| Loại vấn đề | Mức độ ưu tiên |
| ------------------------------------------------------------ | -------------- |
| Lỗi trong các chức năng chính (không thể đăng nhập, ứng dụng không hoạt động, lỗ hổng bảo mật) | Nghiêm trọng |
| Lỗi không quan trọng, cải thiện hiệu suất | Ưu tiên trung bình |
| Sửa lỗi nhỏ (lỗi chính tả, giao diện người dùng gây nhầm lẫn nhưng vẫn hoạt động) | Ưu tiên thấp |
## Cài đặt
Dưới đây là các bước để thiết lập Dify cho việc phát triển:
### 1. Fork repository này
### 2. Clone repository
Clone repository đã fork từ terminal của bạn:
```
git clone git@github.com:<tên_người_dùng_github>/dify.git
```
### 3. Kiểm tra các phụ thuộc
Dify yêu cầu các phụ thuộc sau để build, hãy đảm bảo chúng đã được cài đặt trên hệ thống của bạn:
- [Docker](https://www.docker.com/)
- [Docker Compose](https://docs.docker.com/compose/install/)
- [Node.js v18.x (LTS)](http://nodejs.org)
- [npm](https://www.npmjs.com/) phiên bản 8.x.x hoặc [Yarn](https://yarnpkg.com/)
- [Python](https://www.python.org/) phiên bản 3.10.x
### 4. Cài đặt
Dify bao gồm một backend và một frontend. Đi đến thư mục backend bằng lệnh `cd api/`, sau đó làm theo hướng dẫn trong [README của Backend](api/README.md) để cài đặt. Trong một terminal khác, đi đến thư mục frontend bằng lệnh `cd web/`, sau đó làm theo hướng dẫn trong [README của Frontend](web/README.md) để cài đặt.
Kiểm tra [FAQ về cài đặt](https://docs.dify.ai/learn-more/faq/self-host-faq) để xem danh sách các vấn đề thường gặp và các bước khắc phục.
### 5. Truy cập Dify trong trình duyệt của bạn
Để xác nhận cài đặt của bạn, hãy truy cập [http://localhost:3000](http://localhost:3000) (địa chỉ mặc định, hoặc URL và cổng bạn đã cấu hình) trong trình duyệt. Bạn sẽ thấy Dify đang chạy.
## Phát triển
Nếu bạn đang thêm một nhà cung cấp mô hình, [hướng dẫn này](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/README.md) dành cho bạn.
Nếu bạn đang thêm một nhà cung cấp công cụ cho Agent hoặc Workflow, [hướng dẫn này](./api/core/tools/README.md) dành cho bạn.
Để giúp bạn nhanh chóng định hướng phần đóng góp của mình, dưới đây là một bản phác thảo ngắn gọn về cấu trúc backend & frontend của Dify:
### Backend
Backend của Dify được viết bằng Python sử dụng [Flask](https://flask.palletsprojects.com/en/3.0.x/). Nó sử dụng [SQLAlchemy](https://www.sqlalchemy.org/) cho ORM và [Celery](https://docs.celeryq.dev/en/stable/getting-started/introduction.html) cho hàng đợi tác vụ. Logic xác thực được thực hiện thông qua Flask-login.
```
[api/]
├── constants // Các cài đặt hằng số được sử dụng trong toàn bộ codebase.
├── controllers // Định nghĩa các route API và logic xử lý yêu cầu.
├── core // Điều phối ứng dụng cốt lõi, tích hợp mô hình và công cụ.
├── docker // Cấu hình liên quan đến Docker & containerization.
├── events // Xử lý và xử lý sự kiện
├── extensions // Mở rộng với các framework/nền tảng bên thứ 3.
├── fields // Định nghĩa trường cho serialization/marshalling.
├── libs // Thư viện và tiện ích có thể tái sử dụng.
├── migrations // Script cho việc di chuyển cơ sở dữ liệu.
├── models // Mô hình cơ sở dữ liệu & định nghĩa schema.
├── services // Xác định logic nghiệp vụ.
├── storage // Lưu trữ khóa riêng tư.
├── tasks // Xử lý các tác vụ bất đồng bộ và công việc nền.
└── tests
```
### Frontend
Website được khởi tạo trên boilerplate [Next.js](https://nextjs.org/) bằng Typescript và sử dụng [Tailwind CSS](https://tailwindcss.com/) cho styling. [React-i18next](https://react.i18next.com/) được sử dụng cho việc quốc tế hóa.
```
[web/]
├── app // layouts, pages và components
│ ├── (commonLayout) // layout chung được sử dụng trong toàn bộ ứng dụng
│ ├── (shareLayout) // layouts được chia sẻ cụ thể cho các phiên dựa trên token
│ ├── activate // trang kích hoạt
│ ├── components // được chia sẻ bởi các trang và layouts
│ ├── install // trang cài đặt
│ ├── signin // trang đăng nhập
│ └── styles // styles được chia sẻ toàn cục
├── assets // Tài nguyên tĩnh
├── bin // scripts chạy ở bước build
├── config // cài đặt và tùy chọn có thể điều chỉnh
├── context // contexts được chia sẻ bởi các phần khác nhau của ứng dụng
├── dictionaries // File dịch cho từng ngôn ngữ
├── docker // cấu hình container
├── hooks // Hooks có thể tái sử dụng
├── i18n // Cấu hình quốc tế hóa
├── models // mô tả các mô hình dữ liệu & hình dạng của phản hồi API
├── public // tài nguyên meta như favicon
├── service // xác định hình dạng của các hành động API
├── test
├── types // mô tả các tham số hàm và giá trị trả về
└── utils // Các hàm tiện ích được chia sẻ
```
## Gửi PR của bạn
Cuối cùng, đã đến lúc mở một pull request (PR) đến repository của chúng tôi. Đối với các tính năng lớn, chúng tôi sẽ merge chúng vào nhánh `deploy/dev` để kiểm tra trước khi đưa vào nhánh `main`. Nếu bạn gặp vấn đề như xung đột merge hoặc không biết cách mở pull request, hãy xem [hướng dẫn về pull request của GitHub](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests).
Và thế là xong! Khi PR của bạn được merge, bạn sẽ được giới thiệu là một người đóng góp trong [README](https://github.com/langgenius/dify/blob/main/README.md) của chúng tôi.
## Nhận trợ giúp
Nếu bạn gặp khó khăn hoặc có câu hỏi cấp bách trong quá trình đóng góp, hãy đặt câu hỏi của bạn trong vấn đề GitHub liên quan, hoặc tham gia [Discord](https://discord.gg/8Tpq4AcN9c) của chúng tôi để trò chuyện nhanh chóng.

View File

@ -4,7 +4,7 @@ Dify is licensed under the Apache License 2.0, with the following additional con
1. Dify may be utilized commercially, including as a backend service for other applications or as an application development platform for enterprises. Should the conditions below be met, a commercial license must be obtained from the producer:
a. Multi-tenant SaaS service: Unless explicitly authorized by Dify in writing, you may not use the Dify source code to operate a multi-tenant environment.
a. Multi-tenant service: Unless explicitly authorized by Dify in writing, you may not use the Dify source code to operate a multi-tenant environment.
- Tenant Definition: Within the context of Dify, one tenant corresponds to one workspace. The workspace provides a separated area for each tenant's data and configurations.
b. LOGO and copyright information: In the process of using Dify's frontend components, you may not remove or modify the LOGO or copyright information in the Dify console or applications. This restriction is inapplicable to uses of Dify that do not involve its frontend components.

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Self-hosting</a> ·
<a href="https://docs.dify.ai">Documentation</a> ·
<a href="https://cal.com/guchenhe/60-min-meeting">Enterprise inquiry</a>
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Enterprise inquiry</a>
</p>
<p align="center">
@ -29,19 +29,19 @@
</p>
<p align="center">
<a href="./README.md"><img alt="Commits last month" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="Commits last month" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="Commits last month" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="Commits last month" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="Commits last month" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="Commits last month" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
#
<p align="center">
<a href="https://trendshift.io/repositories/2152" target="_blank"><img src="https://trendshift.io/api/badge/repositories/2152" alt="langgenius%2Fdify | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
</p>
Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production. Here's a list of the core features:
</br> </br>
@ -54,7 +54,7 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
**2. Comprehensive model support**:
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama2, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
@ -66,7 +66,7 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
Extensive RAG capabilities that cover everything from document ingestion to retrieval, with out-of-box support for text extraction from PDFs, PPTs, and other common document formats.
**5. Agent capabilities**:
You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DELL·E, Stable Diffusion and WolframAlpha.
You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DALL·E, Stable Diffusion and WolframAlpha.
**6. LLMOps**:
Monitor and analyze application logs and performance over time. You could continuously improve prompts, datasets, and models based on production data and annotations.
@ -109,7 +109,7 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
@ -127,7 +127,7 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
<td align="center"></td>
</tr>
<tr>
<td align="center">Enterprise Feature (SSO/Access control)</td>
<td align="center">Enterprise Features (SSO/Access control)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
@ -152,7 +152,7 @@ Quickly get Dify running in your environment with this [starter guide](#quick-st
Use our [documentation](https://docs.dify.ai) for further references and more in-depth instructions.
- **Dify for enterprise / organizations</br>**
We provide additional enterprise-centric features. [Schedule a meeting with us](https://cal.com/guchenhe/30min) or [send us an email](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) to discuss enterprise needs. </br>
We provide additional enterprise-centric features. [Log your questions for us through this chatbot](https://udify.app/chat/22L1zSxg6yW1cWQg) or [send us an email](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) to discuss enterprise needs. </br>
> For startups and small businesses using AWS, check out [Dify Premium on AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) and deploy it to your own AWS VPC with one-click. It's an affordable AMI offering with the option to create apps with custom logo and branding.
@ -176,6 +176,7 @@ The easiest way to start the Dify server is to run our [docker-compose.yml](dock
```bash
cd docker
cp .env.example .env
docker compose up -d
```
@ -185,13 +186,19 @@ After running, you can access the Dify dashboard in your browser at [http://loca
## Next steps
If you need to customize the configuration, please refer to the comments in our [docker-compose.yml](docker/docker-compose.yaml) file and manually set the environment configuration. After making the changes, please run `docker-compose up -d` again. You can see the full list of environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) which allow Dify to be deployed on Kubernetes.
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Using Terraform for Deployment
##### Azure Global
Deploy Dify to Azure with a single click using [terraform](https://www.terraform.io/).
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contributing
@ -211,27 +218,9 @@ At the same time, please consider supporting Dify by sharing it on social media
* [Github Discussion](https://github.com/langgenius/dify/discussions). Best for: sharing feedback and asking questions.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Best for: bugs you encounter using Dify.AI, and feature proposals. See our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Email](mailto:support@dify.ai?subject=[GitHub]Questions%20About%20Dify). Best for: questions you have about using Dify.AI.
* [Discord](https://discord.gg/FngNHpbcY7). Best for: sharing your applications and hanging out with the community.
* [Twitter](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
Or, schedule a meeting directly with a team member:
<table>
<tr>
<th>Point of Contact</th>
<th>Purpose</th>
</tr>
<tr>
<td><a href='https://cal.com/guchenhe/15min' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/9ebcd111-1205-4d71-83d5-948d70b809f5' alt='Git-Hub-README-Button-3x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Business enquiries & product feedback</td>
</tr>
<tr>
<td><a href='https://cal.com/pinkbanana' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/d1edd00a-d7e4-4513-be6c-e57038e143fd' alt='Git-Hub-README-Button-2x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Contributions, issues & feature requests</td>
</tr>
</table>
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)

218
README_AR.md Normal file
View File

@ -0,0 +1,218 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">الاستضافة الذاتية</a> ·
<a href="https://docs.dify.ai">التوثيق</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">استفسار الشركات (للإنجليزية فقط)</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on Twitter"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
<div style="text-align: right;">
مشروع Dify هو منصة تطوير تطبيقات الذكاء الصناعي مفتوحة المصدر. تجمع واجهته البديهية بين سير العمل الذكي بالذكاء الاصطناعي وخط أنابيب RAG وقدرات الوكيل وإدارة النماذج وميزات الملاحظة وأكثر من ذلك، مما يتيح لك الانتقال بسرعة من المرحلة التجريبية إلى الإنتاج. إليك قائمة بالميزات الأساسية:
</br> </br>
**1. سير العمل**: قم ببناء واختبار سير عمل الذكاء الاصطناعي القوي على قماش بصري، مستفيدًا من جميع الميزات التالية وأكثر.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. الدعم الشامل للنماذج**: تكامل سلس مع مئات من LLMs الخاصة / مفتوحة المصدر من عشرات من موفري التحليل والحلول المستضافة ذاتيًا، مما يغطي GPT و Mistral و Llama3 وأي نماذج متوافقة مع واجهة OpenAI API. يمكن العثور على قائمة كاملة بمزودي النموذج المدعومين [هنا](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. بيئة التطوير للأوامر**: واجهة بيئة التطوير المبتكرة لصياغة الأمر ومقارنة أداء النموذج، وإضافة ميزات إضافية مثل تحويل النص إلى كلام إلى تطبيق قائم على الدردشة.
**4. خط أنابيب RAG**: قدرات RAG الواسعة التي تغطي كل شيء من استيعاب الوثائق إلى الاسترجاع، مع الدعم الفوري لاستخراج النص من ملفات PDF و PPT وتنسيقات الوثائق الشائعة الأخرى.
**5. قدرات الوكيل**: يمكنك تعريف الوكلاء بناءً على أمر وظيفة LLM أو ReAct، وإضافة أدوات مدمجة أو مخصصة للوكيل. توفر Dify أكثر من 50 أداة مدمجة لوكلاء الذكاء الاصطناعي، مثل البحث في Google و DALL·E وStable Diffusion و WolframAlpha.
**6. الـ LLMOps**: راقب وتحلل سجلات التطبيق والأداء على مر الزمن. يمكنك تحسين الأوامر والبيانات والنماذج باستمرار استنادًا إلى البيانات الإنتاجية والتعليقات.
**7.الواجهة الخلفية (Backend) كخدمة**: تأتي جميع عروض Dify مع APIs مطابقة، حتى يمكنك دمج Dify بسهولة في منطق أعمالك الخاص.
## مقارنة الميزات
<table style="width: 100%;">
<tr>
<th align="center">الميزة</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">نهج البرمجة</td>
<td align="center">موجّه لـ تطبيق + واجهة برمجة تطبيق (API)</td>
<td align="center">برمجة Python</td>
<td align="center">موجه لتطبيق</td>
<td align="center">واجهة برمجة تطبيق (API)</td>
</tr>
<tr>
<td align="center">LLMs المدعومة</td>
<td align="center">تنوع غني</td>
<td align="center">تنوع غني</td>
<td align="center">تنوع غني</td>
<td align="center">فقط OpenAI</td>
</tr>
<tr>
<td align="center">محرك RAG</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">الوكيل</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">سير العمل</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">الملاحظة</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">ميزات الشركات (SSO / مراقبة الوصول)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">نشر محلي</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## استخدام Dify
- **سحابة </br>**
نحن نستضيف [خدمة Dify Cloud](https://dify.ai) لأي شخص لتجربتها بدون أي إعدادات. توفر كل قدرات النسخة التي تمت استضافتها ذاتيًا، وتتضمن 200 أمر GPT-4 مجانًا في خطة الصندوق الرملي.
- **استضافة ذاتية لنسخة المجتمع Dify</br>**
ابدأ سريعًا في تشغيل Dify في بيئتك باستخدام [دليل البدء السريع](#البدء السريع).
استخدم [توثيقنا](https://docs.dify.ai) للمزيد من المراجع والتعليمات الأعمق.
- **مشروع Dify للشركات / المؤسسات</br>**
نحن نوفر ميزات إضافية مركزة على الشركات. [جدول اجتماع معنا](https://cal.com/guchenhe/30min) أو [أرسل لنا بريدًا إلكترونيًا](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) لمناقشة احتياجات الشركات. </br>
> بالنسبة للشركات الناشئة والشركات الصغيرة التي تستخدم خدمات AWS، تحقق من [Dify Premium على AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) ونشرها في شبكتك الخاصة على AWS VPC بنقرة واحدة. إنها عرض AMI بأسعار معقولة مع خيار إنشاء تطبيقات بشعار وعلامة تجارية مخصصة.
## البقاء قدمًا
قم بإضافة نجمة إلى Dify على GitHub وتلق تنبيهًا فوريًا بالإصدارات الجديدة.
![نجمنا](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## البداية السريعة
> قبل تثبيت Dify، تأكد من أن جهازك يلبي الحد الأدنى من متطلبات النظام التالية:
>
>- معالج >= 2 نواة
>- ذاكرة وصول عشوائي (RAM) >= 4 جيجابايت
</br>
أسهل طريقة لبدء تشغيل خادم Dify هي تشغيل ملف [docker-compose.yml](docker/docker-compose.yaml) الخاص بنا. قبل تشغيل أمر التثبيت، تأكد من تثبيت [Docker](https://docs.docker.com/get-docker/) و [Docker Compose](https://docs.docker.com/compose/install/) على جهازك:
```bash
cd docker
cp .env.example .env
docker compose up -d
```
بعد التشغيل، يمكنك الوصول إلى لوحة تحكم Dify في متصفحك على [http://localhost/install](http://localhost/install) وبدء عملية التهيئة.
> إذا كنت ترغب في المساهمة في Dify أو القيام بتطوير إضافي، فانظر إلى [دليلنا للنشر من الشفرة (code) المصدرية](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## الخطوات التالية
إذا كنت بحاجة إلى تخصيص الإعدادات، فيرجى الرجوع إلى التعليقات في ملف [.env.example](docker/.env.example) وتحديث القيم المقابلة في ملف `.env`. بالإضافة إلى ذلك، قد تحتاج إلى إجراء تعديلات على ملف `docker-compose.yaml` نفسه، مثل تغيير إصدارات الصور أو تعيينات المنافذ أو نقاط تحميل وحدات التخزين، بناءً على بيئة النشر ومتطلباتك الخاصة. بعد إجراء أي تغييرات، يرجى إعادة تشغيل `docker-compose up -d`. يمكنك العثور على قائمة كاملة بمتغيرات البيئة المتاحة [هنا](https://docs.dify.ai/getting-started/install-self-hosted/environments).
يوجد مجتمع خاص بـ [Helm Charts](https://helm.sh/) وملفات YAML التي تسمح بتنفيذ Dify على Kubernetes للنظام من الإيجابيات العلوية.
- [رسم بياني Helm من قبل @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [رسم بياني Helm من قبل @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [ملف YAML من قبل @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### استخدام Terraform للتوزيع
##### Azure Global
استخدم [terraform](https://www.terraform.io/) لنشر Dify على Azure بنقرة واحدة.
- [Azure Terraform بواسطة @nikawang](https://github.com/nikawang/dify-azure-terraform)
## المساهمة
لأولئك الذين يرغبون في المساهمة، انظر إلى [دليل المساهمة](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) لدينا.
في الوقت نفسه، يرجى النظر في دعم Dify عن طريق مشاركته على وسائل التواصل الاجتماعي وفي الفعاليات والمؤتمرات.
> نحن نبحث عن مساهمين لمساعدة في ترجمة Dify إلى لغات أخرى غير اللغة الصينية المندرين أو الإنجليزية. إذا كنت مهتمًا بالمساعدة، يرجى الاطلاع على [README للترجمة](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) لمزيد من المعلومات، واترك لنا تعليقًا في قناة `global-users` على [خادم المجتمع على Discord](https://discord.gg/8Tpq4AcN9c).
**المساهمون**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## المجتمع والاتصال
* [مناقشة Github](https://github.com/langgenius/dify/discussions). الأفضل لـ: مشاركة التعليقات وطرح الأسئلة.
* [المشكلات على GitHub](https://github.com/langgenius/dify/issues). الأفضل لـ: الأخطاء التي تواجهها في استخدام Dify.AI، واقتراحات الميزات. انظر [دليل المساهمة](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). الأفضل لـ: مشاركة تطبيقاتك والترفيه مع المجتمع.
* [تويتر](https://twitter.com/dify_ai). الأفضل لـ: مشاركة تطبيقاتك والترفيه مع المجتمع.
## تاريخ النجمة
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## الكشف عن الأمان
لحماية خصوصيتك، يرجى تجنب نشر مشكلات الأمان على GitHub. بدلاً من ذلك، أرسل أسئلتك إلى security@dify.ai وسنقدم لك إجابة أكثر تفصيلاً.
## الرخصة
هذا المستودع متاح تحت [رخصة البرنامج الحر Dify](LICENSE)، والتي تعتبر بشكل أساسي Apache 2.0 مع بعض القيود الإضافية.

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify 云服务</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">自托管</a> ·
<a href="https://docs.dify.ai">文档</a> ·
<a href="https://cal.com/guchenhe/dify-demo">预约演示</a>
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">(需用英文)常见问题解答 / 联系团队</a>
</div>
<p align="center">
@ -29,12 +29,16 @@
</p>
<div align="center">
<a href="./README.md"><img alt="上个月的提交次数" src="https://img.shields.io/badge/英文-d9d9d9"></a>
<a href="./README_CN.md"><img alt="上个月的提交次数" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="上个月的提交次数" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="上个月的提交次数" src="https://img.shields.io/badge/西班牙语-d9d9d9"></a>
<a href="./README_KL.md"><img alt="上个月的提交次数" src="https://img.shields.io/badge/法语-d9d9d9"></a>
<a href="./README_FR.md"><img alt="上个月的提交次数" src="https://img.shields.io/badge/克林贡语-d9d9d9"></a>
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</div>
@ -68,7 +72,7 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
广泛的 RAG 功能,涵盖从文档摄入到检索的所有内容,支持从 PDF、PPT 和其他常见文档格式中提取文本的开箱即用的支持。
**5. Agent 智能体**:
您可以基于 LLM 函数调用或 ReAct 定义 Agent并为 Agent 添加预构建或自定义工具。Dify 为 AI Agent 提供了50多种内置工具如谷歌搜索、DELL·E、Stable Diffusion 和 WolframAlpha 等。
您可以基于 LLM 函数调用或 ReAct 定义 Agent并为 Agent 添加预构建或自定义工具。Dify 为 AI Agent 提供了50多种内置工具如谷歌搜索、DALL·E、Stable Diffusion 和 WolframAlpha 等。
**6. LLMOps**:
随时间监视和分析应用程序日志和性能。您可以根据生产数据和标注持续改进提示、数据集和模型。
@ -111,7 +115,7 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
@ -154,7 +158,7 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
使用我们的[文档](https://docs.dify.ai)进行进一步的参考和更深入的说明。
- **面向企业/组织的 Dify</br>**
我们提供额外的面向企业的功能。[与我们安排会议](https://cal.com/guchenhe/30min)或[给我们发送电子邮件](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry)讨论企业需求。 </br>
我们提供额外的面向企业的功能。[给我们发送电子邮件](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry)讨论企业需求。 </br>
> 对于使用 AWS 的初创公司和中小型企业,请查看 [AWS Marketplace 上的 Dify 高级版](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6),并使用一键部署到您自己的 AWS VPC。它是一个价格实惠的 AMI 产品,提供了使用自定义徽标和品牌创建应用程序的选项。
## 保持领先
@ -178,21 +182,29 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
```bash
cd docker
cp .env.example .env
docker compose up -d
```
运行后,可以在浏览器上访问 [http://localhost/install](http://localhost/install) 进入 Dify 控制台并开始初始化安装操作。
### 自定义配置
如果您需要自定义配置,请参考 [.env.example](docker/.env.example) 文件中的注释,并更新 `.env` 文件中对应的值。此外,您可能需要根据您的具体部署环境和需求对 `docker-compose.yaml` 文件本身进行调整,例如更改镜像版本、端口映射或卷挂载。完成任何更改后,请重新运行 `docker-compose up -d`。您可以在[此处](https://docs.dify.ai/getting-started/install-self-hosted/environments)找到可用环境变量的完整列表。
#### 使用 Helm Chart 部署
使用 [Helm Chart](https://helm.sh/) 版本,可以在 Kubernetes 上部署 Dify。
使用 [Helm Chart](https://helm.sh/) 版本或者 YAML 文件,可以在 Kubernetes 上部署 Dify。
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML 文件 by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
### 配置
#### 使用 Terraform 部署
如果您需要自定义配置,请参考我们的 [docker-compose.yml](docker/docker-compose.yaml) 文件中的注释,并手动设置环境配置。更改后,请再次运行 `docker-compose up -d`。您可以在我们的[文档](https://docs.dify.ai/getting-started/install-self-hosted/environments)中查看所有环境变量的完整列表。
##### Azure Global
使用 [terraform](https://www.terraform.io/) 一键部署 Dify 到 Azure。
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Star History

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Auto-alojamiento</a> ·
<a href="https://docs.dify.ai">Documentación</a> ·
<a href="https://cal.com/guchenhe/dify-demo">Programar demostración</a>
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Consultas empresariales (en inglés)</a>
</p>
<p align="center">
@ -29,12 +29,16 @@
</p>
<p align="center">
<a href="./README.md"><img alt="Actividad de Commits el último mes" src="https://img.shields.io/badge/Inglés-d9d9d9"></a>
<a href="./README_CN.md"><img alt="Actividad de Commits el último mes" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="Actividad de Commits el último mes" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="Actividad de Commits el último mes" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_KL.md"><img alt="Actividad de Commits el último mes" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_FR.md"><img alt="Actividad de Commits el último mes" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
#
@ -54,7 +58,7 @@ Dify es una plataforma de desarrollo de aplicaciones de LLM de código abierto.
**2. Soporte de modelos completo**:
Integración perfecta con cientos de LLMs propietarios / de código abierto de docenas de proveedores de inferencia y soluciones auto-alojadas, que cubren GPT, Mistral, Llama2 y cualquier modelo compatible con la API de OpenAI. Se puede encontrar una lista completa de proveedores de modelos admitidos [aquí](https://docs.dify.ai/getting-started/readme/model-providers).
Integración perfecta con cientos de LLMs propietarios / de código abierto de docenas de proveedores de inferencia y soluciones auto-alojadas, que cubren GPT, Mistral, Llama3 y cualquier modelo compatible con la API de OpenAI. Se puede encontrar una lista completa de proveedores de modelos admitidos [aquí](https://docs.dify.ai/getting-started/readme/model-providers).
![proveedores-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
@ -68,7 +72,7 @@ Dify es una plataforma de desarrollo de aplicaciones de LLM de código abierto.
**5. Capacidades de agente**:
Puedes definir agent
es basados en LLM Function Calling o ReAct, y agregar herramientas preconstruidas o personalizadas para el agente. Dify proporciona más de 50 herramientas integradas para agentes de IA, como Búsqueda de Google, DELL·E, Difusión Estable y WolframAlpha.
es basados en LLM Function Calling o ReAct, y agregar herramientas preconstruidas o personalizadas para el agente. Dify proporciona más de 50 herramientas integradas para agentes de IA, como Búsqueda de Google, DALL·E, Difusión Estable y WolframAlpha.
**6. LLMOps**:
Supervisa y analiza registros de aplicaciones y rendimiento a lo largo del tiempo. Podrías mejorar continuamente prompts, conjuntos de datos y modelos basados en datos de producción y anotaciones.
@ -111,7 +115,7 @@ es basados en LLM Function Calling o ReAct, y agregar herramientas preconstruida
<td align="center">Agente</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
@ -154,7 +158,7 @@ Pon rápidamente Dify en funcionamiento en tu entorno con esta [guía de inicio
Usa nuestra [documentación](https://docs.dify.ai) para más referencias e instrucciones más detalladas.
- **Dify para Empresas / Organizaciones</br>**
Proporcionamos características adicionales centradas en la empresa. [Programa una reunión con nosotros](https://cal.com/guchenhe/30min) o [envíanos un correo electrónico](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) para discutir las necesidades empresariales. </br>
Proporcionamos características adicionales centradas en la empresa. [Envíanos un correo electrónico](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) para discutir las necesidades empresariales. </br>
> Para startups y pequeñas empresas que utilizan AWS, echa un vistazo a [Dify Premium en AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) e impleméntalo en tu propio VPC de AWS con un clic. Es una AMI asequible que ofrece la opción de crear aplicaciones con logotipo y marca personalizados.
@ -178,6 +182,7 @@ La forma más fácil de iniciar el servidor de Dify es ejecutar nuestro archivo
```bash
cd docker
cp .env.example .env
docker compose up -d
```
@ -187,14 +192,21 @@ Después de ejecutarlo, puedes acceder al panel de control de Dify en tu navegad
## Próximos pasos
Si necesitas personalizar la configuración, consulta los comentarios en nuestro archivo [docker-compose.yml](docker/docker-compose.yaml) y configura manualmente la configuración del entorno
Si necesita personalizar la configuración, consulte los comentarios en nuestro archivo [.env.example](docker/.env.example) y actualice los valores correspondientes en su archivo `.env`. Además, es posible que deba realizar ajustes en el propio archivo `docker-compose.yaml`, como cambiar las versiones de las imágenes, las asignaciones de puertos o los montajes de volúmenes, según su entorno de implementación y requisitos específicos. Después de realizar cualquier cambio, vuelva a ejecutar `docker-compose up -d`. Puede encontrar la lista completa de variables de entorno disponibles [aquí](https://docs.dify.ai/getting-started/install-self-hosted/environments).
. Después de realizar los cambios, ejecuta `docker-compose up -d` nuevamente. Puedes ver la lista completa de variables de entorno [aquí](https://docs.dify.ai/getting-started/install-self-hosted/environments).
Si deseas configurar una instalación altamente disponible, hay [Gráficos Helm](https://helm.sh/) contribuidos por la comunidad que permiten implementar Dify en Kubernetes.
Si desea configurar una configuración de alta disponibilidad, la comunidad proporciona [Gráficos Helm](https://helm.sh/) y archivos YAML, a través de los cuales puede desplegar Dify en Kubernetes.
- [Gráfico Helm por @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Gráfico Helm por @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [Ficheros YAML por @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Uso de Terraform para el despliegue
##### Azure Global
Utiliza [terraform](https://www.terraform.io/) para desplegar Dify en Azure con un solo clic.
- [Azure Terraform por @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contribuir
@ -215,27 +227,9 @@ Al mismo tiempo, considera apoyar a Dify compartiéndolo en redes sociales y en
* [Discusión en GitHub](https://github.com/langgenius/dify/discussions). Lo mejor para: compartir comentarios y hacer preguntas.
* [Reporte de problemas en GitHub](https://github.com/langgenius/dify/issues). Lo mejor para: errores que encuentres usando Dify.AI y propuestas de características. Consulta nuestra [Guía de contribución](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Correo electrónico](mailto:support@dify.ai?subject=[GitHub]Questions%20About%20Dify). Lo mejor para: preguntas que tengas sobre el uso de Dify.AI.
* [Discord](https://discord.gg/FngNHpbcY7). Lo mejor para: compartir tus aplicaciones y pasar el rato con la comunidad.
* [Twitter](https://twitter.com/dify_ai). Lo mejor para: compartir tus aplicaciones y pasar el rato con la comunidad.
O, programa una reunión directamente con un miembro del equipo:
<table>
<tr>
<th>Punto de Contacto</th>
<th>Propósito</th>
</tr>
<tr>
<td><a href='https://cal.com/guchenhe/15min' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/9ebcd111-1205-4d71-83d5-948d70b809f5' alt='Git-Hub-README-Button-3x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Consultas comerciales y retroalimentación del producto</td>
</tr>
<tr>
<td><a href='https://cal.com/pinkbanana' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/d1edd00a-d7e4-4513-be6c-e57038e143fd' alt='Git-Hub-README-Button-2x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Contribuciones, problemas y solicitudes de características</td>
</tr>
</table>
## Historial de Estrellas
[![Gráfico de Historial de Estrellas](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
@ -247,4 +241,4 @@ Para proteger tu privacidad, evita publicar problemas de seguridad en GitHub. En
## Licencia
Este repositorio está disponible bajo la [Licencia de Código Abierto de Dify](LICENSE), que es esencialmente Apache 2.0 con algunas restricciones adicionales.
Este repositorio está disponible bajo la [Licencia de Código Abierto de Dify](LICENSE), que es esencialmente Apache 2.0 con algunas restricciones adicionales.

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Auto-hébergement</a> ·
<a href="https://docs.dify.ai">Documentation</a> ·
<a href="https://cal.com/guchenhe/dify-demo">Planifier une démo</a>
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Demande dentreprise (en anglais seulement)</a>
</p>
<p align="center">
@ -29,12 +29,16 @@
</p>
<p align="center">
<a href="./README.md"><img alt="Commits le mois dernier" src="https://img.shields.io/badge/Anglais-d9d9d9"></a>
<a href="./README_CN.md"><img alt="Commits le mois dernier" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="Commits le mois dernier" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="Commits le mois dernier" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_KL.md"><img alt="Commits le mois dernier" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_FR.md"><img alt="Commits le mois dernier" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
#
@ -54,7 +58,7 @@ Dify est une plateforme de développement d'applications LLM open source. Son in
**2. Prise en charge complète des modèles**:
Intégration transparente avec des centaines de LLM propriétaires / open source provenant de dizaines de fournisseurs d'inférence et de solutions auto-hébergées, couvrant GPT, Mistral, Llama2, et tous les modèles compatibles avec l'API OpenAI. Une liste complète des fournisseurs de modèles pris en charge se trouve [ici](https://docs.dify.ai/getting-started/readme/model-providers).
Intégration transparente avec des centaines de LLM propriétaires / open source provenant de dizaines de fournisseurs d'inférence et de solutions auto-hébergées, couvrant GPT, Mistral, Llama3, et tous les modèles compatibles avec l'API OpenAI. Une liste complète des fournisseurs de modèles pris en charge se trouve [ici](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
@ -68,7 +72,7 @@ Dify est une plateforme de développement d'applications LLM open source. Son in
**5. Capac
ités d'agent**:
Vous pouvez définir des agents basés sur l'appel de fonction LLM ou ReAct, et ajouter des outils pré-construits ou personnalisés pour l'agent. Dify fournit plus de 50 outils intégrés pour les agents d'IA, tels que la recherche Google, DELL·E, Stable Diffusion et WolframAlpha.
Vous pouvez définir des agents basés sur l'appel de fonction LLM ou ReAct, et ajouter des outils pré-construits ou personnalisés pour l'agent. Dify fournit plus de 50 outils intégrés pour les agents d'IA, tels que la recherche Google, DALL·E, Stable Diffusion et WolframAlpha.
**6. LLMOps**:
Surveillez et analysez les journaux d'application et les performances au fil du temps. Vous pouvez continuellement améliorer les prompts, les ensembles de données et les modèles en fonction des données de production et des annotations.
@ -111,7 +115,7 @@ ités d'agent**:
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
@ -154,7 +158,7 @@ Lancez rapidement Dify dans votre environnement avec ce [guide de démarrage](#q
Utilisez notre [documentation](https://docs.dify.ai) pour plus de références et des instructions plus détaillées.
- **Dify pour les entreprises / organisations</br>**
Nous proposons des fonctionnalités supplémentaires adaptées aux entreprises. [Planifiez une réunion avec nous](https://cal.com/guchenhe/30min) ou [envoyez-nous un e-mail](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) pour discuter des besoins de l'entreprise. </br>
Nous proposons des fonctionnalités supplémentaires adaptées aux entreprises. [Envoyez-nous un e-mail](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) pour discuter des besoins de l'entreprise. </br>
> Pour les startups et les petites entreprises utilisant AWS, consultez [Dify Premium sur AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) et déployez-le dans votre propre VPC AWS en un clic. C'est une offre AMI abordable avec la possibilité de créer des applications avec un logo et une marque personnalisés.
@ -178,6 +182,7 @@ La manière la plus simple de démarrer le serveur Dify est d'exécuter notre fi
```bash
cd docker
cp .env.example .env
docker compose up -d
```
@ -187,14 +192,19 @@ Après l'exécution, vous pouvez accéder au tableau de bord Dify dans votre nav
## Prochaines étapes
Si vous devez personnaliser la configuration, veuillez
Si vous devez personnaliser la configuration, veuillez vous référer aux commentaires dans notre fichier [.env.example](docker/.env.example) et mettre à jour les valeurs correspondantes dans votre fichier `.env`. De plus, vous devrez peut-être apporter des modifications au fichier `docker-compose.yaml` lui-même, comme changer les versions d'image, les mappages de ports ou les montages de volumes, en fonction de votre environnement de déploiement et de vos exigences spécifiques. Après avoir effectué des modifications, veuillez réexécuter `docker-compose up -d`. Vous pouvez trouver la liste complète des variables d'environnement disponibles [ici](https://docs.dify.ai/getting-started/install-self-hosted/environments).
vous référer aux commentaires dans notre fichier [docker-compose.yml](docker/docker-compose.yaml) et définir manuellement la configuration de l'environnement. Après avoir apporté les modifications, veuillez exécuter à nouveau `docker-compose up -d`. Vous pouvez voir la liste complète des variables d'environnement [ici](https://docs.dify.ai/getting-started/install-self-hosted/environments).
Si vous souhaitez configurer une installation hautement disponible, il existe des [Helm Charts](https://helm.sh/) contribués par la communauté qui permettent de déployer Dify sur Kubernetes.
Si vous souhaitez configurer une configuration haute disponibilité, la communauté fournit des [Helm Charts](https://helm.sh/) et des fichiers YAML, à travers lesquels vous pouvez déployer Dify sur Kubernetes.
- [Helm Chart par @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart par @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [Fichier YAML par @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Utilisation de Terraform pour le déploiement
##### Azure Global
Utilisez [terraform](https://www.terraform.io/) pour déployer Dify sur Azure en un clic.
- [Azure Terraform par @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contribuer
@ -215,27 +225,9 @@ Dans le même temps, veuillez envisager de soutenir Dify en le partageant sur le
* [Discussion GitHub](https://github.com/langgenius/dify/discussions). Meilleur pour: partager des commentaires et poser des questions.
* [Problèmes GitHub](https://github.com/langgenius/dify/issues). Meilleur pour: les bogues que vous rencontrez en utilisant Dify.AI et les propositions de fonctionnalités. Consultez notre [Guide de contribution](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [E-mail](mailto:support@dify.ai?subject=[GitHub]Questions%20About%20Dify). Meilleur pour: les questions que vous avez sur l'utilisation de Dify.AI.
* [Discord](https://discord.gg/FngNHpbcY7). Meilleur pour: partager vos applications et passer du temps avec la communauté.
* [Twitter](https://twitter.com/dify_ai). Meilleur pour: partager vos applications et passer du temps avec la communauté.
Ou, planifiez directement une réunion avec un membre de l'équipe:
<table>
<tr>
<th>Point de contact</th>
<th>Objectif</th>
</tr>
<tr>
<td><a href='https://cal.com/guchenhe/15min' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/9ebcd111-1205-4d71-83d5-948d70b809f5' alt='Git-Hub-README-Button-3x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Demandes commerciales & retours produit</td>
</tr>
<tr>
<td><a href='https://cal.com/pinkbanana' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/d1edd00a-d7e4-4513-be6c-e57038e143fd' alt='Git-Hub-README-Button-2x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Contributions, problèmes & demandes de fonctionnalités</td>
</tr>
</table>
## Historique des étoiles
[![Graphique de l'historique des étoiles](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)

View File

@ -2,9 +2,9 @@
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">自己ホスティング</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">セルフホスティング</a> ·
<a href="https://docs.dify.ai">ドキュメント</a> ·
<a href="https://cal.com/guchenhe/dify-demo">デモのスケジュール</a>
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">企業のお問い合わせ(英語のみ)</a>
</p>
<p align="center">
@ -29,12 +29,16 @@
</p>
<p align="center">
<a href="./README.md"><img alt="先月のコミット" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="先月のコミット" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="先月のコミット" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="先月のコミット" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_KL.md"><img alt="先月のコミット" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_FR.md"><img alt="先月のコミット" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
#
@ -43,39 +47,37 @@
<a href="https://trendshift.io/repositories/2152" target="_blank"><img src="https://trendshift.io/api/badge/repositories/2152" alt="langgenius%2Fdify | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
</p>
DifyはオープンソースのLLMアプリケーション開発プラットフォームです。直感的なインターフェスには、AIワークフロー、RAGパイプライン、エージェント機能、モデル管理、観測機能などが組み合わさっており、プロトタイプから本番までの移行を迅速に行うことができます。以下は、主要機能のリストです:
DifyはオープンソースのLLMアプリケーション開発プラットフォームです。直感的なインターフェスには、AIワークフロー、RAGパイプライン、エージェント機能、モデル管理、観測機能などが組み合わさっており、プロトタイプから生産まで迅速に進めることができます。以下の機能が含まれます:
</br> </br>
**1. ワークフロー**:
ビジュアルキャンバス上で強力なAIワークフローを構築しテストし、以下の機能を活用してプロトタイプを超えることができます。
強力なAIワークフローをビジュアルキャンバス上で構築しテストできます。すべての機能、および以下の機能を使用できます。
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. 網羅的なモデルサポート**:
数百のプロプライエタリ/オープンソースのLLMと、数十の推論プロバイダーおよびセルフホスティングソリューションとのシームレスな統合を提供します。GPT、Mistral、Llama2、およびOpenAI API互換のモデルをカバーします。サポートされているモデルプロバイダーの完全なリストは[こちら](https://docs
.dify.ai/getting-started/readme/model-providers)をご覧ください。
**2. 総合的なモデルサポート**:
数百のプロプライエタリ/オープンソースのLLMと、数十の推論プロバイダーおよびセルフホスティングソリューションとのシームレスな統合を提供します。GPT、Mistral、Llama3、OpenAI API互換性のあるすべてのモデルを統合されています。サポートされているモデルプロバイダーの完全なリストは[こちら](https://docs.dify.ai/getting-started/readme/model-providers)をご覧ください。
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. プロンプトIDE**:
チャットベースのアプリにテキスト読み上げなどの追加機能を追加するプロンプト作成、モデルパフォーマンス比較する直感的なインターフェース
プロンプト作成、モデルパフォーマンス比較が行え、チャットベースのアプリに音声合成などの機能も追加できます
**4. RAGパイプライン**:
文書の取り込みから取得までをカバーする幅広いRAG機能で、PDF、PPTなどの一般的なドキュメント形式からのテキスト抽出に対するアウトオブボックスのサポートを提供します。
ドキュメントの取り込みから検索までをカバーする広範なRAG機能ができます。ほかにもPDF、PPT、その他の一般的なドキュメントフォーマットからのテキスト抽出のサーポイントも提供します。
**5. エージェント機能**:
LLM関数呼び出しまたはReActに基づいてエージェント定義し、エージェント向けの事前構築済みまたはカスタムツールを追加できます。Difyには、Google検索、DELL·E、Stable Diffusion、WolframAlphaなどのAIエージェント用の50以上の組み込みツールが用意されています。
LLM Function CallingやReActに基づエージェント定義が可能で、AIエージェント用のプリビルトまたはカスタムツールを追加できます。Difyには、Google検索、DALL·E、Stable Diffusion、WolframAlphaなどのAIエージェント用の50以上の組み込みツールが提供します。
**6. LLMOps**:
アプリケーションログパフォーマンスを時間の経過とともにモニタリングおよび分析します。本番データと注釈に基づいて、プロンプト、データセット、およびモデルを継続的に改善できます。
アプリケーションログパフォーマンスを監視と分析し、生産のデータと注釈に基づいて、プロンプト、データセット、モデルを継続的に改善できます。
**7. Backend-as-a-Service**:
Difyのすべての提供には、それに対応するAPIが付属しており、独自のビジネスロジックにDifyをシームレスに統合できます。
すべての機能はAPIを提供されており、Difyを自分のビジネスロジックに簡単に統合できます。
## 機能比較
@ -96,9 +98,9 @@ DifyはオープンソースのLLMアプリケーション開発プラットフ
</tr>
<tr>
<td align="center">サポートされているLLM</td>
<td align="center">豊富なバリエーション</td>
<td align="center">豊富なバリエーション</td>
<td align="center">豊富なバリエーション</td>
<td align="center">バラエティ豊か</td>
<td align="center">バラエティ豊か</td>
<td align="center">バラエティ豊か</td>
<td align="center">OpenAIのみ</td>
</tr>
<tr>
@ -112,7 +114,7 @@ DifyはオープンソースのLLMアプリケーション開発プラットフ
<td align="center">エージェント</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
@ -148,39 +150,38 @@ DifyはオープンソースのLLMアプリケーション開発プラットフ
## Difyの使用方法
- **クラウド </br>**
[こちら](https://dify.ai)のDify Cloudサービスを利用して、セットアップ不要で誰でも試すことができます。サンドボックスプランは、200回の無料のGPT-4呼び出しが含まれています。
[こちら](https://dify.ai)のDify Cloudサービスを利用して、セットアップ不要で試すことができます。サンドボックスプランは、200回のGPT-4呼び出しが無料で含まれています。
- **Dify Community Editionのセルフホスティング</br>**
この[スターターガイド](#quick-start)を使用して、環境でDifyをすばやく実行できます。
さらなる参照や詳細な手順については、[ドキュメント](https://docs.dify.ai)をご覧ください。
この[スターガイド](#quick-start)を使用して、ローカル環境でDifyを簡単に実行できます。
詳しくは[ドキュメント](https://docs.dify.ai)をご覧ください。
- **エンタープライズ/組織向けのDify</br>**
追加のエンタープライズ向け機能を提供しています。[こちらからミーティ
ングを予約](https://cal.com/guchenhe/30min)したり、[メールを送信](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry)してエンタープライズのニーズについて相談してください。 </br>
> AWSを使用しているスタートアップや中小企業の場合は、[AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6)のDify Premiumをチェックして、ワンクリックで独自のAWS VPCにデプロイできます。カスタムロゴとブランディングでアプリを作成するオプションを備えた手頃な価格のAMIオファリングです。
- **企業/組織向けのDify</br>**
企業中心の機能を提供しています。[メールを送信](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry)して企業のニーズについて相談してください。 </br>
> AWSを使用しているスタートアップ企業や中小企業の場合は、[AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6)のDify Premiumをチェックして、ワンクリックで自分のAWS VPCにデプロイできます。さらに、手頃な価格のAMIオファリングどして、ロゴやブランディングをカスタマイズしてアプリケーションを作成するオプションがあります。
## 先を見る
## 最新の情報を入手
GitHubでDifyにスターを付け新しいリリースをすぐに通知されます。
GitHubでDifyにスターを付けることで、Difyに関する新しいニュースを受け取れます。
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## クイックスタート
> Difyをインストールする前に、マシンが以下の最小システム要件を満たしていることを確認してください
> Difyをインストールする前に、お使いのマシンが以下の最小システム要件を満たしていることを確認してください:
>
>- CPU >= 2コア
>- RAM >= 4GB
</br>
Difyサーバーを起動する最も簡単な方法は、当社の[docker-compose.yml](docker/docker-compose.yaml)ファイルを実行することです。インストールコマンドを実行する前に、マシンに[Docker](https://docs.docker.com/get-docker/)と[Docker Compose](https://docs.docker.com/compose/install/)がインストールされていることを確認してください。
Difyサーバーを起動する最も簡単な方法は、[docker-compose.yml](docker/docker-compose.yaml)ファイルを実行することです。インストールコマンドを実行する前に、マシンに[Docker](https://docs.docker.com/get-docker/)と[Docker Compose](https://docs.docker.com/compose/install/)がインストールされていることを確認してください。
```bash
cd docker
cp .env.example .env
docker compose up -d
```
@ -190,12 +191,19 @@ docker compose up -d
## 次のステップ
環境設定をカスタマイズする場合は、[docker-compose.yml](docker/docker-compose.yaml)ファイル内のコメントを参照して、環境設定を手動で設定してください。変更を加えた後は、再び `docker-compose up -d` を実行してください。環境変数の完全なリストは[こちら](https://docs.dify.ai/getting-started/install-self-hosted/environments)をご覧ください
設定をカスタマイズする必要がある場合は、[.env.example](docker/.env.example) ファイルのコメントを参照し、`.env` ファイルの対応する値を更新してください。さらに、デプロイ環境や要件に応じて、`docker-compose.yaml` ファイル自体を調整する必要がある場合があります。たとえば、イメージのバージョン、ポートのマッピング、ボリュームのマウントなどを変更します。変更を加えた後は、`docker-compose up -d`実行してください。利用可能な環境変数の全一覧は、[こちら](https://docs.dify.ai/getting-started/install-self-hosted/environments)で確認できます
高可用性のセットアップを構成する場合、コミュニティによって提供されている[Helm Charts](https://helm.sh/)があり、これによりKubernetes上にDifyを展開できます。
高可用性設定を設定する必要がある場合、コミュニティ[Helm Charts](https://helm.sh/)とYAMLファイルにより、DifyをKubernetesにデプロイすることができます。
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Terraformを使用したデプロイ
##### Azure Global
[terraform](https://www.terraform.io/) を使用して、AzureにDifyをワンクリックでデプロイします。
- [nikawangのAzure Terraform](https://github.com/nikawang/dify-azure-terraform)
## 貢献
@ -215,35 +223,12 @@ docker compose up -d
## コミュニティ & お問い合わせ
* [Github Discussion](https://github.com/langgenius/dify/discussions). 主に: フィードバックの共有や質問。
* [GitHub Issues](https://github.com/langgenius/dify/issues). 主に: Dify.AI使用中に遭遇したバグや機能提案。
* [Email](mailto:support@dify.ai?subject=[GitHub]Questions%20About%20Dify). 主に: Dify.AIの使用に関する質問。
* [GitHub Issues](https://github.com/langgenius/dify/issues). 主に: Dify.AI使用する際に発生するエラーや問題については、[貢献ガイド](CONTRIBUTING_JA.md)を参照してください
* [Discord](https://discord.gg/FngNHpbcY7). 主に: アプリケーションの共有やコミュニティとの交流。
* [Twitter](https://twitter.com/dify_ai). 主に: アプリケーションの共有やコミュニティとの交流。
または、直接チームメンバーとミーティングをスケジュールします:
<table>
<tr>
<th>連絡先</th>
<th>目的</th>
</tr>
<tr>
<td><a href='https://cal.com
/guchenhe/30min'>ミーティング</a></td>
<td>無料の30分間のミーティングをスケジュールしてください。</td>
</tr>
<tr>
<td><a href='mailto:support@dify.ai?subject=[GitHub]Technical%20Support'>技術サポート</a></td>
<td>技術的な問題やサポートに関する質問</td>
</tr>
<tr>
<td><a href='mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry'>営業担当</a></td>
<td>法人ライセンスに関するお問い合わせ</td>
</tr>
</table>
## ライセンス
プロジェクトはMITライセンスの下で利用可能です。[LICENSE](LICENSE)をご参照ください
このリポジトリは、Dify Open Source License にいくつかの追加制限を加えた[Difyオープンソースライセンス](LICENSE)の下で利用可能です

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Self-hosting</a> ·
<a href="https://docs.dify.ai">Documentation</a> ·
<a href="https://cal.com/guchenhe/dify-demo">Schedule demo</a>
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Commercial enquiries</a>
</p>
<p align="center">
@ -29,12 +29,16 @@
</p>
<p align="center">
<a href="./README.md"><img alt="Commits last month" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="Commits last month" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="Commits last month" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="Commits last month" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_KL.md"><img alt="Commits last month" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_FR.md"><img alt="Commits last month" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
#
@ -54,7 +58,7 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
**2. Comprehensive model support**:
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama2, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
@ -66,7 +70,7 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
Extensive RAG capabilities that cover everything from document ingestion to retrieval, with out-of-box support for text extraction from PDFs, PPTs, and other common document formats.
**5. Agent capabilities**:
You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DELL·E, Stable Diffusion and WolframAlpha.
You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DALL·E, Stable Diffusion and WolframAlpha.
**6. LLMOps**:
Monitor and analyze application logs and performance over time. You could continuously improve prompts, datasets, and models based on production data and annotations.
@ -111,7 +115,7 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
@ -154,7 +158,7 @@ Quickly get Dify running in your environment with this [starter guide](#quick-st
Use our [documentation](https://docs.dify.ai) for further references and more in-depth instructions.
- **Dify for Enterprise / Organizations</br>**
We provide additional enterprise-centric features. [Schedule a meeting with us](https://cal.com/guchenhe/30min) or [send us an email](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) to discuss enterprise needs. </br>
We provide additional enterprise-centric features. [Send us an email](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) to discuss enterprise needs. </br>
> For startups and small businesses using AWS, check out [Dify Premium on AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) and deploy it to your own AWS VPC with one-click. It's an affordable AMI offering with the option to create apps with custom logo and branding.
@ -178,6 +182,7 @@ The easiest way to start the Dify server is to run our [docker-compose.yml](dock
```bash
cd docker
cp .env.example .env
docker compose up -d
```
@ -187,12 +192,19 @@ After running, you can access the Dify dashboard in your browser at [http://loca
## Next steps
If you need to customize the configuration, please refer to the comments in our [docker-compose.yml](docker/docker-compose.yaml) file and manually set the environment configuration. After making the changes, please run `docker-compose up -d` again. You can see the full list of environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) which allow Dify to be deployed on Kubernetes.
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Terraform atorlugu pilersitsineq
##### Azure Global
Atoruk [terraform](https://www.terraform.io/) Dify-mik Azure-mut ataatsikkut ikkussuilluarlugu.
- [Azure Terraform atorlugu @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Contributing
@ -215,27 +227,9 @@ At the same time, please consider supporting Dify by sharing it on social media
). Best for: sharing feedback and asking questions.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Best for: bugs you encounter using Dify.AI, and feature proposals. See our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Email](mailto:support@dify.ai?subject=[GitHub]Questions%20About%20Dify). Best for: questions you have about using Dify.AI.
* [Discord](https://discord.gg/FngNHpbcY7). Best for: sharing your applications and hanging out with the community.
* [Twitter](https://twitter.com/dify_ai). Best for: sharing your applications and hanging out with the community.
Or, schedule a meeting directly with a team member:
<table>
<tr>
<th>Point of Contact</th>
<th>Purpose</th>
</tr>
<tr>
<td><a href='https://cal.com/guchenhe/15min' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/9ebcd111-1205-4d71-83d5-948d70b809f5' alt='Git-Hub-README-Button-3x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Business enquiries & product feedback</td>
</tr>
<tr>
<td><a href='https://cal.com/pinkbanana' target='_blank'><img class="schedule-button" src='https://github.com/langgenius/dify/assets/13230914/d1edd00a-d7e4-4513-be6c-e57038e143fd' alt='Git-Hub-README-Button-2x' style="width: 180px; height: auto; object-fit: contain;"/></a></td>
<td>Contributions, issues & feature requests</td>
</tr>
</table>
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
@ -247,4 +241,4 @@ To protect your privacy, please avoid posting security issues on GitHub. Instead
## License
This repository is available under the [Dify Open Source License](LICENSE), which is essentially Apache 2.0 with a few additional restrictions.
This repository is available under the [Dify Open Source License](LICENSE), which is essentially Apache 2.0 with a few additional restrictions.

235
README_KR.md Normal file
View File

@ -0,0 +1,235 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
<a href="https://cloud.dify.ai">Dify 클라우드</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">셀프-호스팅</a> ·
<a href="https://docs.dify.ai">문서</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">기업 문의 (영어만 가능)</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on Twitter"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
Dify는 오픈 소스 LLM 앱 개발 플랫폼입니다. 직관적인 인터페이스를 통해 AI 워크플로우, RAG 파이프라인, 에이전트 기능, 모델 관리, 관찰 기능 등을 결합하여 프로토타입에서 프로덕션까지 빠르게 전환할 수 있습니다. 주요 기능 목록은 다음과 같습니다:</br> </br>
**1. 워크플로우**:
다음 기능들을 비롯한 다양한 기능을 활용하여 시각적 캔버스에서 강력한 AI 워크플로우를 구축하고 테스트하세요.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. 포괄적인 모델 지원:**:
수십 개의 추론 제공업체와 자체 호스팅 솔루션에서 제공하는 수백 개의 독점 및 오픈 소스 LLM과 원활하게 통합되며, GPT, Mistral, Llama3 및 모든 OpenAI API 호환 모델을 포함합니다. 지원되는 모델 제공업체의 전체 목록은 [여기](https://docs.dify.ai/getting-started/readme/model-providers)에서 확인할 수 있습니다.
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. 통합 개발환경**:
프롬프트를 작성하고, 모델 성능을 비교하며, 텍스트-음성 변환과 같은 추가 기능을 채팅 기반 앱에 추가할 수 있는 직관적인 인터페이스를 제공합니다.
**4. RAG 파이프라인**:
문서 수집부터 검색까지 모든 것을 다루며, PDF, PPT 및 기타 일반적인 문서 형식에서 텍스트 추출을 위한 기본 지원이 포함되어 있는 광범위한 RAG 기능을 제공합니다.
**5. 에이전트 기능**:
LLM 함수 호출 또는 ReAct를 기반으로 에이전트를 정의하고 에이전트에 대해 사전 구축된 도구나 사용자 정의 도구를 추가할 수 있습니다. Dify는 Google Search, DALL·E, Stable Diffusion, WolframAlpha 등 AI 에이전트를 위한 50개 이상의 내장 도구를 제공합니다.
**6. LLMOps**:
시간 경과에 따른 애플리케이션 로그와 성능을 모니터링하고 분석합니다. 생산 데이터와 주석을 기반으로 프롬프트, 데이터세트, 모델을 지속적으로 개선할 수 있습니다.
**7. Backend-as-a-Service**:
Dify의 모든 제품에는 해당 API가 함께 제공되므로 Dify를 자신의 비즈니스 로직에 쉽게 통합할 수 있습니다.
## 기능 비교
<table style="width: 100%;">
<tr>
<th align="center">기능</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">프로그래밍 접근 방식</td>
<td align="center">API + 앱 중심</td>
<td align="center">Python 코드</td>
<td align="center">앱 중심</td>
<td align="center">API 중심</td>
</tr>
<tr>
<td align="center">지원되는 LLMs</td>
<td align="center">다양한 종류</td>
<td align="center">다양한 종류</td>
<td align="center">다양한 종류</td>
<td align="center">OpenAI 전용</td>
</tr>
<tr>
<td align="center">RAG 엔진</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">에이전트</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">워크플로우</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">가시성</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">기업용 기능 (SSO/접근 제어)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">로컬 배포</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## Dify 사용하기
- **클라우드 </br>**
우리는 누구나 설정이 필요 없이 사용해 볼 수 있도록 [Dify 클라우드](https://dify.ai) 서비스를 호스팅합니다. 이는 자체 배포 버전의 모든 기능을 제공하며, 샌드박스 플랜에서 무료로 200회의 GPT-4 호출을 포함합니다.
- **셀프-호스팅 Dify 커뮤니티 에디션</br>**
환경에서 Dify를 빠르게 실행하려면 이 [스타터 가이드를](#quick-start) 참조하세요.
추가 참조 및 더 심층적인 지침은 [문서](https://docs.dify.ai)를 사용하세요.
- **기업 / 조직을 위한 Dify</br>**
우리는 추가적인 기업 중심 기능을 제공합니다. 잡거나 [이메일 보내기](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry)를 통해 기업 요구 사항을 논의하십시오. </br>
> AWS를 사용하는 스타트업 및 중소기업의 경우 [AWS Marketplace에서 Dify Premium](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6)을 확인하고 한 번의 클릭으로 자체 AWS VPC에 배포하십시오. 맞춤형 로고와 브랜딩이 포함된 앱을 생성할 수 있는 옵션이 포함된 저렴한 AMI 제품입니다.
## 앞서가기
GitHub에서 Dify에 별표를 찍어 새로운 릴리스를 즉시 알림 받으세요.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## 빠른 시작
>Dify를 설치하기 전에 컴퓨터가 다음과 같은 최소 시스템 요구 사항을 충족하는지 확인하세요 :
>- CPU >= 2 Core
>- RAM >= 4GB
</br>
Dify 서버를 시작하는 가장 쉬운 방법은 [docker-compose.yml](docker/docker-compose.yaml) 파일을 실행하는 것입니다. 설치 명령을 실행하기 전에 [Docker](https://docs.docker.com/get-docker/) 및 [Docker Compose](https://docs.docker.com/compose/install/)가 머신에 설치되어 있는지 확인하세요.
```bash
cd docker
cp .env.example .env
docker compose up -d
```
실행 후 브라우저의 [http://localhost/install](http://localhost/install) 에서 Dify 대시보드에 액세스하고 초기화 프로세스를 시작할 수 있습니다.
> Dify에 기여하거나 추가 개발을 하고 싶다면 소스 코드에서 [배포에 대한 가이드](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)를 참조하세요.
## 다음 단계
구성을 사용자 정의해야 하는 경우 [.env.example](docker/.env.example) 파일의 주석을 참조하고 `.env` 파일에서 해당 값을 업데이트하십시오. 또한 특정 배포 환경 및 요구 사항에 따라 `docker-compose.yaml` 파일 자체를 조정해야 할 수도 있습니다. 예를 들어 이미지 버전, 포트 매핑 또는 볼륨 마운트를 변경합니다. 변경 한 후 `docker-compose up -d`를 다시 실행하십시오. 사용 가능한 환경 변수의 전체 목록은 [여기](https://docs.dify.ai/getting-started/install-self-hosted/environments)에서 찾을 수 있습니다.
Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했다는 커뮤니티가 제공하는 [Helm Charts](https://helm.sh/)와 YAML 파일이 존재합니다.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Terraform을 사용한 배포
##### Azure Global
[terraform](https://www.terraform.io/)을 사용하여 Azure에 Dify를 원클릭으로 배포하세요.
- [nikawang의 Azure Terraform](https://github.com/nikawang/dify-azure-terraform)
## 기여
코드에 기여하고 싶은 분들은 [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요.
동시에 Dify를 소셜 미디어와 행사 및 컨퍼런스에 공유하여 지원하는 것을 고려해 주시기 바랍니다.
> 우리는 Dify를 중국어나 영어 이외의 언어로 번역하는 데 도움을 줄 수 있는 기여자를 찾고 있습니다. 도움을 주고 싶으시다면 [i18n README](https://github.com/langgenius/dify/blob/main/web/i18n/README.md)에서 더 많은 정보를 확인하시고 [Discord 커뮤니티 서버](https://discord.gg/8Tpq4AcN9c)의 `global-users` 채널에 댓글을 남겨주세요.
**기여자**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## 커뮤니티 & 연락처
* [Github 토론](https://github.com/langgenius/dify/discussions). 피드백 공유 및 질문하기에 적합합니다.
* [GitHub 이슈](https://github.com/langgenius/dify/issues). Dify.AI 사용 중 발견한 버그와 기능 제안에 적합합니다. [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요.
* [디스코드](https://discord.gg/FngNHpbcY7). 애플리케이션 공유 및 커뮤니티와 소통하기에 적합합니다.
* [트위터](https://twitter.com/dify_ai). 애플리케이션 공유 및 커뮤니티와 소통하기에 적합합니다.
## Star 히스토리
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## 보안 공개
개인정보 보호를 위해 보안 문제를 GitHub에 게시하지 마십시오. 대신 security@dify.ai로 질문을 보내주시면 더 자세한 답변을 드리겠습니다.
## 라이선스
이 저장소는 기본적으로 몇 가지 추가 제한 사항이 있는 Apache 2.0인 [Dify 오픈 소스 라이선스](LICENSE)에 따라 사용할 수 있습니다.

237
README_TR.md Normal file
View File

@ -0,0 +1,237 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
<a href="https://cloud.dify.ai">Dify Bulut</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Kendi Sunucunuzda Barındırma</a> ·
<a href="https://docs.dify.ai">Dokümantasyon</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Yalnızca İngilizce: Kurumsal Sorgulama</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Statik Rozet" src="https://img.shields.io/badge/Ürün-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Statik Rozet" src="https://img.shields.io/badge/ücretsiz-fiyatlandırma?logo=free&color=%20%23155EEF&label=fiyatlandirma&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="Discord'da sohbet et"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="Twitter'da takip et"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Çekmeleri" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Geçen ay yapılan commitler" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Kapatılan sorunlar" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=kapatilan%20sorunlar&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Tartışma gönderileri" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
Dify, açık kaynaklı bir LLM uygulama geliştirme platformudur. Sezgisel arayüzü, AI iş akışı, RAG pipeline'ı, ajan yetenekleri, model yönetimi, gözlemlenebilirlik özellikleri ve daha fazlasını birleştirerek, prototipten üretime hızlıca geçmenizi sağlar. İşte temel özelliklerin bir listesi:
</br> </br>
**1. Workflow**:
Görsel bir arayüz üzerinde güçlü AI iş akışları oluşturun ve test edin, aşağıdaki tüm özellikleri ve daha fazlasını kullanarak.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Kapsamlı model desteği**:
Çok sayıda çıkarım sağlayıcısı ve kendi kendine barındırılan çözümlerden yüzlerce özel / açık kaynaklı LLM ile sorunsuz entegrasyon sağlar. GPT, Mistral, Llama3 ve OpenAI API uyumlu tüm modelleri kapsar. Desteklenen model sağlayıcılarının tam listesine [buradan](https://docs.dify.ai/getting-started/readme/model-providers) ulaşabilirsiniz.
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
Özür dilerim, haklısınız. Daha anlamlı ve akıcı bir çeviri yapmaya çalışayım. İşte güncellenmiş çeviri:
**3. Prompt IDE**:
Komut istemlerini oluşturmak, model performansını karşılaştırmak ve sohbet tabanlı uygulamalara metin-konuşma gibi ek özellikler eklemek için kullanıcı dostu bir arayüz.
**4. RAG Pipeline**:
Belge alımından bilgi çekmeye kadar geniş kapsamlı RAG yetenekleri. PDF'ler, PPT'ler ve diğer yaygın belge formatlarından metin çıkarma için hazır destek sunar.
**5. Ajan yetenekleri**:
LLM Fonksiyon Çağırma veya ReAct'a dayalı ajanlar tanımlayabilir ve bu ajanlara önceden hazırlanmış veya özel araçlar ekleyebilirsiniz. Dify, AI ajanları için Google Arama, DALL·E, Stable Diffusion ve WolframAlpha gibi 50'den fazla yerleşik araç sağlar.
**6. LLMOps**:
Uygulama loglarını ve performans metriklerini zaman içinde izleme ve analiz etme imkanı. Üretim ortamından elde edilen verilere ve kullanıcı geri bildirimlerine dayanarak, prompt'ları, veri setlerini ve modelleri sürekli olarak optimize edebilirsiniz. Bu sayede, AI uygulamanızın performansını ve doğruluğunu sürekli olarak artırabilirsiniz.
**7. Hizmet Olarak Backend**:
Dify'ın tüm özellikleri ilgili API'lerle birlikte gelir, böylece Dify'ı kendi iş mantığınıza kolayca entegre edebilirsiniz.
## Özellik karşılaştırması
<table style="width: 100%;">
<tr>
<th align="center">Özellik</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Programlama Yaklaşımı</td>
<td align="center">API + Uygulama odaklı</td>
<td align="center">Python Kodu</td>
<td align="center">Uygulama odaklı</td>
<td align="center">API odaklı</td>
</tr>
<tr>
<td align="center">Desteklenen LLM'ler</td>
<td align="center">Zengin Çeşitlilik</td>
<td align="center">Zengin Çeşitlilik</td>
<td align="center">Zengin Çeşitlilik</td>
<td align="center">Yalnızca OpenAI</td>
</tr>
<tr>
<td align="center">RAG Motoru</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Ajan</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">İş Akışı</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Gözlemlenebilirlik</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Kurumsal Özellikler (SSO/Erişim kontrolü)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Yerel Dağıtım</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## Dify'ı Kullanma
- **Cloud </br>**
İşte verdiğiniz metnin Türkçe çevirisi, kod bloğu içinde:
-
Herkesin sıfır kurulumla denemesi için bir [Dify Cloud](https://dify.ai) hizmeti sunuyoruz. Bu hizmet, kendi kendine dağıtılan versiyonun tüm yeteneklerini sağlar ve sandbox planında 200 ücretsiz GPT-4 çağrısı içerir.
- **Dify Topluluk Sürümünü Kendi Sunucunuzda Barındırma</br>**
Bu [başlangıç kılavuzu](#quick-start) ile Dify'ı kendi ortamınızda hızlıca çalıştırın.
Daha fazla referans ve detaylı talimatlar için [dokümantasyonumuzu](https://docs.dify.ai) kullanın.
- **Kurumlar / organizasyonlar için Dify</br>**
Ek kurumsal odaklı özellikler sunuyoruz. Kurumsal ihtiyaçları görüşmek için [bize bir e-posta gönderin](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry). </br>
> AWS kullanan startuplar ve küçük işletmeler için, [AWS Marketplace'deki Dify Premium'a](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) göz atın ve tek tıklamayla kendi AWS VPC'nize dağıtın. Bu, özel logo ve marka ile uygulamalar oluşturma seçeneğine sahip uygun fiyatlı bir AMI teklifdir.
## Güncel Kalma
GitHub'da Dify'a yıldız verin ve yeni sürümlerden anında haberdar olun.
![bizi-yıldızlayın](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Hızlı başlangıç
> Dify'ı kurmadan önce, makinenizin aşağıdaki minimum sistem gereksinimlerini karşıladığından emin olun:
>
>- CPU >= 2 Çekirdek
>- RAM >= 4GB
</br>
İşte verdiğiniz metnin Türkçe çevirisi, kod bloğu içinde:
Dify sunucusunu başlatmanın en kolay yolu, [docker-compose.yml](docker/docker-compose.yaml) dosyamızı çalıştırmaktır. Kurulum komutunu çalıştırmadan önce, makinenizde [Docker](https://docs.docker.com/get-docker/) ve [Docker Compose](https://docs.docker.com/compose/install/)'un kurulu olduğundan emin olun:
```bash
cd docker
cp .env.example .env
docker compose up -d
```
Çalıştırdıktan sonra, tarayıcınızda [http://localhost/install](http://localhost/install) adresinden Dify kontrol paneline erişebilir ve başlangıç ayarları sürecini başlatabilirsiniz.
> Eğer Dify'a katkıda bulunmak veya ek geliştirmeler yapmak isterseniz, [kaynak koddan dağıtım kılavuzumuza](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code) başvurun.
## Sonraki adımlar
Yapılandırmayı özelleştirmeniz gerekiyorsa, lütfen [.env.example](docker/.env.example) dosyamızdaki yorumlara bakın ve `.env` dosyanızdaki ilgili değerleri güncelleyin. Ayrıca, spesifik dağıtım ortamınıza ve gereksinimlerinize bağlı olarak `docker-compose.yaml` dosyasının kendisinde de, imaj sürümlerini, port eşlemelerini veya hacim bağlantılarını değiştirmek gibi ayarlamalar yapmanız gerekebilir. Herhangi bir değişiklik yaptıktan sonra, lütfen `docker-compose up -d` komutunu tekrar çalıştırın. Kullanılabilir tüm ortam değişkenlerinin tam listesini [burada](https://docs.dify.ai/getting-started/install-self-hosted/environments) bulabilirsiniz.
Yüksek kullanılabilirliğe sahip bir kurulum yapılandırmak isterseniz, Dify'ın Kubernetes üzerine dağıtılmasına olanak tanıyan topluluk katkılı [Helm Charts](https://helm.sh/) ve YAML dosyaları mevcuttur.
- [@LeoQuote tarafından Helm Chart](https://github.com/douban/charts/tree/master/charts/dify)
- [@BorisPolonsky tarafından Helm Chart](https://github.com/BorisPolonsky/dify-helm)
- [@Winson-030 tarafından YAML dosyası](https://github.com/Winson-030/dify-kubernetes)
#### Dağıtım için Terraform Kullanımı
##### Azure Global
[Terraform](https://www.terraform.io/) kullanarak Dify'ı Azure'a tek tıklamayla dağıtın.
- [@nikawang tarafından Azure Terraform](https://github.com/nikawang/dify-azure-terraform)
## Katkıda Bulunma
Kod katkısında bulunmak isteyenler için [Katkı Kılavuzumuza](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) bakabilirsiniz.
Aynı zamanda, lütfen Dify'ı sosyal medyada, etkinliklerde ve konferanslarda paylaşarak desteklemeyi düşünün.
> Dify'ı Mandarin veya İngilizce dışındaki dillere çevirmemize yardımcı olacak katkıda bulunanlara ihtiyacımız var. Yardımcı olmakla ilgileniyorsanız, lütfen daha fazla bilgi için [i18n README](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) dosyasına bakın ve [Discord Topluluk Sunucumuzdaki](https://discord.gg/8Tpq4AcN9c) `global-users` kanalında bize bir yorum bırakın.
**Katkıda Bulunanlar**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Topluluk & iletişim
* [Github Tartışmaları](https://github.com/langgenius/dify/discussions). En uygun: geri bildirim paylaşmak ve soru sormak için.
* [GitHub Sorunları](https://github.com/langgenius/dify/issues). En uygun: Dify.AI kullanırken karşılaştığınız hatalar ve özellik önerileri için. [Katkı Kılavuzumuza](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) bakın.
* [Discord](https://discord.gg/FngNHpbcY7). En uygun: uygulamalarınızı paylaşmak ve toplulukla vakit geçirmek için.
* [Twitter](https://twitter.com/dify_ai). En uygun: uygulamalarınızı paylaşmak ve toplulukla vakit geçirmek için.
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Güvenlik açıklaması
Gizliliğinizi korumak için, lütfen güvenlik sorunlarını GitHub'da paylaşmaktan kaçının. Bunun yerine, sorularınızı security@dify.ai adresine gönderin ve size daha detaylı bir cevap vereceğiz.
## Lisans
Bu depo, temel olarak Apache 2.0 lisansı ve birkaç ek kısıtlama içeren [Dify Açık Kaynak Lisansı](LICENSE) altında kullanıma sunulmuştur.

234
README_VI.md Normal file
View File

@ -0,0 +1,234 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Tự triển khai</a> ·
<a href="https://docs.dify.ai">Tài liệu</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Yêu cầu doanh nghiệp</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat trên Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="theo dõi trên Twitter"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits tháng trước" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Vấn đề đã đóng" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Bài thảo luận" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
</p>
Dify là một nền tảng phát triển ứng dụng LLM mã nguồn mở. Giao diện trực quan kết hợp quy trình làm việc AI, mô hình RAG, khả năng tác nhân, quản lý mô hình, tính năng quan sát và hơn thế nữa, cho phép bạn nhanh chóng chuyển từ nguyên mẫu sang sản phẩm. Đây là danh sách các tính năng cốt lõi:
</br> </br>
**1. Quy trình làm việc**:
Xây dựng và kiểm tra các quy trình làm việc AI mạnh mẽ trên một canvas trực quan, tận dụng tất cả các tính năng sau đây và hơn thế nữa.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Hỗ trợ mô hình toàn diện**:
Tích hợp liền mạch với hàng trăm mô hình LLM độc quyền / mã nguồn mở từ hàng chục nhà cung cấp suy luận và giải pháp tự lưu trữ, bao gồm GPT, Mistral, Llama3, và bất kỳ mô hình tương thích API OpenAI nào. Danh sách đầy đủ các nhà cung cấp mô hình được hỗ trợ có thể được tìm thấy [tại đây](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. IDE Prompt**:
Giao diện trực quan để tạo prompt, so sánh hiệu suất mô hình và thêm các tính năng bổ sung như chuyển văn bản thành giọng nói cho một ứng dụng dựa trên trò chuyện.
**4. Mô hình RAG**:
Khả năng RAG mở rộng bao gồm mọi thứ từ nhập tài liệu đến truy xuất, với hỗ trợ sẵn có cho việc trích xuất văn bản từ PDF, PPT và các định dạng tài liệu phổ biến khác.
**5. Khả năng tác nhân**:
Bạn có thể định nghĩa các tác nhân dựa trên LLM Function Calling hoặc ReAct, và thêm các công cụ được xây dựng sẵn hoặc tùy chỉnh cho tác nhân. Dify cung cấp hơn 50 công cụ tích hợp sẵn cho các tác nhân AI, như Google Search, DALL·E, Stable Diffusion và WolframAlpha.
**6. LLMOps**:
Giám sát và phân tích nhật ký và hiệu suất ứng dụng theo thời gian. Bạn có thể liên tục cải thiện prompt, bộ dữ liệu và mô hình dựa trên dữ liệu sản xuất và chú thích.
**7. Backend-as-a-Service**:
Tất cả các dịch vụ của Dify đều đi kèm với các API tương ứng, vì vậy bạn có thể dễ dàng tích hợp Dify vào logic kinh doanh của riêng mình.
## So sánh tính năng
<table style="width: 100%;">
<tr>
<th align="center">Tính năng</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Phương pháp lập trình</td>
<td align="center">Hướng API + Ứng dụng</td>
<td align="center">Mã Python</td>
<td align="center">Hướng ứng dụng</td>
<td align="center">Hướng API</td>
</tr>
<tr>
<td align="center">LLMs được hỗ trợ</td>
<td align="center">Đa dạng phong phú</td>
<td align="center">Đa dạng phong phú</td>
<td align="center">Đa dạng phong phú</td>
<td align="center">Chỉ OpenAI</td>
</tr>
<tr>
<td align="center">RAG Engine</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Quy trình làm việc</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Khả năng quan sát</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Tính năng doanh nghiệp (SSO/Kiểm soát truy cập)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Triển khai cục bộ</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## Sử dụng Dify
- **Cloud </br>**
Chúng tôi lưu trữ dịch vụ [Dify Cloud](https://dify.ai) cho bất kỳ ai muốn thử mà không cần cài đặt. Nó cung cấp tất cả các khả năng của phiên bản tự triển khai và bao gồm 200 lượt gọi GPT-4 miễn phí trong gói sandbox.
- **Tự triển khai Dify Community Edition</br>**
Nhanh chóng chạy Dify trong môi trường của bạn với [hướng dẫn bắt đầu](#quick-start) này.
Sử dụng [tài liệu](https://docs.dify.ai) của chúng tôi để tham khảo thêm và nhận hướng dẫn chi tiết hơn.
- **Dify cho doanh nghiệp / tổ chức</br>**
Chúng tôi cung cấp các tính năng bổ sung tập trung vào doanh nghiệp. [Ghi lại câu hỏi của bạn cho chúng tôi thông qua chatbot này](https://udify.app/chat/22L1zSxg6yW1cWQg) hoặc [gửi email cho chúng tôi](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) để thảo luận về nhu cầu doanh nghiệp. </br>
> Đối với các công ty khởi nghiệp và doanh nghiệp nhỏ sử dụng AWS, hãy xem [Dify Premium trên AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) và triển khai nó vào AWS VPC của riêng bạn chỉ với một cú nhấp chuột. Đây là một AMI giá cả phải chăng với tùy chọn tạo ứng dụng với logo và thương hiệu tùy chỉnh.
## Luôn cập nhật
Yêu thích Dify trên GitHub và được thông báo ngay lập tức về các bản phát hành mới.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Bắt đầu nhanh
> Trước khi cài đặt Dify, hãy đảm bảo máy của bạn đáp ứng các yêu cầu hệ thống tối thiểu sau:
>
>- CPU >= 2 Core
>- RAM >= 4GB
</br>
Cách dễ nhất để khởi động máy chủ Dify là chạy tệp [docker-compose.yml](docker/docker-compose.yaml) của chúng tôi. Trước khi chạy lệnh cài đặt, hãy đảm bảo rằng [Docker](https://docs.docker.com/get-docker/) và [Docker Compose](https://docs.docker.com/compose/install/) đã được cài đặt trên máy của bạn:
```bash
cd docker
cp .env.example .env
docker compose up -d
```
Sau khi chạy, bạn có thể truy cập bảng điều khiển Dify trong trình duyệt của bạn tại [http://localhost/install](http://localhost/install) và bắt đầu quá trình khởi tạo.
> Nếu bạn muốn đóng góp cho Dify hoặc phát triển thêm, hãy tham khảo [hướng dẫn triển khai từ mã nguồn](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code) của chúng tôi
## Các bước tiếp theo
Nếu bạn cần tùy chỉnh cấu hình, vui lòng tham khảo các nhận xét trong tệp [.env.example](docker/.env.example) của chúng tôi và cập nhật các giá trị tương ứng trong tệp `.env` của bạn. Ngoài ra, bạn có thể cần điều chỉnh tệp `docker-compose.yaml`, chẳng hạn như thay đổi phiên bản hình ảnh, ánh xạ cổng hoặc gắn kết khối lượng, dựa trên môi trường triển khai cụ thể và yêu cầu của bạn. Sau khi thực hiện bất kỳ thay đổi nào, vui lòng chạy lại `docker-compose up -d`. Bạn có thể tìm thấy danh sách đầy đủ các biến môi trường có sẵn [tại đây](https://docs.dify.ai/getting-started/install-self-hosted/environments).
Nếu bạn muốn cấu hình một cài đặt có độ sẵn sàng cao, có các [Helm Charts](https://helm.sh/) và tệp YAML do cộng đồng đóng góp cho phép Dify được triển khai trên Kubernetes.
- [Helm Chart bởi @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart bởi @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [Tệp YAML bởi @Winson-030](https://github.com/Winson-030/dify-kubernetes)
#### Sử dụng Terraform để Triển khai
##### Azure Global
Triển khai Dify lên Azure chỉ với một cú nhấp chuột bằng cách sử dụng [terraform](https://www.terraform.io/).
- [Azure Terraform bởi @nikawang](https://github.com/nikawang/dify-azure-terraform)
## Đóng góp
Đối với những người muốn đóng góp mã, xem [Hướng dẫn Đóng góp](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) của chúng tôi.
Đồng thời, vui lòng xem xét hỗ trợ Dify bằng cách chia sẻ nó trên mạng xã hội và tại các sự kiện và hội nghị.
> Chúng tôi đang tìm kiếm người đóng góp để giúp dịch Dify sang các ngôn ngữ khác ngoài tiếng Trung hoặc tiếng Anh. Nếu bạn quan tâm đến việc giúp đỡ, vui lòng xem [README i18n](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) để biết thêm thông tin và để lại bình luận cho chúng tôi trong kênh `global-users` của [Máy chủ Cộng đồng Discord](https://discord.gg/8Tpq4AcN9c) của chúng tôi.
**Người đóng góp**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Cộng đồng & liên hệ
* [Thảo luận GitHub](https://github.com/langgenius/dify/discussions). Tốt nhất cho: chia sẻ phản hồi và đặt câu hỏi.
* [Vấn đề GitHub](https://github.com/langgenius/dify/issues). Tốt nhất cho: lỗi bạn gặp phải khi sử dụng Dify.AI và đề xuất tính năng. Xem [Hướng dẫn Đóng góp](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) của chúng tôi.
* [Discord](https://discord.gg/FngNHpbcY7). Tốt nhất cho: chia sẻ ứng dụng của bạn và giao lưu với cộng đồng.
* [Twitter](https://twitter.com/dify_ai). Tốt nhất cho: chia sẻ ứng dụng của bạn và giao lưu với cộng đồng.
## Lịch sử Yêu thích
[![Biểu đồ Lịch sử Yêu thích](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Tiết lộ bảo mật
Để bảo vệ quyền riêng tư của bạn, vui lòng tránh đăng các vấn đề bảo mật trên GitHub. Thay vào đó, hãy gửi câu hỏi của bạn đến security@dify.ai và chúng tôi sẽ cung cấp cho bạn câu trả lời chi tiết hơn.
## Giấy phép
Kho lưu trữ này có sẵn theo [Giấy phép Mã nguồn Mở Dify](LICENSE), về cơ bản là Apache 2.0 với một vài hạn chế bổ sung.

View File

@ -8,4 +8,7 @@ logs
*.log*
# jetbrains
.idea
.idea
# venv
.venv

View File

@ -1,6 +1,3 @@
# Server Edition
EDITION=SELF_HOSTED
# Your App secret key will be used for securely signing the session cookie
# Make sure you are changing this key for your deployment with a strong key.
# You can generate a strong key using `openssl rand -base64 42`.
@ -20,6 +17,9 @@ APP_WEB_URL=http://127.0.0.1:3000
# Files URL
FILES_URL=http://127.0.0.1:5001
# The time in seconds after the signature is rejected
FILES_ACCESS_TIMEOUT=300
# celery configuration
CELERY_BROKER_URL=redis://:difyai123456@localhost:6379/1
@ -39,9 +39,10 @@ DB_DATABASE=dify
# Storage configuration
# use for store upload files, private keys...
# storage type: local, s3, azure-blob
# storage type: local, s3, azure-blob, google-storage, tencent-cos, huawei-obs, volcengine-tos
STORAGE_TYPE=local
STORAGE_LOCAL_PATH=storage
S3_USE_AWS_MANAGED_IAM=false
S3_ENDPOINT=https://your-bucket-name.storage.s3.clooudflare.com
S3_BUCKET_NAME=your-bucket-name
S3_ACCESS_KEY=your-access-key
@ -52,12 +53,51 @@ AZURE_BLOB_ACCOUNT_NAME=your-account-name
AZURE_BLOB_ACCOUNT_KEY=your-account-key
AZURE_BLOB_CONTAINER_NAME=yout-container-name
AZURE_BLOB_ACCOUNT_URL=https://<your_account_name>.blob.core.windows.net
# Aliyun oss Storage configuration
ALIYUN_OSS_BUCKET_NAME=your-bucket-name
ALIYUN_OSS_ACCESS_KEY=your-access-key
ALIYUN_OSS_SECRET_KEY=your-secret-key
ALIYUN_OSS_ENDPOINT=your-endpoint
ALIYUN_OSS_AUTH_VERSION=v1
ALIYUN_OSS_REGION=your-region
# Don't start with '/'. OSS doesn't support leading slash in object names.
ALIYUN_OSS_PATH=your-path
# Google Storage configuration
GOOGLE_STORAGE_BUCKET_NAME=yout-bucket-name
GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64=your-google-service-account-json-base64-string
# Tencent COS Storage configuration
TENCENT_COS_BUCKET_NAME=your-bucket-name
TENCENT_COS_SECRET_KEY=your-secret-key
TENCENT_COS_SECRET_ID=your-secret-id
TENCENT_COS_REGION=your-region
TENCENT_COS_SCHEME=your-scheme
# Huawei OBS Storage Configuration
HUAWEI_OBS_BUCKET_NAME=your-bucket-name
HUAWEI_OBS_SECRET_KEY=your-secret-key
HUAWEI_OBS_ACCESS_KEY=your-access-key
HUAWEI_OBS_SERVER=your-server-url
# OCI Storage configuration
OCI_ENDPOINT=your-endpoint
OCI_BUCKET_NAME=your-bucket-name
OCI_ACCESS_KEY=your-access-key
OCI_SECRET_KEY=your-secret-key
OCI_REGION=your-region
# Volcengine tos Storage configuration
VOLCENGINE_TOS_ENDPOINT=your-endpoint
VOLCENGINE_TOS_BUCKET_NAME=your-bucket-name
VOLCENGINE_TOS_ACCESS_KEY=your-access-key
VOLCENGINE_TOS_SECRET_KEY=your-secret-key
VOLCENGINE_TOS_REGION=your-region
# CORS configuration
WEB_API_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
CONSOLE_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
# Vector database configuration, support: weaviate, qdrant, milvus, relyt
# Vector database configuration, support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector
VECTOR_STORE=weaviate
# Weaviate configuration
@ -70,13 +110,22 @@ WEAVIATE_BATCH_SIZE=100
QDRANT_URL=http://localhost:6333
QDRANT_API_KEY=difyai123456
QDRANT_CLIENT_TIMEOUT=20
QDRANT_GRPC_ENABLED=false
QDRANT_GRPC_PORT=6334
# Milvus configuration
MILVUS_HOST=127.0.0.1
MILVUS_PORT=19530
MILVUS_URI=http://127.0.0.1:19530
MILVUS_TOKEN=
MILVUS_USER=root
MILVUS_PASSWORD=Milvus
MILVUS_SECURE=false
# MyScale configuration
MYSCALE_HOST=127.0.0.1
MYSCALE_PORT=8123
MYSCALE_USER=default
MYSCALE_PASSWORD=
MYSCALE_DATABASE=default
MYSCALE_FTS_PARAMS=
# Relyt configuration
RELYT_HOST=127.0.0.1
@ -85,6 +134,67 @@ RELYT_USER=postgres
RELYT_PASSWORD=postgres
RELYT_DATABASE=postgres
# Tencent configuration
TENCENT_VECTOR_DB_URL=http://127.0.0.1
TENCENT_VECTOR_DB_API_KEY=dify
TENCENT_VECTOR_DB_TIMEOUT=30
TENCENT_VECTOR_DB_USERNAME=dify
TENCENT_VECTOR_DB_DATABASE=dify
TENCENT_VECTOR_DB_SHARD=1
TENCENT_VECTOR_DB_REPLICAS=2
# ElasticSearch configuration
ELASTICSEARCH_HOST=127.0.0.1
ELASTICSEARCH_PORT=9200
ELASTICSEARCH_USERNAME=elastic
ELASTICSEARCH_PASSWORD=elastic
# PGVECTO_RS configuration
PGVECTO_RS_HOST=localhost
PGVECTO_RS_PORT=5431
PGVECTO_RS_USER=postgres
PGVECTO_RS_PASSWORD=difyai123456
PGVECTO_RS_DATABASE=postgres
# PGVector configuration
PGVECTOR_HOST=127.0.0.1
PGVECTOR_PORT=5433
PGVECTOR_USER=postgres
PGVECTOR_PASSWORD=postgres
PGVECTOR_DATABASE=postgres
# Tidb Vector configuration
TIDB_VECTOR_HOST=xxx.eu-central-1.xxx.aws.tidbcloud.com
TIDB_VECTOR_PORT=4000
TIDB_VECTOR_USER=xxx.root
TIDB_VECTOR_PASSWORD=xxxxxx
TIDB_VECTOR_DATABASE=dify
# Chroma configuration
CHROMA_HOST=127.0.0.1
CHROMA_PORT=8000
CHROMA_TENANT=default_tenant
CHROMA_DATABASE=default_database
CHROMA_AUTH_PROVIDER=chromadb.auth.token_authn.TokenAuthenticationServerProvider
CHROMA_AUTH_CREDENTIALS=difyai123456
# AnalyticDB configuration
ANALYTICDB_KEY_ID=your-ak
ANALYTICDB_KEY_SECRET=your-sk
ANALYTICDB_REGION_ID=cn-hangzhou
ANALYTICDB_INSTANCE_ID=gp-ab123456
ANALYTICDB_ACCOUNT=testaccount
ANALYTICDB_PASSWORD=testpassword
ANALYTICDB_NAMESPACE=dify
ANALYTICDB_NAMESPACE_PASSWORD=difypassword
# OpenSearch configuration
OPENSEARCH_HOST=127.0.0.1
OPENSEARCH_PORT=9200
OPENSEARCH_USER=admin
OPENSEARCH_PASSWORD=admin
OPENSEARCH_SECURE=true
# Upload configuration
UPLOAD_FILE_SIZE_LIMIT=15
UPLOAD_FILE_BATCH_LIMIT=5
@ -92,6 +202,7 @@ UPLOAD_IMAGE_FILE_SIZE_LIMIT=10
# Model Configuration
MULTIMODAL_SEND_IMAGE_FORMAT=base64
PROMPT_GENERATION_MAX_TOKENS=512
# Mail configuration, support: resend, smtp
MAIL_TYPE=
@ -100,10 +211,11 @@ RESEND_API_KEY=
RESEND_API_URL=https://api.resend.com
# smtp configuration
SMTP_SERVER=smtp.gmail.com
SMTP_PORT=587
SMTP_PORT=465
SMTP_USERNAME=123
SMTP_PASSWORD=abc
SMTP_USE_TLS=false
SMTP_USE_TLS=true
SMTP_OPPORTUNISTIC_TLS=false
# Sentry configuration
SENTRY_DSN=
@ -118,30 +230,13 @@ NOTION_CLIENT_SECRET=you-client-secret
NOTION_CLIENT_ID=you-client-id
NOTION_INTERNAL_SECRET=you-internal-secret
# Hosted Model Credentials
HOSTED_OPENAI_API_KEY=
HOSTED_OPENAI_API_BASE=
HOSTED_OPENAI_API_ORGANIZATION=
HOSTED_OPENAI_TRIAL_ENABLED=false
HOSTED_OPENAI_QUOTA_LIMIT=200
HOSTED_OPENAI_PAID_ENABLED=false
HOSTED_AZURE_OPENAI_ENABLED=false
HOSTED_AZURE_OPENAI_API_KEY=
HOSTED_AZURE_OPENAI_API_BASE=
HOSTED_AZURE_OPENAI_QUOTA_LIMIT=200
HOSTED_ANTHROPIC_API_BASE=
HOSTED_ANTHROPIC_API_KEY=
HOSTED_ANTHROPIC_TRIAL_ENABLED=false
HOSTED_ANTHROPIC_QUOTA_LIMIT=600000
HOSTED_ANTHROPIC_PAID_ENABLED=false
ETL_TYPE=dify
UNSTRUCTURED_API_URL=
UNSTRUCTURED_API_KEY=
SSRF_PROXY_HTTP_URL=
SSRF_PROXY_HTTPS_URL=
SSRF_DEFAULT_MAX_RETRIES=3
BATCH_UPLOAD_LIMIT=10
KEYWORD_DATA_SOURCE_TYPE=database
@ -160,3 +255,38 @@ CODE_MAX_NUMBER_ARRAY_LENGTH=1000
# API Tool configuration
API_TOOL_DEFAULT_CONNECT_TIMEOUT=10
API_TOOL_DEFAULT_READ_TIMEOUT=60
# HTTP Node configuration
HTTP_REQUEST_MAX_CONNECT_TIMEOUT=300
HTTP_REQUEST_MAX_READ_TIMEOUT=600
HTTP_REQUEST_MAX_WRITE_TIMEOUT=600
HTTP_REQUEST_NODE_MAX_BINARY_SIZE=10485760
HTTP_REQUEST_NODE_MAX_TEXT_SIZE=1048576
# Log file path
LOG_FILE=
# Indexing configuration
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH=1000
# Workflow runtime configuration
WORKFLOW_MAX_EXECUTION_STEPS=500
WORKFLOW_MAX_EXECUTION_TIME=1200
WORKFLOW_CALL_MAX_DEPTH=5
# App configuration
APP_MAX_EXECUTION_TIME=1200
APP_MAX_ACTIVE_REQUESTS=0
# Celery beat configuration
CELERY_BEAT_SCHEDULER_TIME=1
# Position configuration
POSITION_TOOL_PINS=
POSITION_TOOL_INCLUDES=
POSITION_TOOL_EXCLUDES=
POSITION_PROVIDER_PINS=
POSITION_PROVIDER_INCLUDES=
POSITION_PROVIDER_EXCLUDES=

BIN
api/.idea/icon.png generated Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 KiB

17
api/.idea/vcs.xml generated Normal file
View File

@ -0,0 +1,17 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="IssueNavigationConfiguration">
<option name="links">
<list>
<IssueNavigationLink>
<option name="issueRegexp" value="#(\d+)" />
<option name="linkRegexp" value="https://github.com/langgenius/dify/issues/$1" />
</IssueNavigationLink>
</list>
</option>
</component>
<component name="VcsDirectoryMappings">
<mapping directory="" vcs="Git" />
<mapping directory="$PROJECT_DIR$/.." vcs="Git" />
</component>
</project>

View File

@ -1,42 +1,54 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Python: Celery",
"type": "debugpy",
"request": "launch",
"module": "celery",
"justMyCode": true,
"args": ["-A", "app.celery", "worker", "-P", "gevent", "-c", "1", "--loglevel", "info", "-Q", "dataset,generation,mail"],
"envFile": "${workspaceFolder}/.env",
"env": {
"FLASK_APP": "app.py",
"FLASK_DEBUG": "1",
"GEVENT_SUPPORT": "True"
},
"console": "integratedTerminal"
},
{
"name": "Python: Flask",
"type": "debugpy",
"request": "launch",
"python": "${workspaceFolder}/.venv/bin/python",
"cwd": "${workspaceFolder}",
"envFile": ".env",
"module": "flask",
"justMyCode": true,
"jinja": true,
"env": {
"FLASK_APP": "app.py",
"GEVENT_SUPPORT": "True"
},
"args": [
"run",
"--host=0.0.0.0",
"--port=5001"
]
},
{
"name": "Python: Celery",
"type": "debugpy",
"request": "launch",
"python": "${workspaceFolder}/.venv/bin/python",
"cwd": "${workspaceFolder}",
"module": "celery",
"justMyCode": true,
"envFile": ".env",
"console": "integratedTerminal",
"env": {
"FLASK_APP": "app.py",
"FLASK_DEBUG": "1",
"GEVENT_SUPPORT": "True"
},
"args": [
"run",
"--host=0.0.0.0",
"--port=5001",
"--debug"
],
"jinja": true,
"justMyCode": true
}
"-A",
"app.celery",
"worker",
"-P",
"gevent",
"-c",
"1",
"--loglevel",
"info",
"-Q",
"dataset,generation,mail,ops_trace,app_deletion"
]
},
]
}

View File

@ -1,49 +1,81 @@
# base image
FROM python:3.10-slim-bookworm AS base
LABEL maintainer="takatost@gmail.com"
WORKDIR /app/api
# install packages
FROM base as packages
# Install Poetry
ENV POETRY_VERSION=1.8.3
# if you located in China, you can use aliyun mirror to speed up
# RUN pip install --no-cache-dir poetry==${POETRY_VERSION} -i https://mirrors.aliyun.com/pypi/simple/
RUN pip install --no-cache-dir poetry==${POETRY_VERSION}
# Configure Poetry
ENV POETRY_CACHE_DIR=/tmp/poetry_cache
ENV POETRY_NO_INTERACTION=1
ENV POETRY_VIRTUALENVS_IN_PROJECT=true
ENV POETRY_VIRTUALENVS_CREATE=true
ENV POETRY_REQUESTS_TIMEOUT=15
FROM base AS packages
# if you located in China, you can use aliyun mirror to speed up
# RUN sed -i 's@deb.debian.org@mirrors.aliyun.com@g' /etc/apt/sources.list.d/debian.sources
RUN apt-get update \
&& apt-get install -y --no-install-recommends gcc g++ libc-dev libffi-dev libgmp-dev libmpfr-dev libmpc-dev
COPY requirements.txt /requirements.txt
RUN --mount=type=cache,target=/root/.cache/pip \
pip install --prefix=/pkg -r requirements.txt
# Install Python dependencies
COPY pyproject.toml poetry.lock ./
RUN poetry install --sync --no-cache --no-root
# production stage
FROM base AS production
ENV FLASK_APP app.py
ENV EDITION SELF_HOSTED
ENV DEPLOY_ENV PRODUCTION
ENV CONSOLE_API_URL http://127.0.0.1:5001
ENV CONSOLE_WEB_URL http://127.0.0.1:3000
ENV SERVICE_API_URL http://127.0.0.1:5001
ENV APP_WEB_URL http://127.0.0.1:3000
ENV FLASK_APP=app.py
ENV EDITION=SELF_HOSTED
ENV DEPLOY_ENV=PRODUCTION
ENV CONSOLE_API_URL=http://127.0.0.1:5001
ENV CONSOLE_WEB_URL=http://127.0.0.1:3000
ENV SERVICE_API_URL=http://127.0.0.1:5001
ENV APP_WEB_URL=http://127.0.0.1:3000
EXPOSE 5001
# set timezone
ENV TZ UTC
ENV TZ=UTC
WORKDIR /app/api
RUN apt-get update \
&& apt-get install -y --no-install-recommends curl wget vim nodejs ffmpeg libgmp-dev libmpfr-dev libmpc-dev \
&& apt-get autoremove \
&& apt-get install -y --no-install-recommends curl nodejs libgmp-dev libmpfr-dev libmpc-dev \
# if you located in China, you can use aliyun mirror to speed up
# && echo "deb http://mirrors.aliyun.com/debian testing main" > /etc/apt/sources.list \
&& echo "deb http://deb.debian.org/debian testing main" > /etc/apt/sources.list \
&& apt-get update \
# For Security
&& apt-get install -y --no-install-recommends zlib1g=1:1.3.dfsg+really1.3.1-1 expat=2.6.3-1 libldap-2.5-0=2.5.18+dfsg-3 perl=5.38.2-5 libsqlite3-0=3.46.0-1 \
&& apt-get autoremove -y \
&& rm -rf /var/lib/apt/lists/*
COPY --from=packages /pkg /usr/local
# Copy Python environment and packages
ENV VIRTUAL_ENV=/app/api/.venv
COPY --from=packages ${VIRTUAL_ENV} ${VIRTUAL_ENV}
ENV PATH="${VIRTUAL_ENV}/bin:${PATH}"
# Download nltk data
RUN python -c "import nltk; nltk.download('punkt'); nltk.download('averaged_perceptron_tagger')"
# Copy source code
COPY . /app/api/
# Copy entrypoint
COPY docker/entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh
ARG COMMIT_SHA
ENV COMMIT_SHA ${COMMIT_SHA}
ENTRYPOINT ["/bin/bash", "/entrypoint.sh"]
ARG COMMIT_SHA
ENV COMMIT_SHA=${COMMIT_SHA}
ENTRYPOINT ["/bin/bash", "/entrypoint.sh"]

View File

@ -2,69 +2,88 @@
## Usage
> [!IMPORTANT]
> In the v0.6.12 release, we deprecated `pip` as the package management tool for Dify API Backend service and replaced it with `poetry`.
1. Start the docker-compose stack
The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using `docker-compose`.
```bash
cd ../docker
docker-compose -f docker-compose.middleware.yaml -p dify up -d
cp middleware.env.example middleware.env
# change the profile to other vector database if you are not using weaviate
docker compose -f docker-compose.middleware.yaml --profile weaviate -p dify up -d
cd ../api
```
2. Copy `.env.example` to `.env`
3. Generate a `SECRET_KEY` in the `.env` file.
```bash
```bash for Linux
sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
```
4. If you use Anaconda, create a new environment and activate it
```bash
conda create --name dify python=3.10
conda activate dify
```bash for Mac
secret_key=$(openssl rand -base64 42)
sed -i '' "/^SECRET_KEY=/c\\
SECRET_KEY=${secret_key}" .env
```
4. Create environment.
Dify API service uses [Poetry](https://python-poetry.org/docs/) to manage dependencies. You can execute `poetry shell` to activate the environment.
5. Install dependencies
```bash
pip install -r requirements.txt
poetry env use 3.10
poetry install
```
In case of contributors missing to update dependencies for `pyproject.toml`, you can perform the following shell instead.
```bash
poetry shell # activate current environment
poetry add $(cat requirements.txt) # install dependencies of production and update pyproject.toml
poetry add $(cat requirements-dev.txt) --group dev # install dependencies of development and update pyproject.toml
```
6. Run migrate
Before the first launch, migrate the database to the latest version.
```bash
flask db upgrade
poetry run python -m flask db upgrade
```
⚠️ If you encounter problems with jieba, for example
7. Start backend
```
> flask db upgrade
Error: While importing 'app', an ImportError was raised:
```
Please run the following command instead.
```
pip install -r requirements.txt --upgrade --force-reinstall
```
7. Start backend:
```bash
flask run --host 0.0.0.0 --port=5001 --debug
poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
```
8. Setup your application by visiting http://localhost:5001/console/api/setup or other apis...
9. If you need to debug local async processing, please start the worker service by running
`celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail`.
The started celery app handles the async tasks, e.g. dataset importing and documents indexing.
8. Start Dify [web](../web) service.
9. Setup your application by visiting `http://localhost:3000`...
10. If you need to debug local async processing, please start the worker service.
```bash
poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
```
The started celery app handles the async tasks, e.g. dataset importing and documents indexing.
## Testing
1. Install dependencies for both the backend and the test environment
```bash
pip install -r requirements.txt -r requirements-dev.txt
```
2. Run the tests locally with mocked system environment variables in `tool.pytest_env` section in `pyproject.toml`
```bash
dev/pytest/pytest_all_tests.sh
poetry install --with dev
```
2. Run the tests locally with mocked system environment variables in `tool.pytest_env` section in `pyproject.toml`
```bash
cd ../
poetry run -C api bash dev/pytest/pytest_all_tests.sh
```

View File

@ -1,28 +1,29 @@
import os
import sys
from logging.handlers import RotatingFileHandler
if not os.environ.get("DEBUG") or os.environ.get("DEBUG").lower() != 'true':
if os.environ.get("DEBUG", "false").lower() != "true":
from gevent import monkey
monkey.patch_all()
# if os.environ.get("VECTOR_STORE") == 'milvus':
import grpc.experimental.gevent
grpc.experimental.gevent.init_gevent()
import json
import logging
import sys
import threading
import time
import warnings
from logging.handlers import RotatingFileHandler
from flask import Flask, Response, request
from flask_cors import CORS
from werkzeug.exceptions import Unauthorized
import contexts
from commands import register_commands
from config import CloudEditionConfig, Config
from configs import dify_config
# DO NOT REMOVE BELOW
from events import event_handlers
@ -42,6 +43,8 @@ from extensions import (
from extensions.ext_database import db
from extensions.ext_login import login_manager
from libs.passport import PassportService
# TODO: Find a way to avoid importing models here
from models import account, dataset, model, source, task, tool, tools, web
from services.account_service import AccountService
@ -54,7 +57,7 @@ warnings.simplefilter("ignore", ResourceWarning)
if os.name == "nt":
os.system('tzutil /s "UTC"')
else:
os.environ['TZ'] = 'UTC'
os.environ["TZ"] = "UTC"
time.tzset()
@ -67,7 +70,7 @@ class DifyApp(Flask):
# -------------
config_type = os.getenv('EDITION', default='SELF_HOSTED') # ce edition first
config_type = os.getenv("EDITION", default="SELF_HOSTED") # ce edition first
# ----------------------------
@ -75,21 +78,33 @@ config_type = os.getenv('EDITION', default='SELF_HOSTED') # ce edition first
# ----------------------------
def create_app(test_config=None) -> Flask:
app = DifyApp(__name__)
def create_flask_app_with_configs() -> Flask:
"""
create a raw flask app
with configs loaded from .env file
"""
dify_app = DifyApp(__name__)
dify_app.config.from_mapping(dify_config.model_dump())
if test_config:
app.config.from_object(test_config)
else:
if config_type == "CLOUD":
app.config.from_object(CloudEditionConfig())
else:
app.config.from_object(Config())
# populate configs into system environment variables
for key, value in dify_app.config.items():
if isinstance(value, str):
os.environ[key] = value
elif isinstance(value, int | float | bool):
os.environ[key] = str(value)
elif value is None:
os.environ[key] = ""
app.secret_key = app.config['SECRET_KEY']
return dify_app
def create_app() -> Flask:
app = create_flask_app_with_configs()
app.secret_key = app.config["SECRET_KEY"]
log_handlers = None
log_file = app.config.get('LOG_FILE')
log_file = app.config.get("LOG_FILE")
if log_file:
log_dir = os.path.dirname(log_file)
os.makedirs(log_dir, exist_ok=True)
@ -97,17 +112,31 @@ def create_app(test_config=None) -> Flask:
RotatingFileHandler(
filename=log_file,
maxBytes=1024 * 1024 * 1024,
backupCount=5
backupCount=5,
),
logging.StreamHandler(sys.stdout)
logging.StreamHandler(sys.stdout),
]
logging.basicConfig(
level=app.config.get('LOG_LEVEL'),
format=app.config.get('LOG_FORMAT'),
datefmt=app.config.get('LOG_DATEFORMAT'),
handlers=log_handlers
)
logging.basicConfig(
level=app.config.get("LOG_LEVEL"),
format=app.config.get("LOG_FORMAT"),
datefmt=app.config.get("LOG_DATEFORMAT"),
handlers=log_handlers,
force=True,
)
log_tz = app.config.get("LOG_TZ")
if log_tz:
from datetime import datetime
import pytz
timezone = pytz.timezone(log_tz)
def time_converter(seconds):
return datetime.utcfromtimestamp(seconds).astimezone(timezone).timetuple()
for handler in logging.root.handlers:
handler.formatter.converter = time_converter
initialize_extensions(app)
register_blueprints(app)
register_commands(app)
@ -135,36 +164,39 @@ def initialize_extensions(app):
@login_manager.request_loader
def load_user_from_request(request_from_flask_login):
"""Load user based on the request."""
if request.blueprint in ['console', 'inner_api']:
# Check if the user_id contains a dot, indicating the old format
auth_header = request.headers.get('Authorization', '')
if not auth_header:
auth_token = request.args.get('_token')
if not auth_token:
raise Unauthorized('Invalid Authorization token.')
else:
if ' ' not in auth_header:
raise Unauthorized('Invalid Authorization header format. Expected \'Bearer <api-key>\' format.')
auth_scheme, auth_token = auth_header.split(None, 1)
auth_scheme = auth_scheme.lower()
if auth_scheme != 'bearer':
raise Unauthorized('Invalid Authorization header format. Expected \'Bearer <api-key>\' format.')
decoded = PassportService().verify(auth_token)
user_id = decoded.get('user_id')
return AccountService.load_user(user_id)
else:
if request.blueprint not in {"console", "inner_api"}:
return None
# Check if the user_id contains a dot, indicating the old format
auth_header = request.headers.get("Authorization", "")
if not auth_header:
auth_token = request.args.get("_token")
if not auth_token:
raise Unauthorized("Invalid Authorization token.")
else:
if " " not in auth_header:
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
auth_scheme, auth_token = auth_header.split(None, 1)
auth_scheme = auth_scheme.lower()
if auth_scheme != "bearer":
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
decoded = PassportService().verify(auth_token)
user_id = decoded.get("user_id")
account = AccountService.load_logged_in_account(account_id=user_id, token=auth_token)
if account:
contexts.tenant_id.set(account.current_tenant_id)
return account
@login_manager.unauthorized_handler
def unauthorized_handler():
"""Handle unauthorized requests."""
return Response(json.dumps({
'code': 'unauthorized',
'message': "Unauthorized."
}), status=401, content_type="application/json")
return Response(
json.dumps({"code": "unauthorized", "message": "Unauthorized."}),
status=401,
content_type="application/json",
)
# register blueprint routers
@ -175,38 +207,36 @@ def register_blueprints(app):
from controllers.service_api import bp as service_api_bp
from controllers.web import bp as web_bp
CORS(service_api_bp,
allow_headers=['Content-Type', 'Authorization', 'X-App-Code'],
methods=['GET', 'PUT', 'POST', 'DELETE', 'OPTIONS', 'PATCH']
)
CORS(
service_api_bp,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
)
app.register_blueprint(service_api_bp)
CORS(web_bp,
resources={
r"/*": {"origins": app.config['WEB_API_CORS_ALLOW_ORIGINS']}},
supports_credentials=True,
allow_headers=['Content-Type', 'Authorization', 'X-App-Code'],
methods=['GET', 'PUT', 'POST', 'DELETE', 'OPTIONS', 'PATCH'],
expose_headers=['X-Version', 'X-Env']
)
CORS(
web_bp,
resources={r"/*": {"origins": app.config["WEB_API_CORS_ALLOW_ORIGINS"]}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(web_bp)
CORS(console_app_bp,
resources={
r"/*": {"origins": app.config['CONSOLE_CORS_ALLOW_ORIGINS']}},
supports_credentials=True,
allow_headers=['Content-Type', 'Authorization'],
methods=['GET', 'PUT', 'POST', 'DELETE', 'OPTIONS', 'PATCH'],
expose_headers=['X-Version', 'X-Env']
)
CORS(
console_app_bp,
resources={r"/*": {"origins": app.config["CONSOLE_CORS_ALLOW_ORIGINS"]}},
supports_credentials=True,
allow_headers=["Content-Type", "Authorization"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"],
expose_headers=["X-Version", "X-Env"],
)
app.register_blueprint(console_app_bp)
CORS(files_bp,
allow_headers=['Content-Type'],
methods=['GET', 'PUT', 'POST', 'DELETE', 'OPTIONS', 'PATCH']
)
CORS(files_bp, allow_headers=["Content-Type"], methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH"])
app.register_blueprint(files_bp)
app.register_blueprint(inner_api_bp)
@ -216,28 +246,29 @@ def register_blueprints(app):
app = create_app()
celery = app.extensions["celery"]
if app.config['TESTING']:
if app.config.get("TESTING"):
print("App is running in TESTING mode")
@app.after_request
def after_request(response):
"""Add Version headers to the response."""
response.set_cookie('remember_token', '', expires=0)
response.headers.add('X-Version', app.config['CURRENT_VERSION'])
response.headers.add('X-Env', app.config['DEPLOY_ENV'])
response.set_cookie("remember_token", "", expires=0)
response.headers.add("X-Version", app.config["CURRENT_VERSION"])
response.headers.add("X-Env", app.config["DEPLOY_ENV"])
return response
@app.route('/health')
@app.route("/health")
def health():
return Response(json.dumps({
'status': 'ok',
'version': app.config['CURRENT_VERSION']
}), status=200, content_type="application/json")
return Response(
json.dumps({"pid": os.getpid(), "status": "ok", "version": app.config["CURRENT_VERSION"]}),
status=200,
content_type="application/json",
)
@app.route('/threads')
@app.route("/threads")
def threads():
num_threads = threading.active_count()
threads = threading.enumerate()
@ -248,30 +279,34 @@ def threads():
thread_id = thread.ident
is_alive = thread.is_alive()
thread_list.append({
'name': thread_name,
'id': thread_id,
'is_alive': is_alive
})
thread_list.append(
{
"name": thread_name,
"id": thread_id,
"is_alive": is_alive,
}
)
return {
'thread_num': num_threads,
'threads': thread_list
"pid": os.getpid(),
"thread_num": num_threads,
"threads": thread_list,
}
@app.route('/db-pool-stat')
@app.route("/db-pool-stat")
def pool_stat():
engine = db.engine
return {
'pool_size': engine.pool.size(),
'checked_in_connections': engine.pool.checkedin(),
'checked_out_connections': engine.pool.checkedout(),
'overflow_connections': engine.pool.overflow(),
'connection_timeout': engine.pool.timeout(),
'recycle_time': db.engine.pool._recycle
"pid": os.getpid(),
"pool_size": engine.pool.size(),
"checked_in_connections": engine.pool.checkedin(),
"checked_out_connections": engine.pool.checkedout(),
"overflow_connections": engine.pool.overflow(),
"connection_timeout": engine.pool.timeout(),
"recycle_time": db.engine.pool._recycle,
}
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5001)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5001)

View File

@ -1,14 +1,21 @@
import base64
import json
import logging
import secrets
from typing import Optional
import click
from flask import current_app
from werkzeug.exceptions import NotFound
from configs import dify_config
from constants.languages import languages
from core.rag.datasource.vdb.vector_factory import Vector
from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.models.document import Document
from events.app_event import app_was_created
from extensions.ext_database import db
from extensions.ext_redis import redis_client
from libs.helper import email as email_validate
from libs.password import hash_password, password_pattern, valid_password
from libs.rsa import generate_key_pair
@ -17,34 +24,32 @@ from models.dataset import Dataset, DatasetCollectionBinding, DocumentSegment
from models.dataset import Document as DatasetDocument
from models.model import Account, App, AppAnnotationSetting, AppMode, Conversation, MessageAnnotation
from models.provider import Provider, ProviderModel
from services.account_service import RegisterService, TenantService
@click.command('reset-password', help='Reset the account password.')
@click.option('--email', prompt=True, help='The email address of the account whose password you need to reset')
@click.option('--new-password', prompt=True, help='the new password.')
@click.option('--password-confirm', prompt=True, help='the new password confirm.')
@click.command("reset-password", help="Reset the account password.")
@click.option("--email", prompt=True, help="The email address of the account whose password you need to reset")
@click.option("--new-password", prompt=True, help="the new password.")
@click.option("--password-confirm", prompt=True, help="the new password confirm.")
def reset_password(email, new_password, password_confirm):
"""
Reset password of owner account
Only available in SELF_HOSTED mode
"""
if str(new_password).strip() != str(password_confirm).strip():
click.echo(click.style('sorry. The two passwords do not match.', fg='red'))
click.echo(click.style("sorry. The two passwords do not match.", fg="red"))
return
account = db.session.query(Account). \
filter(Account.email == email). \
one_or_none()
account = db.session.query(Account).filter(Account.email == email).one_or_none()
if not account:
click.echo(click.style('sorry. the account: [{}] not exist .'.format(email), fg='red'))
click.echo(click.style("sorry. the account: [{}] not exist .".format(email), fg="red"))
return
try:
valid_password(new_password)
except:
click.echo(
click.style('sorry. The passwords must match {} '.format(password_pattern), fg='red'))
click.echo(click.style("sorry. The passwords must match {} ".format(password_pattern), fg="red"))
return
# generate password salt
@ -57,80 +62,87 @@ def reset_password(email, new_password, password_confirm):
account.password = base64_password_hashed
account.password_salt = base64_salt
db.session.commit()
click.echo(click.style('Congratulations!, password has been reset.', fg='green'))
click.echo(click.style("Congratulations! Password has been reset.", fg="green"))
@click.command('reset-email', help='Reset the account email.')
@click.option('--email', prompt=True, help='The old email address of the account whose email you need to reset')
@click.option('--new-email', prompt=True, help='the new email.')
@click.option('--email-confirm', prompt=True, help='the new email confirm.')
@click.command("reset-email", help="Reset the account email.")
@click.option("--email", prompt=True, help="The old email address of the account whose email you need to reset")
@click.option("--new-email", prompt=True, help="the new email.")
@click.option("--email-confirm", prompt=True, help="the new email confirm.")
def reset_email(email, new_email, email_confirm):
"""
Replace account email
:return:
"""
if str(new_email).strip() != str(email_confirm).strip():
click.echo(click.style('Sorry, new email and confirm email do not match.', fg='red'))
click.echo(click.style("Sorry, new email and confirm email do not match.", fg="red"))
return
account = db.session.query(Account). \
filter(Account.email == email). \
one_or_none()
account = db.session.query(Account).filter(Account.email == email).one_or_none()
if not account:
click.echo(click.style('sorry. the account: [{}] not exist .'.format(email), fg='red'))
click.echo(click.style("sorry. the account: [{}] not exist .".format(email), fg="red"))
return
try:
email_validate(new_email)
except:
click.echo(
click.style('sorry. {} is not a valid email. '.format(email), fg='red'))
click.echo(click.style("sorry. {} is not a valid email. ".format(email), fg="red"))
return
account.email = new_email
db.session.commit()
click.echo(click.style('Congratulations!, email has been reset.', fg='green'))
click.echo(click.style("Congratulations!, email has been reset.", fg="green"))
@click.command('reset-encrypt-key-pair', help='Reset the asymmetric key pair of workspace for encrypt LLM credentials. '
'After the reset, all LLM credentials will become invalid, '
'requiring re-entry.'
'Only support SELF_HOSTED mode.')
@click.confirmation_option(prompt=click.style('Are you sure you want to reset encrypt key pair?'
' this operation cannot be rolled back!', fg='red'))
@click.command(
"reset-encrypt-key-pair",
help="Reset the asymmetric key pair of workspace for encrypt LLM credentials. "
"After the reset, all LLM credentials will become invalid, "
"requiring re-entry."
"Only support SELF_HOSTED mode.",
)
@click.confirmation_option(
prompt=click.style(
"Are you sure you want to reset encrypt key pair? this operation cannot be rolled back!", fg="red"
)
)
def reset_encrypt_key_pair():
"""
Reset the encrypted key pair of workspace for encrypt LLM credentials.
After the reset, all LLM credentials will become invalid, requiring re-entry.
Only support SELF_HOSTED mode.
"""
if current_app.config['EDITION'] != 'SELF_HOSTED':
click.echo(click.style('Sorry, only support SELF_HOSTED mode.', fg='red'))
if dify_config.EDITION != "SELF_HOSTED":
click.echo(click.style("Sorry, only support SELF_HOSTED mode.", fg="red"))
return
tenants = db.session.query(Tenant).all()
for tenant in tenants:
if not tenant:
click.echo(click.style('Sorry, no workspace found. Please enter /install to initialize.', fg='red'))
click.echo(click.style("Sorry, no workspace found. Please enter /install to initialize.", fg="red"))
return
tenant.encrypt_public_key = generate_key_pair(tenant.id)
db.session.query(Provider).filter(Provider.provider_type == 'custom', Provider.tenant_id == tenant.id).delete()
db.session.query(Provider).filter(Provider.provider_type == "custom", Provider.tenant_id == tenant.id).delete()
db.session.query(ProviderModel).filter(ProviderModel.tenant_id == tenant.id).delete()
db.session.commit()
click.echo(click.style('Congratulations! '
'the asymmetric key pair of workspace {} has been reset.'.format(tenant.id), fg='green'))
click.echo(
click.style(
"Congratulations! The asymmetric key pair of workspace {} has been reset.".format(tenant.id),
fg="green",
)
)
@click.command('vdb-migrate', help='migrate vector db.')
@click.option('--scope', default='all', prompt=False, help='The scope of vector database to migrate, Default is All.')
@click.command("vdb-migrate", help="migrate vector db.")
@click.option("--scope", default="all", prompt=False, help="The scope of vector database to migrate, Default is All.")
def vdb_migrate(scope: str):
if scope in ['knowledge', 'all']:
if scope in {"knowledge", "all"}:
migrate_knowledge_vector_database()
if scope in ['annotation', 'all']:
if scope in {"annotation", "all"}:
migrate_annotation_vector_database()
@ -138,7 +150,7 @@ def migrate_annotation_vector_database():
"""
Migrate annotation datas to target vector database .
"""
click.echo(click.style('Start migrate annotation data.', fg='green'))
click.echo(click.style("Start migrate annotation data.", fg="green"))
create_count = 0
skipped_count = 0
total_count = 0
@ -146,167 +158,197 @@ def migrate_annotation_vector_database():
while True:
try:
# get apps info
apps = db.session.query(App).filter(
App.status == 'normal'
).order_by(App.created_at.desc()).paginate(page=page, per_page=50)
apps = (
db.session.query(App)
.filter(App.status == "normal")
.order_by(App.created_at.desc())
.paginate(page=page, per_page=50)
)
except NotFound:
break
page += 1
for app in apps:
total_count = total_count + 1
click.echo(f'Processing the {total_count} app {app.id}. '
+ f'{create_count} created, {skipped_count} skipped.')
click.echo(
f"Processing the {total_count} app {app.id}. " + f"{create_count} created, {skipped_count} skipped."
)
try:
click.echo('Create app annotation index: {}'.format(app.id))
app_annotation_setting = db.session.query(AppAnnotationSetting).filter(
AppAnnotationSetting.app_id == app.id
).first()
click.echo("Create app annotation index: {}".format(app.id))
app_annotation_setting = (
db.session.query(AppAnnotationSetting).filter(AppAnnotationSetting.app_id == app.id).first()
)
if not app_annotation_setting:
skipped_count = skipped_count + 1
click.echo('App annotation setting is disabled: {}'.format(app.id))
click.echo("App annotation setting is disabled: {}".format(app.id))
continue
# get dataset_collection_binding info
dataset_collection_binding = db.session.query(DatasetCollectionBinding).filter(
DatasetCollectionBinding.id == app_annotation_setting.collection_binding_id
).first()
dataset_collection_binding = (
db.session.query(DatasetCollectionBinding)
.filter(DatasetCollectionBinding.id == app_annotation_setting.collection_binding_id)
.first()
)
if not dataset_collection_binding:
click.echo('App annotation collection binding is not exist: {}'.format(app.id))
click.echo("App annotation collection binding is not exist: {}".format(app.id))
continue
annotations = db.session.query(MessageAnnotation).filter(MessageAnnotation.app_id == app.id).all()
dataset = Dataset(
id=app.id,
tenant_id=app.tenant_id,
indexing_technique='high_quality',
indexing_technique="high_quality",
embedding_model_provider=dataset_collection_binding.provider_name,
embedding_model=dataset_collection_binding.model_name,
collection_binding_id=dataset_collection_binding.id
collection_binding_id=dataset_collection_binding.id,
)
documents = []
if annotations:
for annotation in annotations:
document = Document(
page_content=annotation.question,
metadata={
"annotation_id": annotation.id,
"app_id": app.id,
"doc_id": annotation.id
}
metadata={"annotation_id": annotation.id, "app_id": app.id, "doc_id": annotation.id},
)
documents.append(document)
vector = Vector(dataset, attributes=['doc_id', 'annotation_id', 'app_id'])
vector = Vector(dataset, attributes=["doc_id", "annotation_id", "app_id"])
click.echo(f"Start to migrate annotation, app_id: {app.id}.")
try:
vector.delete()
click.echo(
click.style(f'Successfully delete vector index for app: {app.id}.',
fg='green'))
click.echo(click.style(f"Successfully delete vector index for app: {app.id}.", fg="green"))
except Exception as e:
click.echo(
click.style(f'Failed to delete vector index for app {app.id}.',
fg='red'))
click.echo(click.style(f"Failed to delete vector index for app {app.id}.", fg="red"))
raise e
if documents:
try:
click.echo(click.style(
f'Start to created vector index with {len(documents)} annotations for app {app.id}.',
fg='green'))
vector.create(documents)
click.echo(
click.style(f'Successfully created vector index for app {app.id}.', fg='green'))
click.style(
f"Start to created vector index with {len(documents)} annotations for app {app.id}.",
fg="green",
)
)
vector.create(documents)
click.echo(click.style(f"Successfully created vector index for app {app.id}.", fg="green"))
except Exception as e:
click.echo(click.style(f'Failed to created vector index for app {app.id}.', fg='red'))
click.echo(click.style(f"Failed to created vector index for app {app.id}.", fg="red"))
raise e
click.echo(f'Successfully migrated app annotation {app.id}.')
click.echo(f"Successfully migrated app annotation {app.id}.")
create_count += 1
except Exception as e:
click.echo(
click.style('Create app annotation index error: {} {}'.format(e.__class__.__name__, str(e)),
fg='red'))
click.style(
"Create app annotation index error: {} {}".format(e.__class__.__name__, str(e)), fg="red"
)
)
continue
click.echo(
click.style(f'Congratulations! Create {create_count} app annotation indexes, and skipped {skipped_count} apps.',
fg='green'))
click.style(
f"Congratulations! Create {create_count} app annotation indexes, and skipped {skipped_count} apps.",
fg="green",
)
)
def migrate_knowledge_vector_database():
"""
Migrate vector database datas to target vector database .
"""
click.echo(click.style('Start migrate vector db.', fg='green'))
click.echo(click.style("Start migrate vector db.", fg="green"))
create_count = 0
skipped_count = 0
total_count = 0
config = current_app.config
vector_type = config.get('VECTOR_STORE')
vector_type = dify_config.VECTOR_STORE
page = 1
while True:
try:
datasets = db.session.query(Dataset).filter(Dataset.indexing_technique == 'high_quality') \
.order_by(Dataset.created_at.desc()).paginate(page=page, per_page=50)
datasets = (
db.session.query(Dataset)
.filter(Dataset.indexing_technique == "high_quality")
.order_by(Dataset.created_at.desc())
.paginate(page=page, per_page=50)
)
except NotFound:
break
page += 1
for dataset in datasets:
total_count = total_count + 1
click.echo(f'Processing the {total_count} dataset {dataset.id}. '
+ f'{create_count} created, {skipped_count} skipped.')
click.echo(
f"Processing the {total_count} dataset {dataset.id}. {create_count} created, {skipped_count} skipped."
)
try:
click.echo('Create dataset vdb index: {}'.format(dataset.id))
click.echo("Create dataset vdb index: {}".format(dataset.id))
if dataset.index_struct_dict:
if dataset.index_struct_dict['type'] == vector_type:
if dataset.index_struct_dict["type"] == vector_type:
skipped_count = skipped_count + 1
continue
collection_name = ''
if vector_type == "weaviate":
collection_name = ""
if vector_type == VectorType.WEAVIATE:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": 'weaviate',
"vector_store": {"class_prefix": collection_name}
}
index_struct_dict = {"type": VectorType.WEAVIATE, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == "qdrant":
elif vector_type == VectorType.QDRANT:
if dataset.collection_binding_id:
dataset_collection_binding = db.session.query(DatasetCollectionBinding). \
filter(DatasetCollectionBinding.id == dataset.collection_binding_id). \
one_or_none()
dataset_collection_binding = (
db.session.query(DatasetCollectionBinding)
.filter(DatasetCollectionBinding.id == dataset.collection_binding_id)
.one_or_none()
)
if dataset_collection_binding:
collection_name = dataset_collection_binding.collection_name
else:
raise ValueError('Dataset Collection Bindings is not exist!')
raise ValueError("Dataset Collection Bindings is not exist!")
else:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": 'qdrant',
"vector_store": {"class_prefix": collection_name}
}
index_struct_dict = {"type": VectorType.QDRANT, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == "milvus":
elif vector_type == VectorType.MILVUS:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.MILVUS, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.RELYT:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": "relyt", "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.TENCENT:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.TENCENT, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.PGVECTOR:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": VectorType.PGVECTOR, "vector_store": {"class_prefix": collection_name}}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.OPENSEARCH:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": 'milvus',
"vector_store": {"class_prefix": collection_name}
"type": VectorType.OPENSEARCH,
"vector_store": {"class_prefix": collection_name},
}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == "relyt":
elif vector_type == VectorType.ANALYTICDB:
dataset_id = dataset.id
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {
"type": 'relyt',
"vector_store": {"class_prefix": collection_name}
"type": VectorType.ANALYTICDB,
"vector_store": {"class_prefix": collection_name},
}
dataset.index_struct = json.dumps(index_struct_dict)
elif vector_type == VectorType.ELASTICSEARCH:
dataset_id = dataset.id
index_name = Dataset.gen_collection_name_by_id(dataset_id)
index_struct_dict = {"type": "elasticsearch", "vector_store": {"class_prefix": index_name}}
dataset.index_struct = json.dumps(index_struct_dict)
else:
raise ValueError(f"Vector store {config.get('VECTOR_STORE')} is not supported.")
raise ValueError(f"Vector store {vector_type} is not supported.")
vector = Vector(dataset)
click.echo(f"Start to migrate dataset {dataset.id}.")
@ -314,29 +356,41 @@ def migrate_knowledge_vector_database():
try:
vector.delete()
click.echo(
click.style(f'Successfully delete vector index {collection_name} for dataset {dataset.id}.',
fg='green'))
click.style(
f"Successfully delete vector index {collection_name} for dataset {dataset.id}.", fg="green"
)
)
except Exception as e:
click.echo(
click.style(f'Failed to delete vector index {collection_name} for dataset {dataset.id}.',
fg='red'))
click.style(
f"Failed to delete vector index {collection_name} for dataset {dataset.id}.", fg="red"
)
)
raise e
dataset_documents = db.session.query(DatasetDocument).filter(
DatasetDocument.dataset_id == dataset.id,
DatasetDocument.indexing_status == 'completed',
DatasetDocument.enabled == True,
DatasetDocument.archived == False,
).all()
dataset_documents = (
db.session.query(DatasetDocument)
.filter(
DatasetDocument.dataset_id == dataset.id,
DatasetDocument.indexing_status == "completed",
DatasetDocument.enabled == True,
DatasetDocument.archived == False,
)
.all()
)
documents = []
segments_count = 0
for dataset_document in dataset_documents:
segments = db.session.query(DocumentSegment).filter(
DocumentSegment.document_id == dataset_document.id,
DocumentSegment.status == 'completed',
DocumentSegment.enabled == True
).all()
segments = (
db.session.query(DocumentSegment)
.filter(
DocumentSegment.document_id == dataset_document.id,
DocumentSegment.status == "completed",
DocumentSegment.enabled == True,
)
.all()
)
for segment in segments:
document = Document(
@ -346,7 +400,7 @@ def migrate_knowledge_vector_database():
"doc_hash": segment.index_node_hash,
"document_id": segment.document_id,
"dataset_id": segment.dataset_id,
}
},
)
documents.append(document)
@ -354,37 +408,44 @@ def migrate_knowledge_vector_database():
if documents:
try:
click.echo(click.style(
f'Start to created vector index with {len(documents)} documents of {segments_count} segments for dataset {dataset.id}.',
fg='green'))
click.echo(
click.style(
f"Start to created vector index with {len(documents)} documents of {segments_count}"
f" segments for dataset {dataset.id}.",
fg="green",
)
)
vector.create(documents)
click.echo(
click.style(f'Successfully created vector index for dataset {dataset.id}.', fg='green'))
click.style(f"Successfully created vector index for dataset {dataset.id}.", fg="green")
)
except Exception as e:
click.echo(click.style(f'Failed to created vector index for dataset {dataset.id}.', fg='red'))
click.echo(click.style(f"Failed to created vector index for dataset {dataset.id}.", fg="red"))
raise e
db.session.add(dataset)
db.session.commit()
click.echo(f'Successfully migrated dataset {dataset.id}.')
click.echo(f"Successfully migrated dataset {dataset.id}.")
create_count += 1
except Exception as e:
db.session.rollback()
click.echo(
click.style('Create dataset index error: {} {}'.format(e.__class__.__name__, str(e)),
fg='red'))
click.style("Create dataset index error: {} {}".format(e.__class__.__name__, str(e)), fg="red")
)
continue
click.echo(
click.style(f'Congratulations! Create {create_count} dataset indexes, and skipped {skipped_count} datasets.',
fg='green'))
click.style(
f"Congratulations! Create {create_count} dataset indexes, and skipped {skipped_count} datasets.", fg="green"
)
)
@click.command('convert-to-agent-apps', help='Convert Agent Assistant to Agent App.')
@click.command("convert-to-agent-apps", help="Convert Agent Assistant to Agent App.")
def convert_to_agent_apps():
"""
Convert Agent Assistant to Agent App.
"""
click.echo(click.style('Start convert to agent apps.', fg='green'))
click.echo(click.style("Start convert to agent apps.", fg="green"))
proceeded_app_ids = []
@ -419,7 +480,7 @@ def convert_to_agent_apps():
break
for app in apps:
click.echo('Converting app: {}'.format(app.id))
click.echo("Converting app: {}".format(app.id))
try:
app.mode = AppMode.AGENT_CHAT.value
@ -431,13 +492,180 @@ def convert_to_agent_apps():
)
db.session.commit()
click.echo(click.style('Converted app: {}'.format(app.id), fg='green'))
click.echo(click.style("Converted app: {}".format(app.id), fg="green"))
except Exception as e:
click.echo(
click.style('Convert app error: {} {}'.format(e.__class__.__name__,
str(e)), fg='red'))
click.echo(click.style("Convert app error: {} {}".format(e.__class__.__name__, str(e)), fg="red"))
click.echo(click.style('Congratulations! Converted {} agent apps.'.format(len(proceeded_app_ids)), fg='green'))
click.echo(click.style("Congratulations! Converted {} agent apps.".format(len(proceeded_app_ids)), fg="green"))
@click.command("add-qdrant-doc-id-index", help="add qdrant doc_id index.")
@click.option("--field", default="metadata.doc_id", prompt=False, help="index field , default is metadata.doc_id.")
def add_qdrant_doc_id_index(field: str):
click.echo(click.style("Start add qdrant doc_id index.", fg="green"))
vector_type = dify_config.VECTOR_STORE
if vector_type != "qdrant":
click.echo(click.style("Sorry, only support qdrant vector store.", fg="red"))
return
create_count = 0
try:
bindings = db.session.query(DatasetCollectionBinding).all()
if not bindings:
click.echo(click.style("Sorry, no dataset collection bindings found.", fg="red"))
return
import qdrant_client
from qdrant_client.http.exceptions import UnexpectedResponse
from qdrant_client.http.models import PayloadSchemaType
from core.rag.datasource.vdb.qdrant.qdrant_vector import QdrantConfig
for binding in bindings:
if dify_config.QDRANT_URL is None:
raise ValueError("Qdrant url is required.")
qdrant_config = QdrantConfig(
endpoint=dify_config.QDRANT_URL,
api_key=dify_config.QDRANT_API_KEY,
root_path=current_app.root_path,
timeout=dify_config.QDRANT_CLIENT_TIMEOUT,
grpc_port=dify_config.QDRANT_GRPC_PORT,
prefer_grpc=dify_config.QDRANT_GRPC_ENABLED,
)
try:
client = qdrant_client.QdrantClient(**qdrant_config.to_qdrant_params())
# create payload index
client.create_payload_index(binding.collection_name, field, field_schema=PayloadSchemaType.KEYWORD)
create_count += 1
except UnexpectedResponse as e:
# Collection does not exist, so return
if e.status_code == 404:
click.echo(
click.style(f"Collection not found, collection_name:{binding.collection_name}.", fg="red")
)
continue
# Some other error occurred, so re-raise the exception
else:
click.echo(
click.style(
f"Failed to create qdrant index, collection_name:{binding.collection_name}.", fg="red"
)
)
except Exception as e:
click.echo(click.style("Failed to create qdrant client.", fg="red"))
click.echo(click.style(f"Congratulations! Create {create_count} collection indexes.", fg="green"))
@click.command("create-tenant", help="Create account and tenant.")
@click.option("--email", prompt=True, help="The email address of the tenant account.")
@click.option("--name", prompt=True, help="The workspace name of the tenant account.")
@click.option("--language", prompt=True, help="Account language, default: en-US.")
def create_tenant(email: str, language: Optional[str] = None, name: Optional[str] = None):
"""
Create tenant account
"""
if not email:
click.echo(click.style("Sorry, email is required.", fg="red"))
return
# Create account
email = email.strip()
if "@" not in email:
click.echo(click.style("Sorry, invalid email address.", fg="red"))
return
account_name = email.split("@")[0]
if language not in languages:
language = "en-US"
name = name.strip()
# generate random password
new_password = secrets.token_urlsafe(16)
# register account
account = RegisterService.register(email=email, name=account_name, password=new_password, language=language)
TenantService.create_owner_tenant_if_not_exist(account, name)
click.echo(
click.style(
"Congratulations! Account and tenant created.\nAccount: {}\nPassword: {}".format(email, new_password),
fg="green",
)
)
@click.command("upgrade-db", help="upgrade the database")
def upgrade_db():
click.echo("Preparing database migration...")
lock = redis_client.lock(name="db_upgrade_lock", timeout=60)
if lock.acquire(blocking=False):
try:
click.echo(click.style("Start database migration.", fg="green"))
# run db migration
import flask_migrate
flask_migrate.upgrade()
click.echo(click.style("Database migration successful!", fg="green"))
except Exception as e:
logging.exception(f"Database migration failed, error: {e}")
finally:
lock.release()
else:
click.echo("Database migration skipped")
@click.command("fix-app-site-missing", help="Fix app related site missing issue.")
def fix_app_site_missing():
"""
Fix app related site missing issue.
"""
click.echo(click.style("Start fix app related site missing issue.", fg="green"))
failed_app_ids = []
while True:
sql = """select apps.id as id from apps left join sites on sites.app_id=apps.id
where sites.id is null limit 1000"""
with db.engine.begin() as conn:
rs = conn.execute(db.text(sql))
processed_count = 0
for i in rs:
processed_count += 1
app_id = str(i.id)
if app_id in failed_app_ids:
continue
try:
app = db.session.query(App).filter(App.id == app_id).first()
tenant = app.tenant
if tenant:
accounts = tenant.get_accounts()
if not accounts:
print("Fix app {} failed.".format(app.id))
continue
account = accounts[0]
print("Fix app {} related site missing issue.".format(app.id))
app_was_created.send(app, account=account)
except Exception as e:
failed_app_ids.append(app_id)
click.echo(click.style("Fix app {} related site missing issue failed!".format(app_id), fg="red"))
logging.exception(f"Fix app related site missing issue failed, error: {e}")
continue
if not processed_count:
break
click.echo(click.style("Congratulations! Fix app related site missing issue successful!", fg="green"))
def register_commands(app):
@ -446,3 +674,7 @@ def register_commands(app):
app.cli.add_command(reset_encrypt_key_pair)
app.cli.add_command(vdb_migrate)
app.cli.add_command(convert_to_agent_apps)
app.cli.add_command(add_qdrant_doc_id_index)
app.cli.add_command(create_tenant)
app.cli.add_command(upgrade_db)
app.cli.add_command(fix_app_site_missing)

View File

@ -1,357 +0,0 @@
import os
import dotenv
dotenv.load_dotenv()
DEFAULTS = {
'DB_USERNAME': 'postgres',
'DB_PASSWORD': '',
'DB_HOST': 'localhost',
'DB_PORT': '5432',
'DB_DATABASE': 'dify',
'DB_CHARSET': '',
'REDIS_HOST': 'localhost',
'REDIS_PORT': '6379',
'REDIS_DB': '0',
'REDIS_USE_SSL': 'False',
'OAUTH_REDIRECT_PATH': '/console/api/oauth/authorize',
'OAUTH_REDIRECT_INDEX_PATH': '/',
'CONSOLE_WEB_URL': 'https://cloud.dify.ai',
'CONSOLE_API_URL': 'https://cloud.dify.ai',
'SERVICE_API_URL': 'https://api.dify.ai',
'APP_WEB_URL': 'https://udify.app',
'FILES_URL': '',
'S3_ADDRESS_STYLE': 'auto',
'STORAGE_TYPE': 'local',
'STORAGE_LOCAL_PATH': 'storage',
'CHECK_UPDATE_URL': 'https://updates.dify.ai',
'DEPLOY_ENV': 'PRODUCTION',
'SQLALCHEMY_POOL_SIZE': 30,
'SQLALCHEMY_MAX_OVERFLOW': 10,
'SQLALCHEMY_POOL_RECYCLE': 3600,
'SQLALCHEMY_ECHO': 'False',
'SENTRY_TRACES_SAMPLE_RATE': 1.0,
'SENTRY_PROFILES_SAMPLE_RATE': 1.0,
'WEAVIATE_GRPC_ENABLED': 'True',
'WEAVIATE_BATCH_SIZE': 100,
'QDRANT_CLIENT_TIMEOUT': 20,
'CELERY_BACKEND': 'database',
'LOG_LEVEL': 'INFO',
'LOG_FILE': '',
'LOG_FORMAT': '%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s',
'LOG_DATEFORMAT': '%Y-%m-%d %H:%M:%S',
'HOSTED_OPENAI_QUOTA_LIMIT': 200,
'HOSTED_OPENAI_TRIAL_ENABLED': 'False',
'HOSTED_OPENAI_TRIAL_MODELS': 'gpt-3.5-turbo,gpt-3.5-turbo-1106,gpt-3.5-turbo-instruct,gpt-3.5-turbo-16k,gpt-3.5-turbo-16k-0613,gpt-3.5-turbo-0613,gpt-3.5-turbo-0125,text-davinci-003',
'HOSTED_OPENAI_PAID_ENABLED': 'False',
'HOSTED_OPENAI_PAID_MODELS': 'gpt-4,gpt-4-turbo-preview,gpt-4-turbo-2024-04-09,gpt-4-1106-preview,gpt-4-0125-preview,gpt-3.5-turbo,gpt-3.5-turbo-16k,gpt-3.5-turbo-16k-0613,gpt-3.5-turbo-1106,gpt-3.5-turbo-0613,gpt-3.5-turbo-0125,gpt-3.5-turbo-instruct,text-davinci-003',
'HOSTED_AZURE_OPENAI_ENABLED': 'False',
'HOSTED_AZURE_OPENAI_QUOTA_LIMIT': 200,
'HOSTED_ANTHROPIC_QUOTA_LIMIT': 600000,
'HOSTED_ANTHROPIC_TRIAL_ENABLED': 'False',
'HOSTED_ANTHROPIC_PAID_ENABLED': 'False',
'HOSTED_MODERATION_ENABLED': 'False',
'HOSTED_MODERATION_PROVIDERS': '',
'HOSTED_FETCH_APP_TEMPLATES_MODE': 'remote',
'HOSTED_FETCH_APP_TEMPLATES_REMOTE_DOMAIN': 'https://tmpl.dify.ai',
'CLEAN_DAY_SETTING': 30,
'UPLOAD_FILE_SIZE_LIMIT': 15,
'UPLOAD_FILE_BATCH_LIMIT': 5,
'UPLOAD_IMAGE_FILE_SIZE_LIMIT': 10,
'OUTPUT_MODERATION_BUFFER_SIZE': 300,
'MULTIMODAL_SEND_IMAGE_FORMAT': 'base64',
'INVITE_EXPIRY_HOURS': 72,
'BILLING_ENABLED': 'False',
'CAN_REPLACE_LOGO': 'False',
'ETL_TYPE': 'dify',
'KEYWORD_STORE': 'jieba',
'BATCH_UPLOAD_LIMIT': 20,
'CODE_EXECUTION_ENDPOINT': 'http://sandbox:8194',
'CODE_EXECUTION_API_KEY': 'dify-sandbox',
'TOOL_ICON_CACHE_MAX_AGE': 3600,
'MILVUS_DATABASE': 'default',
'KEYWORD_DATA_SOURCE_TYPE': 'database',
'INNER_API': 'False',
'ENTERPRISE_ENABLED': 'False',
}
def get_env(key):
return os.environ.get(key, DEFAULTS.get(key))
def get_bool_env(key):
value = get_env(key)
return value.lower() == 'true' if value is not None else False
def get_cors_allow_origins(env, default):
cors_allow_origins = []
if get_env(env):
for origin in get_env(env).split(','):
cors_allow_origins.append(origin)
else:
cors_allow_origins = [default]
return cors_allow_origins
class Config:
"""Application configuration class."""
def __init__(self):
# ------------------------
# General Configurations.
# ------------------------
self.CURRENT_VERSION = "0.6.4"
self.COMMIT_SHA = get_env('COMMIT_SHA')
self.EDITION = "SELF_HOSTED"
self.DEPLOY_ENV = get_env('DEPLOY_ENV')
self.TESTING = False
self.LOG_LEVEL = get_env('LOG_LEVEL')
self.LOG_FILE = get_env('LOG_FILE')
self.LOG_FORMAT = get_env('LOG_FORMAT')
self.LOG_DATEFORMAT = get_env('LOG_DATEFORMAT')
# The backend URL prefix of the console API.
# used to concatenate the login authorization callback or notion integration callback.
self.CONSOLE_API_URL = get_env('CONSOLE_API_URL')
# The front-end URL prefix of the console web.
# used to concatenate some front-end addresses and for CORS configuration use.
self.CONSOLE_WEB_URL = get_env('CONSOLE_WEB_URL')
# WebApp Url prefix.
# used to display WebAPP API Base Url to the front-end.
self.APP_WEB_URL = get_env('APP_WEB_URL')
# Service API Url prefix.
# used to display Service API Base Url to the front-end.
self.SERVICE_API_URL = get_env('SERVICE_API_URL')
# File preview or download Url prefix.
# used to display File preview or download Url to the front-end or as Multi-model inputs;
# Url is signed and has expiration time.
self.FILES_URL = get_env('FILES_URL') if get_env('FILES_URL') else self.CONSOLE_API_URL
# Your App secret key will be used for securely signing the session cookie
# Make sure you are changing this key for your deployment with a strong key.
# You can generate a strong key using `openssl rand -base64 42`.
# Alternatively you can set it with `SECRET_KEY` environment variable.
self.SECRET_KEY = get_env('SECRET_KEY')
# Enable or disable the inner API.
self.INNER_API = get_bool_env('INNER_API')
# The inner API key is used to authenticate the inner API.
self.INNER_API_KEY = get_env('INNER_API_KEY')
# cors settings
self.CONSOLE_CORS_ALLOW_ORIGINS = get_cors_allow_origins(
'CONSOLE_CORS_ALLOW_ORIGINS', self.CONSOLE_WEB_URL)
self.WEB_API_CORS_ALLOW_ORIGINS = get_cors_allow_origins(
'WEB_API_CORS_ALLOW_ORIGINS', '*')
# check update url
self.CHECK_UPDATE_URL = get_env('CHECK_UPDATE_URL')
# ------------------------
# Database Configurations.
# ------------------------
db_credentials = {
key: get_env(key) for key in
['DB_USERNAME', 'DB_PASSWORD', 'DB_HOST', 'DB_PORT', 'DB_DATABASE', 'DB_CHARSET']
}
db_extras = f"?client_encoding={db_credentials['DB_CHARSET']}" if db_credentials['DB_CHARSET'] else ""
self.SQLALCHEMY_DATABASE_URI = f"postgresql://{db_credentials['DB_USERNAME']}:{db_credentials['DB_PASSWORD']}@{db_credentials['DB_HOST']}:{db_credentials['DB_PORT']}/{db_credentials['DB_DATABASE']}{db_extras}"
self.SQLALCHEMY_ENGINE_OPTIONS = {
'pool_size': int(get_env('SQLALCHEMY_POOL_SIZE')),
'max_overflow': int(get_env('SQLALCHEMY_MAX_OVERFLOW')),
'pool_recycle': int(get_env('SQLALCHEMY_POOL_RECYCLE'))
}
self.SQLALCHEMY_ECHO = get_bool_env('SQLALCHEMY_ECHO')
# ------------------------
# Redis Configurations.
# ------------------------
self.REDIS_HOST = get_env('REDIS_HOST')
self.REDIS_PORT = get_env('REDIS_PORT')
self.REDIS_USERNAME = get_env('REDIS_USERNAME')
self.REDIS_PASSWORD = get_env('REDIS_PASSWORD')
self.REDIS_DB = get_env('REDIS_DB')
self.REDIS_USE_SSL = get_bool_env('REDIS_USE_SSL')
# ------------------------
# Celery worker Configurations.
# ------------------------
self.CELERY_BROKER_URL = get_env('CELERY_BROKER_URL')
self.CELERY_BACKEND = get_env('CELERY_BACKEND')
self.CELERY_RESULT_BACKEND = 'db+{}'.format(self.SQLALCHEMY_DATABASE_URI) \
if self.CELERY_BACKEND == 'database' else self.CELERY_BROKER_URL
self.BROKER_USE_SSL = self.CELERY_BROKER_URL.startswith('rediss://')
# ------------------------
# File Storage Configurations.
# ------------------------
self.STORAGE_TYPE = get_env('STORAGE_TYPE')
self.STORAGE_LOCAL_PATH = get_env('STORAGE_LOCAL_PATH')
self.S3_ENDPOINT = get_env('S3_ENDPOINT')
self.S3_BUCKET_NAME = get_env('S3_BUCKET_NAME')
self.S3_ACCESS_KEY = get_env('S3_ACCESS_KEY')
self.S3_SECRET_KEY = get_env('S3_SECRET_KEY')
self.S3_REGION = get_env('S3_REGION')
self.S3_ADDRESS_STYLE = get_env('S3_ADDRESS_STYLE')
self.AZURE_BLOB_ACCOUNT_NAME = get_env('AZURE_BLOB_ACCOUNT_NAME')
self.AZURE_BLOB_ACCOUNT_KEY = get_env('AZURE_BLOB_ACCOUNT_KEY')
self.AZURE_BLOB_CONTAINER_NAME = get_env('AZURE_BLOB_CONTAINER_NAME')
self.AZURE_BLOB_ACCOUNT_URL = get_env('AZURE_BLOB_ACCOUNT_URL')
# ------------------------
# Vector Store Configurations.
# Currently, only support: qdrant, milvus, zilliz, weaviate, relyt
# ------------------------
self.VECTOR_STORE = get_env('VECTOR_STORE')
self.KEYWORD_STORE = get_env('KEYWORD_STORE')
# qdrant settings
self.QDRANT_URL = get_env('QDRANT_URL')
self.QDRANT_API_KEY = get_env('QDRANT_API_KEY')
self.QDRANT_CLIENT_TIMEOUT = get_env('QDRANT_CLIENT_TIMEOUT')
# milvus / zilliz setting
self.MILVUS_HOST = get_env('MILVUS_HOST')
self.MILVUS_PORT = get_env('MILVUS_PORT')
self.MILVUS_USER = get_env('MILVUS_USER')
self.MILVUS_PASSWORD = get_env('MILVUS_PASSWORD')
self.MILVUS_SECURE = get_env('MILVUS_SECURE')
self.MILVUS_DATABASE = get_env('MILVUS_DATABASE')
# weaviate settings
self.WEAVIATE_ENDPOINT = get_env('WEAVIATE_ENDPOINT')
self.WEAVIATE_API_KEY = get_env('WEAVIATE_API_KEY')
self.WEAVIATE_GRPC_ENABLED = get_bool_env('WEAVIATE_GRPC_ENABLED')
self.WEAVIATE_BATCH_SIZE = int(get_env('WEAVIATE_BATCH_SIZE'))
# relyt settings
self.RELYT_HOST = get_env('RELYT_HOST')
self.RELYT_PORT = get_env('RELYT_PORT')
self.RELYT_USER = get_env('RELYT_USER')
self.RELYT_PASSWORD = get_env('RELYT_PASSWORD')
self.RELYT_DATABASE = get_env('RELYT_DATABASE')
# ------------------------
# Mail Configurations.
# ------------------------
self.MAIL_TYPE = get_env('MAIL_TYPE')
self.MAIL_DEFAULT_SEND_FROM = get_env('MAIL_DEFAULT_SEND_FROM')
self.RESEND_API_KEY = get_env('RESEND_API_KEY')
self.RESEND_API_URL = get_env('RESEND_API_URL')
# SMTP settings
self.SMTP_SERVER = get_env('SMTP_SERVER')
self.SMTP_PORT = get_env('SMTP_PORT')
self.SMTP_USERNAME = get_env('SMTP_USERNAME')
self.SMTP_PASSWORD = get_env('SMTP_PASSWORD')
self.SMTP_USE_TLS = get_bool_env('SMTP_USE_TLS')
# ------------------------
# Workpace Configurations.
# ------------------------
self.INVITE_EXPIRY_HOURS = int(get_env('INVITE_EXPIRY_HOURS'))
# ------------------------
# Sentry Configurations.
# ------------------------
self.SENTRY_DSN = get_env('SENTRY_DSN')
self.SENTRY_TRACES_SAMPLE_RATE = float(get_env('SENTRY_TRACES_SAMPLE_RATE'))
self.SENTRY_PROFILES_SAMPLE_RATE = float(get_env('SENTRY_PROFILES_SAMPLE_RATE'))
# ------------------------
# Business Configurations.
# ------------------------
# multi model send image format, support base64, url, default is base64
self.MULTIMODAL_SEND_IMAGE_FORMAT = get_env('MULTIMODAL_SEND_IMAGE_FORMAT')
# Dataset Configurations.
self.CLEAN_DAY_SETTING = get_env('CLEAN_DAY_SETTING')
# File upload Configurations.
self.UPLOAD_FILE_SIZE_LIMIT = int(get_env('UPLOAD_FILE_SIZE_LIMIT'))
self.UPLOAD_FILE_BATCH_LIMIT = int(get_env('UPLOAD_FILE_BATCH_LIMIT'))
self.UPLOAD_IMAGE_FILE_SIZE_LIMIT = int(get_env('UPLOAD_IMAGE_FILE_SIZE_LIMIT'))
# Moderation in app Configurations.
self.OUTPUT_MODERATION_BUFFER_SIZE = int(get_env('OUTPUT_MODERATION_BUFFER_SIZE'))
# Notion integration setting
self.NOTION_CLIENT_ID = get_env('NOTION_CLIENT_ID')
self.NOTION_CLIENT_SECRET = get_env('NOTION_CLIENT_SECRET')
self.NOTION_INTEGRATION_TYPE = get_env('NOTION_INTEGRATION_TYPE')
self.NOTION_INTERNAL_SECRET = get_env('NOTION_INTERNAL_SECRET')
self.NOTION_INTEGRATION_TOKEN = get_env('NOTION_INTEGRATION_TOKEN')
# ------------------------
# Platform Configurations.
# ------------------------
self.HOSTED_OPENAI_API_KEY = get_env('HOSTED_OPENAI_API_KEY')
self.HOSTED_OPENAI_API_BASE = get_env('HOSTED_OPENAI_API_BASE')
self.HOSTED_OPENAI_API_ORGANIZATION = get_env('HOSTED_OPENAI_API_ORGANIZATION')
self.HOSTED_OPENAI_TRIAL_ENABLED = get_bool_env('HOSTED_OPENAI_TRIAL_ENABLED')
self.HOSTED_OPENAI_TRIAL_MODELS = get_env('HOSTED_OPENAI_TRIAL_MODELS')
self.HOSTED_OPENAI_QUOTA_LIMIT = int(get_env('HOSTED_OPENAI_QUOTA_LIMIT'))
self.HOSTED_OPENAI_PAID_ENABLED = get_bool_env('HOSTED_OPENAI_PAID_ENABLED')
self.HOSTED_OPENAI_PAID_MODELS = get_env('HOSTED_OPENAI_PAID_MODELS')
self.HOSTED_AZURE_OPENAI_ENABLED = get_bool_env('HOSTED_AZURE_OPENAI_ENABLED')
self.HOSTED_AZURE_OPENAI_API_KEY = get_env('HOSTED_AZURE_OPENAI_API_KEY')
self.HOSTED_AZURE_OPENAI_API_BASE = get_env('HOSTED_AZURE_OPENAI_API_BASE')
self.HOSTED_AZURE_OPENAI_QUOTA_LIMIT = int(get_env('HOSTED_AZURE_OPENAI_QUOTA_LIMIT'))
self.HOSTED_ANTHROPIC_API_BASE = get_env('HOSTED_ANTHROPIC_API_BASE')
self.HOSTED_ANTHROPIC_API_KEY = get_env('HOSTED_ANTHROPIC_API_KEY')
self.HOSTED_ANTHROPIC_TRIAL_ENABLED = get_bool_env('HOSTED_ANTHROPIC_TRIAL_ENABLED')
self.HOSTED_ANTHROPIC_QUOTA_LIMIT = int(get_env('HOSTED_ANTHROPIC_QUOTA_LIMIT'))
self.HOSTED_ANTHROPIC_PAID_ENABLED = get_bool_env('HOSTED_ANTHROPIC_PAID_ENABLED')
self.HOSTED_MINIMAX_ENABLED = get_bool_env('HOSTED_MINIMAX_ENABLED')
self.HOSTED_SPARK_ENABLED = get_bool_env('HOSTED_SPARK_ENABLED')
self.HOSTED_ZHIPUAI_ENABLED = get_bool_env('HOSTED_ZHIPUAI_ENABLED')
self.HOSTED_MODERATION_ENABLED = get_bool_env('HOSTED_MODERATION_ENABLED')
self.HOSTED_MODERATION_PROVIDERS = get_env('HOSTED_MODERATION_PROVIDERS')
# fetch app templates mode, remote, builtin, db(only for dify SaaS), default: remote
self.HOSTED_FETCH_APP_TEMPLATES_MODE = get_env('HOSTED_FETCH_APP_TEMPLATES_MODE')
self.HOSTED_FETCH_APP_TEMPLATES_REMOTE_DOMAIN = get_env('HOSTED_FETCH_APP_TEMPLATES_REMOTE_DOMAIN')
self.ETL_TYPE = get_env('ETL_TYPE')
self.UNSTRUCTURED_API_URL = get_env('UNSTRUCTURED_API_URL')
self.BILLING_ENABLED = get_bool_env('BILLING_ENABLED')
self.CAN_REPLACE_LOGO = get_bool_env('CAN_REPLACE_LOGO')
self.BATCH_UPLOAD_LIMIT = get_env('BATCH_UPLOAD_LIMIT')
self.CODE_EXECUTION_ENDPOINT = get_env('CODE_EXECUTION_ENDPOINT')
self.CODE_EXECUTION_API_KEY = get_env('CODE_EXECUTION_API_KEY')
self.API_COMPRESSION_ENABLED = get_bool_env('API_COMPRESSION_ENABLED')
self.TOOL_ICON_CACHE_MAX_AGE = get_env('TOOL_ICON_CACHE_MAX_AGE')
self.KEYWORD_DATA_SOURCE_TYPE = get_env('KEYWORD_DATA_SOURCE_TYPE')
self.ENTERPRISE_ENABLED = get_bool_env('ENTERPRISE_ENABLED')
class CloudEditionConfig(Config):
def __init__(self):
super().__init__()
self.EDITION = "CLOUD"
self.GITHUB_CLIENT_ID = get_env('GITHUB_CLIENT_ID')
self.GITHUB_CLIENT_SECRET = get_env('GITHUB_CLIENT_SECRET')
self.GOOGLE_CLIENT_ID = get_env('GOOGLE_CLIENT_ID')
self.GOOGLE_CLIENT_SECRET = get_env('GOOGLE_CLIENT_SECRET')
self.OAUTH_REDIRECT_PATH = get_env('OAUTH_REDIRECT_PATH')

3
api/configs/__init__.py Normal file
View File

@ -0,0 +1,3 @@
from .app_config import DifyConfig
dify_config = DifyConfig()

38
api/configs/app_config.py Normal file
View File

@ -0,0 +1,38 @@
from pydantic_settings import SettingsConfigDict
from configs.deploy import DeploymentConfig
from configs.enterprise import EnterpriseFeatureConfig
from configs.extra import ExtraServiceConfig
from configs.feature import FeatureConfig
from configs.middleware import MiddlewareConfig
from configs.packaging import PackagingInfo
class DifyConfig(
# Packaging info
PackagingInfo,
# Deployment configs
DeploymentConfig,
# Feature configs
FeatureConfig,
# Middleware configs
MiddlewareConfig,
# Extra service configs
ExtraServiceConfig,
# Enterprise feature configs
# **Before using, please contact business@dify.ai by email to inquire about licensing matters.**
EnterpriseFeatureConfig,
):
model_config = SettingsConfigDict(
# read from dotenv format config file
env_file=".env",
env_file_encoding="utf-8",
frozen=True,
# ignore extra attributes
extra="ignore",
)
# Before adding any config,
# please consider to arrange it in the proper config group of existed or added
# for better readability and maintainability.
# Thanks for your concentration and consideration.

View File

@ -0,0 +1,33 @@
from pydantic import Field
from pydantic_settings import BaseSettings
class DeploymentConfig(BaseSettings):
"""
Deployment configs
"""
APPLICATION_NAME: str = Field(
description="application name",
default="langgenius/dify",
)
DEBUG: bool = Field(
description="whether to enable debug mode.",
default=False,
)
TESTING: bool = Field(
description="",
default=False,
)
EDITION: str = Field(
description="deployment edition",
default="SELF_HOSTED",
)
DEPLOY_ENV: str = Field(
description="deployment environment, default to PRODUCTION.",
default="PRODUCTION",
)

View File

@ -0,0 +1,20 @@
from pydantic import Field
from pydantic_settings import BaseSettings
class EnterpriseFeatureConfig(BaseSettings):
"""
Enterprise feature configs.
**Before using, please contact business@dify.ai by email to inquire about licensing matters.**
"""
ENTERPRISE_ENABLED: bool = Field(
description="whether to enable enterprise features."
"Before using, please contact business@dify.ai by email to inquire about licensing matters.",
default=False,
)
CAN_REPLACE_LOGO: bool = Field(
description="whether to allow replacing enterprise logo.",
default=False,
)

View File

@ -0,0 +1,10 @@
from configs.extra.notion_config import NotionConfig
from configs.extra.sentry_config import SentryConfig
class ExtraServiceConfig(
# place the configs in alphabet order
NotionConfig,
SentryConfig,
):
pass

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class NotionConfig(BaseSettings):
"""
Notion integration configs
"""
NOTION_CLIENT_ID: Optional[str] = Field(
description="Notion client ID",
default=None,
)
NOTION_CLIENT_SECRET: Optional[str] = Field(
description="Notion client secret key",
default=None,
)
NOTION_INTEGRATION_TYPE: Optional[str] = Field(
description="Notion integration type, default to None, available values: internal.",
default=None,
)
NOTION_INTERNAL_SECRET: Optional[str] = Field(
description="Notion internal secret key",
default=None,
)
NOTION_INTEGRATION_TOKEN: Optional[str] = Field(
description="Notion integration token",
default=None,
)

View File

@ -0,0 +1,25 @@
from typing import Optional
from pydantic import Field, NonNegativeFloat
from pydantic_settings import BaseSettings
class SentryConfig(BaseSettings):
"""
Sentry configs
"""
SENTRY_DSN: Optional[str] = Field(
description="Sentry DSN",
default=None,
)
SENTRY_TRACES_SAMPLE_RATE: NonNegativeFloat = Field(
description="Sentry trace sample rate",
default=1.0,
)
SENTRY_PROFILES_SAMPLE_RATE: NonNegativeFloat = Field(
description="Sentry profiles sample rate",
default=1.0,
)

View File

@ -0,0 +1,631 @@
from typing import Annotated, Optional
from pydantic import AliasChoices, Field, HttpUrl, NegativeInt, NonNegativeInt, PositiveInt, computed_field
from pydantic_settings import BaseSettings
from configs.feature.hosted_service import HostedServiceConfig
class SecurityConfig(BaseSettings):
"""
Secret Key configs
"""
SECRET_KEY: Optional[str] = Field(
description="Your App secret key will be used for securely signing the session cookie"
"Make sure you are changing this key for your deployment with a strong key."
"You can generate a strong key using `openssl rand -base64 42`."
"Alternatively you can set it with `SECRET_KEY` environment variable.",
default=None,
)
RESET_PASSWORD_TOKEN_EXPIRY_HOURS: PositiveInt = Field(
description="Expiry time in hours for reset token",
default=24,
)
class AppExecutionConfig(BaseSettings):
"""
App Execution configs
"""
APP_MAX_EXECUTION_TIME: PositiveInt = Field(
description="execution timeout in seconds for app execution",
default=1200,
)
APP_MAX_ACTIVE_REQUESTS: NonNegativeInt = Field(
description="max active request per app, 0 means unlimited",
default=0,
)
class CodeExecutionSandboxConfig(BaseSettings):
"""
Code Execution Sandbox configs
"""
CODE_EXECUTION_ENDPOINT: HttpUrl = Field(
description="endpoint URL of code execution service",
default="http://sandbox:8194",
)
CODE_EXECUTION_API_KEY: str = Field(
description="API key for code execution service",
default="dify-sandbox",
)
CODE_EXECUTION_CONNECT_TIMEOUT: Optional[float] = Field(
description="connect timeout in seconds for code execution request",
default=10.0,
)
CODE_EXECUTION_READ_TIMEOUT: Optional[float] = Field(
description="read timeout in seconds for code execution request",
default=60.0,
)
CODE_EXECUTION_WRITE_TIMEOUT: Optional[float] = Field(
description="write timeout in seconds for code execution request",
default=10.0,
)
CODE_MAX_NUMBER: PositiveInt = Field(
description="max depth for code execution",
default=9223372036854775807,
)
CODE_MIN_NUMBER: NegativeInt = Field(
description="",
default=-9223372036854775807,
)
CODE_MAX_DEPTH: PositiveInt = Field(
description="max depth for code execution",
default=5,
)
CODE_MAX_PRECISION: PositiveInt = Field(
description="max precision digits for float type in code execution",
default=20,
)
CODE_MAX_STRING_LENGTH: PositiveInt = Field(
description="max string length for code execution",
default=80000,
)
CODE_MAX_STRING_ARRAY_LENGTH: PositiveInt = Field(
description="",
default=30,
)
CODE_MAX_OBJECT_ARRAY_LENGTH: PositiveInt = Field(
description="",
default=30,
)
CODE_MAX_NUMBER_ARRAY_LENGTH: PositiveInt = Field(
description="",
default=1000,
)
class EndpointConfig(BaseSettings):
"""
Module URL configs
"""
CONSOLE_API_URL: str = Field(
description="The backend URL prefix of the console API."
"used to concatenate the login authorization callback or notion integration callback.",
default="",
)
CONSOLE_WEB_URL: str = Field(
description="The front-end URL prefix of the console web."
"used to concatenate some front-end addresses and for CORS configuration use.",
default="",
)
SERVICE_API_URL: str = Field(
description="Service API Url prefix. used to display Service API Base Url to the front-end.",
default="",
)
APP_WEB_URL: str = Field(
description="WebApp Url prefix. used to display WebAPP API Base Url to the front-end.",
default="",
)
class FileAccessConfig(BaseSettings):
"""
File Access configs
"""
FILES_URL: str = Field(
description="File preview or download Url prefix."
" used to display File preview or download Url to the front-end or as Multi-model inputs;"
"Url is signed and has expiration time.",
validation_alias=AliasChoices("FILES_URL", "CONSOLE_API_URL"),
alias_priority=1,
default="",
)
FILES_ACCESS_TIMEOUT: int = Field(
description="timeout in seconds for file accessing",
default=300,
)
class FileUploadConfig(BaseSettings):
"""
File Uploading configs
"""
UPLOAD_FILE_SIZE_LIMIT: NonNegativeInt = Field(
description="size limit in Megabytes for uploading files",
default=15,
)
UPLOAD_FILE_BATCH_LIMIT: NonNegativeInt = Field(
description="batch size limit for uploading files",
default=5,
)
UPLOAD_IMAGE_FILE_SIZE_LIMIT: NonNegativeInt = Field(
description="image file size limit in Megabytes for uploading files",
default=10,
)
BATCH_UPLOAD_LIMIT: NonNegativeInt = Field(
description="", # todo: to be clarified
default=20,
)
class HttpConfig(BaseSettings):
"""
HTTP configs
"""
API_COMPRESSION_ENABLED: bool = Field(
description="whether to enable HTTP response compression of gzip",
default=False,
)
inner_CONSOLE_CORS_ALLOW_ORIGINS: str = Field(
description="",
validation_alias=AliasChoices("CONSOLE_CORS_ALLOW_ORIGINS", "CONSOLE_WEB_URL"),
default="",
)
@computed_field
@property
def CONSOLE_CORS_ALLOW_ORIGINS(self) -> list[str]:
return self.inner_CONSOLE_CORS_ALLOW_ORIGINS.split(",")
inner_WEB_API_CORS_ALLOW_ORIGINS: str = Field(
description="",
validation_alias=AliasChoices("WEB_API_CORS_ALLOW_ORIGINS"),
default="*",
)
@computed_field
@property
def WEB_API_CORS_ALLOW_ORIGINS(self) -> list[str]:
return self.inner_WEB_API_CORS_ALLOW_ORIGINS.split(",")
HTTP_REQUEST_MAX_CONNECT_TIMEOUT: Annotated[
PositiveInt, Field(ge=10, description="connect timeout in seconds for HTTP request")
] = 10
HTTP_REQUEST_MAX_READ_TIMEOUT: Annotated[
PositiveInt, Field(ge=60, description="read timeout in seconds for HTTP request")
] = 60
HTTP_REQUEST_MAX_WRITE_TIMEOUT: Annotated[
PositiveInt, Field(ge=10, description="read timeout in seconds for HTTP request")
] = 20
HTTP_REQUEST_NODE_MAX_BINARY_SIZE: PositiveInt = Field(
description="",
default=10 * 1024 * 1024,
)
HTTP_REQUEST_NODE_MAX_TEXT_SIZE: PositiveInt = Field(
description="",
default=1 * 1024 * 1024,
)
SSRF_PROXY_HTTP_URL: Optional[str] = Field(
description="HTTP URL for SSRF proxy",
default=None,
)
SSRF_PROXY_HTTPS_URL: Optional[str] = Field(
description="HTTPS URL for SSRF proxy",
default=None,
)
class InnerAPIConfig(BaseSettings):
"""
Inner API configs
"""
INNER_API: bool = Field(
description="whether to enable the inner API",
default=False,
)
INNER_API_KEY: Optional[str] = Field(
description="The inner API key is used to authenticate the inner API",
default=None,
)
class LoggingConfig(BaseSettings):
"""
Logging configs
"""
LOG_LEVEL: str = Field(
description="Log output level, default to INFO. It is recommended to set it to ERROR for production.",
default="INFO",
)
LOG_FILE: Optional[str] = Field(
description="logging output file path",
default=None,
)
LOG_FORMAT: str = Field(
description="log format",
default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s",
)
LOG_DATEFORMAT: Optional[str] = Field(
description="log date format",
default=None,
)
LOG_TZ: Optional[str] = Field(
description="specify log timezone, eg: America/New_York",
default=None,
)
class ModelLoadBalanceConfig(BaseSettings):
"""
Model load balance configs
"""
MODEL_LB_ENABLED: bool = Field(
description="whether to enable model load balancing",
default=False,
)
class BillingConfig(BaseSettings):
"""
Platform Billing Configurations
"""
BILLING_ENABLED: bool = Field(
description="whether to enable billing",
default=False,
)
class UpdateConfig(BaseSettings):
"""
Update configs
"""
CHECK_UPDATE_URL: str = Field(
description="url for checking updates",
default="https://updates.dify.ai",
)
class WorkflowConfig(BaseSettings):
"""
Workflow feature configs
"""
WORKFLOW_MAX_EXECUTION_STEPS: PositiveInt = Field(
description="max execution steps in single workflow execution",
default=500,
)
WORKFLOW_MAX_EXECUTION_TIME: PositiveInt = Field(
description="max execution time in seconds in single workflow execution",
default=1200,
)
WORKFLOW_CALL_MAX_DEPTH: PositiveInt = Field(
description="max depth of calling in single workflow execution",
default=5,
)
MAX_VARIABLE_SIZE: PositiveInt = Field(
description="The maximum size in bytes of a variable. default to 5KB.",
default=5 * 1024,
)
class OAuthConfig(BaseSettings):
"""
oauth configs
"""
OAUTH_REDIRECT_PATH: str = Field(
description="redirect path for OAuth",
default="/console/api/oauth/authorize",
)
GITHUB_CLIENT_ID: Optional[str] = Field(
description="GitHub client id for OAuth",
default=None,
)
GITHUB_CLIENT_SECRET: Optional[str] = Field(
description="GitHub client secret key for OAuth",
default=None,
)
GOOGLE_CLIENT_ID: Optional[str] = Field(
description="Google client id for OAuth",
default=None,
)
GOOGLE_CLIENT_SECRET: Optional[str] = Field(
description="Google client secret key for OAuth",
default=None,
)
class ModerationConfig(BaseSettings):
"""
Moderation in app configs.
"""
MODERATION_BUFFER_SIZE: PositiveInt = Field(
description="buffer size for moderation",
default=300,
)
class ToolConfig(BaseSettings):
"""
Tool configs
"""
TOOL_ICON_CACHE_MAX_AGE: PositiveInt = Field(
description="max age in seconds for tool icon caching",
default=3600,
)
class MailConfig(BaseSettings):
"""
Mail Configurations
"""
MAIL_TYPE: Optional[str] = Field(
description="Mail provider type name, default to None, available values are `smtp` and `resend`.",
default=None,
)
MAIL_DEFAULT_SEND_FROM: Optional[str] = Field(
description="default email address for sending from ",
default=None,
)
RESEND_API_KEY: Optional[str] = Field(
description="API key for Resend",
default=None,
)
RESEND_API_URL: Optional[str] = Field(
description="API URL for Resend",
default=None,
)
SMTP_SERVER: Optional[str] = Field(
description="smtp server host",
default=None,
)
SMTP_PORT: Optional[int] = Field(
description="smtp server port",
default=465,
)
SMTP_USERNAME: Optional[str] = Field(
description="smtp server username",
default=None,
)
SMTP_PASSWORD: Optional[str] = Field(
description="smtp server password",
default=None,
)
SMTP_USE_TLS: bool = Field(
description="whether to use TLS connection to smtp server",
default=False,
)
SMTP_OPPORTUNISTIC_TLS: bool = Field(
description="whether to use opportunistic TLS connection to smtp server",
default=False,
)
class RagEtlConfig(BaseSettings):
"""
RAG ETL Configurations.
"""
ETL_TYPE: str = Field(
description="RAG ETL type name, default to `dify`, available values are `dify` and `Unstructured`. ",
default="dify",
)
KEYWORD_DATA_SOURCE_TYPE: str = Field(
description="source type for keyword data, default to `database`, available values are `database` .",
default="database",
)
UNSTRUCTURED_API_URL: Optional[str] = Field(
description="API URL for Unstructured",
default=None,
)
UNSTRUCTURED_API_KEY: Optional[str] = Field(
description="API key for Unstructured",
default=None,
)
class DataSetConfig(BaseSettings):
"""
Dataset configs
"""
CLEAN_DAY_SETTING: PositiveInt = Field(
description="interval in days for cleaning up dataset",
default=30,
)
DATASET_OPERATOR_ENABLED: bool = Field(
description="whether to enable dataset operator",
default=False,
)
class WorkspaceConfig(BaseSettings):
"""
Workspace configs
"""
INVITE_EXPIRY_HOURS: PositiveInt = Field(
description="workspaces invitation expiration in hours",
default=72,
)
class IndexingConfig(BaseSettings):
"""
Indexing configs.
"""
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH: PositiveInt = Field(
description="max segmentation token length for indexing",
default=1000,
)
class ImageFormatConfig(BaseSettings):
MULTIMODAL_SEND_IMAGE_FORMAT: str = Field(
description="multi model send image format, support base64, url, default is base64",
default="base64",
)
class CeleryBeatConfig(BaseSettings):
CELERY_BEAT_SCHEDULER_TIME: int = Field(
description="the time of the celery scheduler, default to 1 day",
default=1,
)
class PositionConfig(BaseSettings):
POSITION_PROVIDER_PINS: str = Field(
description="The heads of model providers",
default="",
)
POSITION_PROVIDER_INCLUDES: str = Field(
description="The included model providers",
default="",
)
POSITION_PROVIDER_EXCLUDES: str = Field(
description="The excluded model providers",
default="",
)
POSITION_TOOL_PINS: str = Field(
description="The heads of tools",
default="",
)
POSITION_TOOL_INCLUDES: str = Field(
description="The included tools",
default="",
)
POSITION_TOOL_EXCLUDES: str = Field(
description="The excluded tools",
default="",
)
@computed_field
def POSITION_PROVIDER_PINS_LIST(self) -> list[str]:
return [item.strip() for item in self.POSITION_PROVIDER_PINS.split(",") if item.strip() != ""]
@computed_field
def POSITION_PROVIDER_INCLUDES_SET(self) -> set[str]:
return {item.strip() for item in self.POSITION_PROVIDER_INCLUDES.split(",") if item.strip() != ""}
@computed_field
def POSITION_PROVIDER_EXCLUDES_SET(self) -> set[str]:
return {item.strip() for item in self.POSITION_PROVIDER_EXCLUDES.split(",") if item.strip() != ""}
@computed_field
def POSITION_TOOL_PINS_LIST(self) -> list[str]:
return [item.strip() for item in self.POSITION_TOOL_PINS.split(",") if item.strip() != ""]
@computed_field
def POSITION_TOOL_INCLUDES_SET(self) -> set[str]:
return {item.strip() for item in self.POSITION_TOOL_INCLUDES.split(",") if item.strip() != ""}
@computed_field
def POSITION_TOOL_EXCLUDES_SET(self) -> set[str]:
return {item.strip() for item in self.POSITION_TOOL_EXCLUDES.split(",") if item.strip() != ""}
class FeatureConfig(
# place the configs in alphabet order
AppExecutionConfig,
BillingConfig,
CodeExecutionSandboxConfig,
DataSetConfig,
EndpointConfig,
FileAccessConfig,
FileUploadConfig,
HttpConfig,
ImageFormatConfig,
InnerAPIConfig,
IndexingConfig,
LoggingConfig,
MailConfig,
ModelLoadBalanceConfig,
ModerationConfig,
OAuthConfig,
RagEtlConfig,
SecurityConfig,
ToolConfig,
UpdateConfig,
WorkflowConfig,
WorkspaceConfig,
PositionConfig,
# hosted services config
HostedServiceConfig,
CeleryBeatConfig,
):
pass

View File

@ -0,0 +1,208 @@
from typing import Optional
from pydantic import Field, NonNegativeInt
from pydantic_settings import BaseSettings
class HostedOpenAiConfig(BaseSettings):
"""
Hosted OpenAI service config
"""
HOSTED_OPENAI_API_KEY: Optional[str] = Field(
description="",
default=None,
)
HOSTED_OPENAI_API_BASE: Optional[str] = Field(
description="",
default=None,
)
HOSTED_OPENAI_API_ORGANIZATION: Optional[str] = Field(
description="",
default=None,
)
HOSTED_OPENAI_TRIAL_ENABLED: bool = Field(
description="",
default=False,
)
HOSTED_OPENAI_TRIAL_MODELS: str = Field(
description="",
default="gpt-3.5-turbo,"
"gpt-3.5-turbo-1106,"
"gpt-3.5-turbo-instruct,"
"gpt-3.5-turbo-16k,"
"gpt-3.5-turbo-16k-0613,"
"gpt-3.5-turbo-0613,"
"gpt-3.5-turbo-0125,"
"text-davinci-003",
)
HOSTED_OPENAI_QUOTA_LIMIT: NonNegativeInt = Field(
description="",
default=200,
)
HOSTED_OPENAI_PAID_ENABLED: bool = Field(
description="",
default=False,
)
HOSTED_OPENAI_PAID_MODELS: str = Field(
description="",
default="gpt-4,"
"gpt-4-turbo-preview,"
"gpt-4-turbo-2024-04-09,"
"gpt-4-1106-preview,"
"gpt-4-0125-preview,"
"gpt-3.5-turbo,"
"gpt-3.5-turbo-16k,"
"gpt-3.5-turbo-16k-0613,"
"gpt-3.5-turbo-1106,"
"gpt-3.5-turbo-0613,"
"gpt-3.5-turbo-0125,"
"gpt-3.5-turbo-instruct,"
"text-davinci-003",
)
class HostedAzureOpenAiConfig(BaseSettings):
"""
Hosted OpenAI service config
"""
HOSTED_AZURE_OPENAI_ENABLED: bool = Field(
description="",
default=False,
)
HOSTED_AZURE_OPENAI_API_KEY: Optional[str] = Field(
description="",
default=None,
)
HOSTED_AZURE_OPENAI_API_BASE: Optional[str] = Field(
description="",
default=None,
)
HOSTED_AZURE_OPENAI_QUOTA_LIMIT: NonNegativeInt = Field(
description="",
default=200,
)
class HostedAnthropicConfig(BaseSettings):
"""
Hosted Azure OpenAI service config
"""
HOSTED_ANTHROPIC_API_BASE: Optional[str] = Field(
description="",
default=None,
)
HOSTED_ANTHROPIC_API_KEY: Optional[str] = Field(
description="",
default=None,
)
HOSTED_ANTHROPIC_TRIAL_ENABLED: bool = Field(
description="",
default=False,
)
HOSTED_ANTHROPIC_QUOTA_LIMIT: NonNegativeInt = Field(
description="",
default=600000,
)
HOSTED_ANTHROPIC_PAID_ENABLED: bool = Field(
description="",
default=False,
)
class HostedMinmaxConfig(BaseSettings):
"""
Hosted Minmax service config
"""
HOSTED_MINIMAX_ENABLED: bool = Field(
description="",
default=False,
)
class HostedSparkConfig(BaseSettings):
"""
Hosted Spark service config
"""
HOSTED_SPARK_ENABLED: bool = Field(
description="",
default=False,
)
class HostedZhipuAIConfig(BaseSettings):
"""
Hosted Minmax service config
"""
HOSTED_ZHIPUAI_ENABLED: bool = Field(
description="",
default=False,
)
class HostedModerationConfig(BaseSettings):
"""
Hosted Moderation service config
"""
HOSTED_MODERATION_ENABLED: bool = Field(
description="",
default=False,
)
HOSTED_MODERATION_PROVIDERS: str = Field(
description="",
default="",
)
class HostedFetchAppTemplateConfig(BaseSettings):
"""
Hosted Moderation service config
"""
HOSTED_FETCH_APP_TEMPLATES_MODE: str = Field(
description="the mode for fetching app templates,"
" default to remote,"
" available values: remote, db, builtin",
default="remote",
)
HOSTED_FETCH_APP_TEMPLATES_REMOTE_DOMAIN: str = Field(
description="the domain for fetching remote app templates",
default="https://tmpl.dify.ai",
)
class HostedServiceConfig(
# place the configs in alphabet order
HostedAnthropicConfig,
HostedAzureOpenAiConfig,
HostedFetchAppTemplateConfig,
HostedMinmaxConfig,
HostedOpenAiConfig,
HostedSparkConfig,
HostedZhipuAIConfig,
# moderation
HostedModerationConfig,
):
pass

View File

@ -0,0 +1,225 @@
from typing import Any, Optional
from urllib.parse import quote_plus
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt, computed_field
from pydantic_settings import BaseSettings
from configs.middleware.cache.redis_config import RedisConfig
from configs.middleware.storage.aliyun_oss_storage_config import AliyunOSSStorageConfig
from configs.middleware.storage.amazon_s3_storage_config import S3StorageConfig
from configs.middleware.storage.azure_blob_storage_config import AzureBlobStorageConfig
from configs.middleware.storage.google_cloud_storage_config import GoogleCloudStorageConfig
from configs.middleware.storage.huawei_obs_storage_config import HuaweiCloudOBSStorageConfig
from configs.middleware.storage.oci_storage_config import OCIStorageConfig
from configs.middleware.storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig
from configs.middleware.storage.volcengine_tos_storage_config import VolcengineTOSStorageConfig
from configs.middleware.vdb.analyticdb_config import AnalyticdbConfig
from configs.middleware.vdb.chroma_config import ChromaConfig
from configs.middleware.vdb.elasticsearch_config import ElasticsearchConfig
from configs.middleware.vdb.milvus_config import MilvusConfig
from configs.middleware.vdb.myscale_config import MyScaleConfig
from configs.middleware.vdb.opensearch_config import OpenSearchConfig
from configs.middleware.vdb.oracle_config import OracleConfig
from configs.middleware.vdb.pgvector_config import PGVectorConfig
from configs.middleware.vdb.pgvectors_config import PGVectoRSConfig
from configs.middleware.vdb.qdrant_config import QdrantConfig
from configs.middleware.vdb.relyt_config import RelytConfig
from configs.middleware.vdb.tencent_vector_config import TencentVectorDBConfig
from configs.middleware.vdb.tidb_vector_config import TiDBVectorConfig
from configs.middleware.vdb.weaviate_config import WeaviateConfig
class StorageConfig(BaseSettings):
STORAGE_TYPE: str = Field(
description="storage type,"
" default to `local`,"
" available values are `local`, `s3`, `azure-blob`, `aliyun-oss`, `google-storage`.",
default="local",
)
STORAGE_LOCAL_PATH: str = Field(
description="local storage path",
default="storage",
)
class VectorStoreConfig(BaseSettings):
VECTOR_STORE: Optional[str] = Field(
description="vector store type",
default=None,
)
class KeywordStoreConfig(BaseSettings):
KEYWORD_STORE: str = Field(
description="keyword store type",
default="jieba",
)
class DatabaseConfig:
DB_HOST: str = Field(
description="db host",
default="localhost",
)
DB_PORT: PositiveInt = Field(
description="db port",
default=5432,
)
DB_USERNAME: str = Field(
description="db username",
default="postgres",
)
DB_PASSWORD: str = Field(
description="db password",
default="",
)
DB_DATABASE: str = Field(
description="db database",
default="dify",
)
DB_CHARSET: str = Field(
description="db charset",
default="",
)
DB_EXTRAS: str = Field(
description="db extras options. Example: keepalives_idle=60&keepalives=1",
default="",
)
SQLALCHEMY_DATABASE_URI_SCHEME: str = Field(
description="db uri scheme",
default="postgresql",
)
@computed_field
@property
def SQLALCHEMY_DATABASE_URI(self) -> str:
db_extras = (
f"{self.DB_EXTRAS}&client_encoding={self.DB_CHARSET}" if self.DB_CHARSET else self.DB_EXTRAS
).strip("&")
db_extras = f"?{db_extras}" if db_extras else ""
return (
f"{self.SQLALCHEMY_DATABASE_URI_SCHEME}://"
f"{quote_plus(self.DB_USERNAME)}:{quote_plus(self.DB_PASSWORD)}@{self.DB_HOST}:{self.DB_PORT}/{self.DB_DATABASE}"
f"{db_extras}"
)
SQLALCHEMY_POOL_SIZE: NonNegativeInt = Field(
description="pool size of SqlAlchemy",
default=30,
)
SQLALCHEMY_MAX_OVERFLOW: NonNegativeInt = Field(
description="max overflows for SqlAlchemy",
default=10,
)
SQLALCHEMY_POOL_RECYCLE: NonNegativeInt = Field(
description="SqlAlchemy pool recycle",
default=3600,
)
SQLALCHEMY_POOL_PRE_PING: bool = Field(
description="whether to enable pool pre-ping in SqlAlchemy",
default=False,
)
SQLALCHEMY_ECHO: bool | str = Field(
description="whether to enable SqlAlchemy echo",
default=False,
)
@computed_field
@property
def SQLALCHEMY_ENGINE_OPTIONS(self) -> dict[str, Any]:
return {
"pool_size": self.SQLALCHEMY_POOL_SIZE,
"max_overflow": self.SQLALCHEMY_MAX_OVERFLOW,
"pool_recycle": self.SQLALCHEMY_POOL_RECYCLE,
"pool_pre_ping": self.SQLALCHEMY_POOL_PRE_PING,
"connect_args": {"options": "-c timezone=UTC"},
}
class CeleryConfig(DatabaseConfig):
CELERY_BACKEND: str = Field(
description="Celery backend, available values are `database`, `redis`",
default="database",
)
CELERY_BROKER_URL: Optional[str] = Field(
description="CELERY_BROKER_URL",
default=None,
)
CELERY_USE_SENTINEL: Optional[bool] = Field(
description="Whether to use Redis Sentinel mode",
default=False,
)
CELERY_SENTINEL_MASTER_NAME: Optional[str] = Field(
description="Redis Sentinel master name",
default=None,
)
CELERY_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field(
description="Redis Sentinel socket timeout",
default=0.1,
)
@computed_field
@property
def CELERY_RESULT_BACKEND(self) -> str | None:
return (
"db+{}".format(self.SQLALCHEMY_DATABASE_URI)
if self.CELERY_BACKEND == "database"
else self.CELERY_BROKER_URL
)
@computed_field
@property
def BROKER_USE_SSL(self) -> bool:
return self.CELERY_BROKER_URL.startswith("rediss://") if self.CELERY_BROKER_URL else False
class MiddlewareConfig(
# place the configs in alphabet order
CeleryConfig,
DatabaseConfig,
KeywordStoreConfig,
RedisConfig,
# configs of storage and storage providers
StorageConfig,
AliyunOSSStorageConfig,
AzureBlobStorageConfig,
GoogleCloudStorageConfig,
TencentCloudCOSStorageConfig,
HuaweiCloudOBSStorageConfig,
VolcengineTOSStorageConfig,
S3StorageConfig,
OCIStorageConfig,
# configs of vdb and vdb providers
VectorStoreConfig,
AnalyticdbConfig,
ChromaConfig,
MilvusConfig,
MyScaleConfig,
OpenSearchConfig,
OracleConfig,
PGVectorConfig,
PGVectoRSConfig,
QdrantConfig,
RelytConfig,
TencentVectorDBConfig,
TiDBVectorConfig,
WeaviateConfig,
ElasticsearchConfig,
):
pass

View File

@ -0,0 +1,70 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt
from pydantic_settings import BaseSettings
class RedisConfig(BaseSettings):
"""
Redis configs
"""
REDIS_HOST: str = Field(
description="Redis host",
default="localhost",
)
REDIS_PORT: PositiveInt = Field(
description="Redis port",
default=6379,
)
REDIS_USERNAME: Optional[str] = Field(
description="Redis username",
default=None,
)
REDIS_PASSWORD: Optional[str] = Field(
description="Redis password",
default=None,
)
REDIS_DB: NonNegativeInt = Field(
description="Redis database id, default to 0",
default=0,
)
REDIS_USE_SSL: bool = Field(
description="whether to use SSL for Redis connection",
default=False,
)
REDIS_USE_SENTINEL: Optional[bool] = Field(
description="Whether to use Redis Sentinel mode",
default=False,
)
REDIS_SENTINELS: Optional[str] = Field(
description="Redis Sentinel nodes",
default=None,
)
REDIS_SENTINEL_SERVICE_NAME: Optional[str] = Field(
description="Redis Sentinel service name",
default=None,
)
REDIS_SENTINEL_USERNAME: Optional[str] = Field(
description="Redis Sentinel username",
default=None,
)
REDIS_SENTINEL_PASSWORD: Optional[str] = Field(
description="Redis Sentinel password",
default=None,
)
REDIS_SENTINEL_SOCKET_TIMEOUT: Optional[PositiveFloat] = Field(
description="Redis Sentinel socket timeout",
default=0.1,
)

View File

@ -0,0 +1,45 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class AliyunOSSStorageConfig(BaseSettings):
"""
Aliyun storage configs
"""
ALIYUN_OSS_BUCKET_NAME: Optional[str] = Field(
description="Aliyun OSS bucket name",
default=None,
)
ALIYUN_OSS_ACCESS_KEY: Optional[str] = Field(
description="Aliyun OSS access key",
default=None,
)
ALIYUN_OSS_SECRET_KEY: Optional[str] = Field(
description="Aliyun OSS secret key",
default=None,
)
ALIYUN_OSS_ENDPOINT: Optional[str] = Field(
description="Aliyun OSS endpoint URL",
default=None,
)
ALIYUN_OSS_REGION: Optional[str] = Field(
description="Aliyun OSS region",
default=None,
)
ALIYUN_OSS_AUTH_VERSION: Optional[str] = Field(
description="Aliyun OSS authentication version",
default=None,
)
ALIYUN_OSS_PATH: Optional[str] = Field(
description="Aliyun OSS path",
default=None,
)

View File

@ -0,0 +1,45 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class S3StorageConfig(BaseSettings):
"""
S3 storage configs
"""
S3_ENDPOINT: Optional[str] = Field(
description="S3 storage endpoint",
default=None,
)
S3_REGION: Optional[str] = Field(
description="S3 storage region",
default=None,
)
S3_BUCKET_NAME: Optional[str] = Field(
description="S3 storage bucket name",
default=None,
)
S3_ACCESS_KEY: Optional[str] = Field(
description="S3 storage access key",
default=None,
)
S3_SECRET_KEY: Optional[str] = Field(
description="S3 storage secret key",
default=None,
)
S3_ADDRESS_STYLE: str = Field(
description="S3 storage address style",
default="auto",
)
S3_USE_AWS_MANAGED_IAM: bool = Field(
description="whether to use aws managed IAM for S3",
default=False,
)

View File

@ -0,0 +1,30 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class AzureBlobStorageConfig(BaseSettings):
"""
Azure Blob storage configs
"""
AZURE_BLOB_ACCOUNT_NAME: Optional[str] = Field(
description="Azure Blob account name",
default=None,
)
AZURE_BLOB_ACCOUNT_KEY: Optional[str] = Field(
description="Azure Blob account key",
default=None,
)
AZURE_BLOB_CONTAINER_NAME: Optional[str] = Field(
description="Azure Blob container name",
default=None,
)
AZURE_BLOB_ACCOUNT_URL: Optional[str] = Field(
description="Azure Blob account URL",
default=None,
)

View File

@ -0,0 +1,20 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class GoogleCloudStorageConfig(BaseSettings):
"""
Google Cloud storage configs
"""
GOOGLE_STORAGE_BUCKET_NAME: Optional[str] = Field(
description="Google Cloud storage bucket name",
default=None,
)
GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64: Optional[str] = Field(
description="Google Cloud storage service account json base64",
default=None,
)

View File

@ -0,0 +1,29 @@
from typing import Optional
from pydantic import BaseModel, Field
class HuaweiCloudOBSStorageConfig(BaseModel):
"""
Huawei Cloud OBS storage configs
"""
HUAWEI_OBS_BUCKET_NAME: Optional[str] = Field(
description="Huawei Cloud OBS bucket name",
default=None,
)
HUAWEI_OBS_ACCESS_KEY: Optional[str] = Field(
description="Huawei Cloud OBS Access key",
default=None,
)
HUAWEI_OBS_SECRET_KEY: Optional[str] = Field(
description="Huawei Cloud OBS Secret key",
default=None,
)
HUAWEI_OBS_SERVER: Optional[str] = Field(
description="Huawei Cloud OBS server URL",
default=None,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class OCIStorageConfig(BaseSettings):
"""
OCI storage configs
"""
OCI_ENDPOINT: Optional[str] = Field(
description="OCI storage endpoint",
default=None,
)
OCI_REGION: Optional[str] = Field(
description="OCI storage region",
default=None,
)
OCI_BUCKET_NAME: Optional[str] = Field(
description="OCI storage bucket name",
default=None,
)
OCI_ACCESS_KEY: Optional[str] = Field(
description="OCI storage access key",
default=None,
)
OCI_SECRET_KEY: Optional[str] = Field(
description="OCI storage secret key",
default=None,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class TencentCloudCOSStorageConfig(BaseSettings):
"""
Tencent Cloud COS storage configs
"""
TENCENT_COS_BUCKET_NAME: Optional[str] = Field(
description="Tencent Cloud COS bucket name",
default=None,
)
TENCENT_COS_REGION: Optional[str] = Field(
description="Tencent Cloud COS region",
default=None,
)
TENCENT_COS_SECRET_ID: Optional[str] = Field(
description="Tencent Cloud COS secret id",
default=None,
)
TENCENT_COS_SECRET_KEY: Optional[str] = Field(
description="Tencent Cloud COS secret key",
default=None,
)
TENCENT_COS_SCHEME: Optional[str] = Field(
description="Tencent Cloud COS scheme",
default=None,
)

View File

@ -0,0 +1,34 @@
from typing import Optional
from pydantic import BaseModel, Field
class VolcengineTOSStorageConfig(BaseModel):
"""
Volcengine tos storage configs
"""
VOLCENGINE_TOS_BUCKET_NAME: Optional[str] = Field(
description="Volcengine TOS Bucket Name",
default=None,
)
VOLCENGINE_TOS_ACCESS_KEY: Optional[str] = Field(
description="Volcengine TOS Access Key",
default=None,
)
VOLCENGINE_TOS_SECRET_KEY: Optional[str] = Field(
description="Volcengine TOS Secret Key",
default=None,
)
VOLCENGINE_TOS_ENDPOINT: Optional[str] = Field(
description="Volcengine TOS Endpoint URL",
default=None,
)
VOLCENGINE_TOS_REGION: Optional[str] = Field(
description="Volcengine TOS Region",
default=None,
)

View File

@ -0,0 +1,37 @@
from typing import Optional
from pydantic import BaseModel, Field
class AnalyticdbConfig(BaseModel):
"""
Configuration for connecting to AnalyticDB.
Refer to the following documentation for details on obtaining credentials:
https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/getting-started/create-an-instance-instances-with-vector-engine-optimization-enabled
"""
ANALYTICDB_KEY_ID: Optional[str] = Field(
default=None, description="The Access Key ID provided by Alibaba Cloud for authentication."
)
ANALYTICDB_KEY_SECRET: Optional[str] = Field(
default=None, description="The Secret Access Key corresponding to the Access Key ID for secure access."
)
ANALYTICDB_REGION_ID: Optional[str] = Field(
default=None, description="The region where the AnalyticDB instance is deployed (e.g., 'cn-hangzhou')."
)
ANALYTICDB_INSTANCE_ID: Optional[str] = Field(
default=None,
description="The unique identifier of the AnalyticDB instance you want to connect to (e.g., 'gp-ab123456')..",
)
ANALYTICDB_ACCOUNT: Optional[str] = Field(
default=None, description="The account name used to log in to the AnalyticDB instance."
)
ANALYTICDB_PASSWORD: Optional[str] = Field(
default=None, description="The password associated with the AnalyticDB account for authentication."
)
ANALYTICDB_NAMESPACE: Optional[str] = Field(
default=None, description="The namespace within AnalyticDB for schema isolation."
)
ANALYTICDB_NAMESPACE_PASSWORD: Optional[str] = Field(
default=None, description="The password for accessing the specified namespace within the AnalyticDB instance."
)

View File

@ -0,0 +1,40 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class ChromaConfig(BaseSettings):
"""
Chroma configs
"""
CHROMA_HOST: Optional[str] = Field(
description="Chroma host",
default=None,
)
CHROMA_PORT: PositiveInt = Field(
description="Chroma port",
default=8000,
)
CHROMA_TENANT: Optional[str] = Field(
description="Chroma database",
default=None,
)
CHROMA_DATABASE: Optional[str] = Field(
description="Chroma database",
default=None,
)
CHROMA_AUTH_PROVIDER: Optional[str] = Field(
description="Chroma authentication provider",
default=None,
)
CHROMA_AUTH_CREDENTIALS: Optional[str] = Field(
description="Chroma authentication credentials",
default=None,
)

View File

@ -0,0 +1,30 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class ElasticsearchConfig(BaseSettings):
"""
Elasticsearch configs
"""
ELASTICSEARCH_HOST: Optional[str] = Field(
description="Elasticsearch host",
default="127.0.0.1",
)
ELASTICSEARCH_PORT: PositiveInt = Field(
description="Elasticsearch port",
default=9200,
)
ELASTICSEARCH_USERNAME: Optional[str] = Field(
description="Elasticsearch username",
default="elastic",
)
ELASTICSEARCH_PASSWORD: Optional[str] = Field(
description="Elasticsearch password",
default="elastic",
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class MilvusConfig(BaseSettings):
"""
Milvus configs
"""
MILVUS_URI: Optional[str] = Field(
description="Milvus uri",
default="http://127.0.0.1:19530",
)
MILVUS_TOKEN: Optional[str] = Field(
description="Milvus token",
default=None,
)
MILVUS_USER: Optional[str] = Field(
description="Milvus user",
default=None,
)
MILVUS_PASSWORD: Optional[str] = Field(
description="Milvus password",
default=None,
)
MILVUS_DATABASE: str = Field(
description="Milvus database, default to `default`",
default="default",
)

View File

@ -0,0 +1,37 @@
from pydantic import BaseModel, Field, PositiveInt
class MyScaleConfig(BaseModel):
"""
MyScale configs
"""
MYSCALE_HOST: str = Field(
description="MyScale host",
default="localhost",
)
MYSCALE_PORT: PositiveInt = Field(
description="MyScale port",
default=8123,
)
MYSCALE_USER: str = Field(
description="MyScale user",
default="default",
)
MYSCALE_PASSWORD: str = Field(
description="MyScale password",
default="",
)
MYSCALE_DATABASE: str = Field(
description="MyScale database name",
default="default",
)
MYSCALE_FTS_PARAMS: str = Field(
description="MyScale fts index parameters",
default="",
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class OpenSearchConfig(BaseSettings):
"""
OpenSearch configs
"""
OPENSEARCH_HOST: Optional[str] = Field(
description="OpenSearch host",
default=None,
)
OPENSEARCH_PORT: PositiveInt = Field(
description="OpenSearch port",
default=9200,
)
OPENSEARCH_USER: Optional[str] = Field(
description="OpenSearch user",
default=None,
)
OPENSEARCH_PASSWORD: Optional[str] = Field(
description="OpenSearch password",
default=None,
)
OPENSEARCH_SECURE: bool = Field(
description="whether to use SSL connection for OpenSearch",
default=False,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class OracleConfig(BaseSettings):
"""
ORACLE configs
"""
ORACLE_HOST: Optional[str] = Field(
description="ORACLE host",
default=None,
)
ORACLE_PORT: Optional[PositiveInt] = Field(
description="ORACLE port",
default=1521,
)
ORACLE_USER: Optional[str] = Field(
description="ORACLE user",
default=None,
)
ORACLE_PASSWORD: Optional[str] = Field(
description="ORACLE password",
default=None,
)
ORACLE_DATABASE: Optional[str] = Field(
description="ORACLE database",
default=None,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class PGVectorConfig(BaseSettings):
"""
PGVector configs
"""
PGVECTOR_HOST: Optional[str] = Field(
description="PGVector host",
default=None,
)
PGVECTOR_PORT: Optional[PositiveInt] = Field(
description="PGVector port",
default=5433,
)
PGVECTOR_USER: Optional[str] = Field(
description="PGVector user",
default=None,
)
PGVECTOR_PASSWORD: Optional[str] = Field(
description="PGVector password",
default=None,
)
PGVECTOR_DATABASE: Optional[str] = Field(
description="PGVector database",
default=None,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class PGVectoRSConfig(BaseSettings):
"""
PGVectoRS configs
"""
PGVECTO_RS_HOST: Optional[str] = Field(
description="PGVectoRS host",
default=None,
)
PGVECTO_RS_PORT: Optional[PositiveInt] = Field(
description="PGVectoRS port",
default=5431,
)
PGVECTO_RS_USER: Optional[str] = Field(
description="PGVectoRS user",
default=None,
)
PGVECTO_RS_PASSWORD: Optional[str] = Field(
description="PGVectoRS password",
default=None,
)
PGVECTO_RS_DATABASE: Optional[str] = Field(
description="PGVectoRS database",
default=None,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings
class QdrantConfig(BaseSettings):
"""
Qdrant configs
"""
QDRANT_URL: Optional[str] = Field(
description="Qdrant url",
default=None,
)
QDRANT_API_KEY: Optional[str] = Field(
description="Qdrant api key",
default=None,
)
QDRANT_CLIENT_TIMEOUT: NonNegativeInt = Field(
description="Qdrant client timeout in seconds",
default=20,
)
QDRANT_GRPC_ENABLED: bool = Field(
description="whether enable grpc support for Qdrant connection",
default=False,
)
QDRANT_GRPC_PORT: PositiveInt = Field(
description="Qdrant grpc port",
default=6334,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class RelytConfig(BaseSettings):
"""
Relyt configs
"""
RELYT_HOST: Optional[str] = Field(
description="Relyt host",
default=None,
)
RELYT_PORT: PositiveInt = Field(
description="Relyt port",
default=9200,
)
RELYT_USER: Optional[str] = Field(
description="Relyt user",
default=None,
)
RELYT_PASSWORD: Optional[str] = Field(
description="Relyt password",
default=None,
)
RELYT_DATABASE: Optional[str] = Field(
description="Relyt database",
default="default",
)

View File

@ -0,0 +1,50 @@
from typing import Optional
from pydantic import Field, NonNegativeInt, PositiveInt
from pydantic_settings import BaseSettings
class TencentVectorDBConfig(BaseSettings):
"""
Tencent Vector configs
"""
TENCENT_VECTOR_DB_URL: Optional[str] = Field(
description="Tencent Vector URL",
default=None,
)
TENCENT_VECTOR_DB_API_KEY: Optional[str] = Field(
description="Tencent Vector API key",
default=None,
)
TENCENT_VECTOR_DB_TIMEOUT: PositiveInt = Field(
description="Tencent Vector timeout in seconds",
default=30,
)
TENCENT_VECTOR_DB_USERNAME: Optional[str] = Field(
description="Tencent Vector username",
default=None,
)
TENCENT_VECTOR_DB_PASSWORD: Optional[str] = Field(
description="Tencent Vector password",
default=None,
)
TENCENT_VECTOR_DB_SHARD: PositiveInt = Field(
description="Tencent Vector sharding number",
default=1,
)
TENCENT_VECTOR_DB_REPLICAS: NonNegativeInt = Field(
description="Tencent Vector replicas",
default=2,
)
TENCENT_VECTOR_DB_DATABASE: Optional[str] = Field(
description="Tencent Vector Database",
default=None,
)

View File

@ -0,0 +1,35 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class TiDBVectorConfig(BaseSettings):
"""
TiDB Vector configs
"""
TIDB_VECTOR_HOST: Optional[str] = Field(
description="TiDB Vector host",
default=None,
)
TIDB_VECTOR_PORT: Optional[PositiveInt] = Field(
description="TiDB Vector port",
default=4000,
)
TIDB_VECTOR_USER: Optional[str] = Field(
description="TiDB Vector user",
default=None,
)
TIDB_VECTOR_PASSWORD: Optional[str] = Field(
description="TiDB Vector password",
default=None,
)
TIDB_VECTOR_DATABASE: Optional[str] = Field(
description="TiDB Vector database",
default=None,
)

View File

@ -0,0 +1,30 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class WeaviateConfig(BaseSettings):
"""
Weaviate configs
"""
WEAVIATE_ENDPOINT: Optional[str] = Field(
description="Weaviate endpoint URL",
default=None,
)
WEAVIATE_API_KEY: Optional[str] = Field(
description="Weaviate API key",
default=None,
)
WEAVIATE_GRPC_ENABLED: bool = Field(
description="whether to enable gRPC for Weaviate connection",
default=True,
)
WEAVIATE_BATCH_SIZE: PositiveInt = Field(
description="Weaviate batch size",
default=100,
)

View File

@ -0,0 +1,18 @@
from pydantic import Field
from pydantic_settings import BaseSettings
class PackagingInfo(BaseSettings):
"""
Packaging build information
"""
CURRENT_VERSION: str = Field(
description="Dify version",
default="0.8.2",
)
COMMIT_SHA: str = Field(
description="SHA-1 checksum of the git commit used to build the app",
default="",
)

View File

@ -0,0 +1 @@
HIDDEN_VALUE = "[__HIDDEN__]"

View File

@ -1,27 +1,30 @@
languages = ['en-US', 'zh-Hans', 'pt-BR', 'es-ES', 'fr-FR', 'de-DE', 'ja-JP', 'ko-KR', 'ru-RU', 'it-IT', 'uk-UA', 'vi-VN']
language_timezone_mapping = {
'en-US': 'America/New_York',
'zh-Hans': 'Asia/Shanghai',
'pt-BR': 'America/Sao_Paulo',
'es-ES': 'Europe/Madrid',
'fr-FR': 'Europe/Paris',
'de-DE': 'Europe/Berlin',
'ja-JP': 'Asia/Tokyo',
'ko-KR': 'Asia/Seoul',
'ru-RU': 'Europe/Moscow',
'it-IT': 'Europe/Rome',
'uk-UA': 'Europe/Kyiv',
'vi-VN': 'Asia/Ho_Chi_Minh',
"en-US": "America/New_York",
"zh-Hans": "Asia/Shanghai",
"zh-Hant": "Asia/Taipei",
"pt-BR": "America/Sao_Paulo",
"es-ES": "Europe/Madrid",
"fr-FR": "Europe/Paris",
"de-DE": "Europe/Berlin",
"ja-JP": "Asia/Tokyo",
"ko-KR": "Asia/Seoul",
"ru-RU": "Europe/Moscow",
"it-IT": "Europe/Rome",
"uk-UA": "Europe/Kyiv",
"vi-VN": "Asia/Ho_Chi_Minh",
"ro-RO": "Europe/Bucharest",
"pl-PL": "Europe/Warsaw",
"hi-IN": "Asia/Kolkata",
"tr-TR": "Europe/Istanbul",
"fa-IR": "Asia/Tehran",
}
languages = list(language_timezone_mapping.keys())
def supported_language(lang):
if lang in languages:
return lang
error = ('{lang} is not a valid language.'
.format(lang=lang))
error = "{lang} is not a valid language.".format(lang=lang)
raise ValueError(error)

View File

@ -5,82 +5,79 @@ from models.model import AppMode
default_app_templates = {
# workflow default mode
AppMode.WORKFLOW: {
'app': {
'mode': AppMode.WORKFLOW.value,
'enable_site': True,
'enable_api': True
"app": {
"mode": AppMode.WORKFLOW.value,
"enable_site": True,
"enable_api": True,
}
},
# completion default mode
AppMode.COMPLETION: {
'app': {
'mode': AppMode.COMPLETION.value,
'enable_site': True,
'enable_api': True
"app": {
"mode": AppMode.COMPLETION.value,
"enable_site": True,
"enable_api": True,
},
'model_config': {
'model': {
"model_config": {
"model": {
"provider": "openai",
"name": "gpt-4",
"name": "gpt-4o",
"mode": "chat",
"completion_params": {}
"completion_params": {},
},
'user_input_form': json.dumps([
{
"paragraph": {
"label": "Query",
"variable": "query",
"required": True,
"default": ""
}
}
]),
'pre_prompt': '{{query}}'
"user_input_form": json.dumps(
[
{
"paragraph": {
"label": "Query",
"variable": "query",
"required": True,
"default": "",
},
},
]
),
"pre_prompt": "{{query}}",
},
},
# chat default mode
AppMode.CHAT: {
'app': {
'mode': AppMode.CHAT.value,
'enable_site': True,
'enable_api': True
"app": {
"mode": AppMode.CHAT.value,
"enable_site": True,
"enable_api": True,
},
'model_config': {
'model': {
"model_config": {
"model": {
"provider": "openai",
"name": "gpt-4",
"name": "gpt-4o",
"mode": "chat",
"completion_params": {}
}
}
"completion_params": {},
},
},
},
# advanced-chat default mode
AppMode.ADVANCED_CHAT: {
'app': {
'mode': AppMode.ADVANCED_CHAT.value,
'enable_site': True,
'enable_api': True
}
"app": {
"mode": AppMode.ADVANCED_CHAT.value,
"enable_site": True,
"enable_api": True,
},
},
# agent-chat default mode
AppMode.AGENT_CHAT: {
'app': {
'mode': AppMode.AGENT_CHAT.value,
'enable_site': True,
'enable_api': True
"app": {
"mode": AppMode.AGENT_CHAT.value,
"enable_site": True,
"enable_api": True,
},
'model_config': {
'model': {
"model_config": {
"model": {
"provider": "openai",
"name": "gpt-4",
"name": "gpt-4o",
"mode": "chat",
"completion_params": {}
}
}
}
"completion_params": {},
},
},
},
}

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,4 @@
TTS_AUTO_PLAY_TIMEOUT = 5
# sleep 20 ms ( 40ms => 1280 byte audio file,20ms => 640 byte audio file)
TTS_AUTO_PLAY_YIELD_CPU_TIME = 0.02

7
api/contexts/__init__.py Normal file
View File

@ -0,0 +1,7 @@
from contextvars import ContextVar
from core.workflow.entities.variable_pool import VariablePool
tenant_id: ContextVar[str] = ContextVar("tenant_id")
workflow_variable_pool: ContextVar[VariablePool] = ContextVar("workflow_variable_pool")

View File

@ -1,3 +1 @@

View File

@ -2,7 +2,7 @@ from flask import Blueprint
from libs.external_api import ExternalApi
bp = Blueprint('console', __name__, url_prefix='/console/api')
bp = Blueprint("console", __name__, url_prefix="/console/api")
api = ExternalApi(bp)
# Import other controllers
@ -17,9 +17,11 @@ from .app import (
audio,
completion,
conversation,
conversation_variables,
generator,
message,
model_config,
ops_trace,
site,
statistic,
workflow,
@ -29,16 +31,13 @@ from .app import (
)
# Import auth controllers
from .auth import activate, data_source_oauth, login, oauth
from .auth import activate, data_source_bearer_auth, data_source_oauth, forgot_password, login, oauth
# Import billing controllers
from .billing import billing
# Import datasets controllers
from .datasets import data_source, datasets, datasets_document, datasets_segments, file, hit_testing
# Import enterprise controllers
from .enterprise import enterprise_sso
from .datasets import data_source, datasets, datasets_document, datasets_segments, external, file, hit_testing, website
# Import explore controllers
from .explore import (
@ -53,5 +52,8 @@ from .explore import (
workflow,
)
# Import tag controllers
from .tag import tags
# Import workspace controllers
from .workspace import account, members, model_providers, models, tool_providers, workspace
from .workspace import account, load_balancing_config, members, model_providers, models, tool_providers, workspace

View File

@ -15,24 +15,24 @@ from models.model import App, InstalledApp, RecommendedApp
def admin_required(view):
@wraps(view)
def decorated(*args, **kwargs):
if not os.getenv('ADMIN_API_KEY'):
raise Unauthorized('API key is invalid.')
if not os.getenv("ADMIN_API_KEY"):
raise Unauthorized("API key is invalid.")
auth_header = request.headers.get('Authorization')
auth_header = request.headers.get("Authorization")
if auth_header is None:
raise Unauthorized('Authorization header is missing.')
raise Unauthorized("Authorization header is missing.")
if ' ' not in auth_header:
raise Unauthorized('Invalid Authorization header format. Expected \'Bearer <api-key>\' format.')
if " " not in auth_header:
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
auth_scheme, auth_token = auth_header.split(None, 1)
auth_scheme = auth_scheme.lower()
if auth_scheme != 'bearer':
raise Unauthorized('Invalid Authorization header format. Expected \'Bearer <api-key>\' format.')
if auth_scheme != "bearer":
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
if os.getenv('ADMIN_API_KEY') != auth_token:
raise Unauthorized('API key is invalid.')
if os.getenv("ADMIN_API_KEY") != auth_token:
raise Unauthorized("API key is invalid.")
return view(*args, **kwargs)
@ -44,33 +44,33 @@ class InsertExploreAppListApi(Resource):
@admin_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument('app_id', type=str, required=True, nullable=False, location='json')
parser.add_argument('desc', type=str, location='json')
parser.add_argument('copyright', type=str, location='json')
parser.add_argument('privacy_policy', type=str, location='json')
parser.add_argument('language', type=supported_language, required=True, nullable=False, location='json')
parser.add_argument('category', type=str, required=True, nullable=False, location='json')
parser.add_argument('position', type=int, required=True, nullable=False, location='json')
parser.add_argument("app_id", type=str, required=True, nullable=False, location="json")
parser.add_argument("desc", type=str, location="json")
parser.add_argument("copyright", type=str, location="json")
parser.add_argument("privacy_policy", type=str, location="json")
parser.add_argument("custom_disclaimer", type=str, location="json")
parser.add_argument("language", type=supported_language, required=True, nullable=False, location="json")
parser.add_argument("category", type=str, required=True, nullable=False, location="json")
parser.add_argument("position", type=int, required=True, nullable=False, location="json")
args = parser.parse_args()
app = App.query.filter(App.id == args['app_id']).first()
app = App.query.filter(App.id == args["app_id"]).first()
if not app:
raise NotFound(f'App \'{args["app_id"]}\' is not found')
site = app.site
if not site:
desc = args['desc'] if args['desc'] else ''
copy_right = args['copyright'] if args['copyright'] else ''
privacy_policy = args['privacy_policy'] if args['privacy_policy'] else ''
desc = args["desc"] or ""
copy_right = args["copyright"] or ""
privacy_policy = args["privacy_policy"] or ""
custom_disclaimer = args["custom_disclaimer"] or ""
else:
desc = site.description if site.description else \
args['desc'] if args['desc'] else ''
copy_right = site.copyright if site.copyright else \
args['copyright'] if args['copyright'] else ''
privacy_policy = site.privacy_policy if site.privacy_policy else \
args['privacy_policy'] if args['privacy_policy'] else ''
desc = site.description or args["desc"] or ""
copy_right = site.copyright or args["copyright"] or ""
privacy_policy = site.privacy_policy or args["privacy_policy"] or ""
custom_disclaimer = site.custom_disclaimer or args["custom_disclaimer"] or ""
recommended_app = RecommendedApp.query.filter(RecommendedApp.app_id == args['app_id']).first()
recommended_app = RecommendedApp.query.filter(RecommendedApp.app_id == args["app_id"]).first()
if not recommended_app:
recommended_app = RecommendedApp(
@ -78,9 +78,10 @@ class InsertExploreAppListApi(Resource):
description=desc,
copyright=copy_right,
privacy_policy=privacy_policy,
language=args['language'],
category=args['category'],
position=args['position']
custom_disclaimer=custom_disclaimer,
language=args["language"],
category=args["category"],
position=args["position"],
)
db.session.add(recommended_app)
@ -88,20 +89,21 @@ class InsertExploreAppListApi(Resource):
app.is_public = True
db.session.commit()
return {'result': 'success'}, 201
return {"result": "success"}, 201
else:
recommended_app.description = desc
recommended_app.copyright = copy_right
recommended_app.privacy_policy = privacy_policy
recommended_app.language = args['language']
recommended_app.category = args['category']
recommended_app.position = args['position']
recommended_app.custom_disclaimer = custom_disclaimer
recommended_app.language = args["language"]
recommended_app.category = args["category"]
recommended_app.position = args["position"]
app.is_public = True
db.session.commit()
return {'result': 'success'}, 200
return {"result": "success"}, 200
class InsertExploreAppApi(Resource):
@ -110,15 +112,14 @@ class InsertExploreAppApi(Resource):
def delete(self, app_id):
recommended_app = RecommendedApp.query.filter(RecommendedApp.app_id == str(app_id)).first()
if not recommended_app:
return {'result': 'success'}, 204
return {"result": "success"}, 204
app = App.query.filter(App.id == recommended_app.app_id).first()
if app:
app.is_public = False
installed_apps = InstalledApp.query.filter(
InstalledApp.app_id == recommended_app.app_id,
InstalledApp.tenant_id != InstalledApp.app_owner_tenant_id
InstalledApp.app_id == recommended_app.app_id, InstalledApp.tenant_id != InstalledApp.app_owner_tenant_id
).all()
for installed_app in installed_apps:
@ -127,8 +128,8 @@ class InsertExploreAppApi(Resource):
db.session.delete(recommended_app)
db.session.commit()
return {'result': 'success'}, 204
return {"result": "success"}, 204
api.add_resource(InsertExploreAppListApi, '/admin/insert-explore-apps')
api.add_resource(InsertExploreAppApi, '/admin/insert-explore-apps/<uuid:app_id>')
api.add_resource(InsertExploreAppListApi, "/admin/insert-explore-apps")
api.add_resource(InsertExploreAppApi, "/admin/insert-explore-apps/<uuid:app_id>")

View File

@ -14,26 +14,21 @@ from .setup import setup_required
from .wraps import account_initialization_required
api_key_fields = {
'id': fields.String,
'type': fields.String,
'token': fields.String,
'last_used_at': TimestampField,
'created_at': TimestampField
"id": fields.String,
"type": fields.String,
"token": fields.String,
"last_used_at": TimestampField,
"created_at": TimestampField,
}
api_key_list = {
'data': fields.List(fields.Nested(api_key_fields), attribute="items")
}
api_key_list = {"data": fields.List(fields.Nested(api_key_fields), attribute="items")}
def _get_resource(resource_id, tenant_id, resource_model):
resource = resource_model.query.filter_by(
id=resource_id, tenant_id=tenant_id
).first()
resource = resource_model.query.filter_by(id=resource_id, tenant_id=tenant_id).first()
if resource is None:
flask_restful.abort(
404, message=f"{resource_model.__name__} not found.")
flask_restful.abort(404, message=f"{resource_model.__name__} not found.")
return resource
@ -50,30 +45,32 @@ class BaseApiKeyListResource(Resource):
@marshal_with(api_key_list)
def get(self, resource_id):
resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id,
self.resource_model)
keys = db.session.query(ApiToken). \
filter(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id). \
all()
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
keys = (
db.session.query(ApiToken)
.filter(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id)
.all()
)
return {"items": keys}
@marshal_with(api_key_fields)
def post(self, resource_id):
resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id,
self.resource_model)
if not current_user.is_admin_or_owner:
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
if not current_user.is_editor:
raise Forbidden()
current_key_count = db.session.query(ApiToken). \
filter(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id). \
count()
current_key_count = (
db.session.query(ApiToken)
.filter(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id)
.count()
)
if current_key_count >= self.max_keys:
flask_restful.abort(
400,
message=f"Cannot create more than {self.max_keys} API keys for this resource type.",
code='max_keys_exceeded'
code="max_keys_exceeded",
)
key = ApiToken.generate_api_key(self.token_prefix, 24)
@ -97,79 +94,78 @@ class BaseApiKeyResource(Resource):
def delete(self, resource_id, api_key_id):
resource_id = str(resource_id)
api_key_id = str(api_key_id)
_get_resource(resource_id, current_user.current_tenant_id,
self.resource_model)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
key = db.session.query(ApiToken). \
filter(getattr(ApiToken, self.resource_id_field) == resource_id, ApiToken.type == self.resource_type, ApiToken.id == api_key_id). \
first()
key = (
db.session.query(ApiToken)
.filter(
getattr(ApiToken, self.resource_id_field) == resource_id,
ApiToken.type == self.resource_type,
ApiToken.id == api_key_id,
)
.first()
)
if key is None:
flask_restful.abort(404, message='API key not found')
flask_restful.abort(404, message="API key not found")
db.session.query(ApiToken).filter(ApiToken.id == api_key_id).delete()
db.session.commit()
return {'result': 'success'}, 204
return {"result": "success"}, 204
class AppApiKeyListResource(BaseApiKeyListResource):
def after_request(self, resp):
resp.headers['Access-Control-Allow-Origin'] = '*'
resp.headers['Access-Control-Allow-Credentials'] = 'true'
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = 'app'
resource_type = "app"
resource_model = App
resource_id_field = 'app_id'
token_prefix = 'app-'
resource_id_field = "app_id"
token_prefix = "app-"
class AppApiKeyResource(BaseApiKeyResource):
def after_request(self, resp):
resp.headers['Access-Control-Allow-Origin'] = '*'
resp.headers['Access-Control-Allow-Credentials'] = 'true'
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = 'app'
resource_type = "app"
resource_model = App
resource_id_field = 'app_id'
resource_id_field = "app_id"
class DatasetApiKeyListResource(BaseApiKeyListResource):
def after_request(self, resp):
resp.headers['Access-Control-Allow-Origin'] = '*'
resp.headers['Access-Control-Allow-Credentials'] = 'true'
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = 'dataset'
resource_type = "dataset"
resource_model = Dataset
resource_id_field = 'dataset_id'
token_prefix = 'ds-'
resource_id_field = "dataset_id"
token_prefix = "ds-"
class DatasetApiKeyResource(BaseApiKeyResource):
def after_request(self, resp):
resp.headers['Access-Control-Allow-Origin'] = '*'
resp.headers['Access-Control-Allow-Credentials'] = 'true'
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = 'dataset'
resource_type = "dataset"
resource_model = Dataset
resource_id_field = 'dataset_id'
resource_id_field = "dataset_id"
api.add_resource(AppApiKeyListResource, '/apps/<uuid:resource_id>/api-keys')
api.add_resource(AppApiKeyResource,
'/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>')
api.add_resource(DatasetApiKeyListResource,
'/datasets/<uuid:resource_id>/api-keys')
api.add_resource(DatasetApiKeyResource,
'/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>')
api.add_resource(AppApiKeyListResource, "/apps/<uuid:resource_id>/api-keys")
api.add_resource(AppApiKeyResource, "/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
api.add_resource(DatasetApiKeyListResource, "/datasets/<uuid:resource_id>/api-keys")
api.add_resource(DatasetApiKeyResource, "/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")

View File

@ -8,19 +8,18 @@ from services.advanced_prompt_template_service import AdvancedPromptTemplateServ
class AdvancedPromptTemplateList(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
parser = reqparse.RequestParser()
parser.add_argument('app_mode', type=str, required=True, location='args')
parser.add_argument('model_mode', type=str, required=True, location='args')
parser.add_argument('has_context', type=str, required=False, default='true', location='args')
parser.add_argument('model_name', type=str, required=True, location='args')
parser.add_argument("app_mode", type=str, required=True, location="args")
parser.add_argument("model_mode", type=str, required=True, location="args")
parser.add_argument("has_context", type=str, required=False, default="true", location="args")
parser.add_argument("model_name", type=str, required=True, location="args")
args = parser.parse_args()
return AdvancedPromptTemplateService.get_prompt(args)
api.add_resource(AdvancedPromptTemplateList, '/app/prompt-templates')
api.add_resource(AdvancedPromptTemplateList, "/app/prompt-templates")

View File

@ -18,15 +18,12 @@ class AgentLogApi(Resource):
def get(self, app_model):
"""Get agent logs"""
parser = reqparse.RequestParser()
parser.add_argument('message_id', type=uuid_value, required=True, location='args')
parser.add_argument('conversation_id', type=uuid_value, required=True, location='args')
parser.add_argument("message_id", type=uuid_value, required=True, location="args")
parser.add_argument("conversation_id", type=uuid_value, required=True, location="args")
args = parser.parse_args()
return AgentService.get_agent_logs(
app_model,
args['conversation_id'],
args['message_id']
)
api.add_resource(AgentLogApi, '/apps/<uuid:app_id>/agent/logs')
return AgentService.get_agent_logs(app_model, args["conversation_id"], args["message_id"])
api.add_resource(AgentLogApi, "/apps/<uuid:app_id>/agent/logs")

View File

@ -21,24 +21,23 @@ class AnnotationReplyActionApi(Resource):
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check('annotation')
@cloud_edition_billing_resource_check("annotation")
def post(self, app_id, action):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
parser = reqparse.RequestParser()
parser.add_argument('score_threshold', required=True, type=float, location='json')
parser.add_argument('embedding_provider_name', required=True, type=str, location='json')
parser.add_argument('embedding_model_name', required=True, type=str, location='json')
parser.add_argument("score_threshold", required=True, type=float, location="json")
parser.add_argument("embedding_provider_name", required=True, type=str, location="json")
parser.add_argument("embedding_model_name", required=True, type=str, location="json")
args = parser.parse_args()
if action == 'enable':
if action == "enable":
result = AppAnnotationService.enable_app_annotation(args, app_id)
elif action == 'disable':
elif action == "disable":
result = AppAnnotationService.disable_app_annotation(app_id)
else:
raise ValueError('Unsupported annotation reply action')
raise ValueError("Unsupported annotation reply action")
return result, 200
@ -47,8 +46,7 @@ class AppAnnotationSettingDetailApi(Resource):
@login_required
@account_initialization_required
def get(self, app_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
@ -61,15 +59,14 @@ class AppAnnotationSettingUpdateApi(Resource):
@login_required
@account_initialization_required
def post(self, app_id, annotation_setting_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_setting_id = str(annotation_setting_id)
parser = reqparse.RequestParser()
parser.add_argument('score_threshold', required=True, type=float, location='json')
parser.add_argument("score_threshold", required=True, type=float, location="json")
args = parser.parse_args()
result = AppAnnotationService.update_app_annotation_setting(app_id, annotation_setting_id, args)
@ -80,29 +77,24 @@ class AnnotationReplyActionStatusApi(Resource):
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check('annotation')
@cloud_edition_billing_resource_check("annotation")
def get(self, app_id, job_id, action):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
job_id = str(job_id)
app_annotation_job_key = '{}_app_annotation_job_{}'.format(action, str(job_id))
app_annotation_job_key = "{}_app_annotation_job_{}".format(action, str(job_id))
cache_result = redis_client.get(app_annotation_job_key)
if cache_result is None:
raise ValueError("The job is not exist.")
job_status = cache_result.decode()
error_msg = ''
if job_status == 'error':
app_annotation_error_key = '{}_app_annotation_error_{}'.format(action, str(job_id))
error_msg = ""
if job_status == "error":
app_annotation_error_key = "{}_app_annotation_error_{}".format(action, str(job_id))
error_msg = redis_client.get(app_annotation_error_key).decode()
return {
'job_id': job_id,
'job_status': job_status,
'error_msg': error_msg
}, 200
return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200
class AnnotationListApi(Resource):
@ -110,22 +102,21 @@ class AnnotationListApi(Resource):
@login_required
@account_initialization_required
def get(self, app_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
page = request.args.get('page', default=1, type=int)
limit = request.args.get('limit', default=20, type=int)
keyword = request.args.get('keyword', default=None, type=str)
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
keyword = request.args.get("keyword", default=None, type=str)
app_id = str(app_id)
annotation_list, total = AppAnnotationService.get_annotation_list_by_app_id(app_id, page, limit, keyword)
response = {
'data': marshal(annotation_list, annotation_fields),
'has_more': len(annotation_list) == limit,
'limit': limit,
'total': total,
'page': page
"data": marshal(annotation_list, annotation_fields),
"has_more": len(annotation_list) == limit,
"limit": limit,
"total": total,
"page": page,
}
return response, 200
@ -135,15 +126,12 @@ class AnnotationExportApi(Resource):
@login_required
@account_initialization_required
def get(self, app_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_list = AppAnnotationService.export_annotation_list_by_app_id(app_id)
response = {
'data': marshal(annotation_list, annotation_fields)
}
response = {"data": marshal(annotation_list, annotation_fields)}
return response, 200
@ -151,17 +139,16 @@ class AnnotationCreateApi(Resource):
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check('annotation')
@cloud_edition_billing_resource_check("annotation")
@marshal_with(annotation_fields)
def post(self, app_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
parser = reqparse.RequestParser()
parser.add_argument('question', required=True, type=str, location='json')
parser.add_argument('answer', required=True, type=str, location='json')
parser.add_argument("question", required=True, type=str, location="json")
parser.add_argument("answer", required=True, type=str, location="json")
args = parser.parse_args()
annotation = AppAnnotationService.insert_app_annotation_directly(args, app_id)
return annotation
@ -171,18 +158,17 @@ class AnnotationUpdateDeleteApi(Resource):
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check('annotation')
@cloud_edition_billing_resource_check("annotation")
@marshal_with(annotation_fields)
def post(self, app_id, annotation_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_id = str(annotation_id)
parser = reqparse.RequestParser()
parser.add_argument('question', required=True, type=str, location='json')
parser.add_argument('answer', required=True, type=str, location='json')
parser.add_argument("question", required=True, type=str, location="json")
parser.add_argument("answer", required=True, type=str, location="json")
args = parser.parse_args()
annotation = AppAnnotationService.update_app_annotation_directly(args, app_id, annotation_id)
return annotation
@ -191,37 +177,35 @@ class AnnotationUpdateDeleteApi(Resource):
@login_required
@account_initialization_required
def delete(self, app_id, annotation_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_id = str(annotation_id)
AppAnnotationService.delete_app_annotation(app_id, annotation_id)
return {'result': 'success'}, 200
return {"result": "success"}, 200
class AnnotationBatchImportApi(Resource):
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check('annotation')
@cloud_edition_billing_resource_check("annotation")
def post(self, app_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
# get file from request
file = request.files['file']
file = request.files["file"]
# check file
if 'file' not in request.files:
if "file" not in request.files:
raise NoFileUploadedError()
if len(request.files) > 1:
raise TooManyFilesError()
# check file type
if not file.filename.endswith('.csv'):
if not file.filename.endswith(".csv"):
raise ValueError("Invalid file type. Only CSV files are allowed")
return AppAnnotationService.batch_import_app_annotations(app_id, file)
@ -230,28 +214,23 @@ class AnnotationBatchImportStatusApi(Resource):
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check('annotation')
@cloud_edition_billing_resource_check("annotation")
def get(self, app_id, job_id):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
job_id = str(job_id)
indexing_cache_key = 'app_annotation_batch_import_{}'.format(str(job_id))
indexing_cache_key = "app_annotation_batch_import_{}".format(str(job_id))
cache_result = redis_client.get(indexing_cache_key)
if cache_result is None:
raise ValueError("The job is not exist.")
job_status = cache_result.decode()
error_msg = ''
if job_status == 'error':
indexing_error_msg_key = 'app_annotation_batch_import_error_msg_{}'.format(str(job_id))
error_msg = ""
if job_status == "error":
indexing_error_msg_key = "app_annotation_batch_import_error_msg_{}".format(str(job_id))
error_msg = redis_client.get(indexing_error_msg_key).decode()
return {
'job_id': job_id,
'job_status': job_status,
'error_msg': error_msg
}, 200
return {"job_id": job_id, "job_status": job_status, "error_msg": error_msg}, 200
class AnnotationHitHistoryListApi(Resource):
@ -259,34 +238,35 @@ class AnnotationHitHistoryListApi(Resource):
@login_required
@account_initialization_required
def get(self, app_id, annotation_id):
# The role of the current user in the table must be admin or owner
if not current_user.is_admin_or_owner:
if not current_user.is_editor:
raise Forbidden()
page = request.args.get('page', default=1, type=int)
limit = request.args.get('limit', default=20, type=int)
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
app_id = str(app_id)
annotation_id = str(annotation_id)
annotation_hit_history_list, total = AppAnnotationService.get_annotation_hit_histories(app_id, annotation_id,
page, limit)
annotation_hit_history_list, total = AppAnnotationService.get_annotation_hit_histories(
app_id, annotation_id, page, limit
)
response = {
'data': marshal(annotation_hit_history_list, annotation_hit_history_fields),
'has_more': len(annotation_hit_history_list) == limit,
'limit': limit,
'total': total,
'page': page
"data": marshal(annotation_hit_history_list, annotation_hit_history_fields),
"has_more": len(annotation_hit_history_list) == limit,
"limit": limit,
"total": total,
"page": page,
}
return response
api.add_resource(AnnotationReplyActionApi, '/apps/<uuid:app_id>/annotation-reply/<string:action>')
api.add_resource(AnnotationReplyActionStatusApi,
'/apps/<uuid:app_id>/annotation-reply/<string:action>/status/<uuid:job_id>')
api.add_resource(AnnotationListApi, '/apps/<uuid:app_id>/annotations')
api.add_resource(AnnotationExportApi, '/apps/<uuid:app_id>/annotations/export')
api.add_resource(AnnotationUpdateDeleteApi, '/apps/<uuid:app_id>/annotations/<uuid:annotation_id>')
api.add_resource(AnnotationBatchImportApi, '/apps/<uuid:app_id>/annotations/batch-import')
api.add_resource(AnnotationBatchImportStatusApi, '/apps/<uuid:app_id>/annotations/batch-import-status/<uuid:job_id>')
api.add_resource(AnnotationHitHistoryListApi, '/apps/<uuid:app_id>/annotations/<uuid:annotation_id>/hit-histories')
api.add_resource(AppAnnotationSettingDetailApi, '/apps/<uuid:app_id>/annotation-setting')
api.add_resource(AppAnnotationSettingUpdateApi, '/apps/<uuid:app_id>/annotation-settings/<uuid:annotation_setting_id>')
api.add_resource(AnnotationReplyActionApi, "/apps/<uuid:app_id>/annotation-reply/<string:action>")
api.add_resource(
AnnotationReplyActionStatusApi, "/apps/<uuid:app_id>/annotation-reply/<string:action>/status/<uuid:job_id>"
)
api.add_resource(AnnotationListApi, "/apps/<uuid:app_id>/annotations")
api.add_resource(AnnotationExportApi, "/apps/<uuid:app_id>/annotations/export")
api.add_resource(AnnotationUpdateDeleteApi, "/apps/<uuid:app_id>/annotations/<uuid:annotation_id>")
api.add_resource(AnnotationBatchImportApi, "/apps/<uuid:app_id>/annotations/batch-import")
api.add_resource(AnnotationBatchImportStatusApi, "/apps/<uuid:app_id>/annotations/batch-import-status/<uuid:job_id>")
api.add_resource(AnnotationHitHistoryListApi, "/apps/<uuid:app_id>/annotations/<uuid:annotation_id>/hit-histories")
api.add_resource(AppAnnotationSettingDetailApi, "/apps/<uuid:app_id>/annotation-setting")
api.add_resource(AppAnnotationSettingUpdateApi, "/apps/<uuid:app_id>/annotation-settings/<uuid:annotation_setting_id>")

View File

@ -1,70 +1,84 @@
import json
import uuid
from flask_login import current_user
from flask_restful import Resource, inputs, marshal_with, reqparse
from werkzeug.exceptions import BadRequest, Forbidden
from flask_restful import Resource, inputs, marshal, marshal_with, reqparse
from werkzeug.exceptions import BadRequest, Forbidden, abort
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required, cloud_edition_billing_resource_check
from core.agent.entities import AgentToolEntity
from core.tools.tool_manager import ToolManager
from core.tools.utils.configuration import ToolParameterConfigurationManager
from extensions.ext_database import db
from core.ops.ops_trace_manager import OpsTraceManager
from fields.app_fields import (
app_detail_fields,
app_detail_fields_with_site,
app_pagination_fields,
)
from libs.login import login_required
from models.model import App, AppMode, AppModelConfig
from services.app_dsl_service import AppDslService
from services.app_service import AppService
ALLOW_CREATE_APP_MODES = ['chat', 'agent-chat', 'advanced-chat', 'workflow', 'completion']
ALLOW_CREATE_APP_MODES = ["chat", "agent-chat", "advanced-chat", "workflow", "completion"]
class AppListApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_pagination_fields)
def get(self):
"""Get app list"""
def uuid_list(value):
try:
return [str(uuid.UUID(v)) for v in value.split(",")]
except ValueError:
abort(400, message="Invalid UUID format in tag_ids.")
parser = reqparse.RequestParser()
parser.add_argument('page', type=inputs.int_range(1, 99999), required=False, default=1, location='args')
parser.add_argument('limit', type=inputs.int_range(1, 100), required=False, default=20, location='args')
parser.add_argument('mode', type=str, choices=['chat', 'workflow', 'agent-chat', 'channel', 'all'], default='all', location='args', required=False)
parser.add_argument('name', type=str, location='args', required=False)
parser.add_argument("page", type=inputs.int_range(1, 99999), required=False, default=1, location="args")
parser.add_argument("limit", type=inputs.int_range(1, 100), required=False, default=20, location="args")
parser.add_argument(
"mode",
type=str,
choices=["chat", "workflow", "agent-chat", "channel", "all"],
default="all",
location="args",
required=False,
)
parser.add_argument("name", type=str, location="args", required=False)
parser.add_argument("tag_ids", type=uuid_list, location="args", required=False)
args = parser.parse_args()
# get app list
app_service = AppService()
app_pagination = app_service.get_paginate_apps(current_user.current_tenant_id, args)
if not app_pagination:
return {"data": [], "total": 0, "page": 1, "limit": 20, "has_more": False}
return app_pagination
return marshal(app_pagination, app_pagination_fields)
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_detail_fields)
@cloud_edition_billing_resource_check('apps')
@cloud_edition_billing_resource_check("apps")
def post(self):
"""Create app"""
parser = reqparse.RequestParser()
parser.add_argument('name', type=str, required=True, location='json')
parser.add_argument('description', type=str, location='json')
parser.add_argument('mode', type=str, choices=ALLOW_CREATE_APP_MODES, location='json')
parser.add_argument('icon', type=str, location='json')
parser.add_argument('icon_background', type=str, location='json')
parser.add_argument("name", type=str, required=True, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("mode", type=str, choices=ALLOW_CREATE_APP_MODES, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
if 'mode' not in args or args['mode'] is None:
if "mode" not in args or args["mode"] is None:
raise BadRequest("mode is required")
app_service = AppService()
@ -78,29 +92,57 @@ class AppImportApi(Resource):
@login_required
@account_initialization_required
@marshal_with(app_detail_fields_with_site)
@cloud_edition_billing_resource_check('apps')
@cloud_edition_billing_resource_check("apps")
def post(self):
"""Import app"""
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('data', type=str, required=True, nullable=False, location='json')
parser.add_argument('name', type=str, location='json')
parser.add_argument('description', type=str, location='json')
parser.add_argument('icon', type=str, location='json')
parser.add_argument('icon_background', type=str, location='json')
parser.add_argument("data", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
app_service = AppService()
app = app_service.import_app(current_user.current_tenant_id, args['data'], args, current_user)
app = AppDslService.import_and_create_new_app(
tenant_id=current_user.current_tenant_id, data=args["data"], args=args, account=current_user
)
return app, 201
class AppImportFromUrlApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_detail_fields_with_site)
@cloud_edition_billing_resource_check("apps")
def post(self):
"""Import app from url"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("url", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
app = AppDslService.import_and_create_new_app_from_url(
tenant_id=current_user.current_tenant_id, url=args["url"], args=args, account=current_user
)
return app, 201
class AppApi(Resource):
@setup_required
@login_required
@account_initialization_required
@ -108,43 +150,9 @@ class AppApi(Resource):
@marshal_with(app_detail_fields_with_site)
def get(self, app_model):
"""Get app detail"""
# get original app model config
if app_model.mode == AppMode.AGENT_CHAT.value or app_model.is_agent:
model_config: AppModelConfig = app_model.app_model_config
agent_mode = model_config.agent_mode_dict
# decrypt agent tool parameters if it's secret-input
for tool in agent_mode.get('tools') or []:
if not isinstance(tool, dict) or len(tool.keys()) <= 3:
continue
agent_tool_entity = AgentToolEntity(**tool)
# get tool
try:
tool_runtime = ToolManager.get_agent_tool_runtime(
tenant_id=current_user.current_tenant_id,
agent_tool=agent_tool_entity,
)
manager = ToolParameterConfigurationManager(
tenant_id=current_user.current_tenant_id,
tool_runtime=tool_runtime,
provider_name=agent_tool_entity.provider_id,
provider_type=agent_tool_entity.provider_type,
)
app_service = AppService()
# get decrypted parameters
if agent_tool_entity.tool_parameters:
parameters = manager.decrypt_tool_parameters(agent_tool_entity.tool_parameters or {})
masked_parameter = manager.mask_tool_parameters(parameters or {})
else:
masked_parameter = {}
# override tool parameters
tool['tool_parameters'] = masked_parameter
except Exception as e:
pass
# override agent mode
model_config.agent_mode = json.dumps(agent_mode)
db.session.commit()
app_model = app_service.get_app(app_model)
return app_model
@ -155,11 +163,18 @@ class AppApi(Resource):
@marshal_with(app_detail_fields_with_site)
def put(self, app_model):
"""Update app"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('name', type=str, required=True, nullable=False, location='json')
parser.add_argument('description', type=str, location='json')
parser.add_argument('icon', type=str, location='json')
parser.add_argument('icon_background', type=str, location='json')
parser.add_argument("name", type=str, required=True, nullable=False, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
parser.add_argument("max_active_requests", type=int, location="json")
parser.add_argument("use_icon_as_answer_icon", type=bool, location="json")
args = parser.parse_args()
app_service = AppService()
@ -173,13 +188,14 @@ class AppApi(Resource):
@get_app_model
def delete(self, app_model):
"""Delete app"""
if not current_user.is_admin_or_owner:
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
app_service = AppService()
app_service.delete_app(app_model)
return {'result': 'success'}, 204
return {"result": "success"}, 204
class AppCopyApi(Resource):
@ -190,20 +206,22 @@ class AppCopyApi(Resource):
@marshal_with(app_detail_fields_with_site)
def post(self, app_model):
"""Copy app"""
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('name', type=str, location='json')
parser.add_argument('description', type=str, location='json')
parser.add_argument('icon', type=str, location='json')
parser.add_argument('icon_background', type=str, location='json')
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
app_service = AppService()
data = app_service.export_app(app_model)
app = app_service.import_app(current_user.current_tenant_id, data, args, current_user)
data = AppDslService.export_dsl(app_model=app_model, include_secret=True)
app = AppDslService.import_and_create_new_app(
tenant_id=current_user.current_tenant_id, data=data, args=args, account=current_user
)
return app, 201
@ -215,11 +233,16 @@ class AppExportApi(Resource):
@get_app_model
def get(self, app_model):
"""Export app"""
app_service = AppService()
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
return {
"data": app_service.export_app(app_model)
}
# Add include_secret params
parser = reqparse.RequestParser()
parser.add_argument("include_secret", type=inputs.boolean, default=False, location="args")
args = parser.parse_args()
return {"data": AppDslService.export_dsl(app_model=app_model, include_secret=args["include_secret"])}
class AppNameApi(Resource):
@ -229,12 +252,16 @@ class AppNameApi(Resource):
@get_app_model
@marshal_with(app_detail_fields)
def post(self, app_model):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('name', type=str, required=True, location='json')
parser.add_argument("name", type=str, required=True, location="json")
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_name(app_model, args.get('name'))
app_model = app_service.update_app_name(app_model, args.get("name"))
return app_model
@ -246,13 +273,17 @@ class AppIconApi(Resource):
@get_app_model
@marshal_with(app_detail_fields)
def post(self, app_model):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('icon', type=str, location='json')
parser.add_argument('icon_background', type=str, location='json')
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_icon(app_model, args.get('icon'), args.get('icon_background'))
app_model = app_service.update_app_icon(app_model, args.get("icon"), args.get("icon_background"))
return app_model
@ -264,12 +295,16 @@ class AppSiteStatus(Resource):
@get_app_model
@marshal_with(app_detail_fields)
def post(self, app_model):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('enable_site', type=bool, required=True, location='json')
parser.add_argument("enable_site", type=bool, required=True, location="json")
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_site_status(app_model, args.get('enable_site'))
app_model = app_service.update_app_site_status(app_model, args.get("enable_site"))
return app_model
@ -281,22 +316,59 @@ class AppApiStatus(Resource):
@get_app_model
@marshal_with(app_detail_fields)
def post(self, app_model):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('enable_api', type=bool, required=True, location='json')
parser.add_argument("enable_api", type=bool, required=True, location="json")
args = parser.parse_args()
app_service = AppService()
app_model = app_service.update_app_api_status(app_model, args.get('enable_api'))
app_model = app_service.update_app_api_status(app_model, args.get("enable_api"))
return app_model
api.add_resource(AppListApi, '/apps')
api.add_resource(AppImportApi, '/apps/import')
api.add_resource(AppApi, '/apps/<uuid:app_id>')
api.add_resource(AppCopyApi, '/apps/<uuid:app_id>/copy')
api.add_resource(AppExportApi, '/apps/<uuid:app_id>/export')
api.add_resource(AppNameApi, '/apps/<uuid:app_id>/name')
api.add_resource(AppIconApi, '/apps/<uuid:app_id>/icon')
api.add_resource(AppSiteStatus, '/apps/<uuid:app_id>/site-enable')
api.add_resource(AppApiStatus, '/apps/<uuid:app_id>/api-enable')
class AppTraceApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, app_id):
"""Get app trace"""
app_trace_config = OpsTraceManager.get_app_tracing_config(app_id=app_id)
return app_trace_config
@setup_required
@login_required
@account_initialization_required
def post(self, app_id):
# add app trace
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("enabled", type=bool, required=True, location="json")
parser.add_argument("tracing_provider", type=str, required=True, location="json")
args = parser.parse_args()
OpsTraceManager.update_app_tracing_config(
app_id=app_id,
enabled=args["enabled"],
tracing_provider=args["tracing_provider"],
)
return {"result": "success"}
api.add_resource(AppListApi, "/apps")
api.add_resource(AppImportApi, "/apps/import")
api.add_resource(AppImportFromUrlApi, "/apps/import/url")
api.add_resource(AppApi, "/apps/<uuid:app_id>")
api.add_resource(AppCopyApi, "/apps/<uuid:app_id>/copy")
api.add_resource(AppExportApi, "/apps/<uuid:app_id>/export")
api.add_resource(AppNameApi, "/apps/<uuid:app_id>/name")
api.add_resource(AppIconApi, "/apps/<uuid:app_id>/icon")
api.add_resource(AppSiteStatus, "/apps/<uuid:app_id>/site-enable")
api.add_resource(AppApiStatus, "/apps/<uuid:app_id>/api-enable")
api.add_resource(AppTraceApi, "/apps/<uuid:app_id>/trace")

View File

@ -39,7 +39,7 @@ class ChatMessageAudioApi(Resource):
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def post(self, app_model):
file = request.files['file']
file = request.files["file"]
try:
response = AudioService.transcript_asr(
@ -81,15 +81,32 @@ class ChatMessageTextApi(Resource):
@account_initialization_required
@get_app_model
def post(self, app_model):
try:
response = AudioService.transcript_tts(
app_model=app_model,
text=request.form['text'],
voice=request.form.get('voice'),
streaming=False
)
from werkzeug.exceptions import InternalServerError
return {'data': response.data.decode('latin1')}
try:
parser = reqparse.RequestParser()
parser.add_argument("message_id", type=str, location="json")
parser.add_argument("text", type=str, location="json")
parser.add_argument("voice", type=str, location="json")
parser.add_argument("streaming", type=bool, location="json")
args = parser.parse_args()
message_id = args.get("message_id", None)
text = args.get("text", None)
if (
app_model.mode in {AppMode.ADVANCED_CHAT.value, AppMode.WORKFLOW.value}
and app_model.workflow
and app_model.workflow.features_dict
):
text_to_speech = app_model.workflow.features_dict.get("text_to_speech")
voice = args.get("voice") or text_to_speech.get("voice")
else:
try:
voice = args.get("voice") or app_model.app_model_config.text_to_speech_dict.get("voice")
except Exception:
voice = None
response = AudioService.transcript_tts(app_model=app_model, text=text, message_id=message_id, voice=voice)
return response
except services.errors.app_model_config.AppModelConfigBrokenError:
logging.exception("App model config broken.")
raise AppUnavailableError()
@ -124,12 +141,12 @@ class TextModesApi(Resource):
def get(self, app_model):
try:
parser = reqparse.RequestParser()
parser.add_argument('language', type=str, required=True, location='args')
parser.add_argument("language", type=str, required=True, location="args")
args = parser.parse_args()
response = AudioService.transcript_tts_voices(
tenant_id=app_model.tenant_id,
language=args['language'],
language=args["language"],
)
return response
@ -158,6 +175,6 @@ class TextModesApi(Resource):
raise InternalServerError()
api.add_resource(ChatMessageAudioApi, '/apps/<uuid:app_id>/audio-to-text')
api.add_resource(ChatMessageTextApi, '/apps/<uuid:app_id>/text-to-audio')
api.add_resource(TextModesApi, '/apps/<uuid:app_id>/text-to-audio/voices')
api.add_resource(ChatMessageAudioApi, "/apps/<uuid:app_id>/audio-to-text")
api.add_resource(ChatMessageTextApi, "/apps/<uuid:app_id>/text-to-audio")
api.add_resource(TextModesApi, "/apps/<uuid:app_id>/text-to-audio/voices")

View File

@ -17,46 +17,48 @@ from controllers.console.app.error import (
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.errors.error import (
AppInvokeQuotaExceededError,
ModelCurrentlyNotSupportError,
ProviderTokenNotInitError,
QuotaExceededError,
)
from core.model_runtime.errors.invoke import InvokeError
from libs import helper
from libs.helper import uuid_value
from libs.login import login_required
from models.model import AppMode
from services.app_generate_service import AppGenerateService
from services.errors.llm import InvokeRateLimitError
# define completion message api for user
class CompletionMessageApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
def post(self, app_model):
parser = reqparse.RequestParser()
parser.add_argument('inputs', type=dict, required=True, location='json')
parser.add_argument('query', type=str, location='json', default='')
parser.add_argument('files', type=list, required=False, location='json')
parser.add_argument('model_config', type=dict, required=True, location='json')
parser.add_argument('response_mode', type=str, choices=['blocking', 'streaming'], location='json')
parser.add_argument('retriever_from', type=str, required=False, default='dev', location='json')
parser.add_argument("inputs", type=dict, required=True, location="json")
parser.add_argument("query", type=str, location="json", default="")
parser.add_argument("files", type=list, required=False, location="json")
parser.add_argument("model_config", type=dict, required=True, location="json")
parser.add_argument("response_mode", type=str, choices=["blocking", "streaming"], location="json")
parser.add_argument("retriever_from", type=str, required=False, default="dev", location="json")
args = parser.parse_args()
streaming = args['response_mode'] != 'blocking'
args['auto_generate_name'] = False
streaming = args["response_mode"] != "blocking"
args["auto_generate_name"] = False
account = flask_login.current_user
try:
response = AppGenerateService.generate(
app_model=app_model,
user=account,
args=args,
invoke_from=InvokeFrom.DEBUGGER,
streaming=streaming
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
)
return helper.compact_generate_response(response)
@ -75,7 +77,7 @@ class CompletionMessageApi(Resource):
raise ProviderModelCurrentlyNotSupportError()
except InvokeError as e:
raise CompletionRequestError(e.description)
except ValueError as e:
except (ValueError, AppInvokeQuotaExceededError) as e:
raise e
except Exception as e:
logging.exception("internal server error.")
@ -92,7 +94,7 @@ class CompletionMessageStopApi(Resource):
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id)
return {'result': 'success'}, 200
return {"result": "success"}, 200
class ChatMessageApi(Resource):
@ -102,27 +104,23 @@ class ChatMessageApi(Resource):
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT])
def post(self, app_model):
parser = reqparse.RequestParser()
parser.add_argument('inputs', type=dict, required=True, location='json')
parser.add_argument('query', type=str, required=True, location='json')
parser.add_argument('files', type=list, required=False, location='json')
parser.add_argument('model_config', type=dict, required=True, location='json')
parser.add_argument('conversation_id', type=uuid_value, location='json')
parser.add_argument('response_mode', type=str, choices=['blocking', 'streaming'], location='json')
parser.add_argument('retriever_from', type=str, required=False, default='dev', location='json')
parser.add_argument("inputs", type=dict, required=True, location="json")
parser.add_argument("query", type=str, required=True, location="json")
parser.add_argument("files", type=list, required=False, location="json")
parser.add_argument("model_config", type=dict, required=True, location="json")
parser.add_argument("conversation_id", type=uuid_value, location="json")
parser.add_argument("response_mode", type=str, choices=["blocking", "streaming"], location="json")
parser.add_argument("retriever_from", type=str, required=False, default="dev", location="json")
args = parser.parse_args()
streaming = args['response_mode'] != 'blocking'
args['auto_generate_name'] = False
streaming = args["response_mode"] != "blocking"
args["auto_generate_name"] = False
account = flask_login.current_user
try:
response = AppGenerateService.generate(
app_model=app_model,
user=account,
args=args,
invoke_from=InvokeFrom.DEBUGGER,
streaming=streaming
app_model=app_model, user=account, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=streaming
)
return helper.compact_generate_response(response)
@ -139,9 +137,11 @@ class ChatMessageApi(Resource):
raise ProviderQuotaExceededError()
except ModelCurrentlyNotSupportError:
raise ProviderModelCurrentlyNotSupportError()
except InvokeRateLimitError as ex:
raise InvokeRateLimitHttpError(ex.description)
except InvokeError as e:
raise CompletionRequestError(e.description)
except ValueError as e:
except (ValueError, AppInvokeQuotaExceededError) as e:
raise e
except Exception as e:
logging.exception("internal server error.")
@ -158,10 +158,10 @@ class ChatMessageStopApi(Resource):
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, account.id)
return {'result': 'success'}, 200
return {"result": "success"}, 200
api.add_resource(CompletionMessageApi, '/apps/<uuid:app_id>/completion-messages')
api.add_resource(CompletionMessageStopApi, '/apps/<uuid:app_id>/completion-messages/<string:task_id>/stop')
api.add_resource(ChatMessageApi, '/apps/<uuid:app_id>/chat-messages')
api.add_resource(ChatMessageStopApi, '/apps/<uuid:app_id>/chat-messages/<string:task_id>/stop')
api.add_resource(CompletionMessageApi, "/apps/<uuid:app_id>/completion-messages")
api.add_resource(CompletionMessageStopApi, "/apps/<uuid:app_id>/completion-messages/<string:task_id>/stop")
api.add_resource(ChatMessageApi, "/apps/<uuid:app_id>/chat-messages")
api.add_resource(ChatMessageStopApi, "/apps/<uuid:app_id>/chat-messages/<string:task_id>/stop")

View File

@ -6,7 +6,7 @@ from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from sqlalchemy import func, or_
from sqlalchemy.orm import joinedload
from werkzeug.exceptions import NotFound
from werkzeug.exceptions import Forbidden, NotFound
from controllers.console import api
from controllers.console.app.wraps import get_app_model
@ -20,38 +20,38 @@ from fields.conversation_fields import (
conversation_pagination_fields,
conversation_with_summary_pagination_fields,
)
from libs.helper import datetime_string
from libs.helper import DatetimeString
from libs.login import login_required
from models.model import AppMode, Conversation, Message, MessageAnnotation
from models.model import AppMode, Conversation, EndUser, Message, MessageAnnotation
class CompletionConversationApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
@marshal_with(conversation_pagination_fields)
def get(self, app_model):
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('keyword', type=str, location='args')
parser.add_argument('start', type=datetime_string('%Y-%m-%d %H:%M'), location='args')
parser.add_argument('end', type=datetime_string('%Y-%m-%d %H:%M'), location='args')
parser.add_argument('annotation_status', type=str,
choices=['annotated', 'not_annotated', 'all'], default='all', location='args')
parser.add_argument('page', type=int_range(1, 99999), default=1, location='args')
parser.add_argument('limit', type=int_range(1, 100), default=20, location='args')
parser.add_argument("keyword", type=str, location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument(
"annotation_status", type=str, choices=["annotated", "not_annotated", "all"], default="all", location="args"
)
parser.add_argument("page", type=int_range(1, 99999), default=1, location="args")
parser.add_argument("limit", type=int_range(1, 100), default=20, location="args")
args = parser.parse_args()
query = db.select(Conversation).where(Conversation.app_id == app_model.id, Conversation.mode == 'completion')
query = db.select(Conversation).where(Conversation.app_id == app_model.id, Conversation.mode == "completion")
if args['keyword']:
query = query.join(
Message, Message.conversation_id == Conversation.id
).filter(
if args["keyword"]:
query = query.join(Message, Message.conversation_id == Conversation.id).filter(
or_(
Message.query.ilike('%{}%'.format(args['keyword'])),
Message.answer.ilike('%{}%'.format(args['keyword']))
Message.query.ilike("%{}%".format(args["keyword"])),
Message.answer.ilike("%{}%".format(args["keyword"])),
)
)
@ -59,8 +59,8 @@ class CompletionConversationApi(Resource):
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args['start']:
start_datetime = datetime.strptime(args['start'], '%Y-%m-%d %H:%M')
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
@ -68,8 +68,8 @@ class CompletionConversationApi(Resource):
query = query.where(Conversation.created_at >= start_datetime_utc)
if args['end']:
end_datetime = datetime.strptime(args['end'], '%Y-%m-%d %H:%M')
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=59)
end_datetime_timezone = timezone.localize(end_datetime)
@ -77,35 +77,33 @@ class CompletionConversationApi(Resource):
query = query.where(Conversation.created_at < end_datetime_utc)
if args['annotation_status'] == "annotated":
if args["annotation_status"] == "annotated":
query = query.options(joinedload(Conversation.message_annotations)).join(
MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id
)
elif args['annotation_status'] == "not_annotated":
query = query.outerjoin(
MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id
).group_by(Conversation.id).having(func.count(MessageAnnotation.id) == 0)
elif args["annotation_status"] == "not_annotated":
query = (
query.outerjoin(MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id)
.group_by(Conversation.id)
.having(func.count(MessageAnnotation.id) == 0)
)
query = query.order_by(Conversation.created_at.desc())
conversations = db.paginate(
query,
page=args['page'],
per_page=args['limit'],
error_out=False
)
conversations = db.paginate(query, page=args["page"], per_page=args["limit"], error_out=False)
return conversations
class CompletionConversationDetailApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
@marshal_with(conversation_message_detail_fields)
def get(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
conversation_id = str(conversation_id)
return _get_conversation(app_model, conversation_id)
@ -115,10 +113,15 @@ class CompletionConversationDetailApi(Resource):
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def delete(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
conversation_id = str(conversation_id)
conversation = db.session.query(Conversation) \
.filter(Conversation.id == conversation_id, Conversation.app_id == app_model.id).first()
conversation = (
db.session.query(Conversation)
.filter(Conversation.id == conversation_id, Conversation.app_id == app_model.id)
.first()
)
if not conversation:
raise NotFound("Conversation Not Exists.")
@ -126,105 +129,145 @@ class CompletionConversationDetailApi(Resource):
conversation.is_deleted = True
db.session.commit()
return {'result': 'success'}, 204
return {"result": "success"}, 204
class ChatConversationApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@marshal_with(conversation_with_summary_pagination_fields)
def get(self, app_model):
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument('keyword', type=str, location='args')
parser.add_argument('start', type=datetime_string('%Y-%m-%d %H:%M'), location='args')
parser.add_argument('end', type=datetime_string('%Y-%m-%d %H:%M'), location='args')
parser.add_argument('annotation_status', type=str,
choices=['annotated', 'not_annotated', 'all'], default='all', location='args')
parser.add_argument('message_count_gte', type=int_range(1, 99999), required=False, location='args')
parser.add_argument('page', type=int_range(1, 99999), required=False, default=1, location='args')
parser.add_argument('limit', type=int_range(1, 100), required=False, default=20, location='args')
parser.add_argument("keyword", type=str, location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument(
"annotation_status", type=str, choices=["annotated", "not_annotated", "all"], default="all", location="args"
)
parser.add_argument("message_count_gte", type=int_range(1, 99999), required=False, location="args")
parser.add_argument("page", type=int_range(1, 99999), required=False, default=1, location="args")
parser.add_argument("limit", type=int_range(1, 100), required=False, default=20, location="args")
parser.add_argument(
"sort_by",
type=str,
choices=["created_at", "-created_at", "updated_at", "-updated_at"],
required=False,
default="-updated_at",
location="args",
)
args = parser.parse_args()
subquery = (
db.session.query(
Conversation.id.label("conversation_id"), EndUser.session_id.label("from_end_user_session_id")
)
.outerjoin(EndUser, Conversation.from_end_user_id == EndUser.id)
.subquery()
)
query = db.select(Conversation).where(Conversation.app_id == app_model.id)
if args['keyword']:
query = query.join(
Message, Message.conversation_id == Conversation.id
).filter(
or_(
Message.query.ilike('%{}%'.format(args['keyword'])),
Message.answer.ilike('%{}%'.format(args['keyword'])),
Conversation.name.ilike('%{}%'.format(args['keyword'])),
Conversation.introduction.ilike('%{}%'.format(args['keyword'])),
),
if args["keyword"]:
keyword_filter = "%{}%".format(args["keyword"])
query = (
query.join(
Message,
Message.conversation_id == Conversation.id,
)
.join(subquery, subquery.c.conversation_id == Conversation.id)
.filter(
or_(
Message.query.ilike(keyword_filter),
Message.answer.ilike(keyword_filter),
Conversation.name.ilike(keyword_filter),
Conversation.introduction.ilike(keyword_filter),
subquery.c.from_end_user_session_id.ilike(keyword_filter),
),
)
)
account = current_user
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args['start']:
start_datetime = datetime.strptime(args['start'], '%Y-%m-%d %H:%M')
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
query = query.where(Conversation.created_at >= start_datetime_utc)
match args["sort_by"]:
case "updated_at" | "-updated_at":
query = query.where(Conversation.updated_at >= start_datetime_utc)
case "created_at" | "-created_at" | _:
query = query.where(Conversation.created_at >= start_datetime_utc)
if args['end']:
end_datetime = datetime.strptime(args['end'], '%Y-%m-%d %H:%M')
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=59)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
query = query.where(Conversation.created_at < end_datetime_utc)
match args["sort_by"]:
case "updated_at" | "-updated_at":
query = query.where(Conversation.updated_at <= end_datetime_utc)
case "created_at" | "-created_at" | _:
query = query.where(Conversation.created_at <= end_datetime_utc)
if args['annotation_status'] == "annotated":
if args["annotation_status"] == "annotated":
query = query.options(joinedload(Conversation.message_annotations)).join(
MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id
)
elif args['annotation_status'] == "not_annotated":
query = query.outerjoin(
MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id
).group_by(Conversation.id).having(func.count(MessageAnnotation.id) == 0)
elif args["annotation_status"] == "not_annotated":
query = (
query.outerjoin(MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id)
.group_by(Conversation.id)
.having(func.count(MessageAnnotation.id) == 0)
)
if args['message_count_gte'] and args['message_count_gte'] >= 1:
if args["message_count_gte"] and args["message_count_gte"] >= 1:
query = (
query.options(joinedload(Conversation.messages))
.join(Message, Message.conversation_id == Conversation.id)
.group_by(Conversation.id)
.having(func.count(Message.id) >= args['message_count_gte'])
.having(func.count(Message.id) >= args["message_count_gte"])
)
if app_model.mode == AppMode.ADVANCED_CHAT.value:
query = query.where(Conversation.invoke_from != InvokeFrom.DEBUGGER.value)
query = query.order_by(Conversation.created_at.desc())
match args["sort_by"]:
case "created_at":
query = query.order_by(Conversation.created_at.asc())
case "-created_at":
query = query.order_by(Conversation.created_at.desc())
case "updated_at":
query = query.order_by(Conversation.updated_at.asc())
case "-updated_at":
query = query.order_by(Conversation.updated_at.desc())
case _:
query = query.order_by(Conversation.created_at.desc())
conversations = db.paginate(
query,
page=args['page'],
per_page=args['limit'],
error_out=False
)
conversations = db.paginate(query, page=args["page"], per_page=args["limit"], error_out=False)
return conversations
class ChatConversationDetailApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@marshal_with(conversation_detail_fields)
def get(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
conversation_id = str(conversation_id)
return _get_conversation(app_model, conversation_id)
@ -234,10 +277,15 @@ class ChatConversationDetailApi(Resource):
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@account_initialization_required
def delete(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
conversation_id = str(conversation_id)
conversation = db.session.query(Conversation) \
.filter(Conversation.id == conversation_id, Conversation.app_id == app_model.id).first()
conversation = (
db.session.query(Conversation)
.filter(Conversation.id == conversation_id, Conversation.app_id == app_model.id)
.first()
)
if not conversation:
raise NotFound("Conversation Not Exists.")
@ -245,18 +293,21 @@ class ChatConversationDetailApi(Resource):
conversation.is_deleted = True
db.session.commit()
return {'result': 'success'}, 204
return {"result": "success"}, 204
api.add_resource(CompletionConversationApi, '/apps/<uuid:app_id>/completion-conversations')
api.add_resource(CompletionConversationDetailApi, '/apps/<uuid:app_id>/completion-conversations/<uuid:conversation_id>')
api.add_resource(ChatConversationApi, '/apps/<uuid:app_id>/chat-conversations')
api.add_resource(ChatConversationDetailApi, '/apps/<uuid:app_id>/chat-conversations/<uuid:conversation_id>')
api.add_resource(CompletionConversationApi, "/apps/<uuid:app_id>/completion-conversations")
api.add_resource(CompletionConversationDetailApi, "/apps/<uuid:app_id>/completion-conversations/<uuid:conversation_id>")
api.add_resource(ChatConversationApi, "/apps/<uuid:app_id>/chat-conversations")
api.add_resource(ChatConversationDetailApi, "/apps/<uuid:app_id>/chat-conversations/<uuid:conversation_id>")
def _get_conversation(app_model, conversation_id):
conversation = db.session.query(Conversation) \
.filter(Conversation.id == conversation_id, Conversation.app_id == app_model.id).first()
conversation = (
db.session.query(Conversation)
.filter(Conversation.id == conversation_id, Conversation.app_id == app_model.id)
.first()
)
if not conversation:
raise NotFound("Conversation Not Exists.")

View File

@ -0,0 +1,61 @@
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.setup import setup_required
from controllers.console.wraps import account_initialization_required
from extensions.ext_database import db
from fields.conversation_variable_fields import paginated_conversation_variable_fields
from libs.login import login_required
from models import ConversationVariable
from models.model import AppMode
class ConversationVariablesApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.ADVANCED_CHAT)
@marshal_with(paginated_conversation_variable_fields)
def get(self, app_model):
parser = reqparse.RequestParser()
parser.add_argument("conversation_id", type=str, location="args")
args = parser.parse_args()
stmt = (
select(ConversationVariable)
.where(ConversationVariable.app_id == app_model.id)
.order_by(ConversationVariable.created_at)
)
if args["conversation_id"]:
stmt = stmt.where(ConversationVariable.conversation_id == args["conversation_id"])
else:
raise ValueError("conversation_id is required")
# NOTE: This is a temporary solution to avoid performance issues.
page = 1
page_size = 100
stmt = stmt.limit(page_size).offset((page - 1) * page_size)
with Session(db.engine) as session:
rows = session.scalars(stmt).all()
return {
"page": page,
"limit": page_size,
"total": len(rows),
"has_more": False,
"data": [
{
"created_at": row.created_at,
"updated_at": row.updated_at,
**row.to_variable().model_dump(),
}
for row in rows
],
}
api.add_resource(ConversationVariablesApi, "/apps/<uuid:app_id>/conversation-variables")

View File

@ -2,92 +2,128 @@ from libs.exception import BaseHTTPException
class AppNotFoundError(BaseHTTPException):
error_code = 'app_not_found'
error_code = "app_not_found"
description = "App not found."
code = 404
class ProviderNotInitializeError(BaseHTTPException):
error_code = 'provider_not_initialize'
description = "No valid model provider credentials found. " \
"Please go to Settings -> Model Provider to complete your provider credentials."
error_code = "provider_not_initialize"
description = (
"No valid model provider credentials found. "
"Please go to Settings -> Model Provider to complete your provider credentials."
)
code = 400
class ProviderQuotaExceededError(BaseHTTPException):
error_code = 'provider_quota_exceeded'
description = "Your quota for Dify Hosted Model Provider has been exhausted. " \
"Please go to Settings -> Model Provider to complete your own provider credentials."
error_code = "provider_quota_exceeded"
description = (
"Your quota for Dify Hosted Model Provider has been exhausted. "
"Please go to Settings -> Model Provider to complete your own provider credentials."
)
code = 400
class ProviderModelCurrentlyNotSupportError(BaseHTTPException):
error_code = 'model_currently_not_support'
error_code = "model_currently_not_support"
description = "Dify Hosted OpenAI trial currently not support the GPT-4 model."
code = 400
class ConversationCompletedError(BaseHTTPException):
error_code = 'conversation_completed'
error_code = "conversation_completed"
description = "The conversation has ended. Please start a new conversation."
code = 400
class AppUnavailableError(BaseHTTPException):
error_code = 'app_unavailable'
error_code = "app_unavailable"
description = "App unavailable, please check your app configurations."
code = 400
class CompletionRequestError(BaseHTTPException):
error_code = 'completion_request_error'
error_code = "completion_request_error"
description = "Completion request failed."
code = 400
class AppMoreLikeThisDisabledError(BaseHTTPException):
error_code = 'app_more_like_this_disabled'
error_code = "app_more_like_this_disabled"
description = "The 'More like this' feature is disabled. Please refresh your page."
code = 403
class NoAudioUploadedError(BaseHTTPException):
error_code = 'no_audio_uploaded'
error_code = "no_audio_uploaded"
description = "Please upload your audio."
code = 400
class AudioTooLargeError(BaseHTTPException):
error_code = 'audio_too_large'
error_code = "audio_too_large"
description = "Audio size exceeded. {message}"
code = 413
class UnsupportedAudioTypeError(BaseHTTPException):
error_code = 'unsupported_audio_type'
error_code = "unsupported_audio_type"
description = "Audio type not allowed."
code = 415
class ProviderNotSupportSpeechToTextError(BaseHTTPException):
error_code = 'provider_not_support_speech_to_text'
error_code = "provider_not_support_speech_to_text"
description = "Provider not support speech to text."
code = 400
class NoFileUploadedError(BaseHTTPException):
error_code = 'no_file_uploaded'
error_code = "no_file_uploaded"
description = "Please upload your file."
code = 400
class TooManyFilesError(BaseHTTPException):
error_code = 'too_many_files'
error_code = "too_many_files"
description = "Only one file is allowed."
code = 400
class DraftWorkflowNotExist(BaseHTTPException):
error_code = 'draft_workflow_not_exist'
error_code = "draft_workflow_not_exist"
description = "Draft workflow need to be initialized."
code = 400
class DraftWorkflowNotSync(BaseHTTPException):
error_code = "draft_workflow_not_sync"
description = "Workflow graph might have been modified, please refresh and resubmit."
code = 400
class TracingConfigNotExist(BaseHTTPException):
error_code = "trace_config_not_exist"
description = "Trace config not exist."
code = 400
class TracingConfigIsExist(BaseHTTPException):
error_code = "trace_config_is_exist"
description = "Trace config is exist."
code = 400
class TracingConfigCheckError(BaseHTTPException):
error_code = "trace_config_check_error"
description = "Invalid Credentials."
code = 400
class InvokeRateLimitError(BaseHTTPException):
"""Raised when the Invoke returns rate limit error."""
error_code = "rate_limit_error"
description = "Rate Limit Error"
code = 429

Some files were not shown because too many files have changed in this diff Show More