Compare commits

...

211 Commits

Author SHA1 Message Date
4dc06ee43f revert: https://github.com/langgenius/dify/pull/18554 (#19787) 2025-05-15 22:39:35 +08:00
8081aec730 Fix TiDB vector configuration comment to correctly use 'tidb_vector' (#19767) 2025-05-15 21:32:45 +08:00
aae80681f2 Fix: style of dataset item in chatbot configure with theme dark (#19761) 2025-05-15 17:36:05 +08:00
b862a0cac6 update readme (#19757) 2025-05-15 17:03:45 +08:00
7e618779bc chore: upload describe image. (#19756)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-15 17:03:10 +08:00
71704a713b Fix: style of check list in dark mode (#19744) 2025-05-15 16:07:52 +08:00
937d151187 chore: Updates version numbers to 1.4.0 and related services (#19731)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-15 16:03:16 +08:00
4ac2f02775 add: new brand image (#19676) 2025-05-15 16:02:58 +08:00
f8a9c363ba fix: update background gradient and line positioning in Empty component (#19745) 2025-05-15 15:57:13 +08:00
6d7f43b901 feat: update email templates (#19739) 2025-05-15 14:57:19 +08:00
95467a3f0b fix: broken behavior of rendering (#19732) 2025-05-15 14:37:00 +08:00
bd7094b9f5 chore: image allow "data:" in csp (#19728) 2025-05-15 14:13:49 +08:00
ff0feaf34e fix: handle EndpointSetupFailedError in BasePluginClient (#19613) 2025-05-15 13:59:43 +08:00
dc75a10989 feat: update branding (#19719)
Co-authored-by: twwu <twwu@dify.ai>
2025-05-15 12:38:20 +08:00
b292990075 Fix: Ensure unique index names for pgvector knowledge tables (#19672)
Co-authored-by: crazywoola <427733928@qq.com>
2025-05-15 11:43:44 +08:00
486a66be54 fix: item data type wrong in iteration (#19709) 2025-05-15 10:54:35 +08:00
dd4419fd5e Revert "Support for copying nodes between workflows (This feature is unrelated to remove functions. When using the copy function, the browser will permanently retain the last copied node)."" (#19708) 2025-05-15 10:26:23 +08:00
303c6ecc1d fix: The init_azure_openai() method in the core/hosting_configuration.py file doesn't work (#19704) (#19705)
Co-authored-by: crazywoola <427733928@qq.com>
2025-05-15 10:02:59 +08:00
85eb55de37 feat(extension): support otel grpc exporter (#19686) 2025-05-14 22:37:27 +08:00
e040f8069b Support for copying nodes between workflows (This feature is unrelated to remove functions. When using the copy function, the browser will permanently retain the last copied node)." (#19687)
Co-authored-by: crazywoola <427733928@qq.com>
2025-05-14 19:03:11 +08:00
17b929124f refactor: simplify success response in dataset API endpoints by returning status code 204 directly (#19685) 2025-05-14 18:44:35 +08:00
79015bf8d9 fix: use different local may not load image (#19693) 2025-05-14 18:41:03 +08:00
1c91736a6d fix: auto translate failed when there is a new file in english (#19671) 2025-05-14 16:01:35 +08:00
1b4fea1794 fix: Referencing Metadata in the response of the External Knowledge A… (#19637) (#19644)
Co-authored-by: satou.kazuhiro <satou.kazuhiro@fanuc.co.jp>
2025-05-14 15:05:28 +08:00
85a44b7349 fix: close browser would reset to browser default language (#19665) 2025-05-14 15:00:28 +08:00
5360180a2a feat: add index for workflow_conversation_variables.conversation_id (#19657)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-14 14:16:15 +08:00
2c5f5b0c67 Feat: add search params with theme in links of marketplace (#19648) 2025-05-14 13:46:03 +08:00
9dce0e40b5 fix:fix log formatting field not found in record: 'req_id' (#19575)
Co-authored-by: 刘敏 <min.liu@tongdun.net>
2025-05-14 12:17:35 +08:00
ff20b56074 Add /site API (#19631) 2025-05-14 10:43:36 +08:00
3c953cb0ef fix:#18447:When variables in the workflow are deleted or modified, it is impossible to visually identify subsequent node errors (#18554)
Co-authored-by: crazywoola <427733928@qq.com>
2025-05-14 10:17:15 +08:00
9e795374bc chore: add dev scripts to start api and worker service with uv run (#19633) 2025-05-14 09:54:47 +08:00
3548c133e3 Feat: add theme switcher (#18093) 2025-05-14 09:06:14 +08:00
297d35364e fix(web): optimize action buttons style in the question component (#19626) 2025-05-13 21:31:01 +08:00
b8e305f183 fix: fix can't config Nth item in list Node (#19618) 2025-05-13 19:46:26 +08:00
be51384549 fix: premium badge styling (#19609) 2025-05-13 17:04:26 +08:00
f005434769 fix: not set web prefix use default (#19604) 2025-05-13 16:45:41 +08:00
fabfc7d4d8 fix: remove error message $ symbol (#19587)
Co-authored-by: 刘江波 <jiangbo721@163.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-05-13 16:20:29 +08:00
10a724cc62 chore: Update dependencies in pyproject.toml (#19598)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-13 16:12:17 +08:00
ccc3eeab10 fix(web): Fix metadata modal component (#19573) (#19592) 2025-05-13 15:40:34 +08:00
b0166dbe27 chore: upgrade package version to fix security issue (#19594) 2025-05-13 15:38:50 +08:00
65e9f6651c fix: image use not config host caused page crash (#19590) 2025-05-13 14:38:13 +08:00
934f724130 fix: invitations get suspended when an existing member appears (#19584) 2025-05-13 13:53:52 +08:00
57b3912227 fix: common prerequisite node workflow remove reachable node that failed to streaming llm… (#19552)
Co-authored-by: zhangshibo <zhangshibo@didiglobal.com>
2025-05-13 13:47:29 +08:00
692f922fa4 fix(web): fix the issue where the detail drawer content does not te after editing custom tools (#19460) 2025-05-13 13:47:15 +08:00
33d3bc276e Fix: typo ja translation (#19583) 2025-05-13 13:39:17 +08:00
f0137be719 fix(vector_service): Fixes type hinting and removes unnecessary ignores (#19574)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-13 11:29:02 +08:00
c76d763639 add endpoint of get feedbacks (#18697)
Co-authored-by: lizb <lizb@sugon.com>
2025-05-13 10:35:14 +08:00
0fed5c1193 fix(config): Allow DB_EXTRAS to set search_path via options (#19560)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-13 10:10:18 +08:00
a8b82d2b67 Fix btns (#19564) 2025-05-13 09:14:08 +08:00
0b22e8b544 chore: speed up api service startup time by defering the imports for trace services (#19504) 2025-05-13 09:13:25 +08:00
085bd1aa93 chore: model.query change to db.session.query (#19551)
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
2025-05-13 09:13:12 +08:00
f1e7099541 chore(pyproject.toml): Upgrade transformers and resend (#19562)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-13 09:12:59 +08:00
31d143de11 chore: disable redis client-side caching by default (#19524) 2025-05-12 21:42:07 +08:00
f3522a282c correct key to 'embedding_model_provider' in docs (#19541) 2025-05-12 21:39:41 +08:00
24f56df8c3 fix(web): able to enter Chinese characters in the view-form-dropdown (#19555)
Co-authored-by: fadeaway <chaofanchi@gmail.com>
2025-05-12 21:38:47 +08:00
14cd71ed0a chore: all model.query replace to db.session.query (#19521) 2025-05-12 15:19:41 +08:00
b00f94df64 fix: replace all dataset.Model.query to db.session.query(Model) (#19509) 2025-05-12 13:52:33 +08:00
49af07f444 fix: use NextJS basePath and WEB_PREFIX to support custom prefix (#19497)
Co-authored-by: johnny0120 <15564476+johnny0120@users.noreply.github.com>
2025-05-12 13:44:41 +08:00
d1c55cb901 chore: remove image csp (#19511) 2025-05-12 10:46:47 +08:00
855e850ef3 feat: enable Redis client-side caching (#19493) 2025-05-12 09:34:25 +08:00
c720e0dd04 refactor(workflow): revamp logging module for loop & iteration nodes (#19484) 2025-05-12 09:32:41 +08:00
87da155477 fix: agent log modal fails to open and the timer is not cleared (#18900) (#19471) 2025-05-12 09:00:57 +08:00
505d4cce78 Revert "sort extensions for review," (#19496) 2025-05-11 16:57:13 +08:00
c431da9571 sort extensions for review, (#19470)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
2025-05-10 20:01:31 +08:00
75259c1ea1 chore: update edu version text (#19485) 2025-05-10 20:00:43 +08:00
b29087b680 fix: db.session.query(TenantAccountJoin) (#19482) 2025-05-10 04:43:56 -07:00
af12cf1bf6 fix_invitation-link.tsx_url_more_basepath_bug (#19453)
Co-authored-by: qingguo <qingguo@lexin.com>
2025-05-10 18:17:16 +08:00
abc61f680a fix delete api response (#19480)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-05-10 18:17:05 +08:00
1119790b02 clean rag word_extractor. (#19397)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
2025-05-09 16:39:16 +08:00
56cff485d0 test(vdb/huaweicloudvectordb): Fix the wrong import path (#19413)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-09 16:37:59 +08:00
3a85f218ed refactor(workflow): Improve layout structure in VersionHistoryPanel (#19450) 2025-05-09 16:37:32 +08:00
c7a8885d9d fix: fix pypdfium2 version to 4.30.0 (#19443)
Signed-off-by: Yuichiro Utsumi <utsumi.yuichiro@fujitsu.com>
2025-05-09 16:23:20 +08:00
220db55e71 fix: TenantAccountJoin has no attribute 'query' (#19445) 2025-05-09 16:20:32 +08:00
792b321a81 refactor(models): Use the SQLAlchemy base model. (#19435)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-09 13:52:05 +08:00
2ad7305349 Revert "perf: optimizing db WorkflowAppLog index" (#19432) 2025-05-09 13:51:57 +08:00
198fbb9b3d fix: support echart function option (#19424) 2025-05-09 13:49:40 +08:00
b4064fa092 test(test_dify_config): Update test to use example environment file (#19427)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-09 11:42:51 +08:00
ee3b66bdcd fix: chatbot reopen behabior on iOS (#19406) 2025-05-08 22:52:53 +08:00
a24c20a731 fix: support text wrapping in buttons for long content (#19390) 2025-05-08 22:50:55 +08:00
135b8bd4f5 fix(workflow): Fix the expand/collapse animation effect (#19398) 2025-05-08 22:49:40 +08:00
cbc8ebd8f5 chore: bump pydantic to 2.11 and pydantic-settings to 2.9 (#15049) 2025-05-08 17:39:51 +08:00
58d9d35515 fix: inconsistent metadata definitions (#19343) 2025-05-08 16:33:28 +08:00
736a064bac fix(web): Add unique instanceId & key for AgentStrategy component (#18053) (#19386) 2025-05-08 16:20:51 +08:00
163a76eb6e Bug fix: Invalid edge connection data causes the page to crash. (#19369)
Co-authored-by: hzhufa <hzhufa@linewell.com>
2025-05-08 12:57:10 +08:00
3258a91d5d Feat/add repo to plugin manifest (#19337) 2025-05-07 17:28:38 +08:00
623ac7ea6d feat: add optional hidden property to endpoint items and filter hidden endpoints in endpoint card (#19163) 2025-05-07 16:46:02 +08:00
0358859467 fix: llm_usage.total_tokens stat (#19177) 2025-05-07 16:42:49 +08:00
838812640e fix: reopen switch to 'workflow orchestrate' menu in app detail page (#19274) 2025-05-07 14:58:45 +08:00
bfa652f2d0 fix: metadata filtering condition variable unassigned; fix External K… (#19208) 2025-05-07 14:52:09 +08:00
d1c08a810b feat: store mcp_config when switch agent strategy (#19291) 2025-05-07 14:49:28 +08:00
c457e2b67a clean docker compose env. (#19301)
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
2025-05-07 09:25:35 +08:00
c4c20f6ed5 [Observability] Update counter to include http method and target (#19297)
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-05-07 09:17:26 +08:00
f23cf98317 refactor: Remove RepositoryFactory (#19176)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-06 21:14:51 +08:00
a6827493f0 chore: slice workflow refresh draft hook (#19292) 2025-05-06 18:24:10 +08:00
9565fe9b1b fix(api): fix alembic offline mode (#19285)
Alembic's offline mode generates SQL from SQLAlchemy migration operations,
providing developers with a clear view of database schema changes without
requiring an active database connection.

However, some migration versions (specifically bbadea11becb and d7999dfa4aae)
were performing database schema introspection, which fails in offline mode
since it requires an actual database connection.

This commit:
- Adds offline mode support by detecting context.is_offline_mode()
- Skips introspection steps when in offline mode
- Adds warning messages in SQL output to inform users that assumptions were made
- Prompts users to review the generated SQL for accuracy

These changes ensure migrations work consistently in both online and offline modes.

Close #19284.
2025-05-06 18:05:19 +08:00
8de24bc16e chore: enhance dev script robustness by determining the script directory (#19209) 2025-05-06 17:02:40 +08:00
3ecc1e0228 Fix: update docs link (#19278) 2025-05-06 17:02:01 +08:00
1abf00e443 Fix doc bug workflow (#19269)
Co-authored-by: liuwangwang <liuwangwang@hikvision.com.cn>
2025-05-06 15:11:08 +08:00
Jim
6c515ef47f fix(web): add workspace selector overflow auto (#19265)
Co-authored-by: JMY <jiangmingyao@gf.com.cn>
2025-05-06 13:30:25 +08:00
0b44791eae feat: add mode for /info api (#19264) 2025-05-06 13:24:53 +08:00
0cfc82d731 fix(structured-output): reasoning model's json format parsing (#19261) 2025-05-06 13:16:08 +08:00
b78846078c Fix: hide view chat setting button when no inputs form (#19263) 2025-05-06 13:15:23 +08:00
8537abfff8 chore: avoid repeated type ignore noqa by adding flask_restful and flask_login in mypy import exclusions (#19224) 2025-05-06 11:58:49 +08:00
4b77c9df9d Fix: optional input in batch run (#19257) 2025-05-06 11:33:15 +08:00
b979a8a707 feat: sort variables in the selector by x axis for most recent order (#19243) 2025-05-06 10:59:02 +08:00
9231c197a5 fix: s.filter is not a function (#19250) 2025-05-06 10:26:44 +08:00
8ac3a223a8 fix(api): add missing INNER_API_KEY to InnerAPIConfig (#19166) 2025-05-06 10:02:14 +08:00
5a6f20d575 Optimize the event handling and rendering logic of the component picker (#19232) 2025-05-06 09:48:53 +08:00
c5568f756f fix basic auth if not base64 encode (#19242)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-05-06 09:18:37 +08:00
22f5af9987 fix: support non-ascii charactors in filename of the tool files (#19228) 2025-05-06 09:18:11 +08:00
e352ab2bdd chore: required pip and performance improvment in mypy checks (#19225) 2025-05-06 09:16:43 +08:00
bbf513a2cd Fix appURL construction when basePath is empty (#19234) 2025-05-05 22:35:41 +08:00
9bcf837f17 fix: use only supported operators in metadata filter system prompts (#19195) 2025-05-03 20:08:08 +08:00
a212a63e6a fix: time type metadata filtering error (#19192) 2025-05-03 20:07:37 +08:00
e2cae42115 chore: bump celery from 5.4 to 5.5 (#19190) 2025-05-03 20:07:04 +08:00
50fa0d1512 feat: Plugin page related document supports multiple languages (#19197) 2025-05-03 20:03:56 +08:00
bb1d1dc263 fix: fix API tool integration test (#19187) 2025-05-01 14:49:43 +08:00
1ca6dbcdc8 fix: file name incorrect when download file (#19183) 2025-04-30 22:47:59 +08:00
349c3cf7b8 feat(api): Add image multimodal support for LLMNode (#17372)
Enhance `LLMNode` with multimodal capability, introducing support for
image outputs.

This implementation extracts base64-encoded images from LLM responses,
saves them to the storage service, and records the file metadata in the
`ToolFile` table. In conversations, these images are rendered as
markdown-based inline images.
Additionally, the images are included in the LLMNode's output as
file variables, enabling subsequent nodes in the workflow to utilize them.

To integrate file outputs into workflows, adjustments to the frontend code
are necessary.

For multimodal output functionality, updates to related model configurations
are required. Currently, this capability has been applied exclusively to
Google's Gemini models.

Close #15814.

Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-04-30 17:28:02 +08:00
6c9a9d344a fix: mouse scrolling zooming can not function anymore (#19160) 2025-04-30 16:57:48 +08:00
f8e5341ac0 ci: add diff to lint ci (#17874)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-04-30 16:27:25 +08:00
12c96b93d9 immediately return initialed tiktokenizer instance and remove dead code in usage of tiktokenizer (#17957) 2025-04-30 16:07:20 +08:00
bcc95e520b feat: support remove first and remove last in variable assigner (#19144)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-30 15:50:00 +08:00
69b43a955f fix: inconsistent case expression in _process_metadata_filter_func (#19146) 2025-04-30 15:14:01 +08:00
3dff21e0be fix(i18n): add functions to retrieve document and pricing page languages (#19142) 2025-04-30 14:58:49 +08:00
d5ee465bf9 fix: fix render undefined when text children is empty (#19135) 2025-04-30 14:17:41 +08:00
65b7a783fe fix: metadata filter not work (#19020)
Co-authored-by: 金鹏程 <jinpengcheng01@corp.netease.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-30 11:06:03 +08:00
1bc94b92ac fix: fix import LexicalErrorBoundary error (#19124) 2025-04-30 11:05:47 +08:00
5088ab5061 feat(logAndAnn): add control option for question editing feature (#19117) 2025-04-30 10:57:23 +08:00
d70fa2847b add Accept-Ranges header for audio/video files (#19119) 2025-04-30 10:51:27 +08:00
8bf3f5ea78 fix(api): resolve external knowledge API error due to excessive URL validation (#19003)
The `validators.url` method from the `validators==0.21.0` library enforces a
URL length limit of less than 90 characters, which led to failures in external
knowledge API requests for long URLs.

This PR addresses the issue by replacing `validators.url` with 
`urllib.parse.urlparse`, effectively removing the restrictive URL length check.

Additionally, the unused `validators` dependency has been removed.

Fixes #18981.

Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-04-29 22:32:38 +08:00
a9f7bcb12f fix: Chinese input deletes extra character in Safari within Workspaces (#18193) (#19088) 2025-04-29 18:47:35 +08:00
bd1bbfee4b Enhance Code Consistency Across Repository with .editorconfig (#19023) 2025-04-29 18:04:33 +08:00
226afd4550 Fix: the issue of getting empty environment variables. (#19085) 2025-04-29 18:01:11 +08:00
ce9737c6a0 fix: change provider should change the content (#19075) 2025-04-29 15:49:31 +08:00
23f6914b9b fix: image preview triggers binary download (#19070) 2025-04-29 15:38:33 +08:00
2a3cc43b62 fix: remove redundant Mermaid graph direction enforcement (#19024) 2025-04-29 15:10:38 +08:00
8266815cda feat: add AWS Managed IAM auth for OpenSearch vector DB (#18963) 2025-04-29 15:10:08 +08:00
8b4ea01810 feat: support access milvus with token (#19034) 2025-04-29 14:52:13 +08:00
83187b30c0 fix: fix rerank model runner usage (#19008) 2025-04-29 14:51:21 +08:00
3c09b57059 fix(web): fix the wrong components usage(#18995) (#19065) 2025-04-29 14:39:59 +08:00
a147d2a200 feat(api): use json_repair to fix invalid json while generating structured output (#18977)
When generating JSON schema using an LLM in the structured output feature,
models may occasionally return invalid JSON, which prevents clients from correctly 
parsing the response and can lead to UI breakage.

This commit addresses the issue by introducing `json_repair` to automatically 
fix invalid JSON strings returned by the LLM, ensuring smoother functionality 
and better client-side handling of structured outputs.


Co-authored-by: lizb <lizb@sugon.com>
2025-04-29 12:39:13 +08:00
28a59ba344 chore: improve diagram picture of docker compose (#19054) 2025-04-29 11:43:50 +08:00
94cc0b7a12 fix(workflow_cycle_manage): failed nodes were not updated in workflow_node_executions (#18994) 2025-04-29 10:31:08 +08:00
e36b1a7016 test(graph-engine-test): modify the assert condition (#19041) 2025-04-29 09:51:42 +08:00
bf46b894e9 chore: Improve FILES_URL Configuration Comments (#19040) 2025-04-29 09:05:04 +08:00
f0ca828544 fix: fix embedded chatbot styles on a relatively wide screen (#19030) 2025-04-29 08:58:10 +08:00
36521e4275 fixes: fix entrypoint script with missing environment variables (#19039) 2025-04-29 08:57:58 +08:00
a93a09e0f7 feat: clean up message_files table first before proceeding to find orphaned files (#19035) 2025-04-29 08:57:42 +08:00
a54773fbff refactor: switch to dynamic versioning in package configuration (#19019)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-28 19:51:50 +08:00
b8bb45b106 remove unstructured api key check (#18989) 2025-04-28 17:26:30 +08:00
ecade13455 add MAX_TASK_PRE_CHILD for celery (#18985) 2025-04-28 17:06:00 +08:00
49678e4b48 chore: Bump version to 1.3.1 (#18962)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-28 16:45:17 +08:00
ea150dc336 feat: Upload a folder to knowledgebase (#17026)
Co-authored-by: Silow9 <soulskytony@gmail.com>
2025-04-28 15:31:22 +08:00
5de01c1444 feat (document_extractor): support .properties file (#18969) 2025-04-28 15:28:11 +08:00
f86e2edc54 refactor(plugin/backwards_invocation/app): Remove unnecessary .value from StrEnum (#18896)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-28 14:50:59 +08:00
bf01e41fd9 feat: improve embedded chatbot styles (#18692) 2025-04-28 14:38:59 +08:00
edcfd7761b if as_attachment is in the url, add it to the sign_url (#18930) 2025-04-28 14:25:59 +08:00
b8daf944f1 fix: header padding (#18957) 2025-04-28 13:57:44 +08:00
315436e43b fix: classify remove always remove the last one (#18959) 2025-04-28 13:56:43 +08:00
2c2af1d117 feat: add VTT data transform to Document extractor (#18936) 2025-04-28 13:45:15 +08:00
03ac2d0f17 fix: i.find is not a function (#18945) 2025-04-28 11:09:54 +08:00
8299614e60 [Observability][Bugfix] Fix expected an instance of Token, got None error in OpenTelemetry (#18934) 2025-04-28 10:31:13 +08:00
7a62202392 fix: when cot_agent call tool like searxng lost some response content (#16781) 2025-04-28 09:27:46 +08:00
eb92dd59f9 chore: disabled struct output not show model not support warning (#18909) 2025-04-27 18:30:45 +08:00
d9aa2b155a refactor: Refactors repository imports structure (#18901)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-27 17:29:03 +08:00
e5bdc1438a fix: annotation update need use http put method and annotation-reply api doc parms wrong (#18891) 2025-04-27 16:13:36 +08:00
e3daef19e7 chatflow/workflow add field required (#18892) 2025-04-27 16:12:02 +08:00
0e0ec4691a feat: add interfaces of OAuth handler methods for authorization (#18889) 2025-04-27 16:00:37 +08:00
7ccec5cd95 refactor: remove external link for dataset description guidance (#18884) 2025-04-27 14:47:00 +08:00
7613d9dc33 [Observability] Convert exception logging into span in OpenTelemetry (#18821) 2025-04-27 14:39:47 +08:00
77ad600a33 fix: Text Generation App should not should not show embedded in website (#18880) 2025-04-27 14:33:50 +08:00
abafa68647 refactor: rename plugin manager to plugin client and rename path from manager to impl (#18876) 2025-04-27 14:22:25 +08:00
d91828dd90 chore: support other webapps embedded in iframe (#18877) 2025-04-27 14:21:27 +08:00
19f2a74ba8 fix: check dsl version when create app from explore template (#18872) 2025-04-27 14:00:45 +08:00
58a929edd5 fix: install plugins permissions (#18870) 2025-04-27 14:00:35 +08:00
bed47dffb9 fix: update notice for users for clear-orphaned-file-records and remove-orphaned-files-on-storage commands (#18864) 2025-04-27 13:01:02 +08:00
136995d2a1 fix: change delete app status code from 204 to 200 (#18398)
Co-authored-by: devxing <devxing@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-27 12:12:46 +08:00
c1559a7c8e fix: LLMResultChunk cause concatenate str and list exception (#18852) 2025-04-27 11:32:14 +08:00
993ef87dca feat: add administrative commands to free up storage space by removing unused files (#18835) 2025-04-27 11:11:04 +08:00
b62eb61400 fix depth param issue for WaterCrawl (#18839) 2025-04-27 11:04:56 +08:00
0a20210a59 feat: Add W&B Weave Tracing Integration (#14262)
Signed-off-by: Yuichiro Utsumi <utsumi.yuichiro@fujitsu.com>
Signed-off-by: -LAN- <laipz8200@outlook.com>
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Signed-off-by: ChengZi <chen.zhang@zilliz.com>
Signed-off-by: cl <cailue@apache.org>
Co-authored-by: Yu Chun Chang <changyuchun159630@gmail.com>
Co-authored-by: Kyle Chang <kylechang@91app.com>
Co-authored-by: Lick-liu <51771897+Lick-liu@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: Yuichiro Utsumi <81412151+utsumi-fj@users.noreply.github.com>
Co-authored-by: NFish <douxc512@gmail.com>
Co-authored-by: Yeuoly <45712896+Yeuoly@users.noreply.github.com>
Co-authored-by: Wu Tianwei <30284043+WTW0313@users.noreply.github.com>
Co-authored-by: DDDDD12138 <43703884+DDDDD12138@users.noreply.github.com>
Co-authored-by: Jyong <76649700+JohnJyong@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Novice <857526207@qq.com>
Co-authored-by: yihong <zouzou0208@gmail.com>
Co-authored-by: Kalo Chin <91766386+fdb02983rhy@users.noreply.github.com>
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
Co-authored-by: jiangbo721 <365065261@qq.com>
Co-authored-by: 刘江波 <jiangbo721@163.com>
Co-authored-by: Lam <scau_ljw@126.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Mars <524574386@qq.com>
Co-authored-by: mars <linjx2@by-health.com>
Co-authored-by: Joe <79627742+ZhouhaoJiang@users.noreply.github.com>
Co-authored-by: Rafael Carvalho <r.carvalho@me.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: 非法操作 <hjlarry@163.com>
Co-authored-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: codingjaguar <codingjaguar@gmail.com>
Co-authored-by: ChengZi <chen.zhang@zilliz.com>
Co-authored-by: Fei He <droxer.he@gmail.com>
Co-authored-by: Arcaner <52057416+lrhan321@users.noreply.github.com>
Co-authored-by: Xiyuan Chen <52963600+GareArc@users.noreply.github.com>
Co-authored-by: KVOJJJin <jzongcode@gmail.com>
Co-authored-by: XiaoBa <94062266+XiaoBa-Yu@users.noreply.github.com>
Co-authored-by: Xiaoba Yu <xb1823725853@gmail.com>
Co-authored-by: zhangyuhang <2827528315@qq.com>
Co-authored-by: yuhang2.zhang <yuhang2.zhang@ly.com>
Co-authored-by: 诗浓 <nyaashino@gmail.com>
Co-authored-by: RookieAgent <42060616+Sakura4036@users.noreply.github.com>
Co-authored-by: sho-takano-dev <shota.takano.dev@gmail.com>
Co-authored-by: 過世秋風 <1040926235@qq.com>
Co-authored-by: Yi Feng <66539215+bigyifeng@users.noreply.github.com>
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
Co-authored-by: Yongtao Huang <99629139+hyongtao-db@users.noreply.github.com>
Co-authored-by: ShadowJobs <794878115@qq.com>
Co-authored-by: LinYing <linying@momenta.ai>
Co-authored-by: Benjamin <benjaminx@gmail.com>
Co-authored-by: LiuBodong <liubodong2010@126.com>
Co-authored-by: huangzhuo1949 <167434202+huangzhuo1949@users.noreply.github.com>
Co-authored-by: huangzhuo <huangzhuo1@xiaomi.com>
Co-authored-by: csurong <csurong1@gmail.com>
Co-authored-by: 傻笑zz <43721571+shaxiaozz@users.noreply.github.com>
Co-authored-by: L8ng <straydragonl@foxmail.com>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
Co-authored-by: Novice Lee <novicelee@NoviPro.local>
Co-authored-by: GuanMu <ballmanjq@gmail.com>
Co-authored-by: LittleFish-15 <58618983+LittleFish-15@users.noreply.github.com>
Co-authored-by: 诗浓 <844670992@qq.com>
Co-authored-by: luckylhb90 <luckylhb90@gmail.com>
Co-authored-by: hobo.l <hobo.l@binance.com>
Co-authored-by: Gen Sato <52241300+halogen22@users.noreply.github.com>
Co-authored-by: twwu <twwu@dify.ai>
Co-authored-by: StoneFancyX <53338920+StoneFancyX@users.noreply.github.com>
Co-authored-by: StoneFancyX <kindbin@qq.com>
Co-authored-by: Naoki KOBAYASHI <naotama@gmail.com>
Co-authored-by: kurokobo <kuro664@gmail.com>
Co-authored-by: cyflhn <cyflhn@163.com>
Co-authored-by: Yingchun Lai <laiyingchun@apache.org>
Co-authored-by: jimmyfen <757343258@qq.com>
Co-authored-by: Xuetao Song <xuetaomagicsong@gmail.com>
Co-authored-by: Panpan <wurui.dev@gmail.com>
Co-authored-by: wyy-holding <59436937+wyy-holding@users.noreply.github.com>
Co-authored-by: リイノ Lin <sorphwer@gmail.com>
Co-authored-by: Ning <accelerator314@gmail.com>
Co-authored-by: Linh Nguyen <55907715+batman0911@users.noreply.github.com>
Co-authored-by: Junjie.M <118170653@qq.com>
Co-authored-by: Ron <svcvit@gmail.com>
Co-authored-by: Novice <novice12185727@gmail.com>
Co-authored-by: NanoNova <kid1412621@gmail.com>
Co-authored-by: JaydenZhou <380774082@qq.com>
Co-authored-by: dotdotdot <823150982@qq.com>
Co-authored-by: Good Wood <slm_1990@126.com>
Co-authored-by: Ryosei Karaki <38310693+karamaru-alpha@users.noreply.github.com>
Co-authored-by: chenhuan0728 <54611342+chenhuan0728@users.noreply.github.com>
Co-authored-by: chenhuan <huan.chen0728@foxmail>
Co-authored-by: lenbo <islenbo@qq.com>
Co-authored-by: Jiang <65766008+AlwaysBluer@users.noreply.github.com>
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
Co-authored-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: zhangkun-21 <sephiroth0932@gmail.com>
Co-authored-by: hsiong <37357447+hsiong@users.noreply.github.com>
Co-authored-by: 李远军 <4842@9ji.com>
Co-authored-by: yourchanges <yourchanges@gmail.com>
Co-authored-by: David <guyuezhuying@126.com>
Co-authored-by: liuzhenghua <1090179900@qq.com>
Co-authored-by: taokuizu <taokuizu@qq.com>
Co-authored-by: Hanqing Zhao <sherry9277@gmail.com>
Co-authored-by: JimintheBox <gjwlals111@gmail.com>
Co-authored-by: wlleiiwang <1025164922@qq.com>
Co-authored-by: wlleiiwang <wlleiiwang@tencent.com>
Co-authored-by: Alex <32982705+AlexYuan997@users.noreply.github.com>
Co-authored-by: yuanlong <yuanlong@boco.com.cn>
Co-authored-by: wanttobeamaster <45583625+wanttobeamaster@users.noreply.github.com>
Co-authored-by: xiaozhiqing.xzq <xiaozhiqing.xzq@alibaba-inc.com>
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
Co-authored-by: tyounami <vkbo@qq.com>
Co-authored-by: bo.zhao <bo.zhao@iglooinsure.com>
Co-authored-by: ClSlaid <cailue@apache.org>
Co-authored-by: adru <106513264+adpanru@users.noreply.github.com>
Co-authored-by: horochx <32632779+horochx@users.noreply.github.com>
2025-04-26 04:28:30 -07:00
f6305858a5 fix(plugin_service): Add marketplace enabled check before plugin operations (#18806) 2025-04-26 08:02:53 +08:00
db7af52fcc Hotfix/create from template category (#18807) 2025-04-25 22:14:19 +08:00
09a5f8da1d feat(app_dsl_service): Refines version compatibility logic (#18798)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-25 18:42:39 +08:00
c104febf63 refactor: Apply DI to WorkflowNodeExecutionRepository. (#18794)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-25 18:05:36 +08:00
0e68e8b40a fix: embed chatbot can't drag when use mouse (#18781) 2025-04-25 17:26:16 +08:00
9a3ecc1ac8 fix: Allow advanced chat app to get workflow run detail (#18753) (#18758) 2025-04-25 16:48:38 +08:00
ec82534a1e optimize account status field hard coded (#18771)
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-25 16:47:03 +08:00
9bcc8041e9 fix: #18744 The model order defined in position.yaml in the Model Plugin is not taking effect. (#18756) 2025-04-25 16:45:48 +08:00
a944542858 fix(web): add missing 'clsx' dependency for pagination component (#18769) 2025-04-25 16:43:44 +08:00
a5e6a0dc0c enable pan by fingers (#18775) 2025-04-25 16:36:54 +08:00
a575fbca94 fix: update api rate limit on Pricing page (#18755) 2025-04-25 14:37:04 +08:00
fc4e11d127 fix: wording in README.md (#18751) 2025-04-25 13:42:10 +08:00
9988a506cd fix: enable Milvus database configuration via .env(#18707) (#18714)
Co-authored-by: jiawei.wang <jml@0603>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-25 12:12:30 +08:00
12836f9db9 Resolves #18536 Retreive conversation variables (#18581) 2025-04-25 11:52:25 +08:00
2627e221f2 fix: buildin tool provider credentials_for_provider (#18725)
Co-authored-by: hobo.l <hobo.l@binance.com>
2025-04-25 10:08:16 +08:00
5e2b3b34e5 issue: #17056 : Add a reason field to the message_replace event (#17195)
Co-authored-by: 聂政 <niezheng@pjlab.org.cn>
2025-04-25 10:08:06 +08:00
37e2f73909 [Lindorm VDB] Add the QUERY_TIMEOUT parameter to force the search query to fail. (#18613)
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
2025-04-25 09:42:58 +08:00
759584f8c5 fix: not render conversation var in prompt editor (#18728) 2025-04-25 09:06:07 +08:00
0babdffe3e feat: support vastbase vector database (#16308) 2025-04-24 18:04:57 +08:00
cd9e6609ad fix: project version to 1.3.0 in package.json and uv.lock (#18684) 2025-04-24 17:16:41 +08:00
9982445dad Added the missing path of the webpath prefix and the prefix basepath + of static resources to remove the bug of adding more basepath. (#18658)
Co-authored-by: qingguo <qingguo@lexin.com>
2025-04-24 17:14:26 +08:00
13f647feaa fix: remove chat variable in workflow mode (#18696) 2025-04-24 16:51:19 +08:00
7b00f35a0d fix: link address error in the embedding in websites first example (#18677) 2025-04-24 14:50:12 +08:00
f6b3724268 chore(docker): bump dify-plugin-daemon to 0.0.9 (#18672)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-24 14:08:37 +08:00
212521c92b fix: sometimes error message not display complete (#18663) 2025-04-24 11:58:44 +08:00
69d3853111 fix some browser autofill password when authorization plugin (#18661) 2025-04-24 11:55:42 +08:00
d242e4b95b fix agentflow error if first variable is num (#18660)
Co-authored-by: lizb <lizb@sugon.com>
2025-04-24 11:55:29 +08:00
2266001d19 Fix some prompt typos (#18645) 2025-04-24 10:36:45 +08:00
2f141aa483 Add jp translation (#18628) 2025-04-24 10:32:21 +08:00
1117 changed files with 20876 additions and 5298 deletions

View File

@ -34,4 +34,4 @@ if you see such error message when you open this project in codespaces:
![Alt text](troubleshooting.png)
a simple workaround is change `/signin` endpoint into another one, then login with GitHub account and close the tab, then change it back to `/signin` endpoint. Then all things will be fine.
The reason is `signin` endpoint is not allowed in codespaces, details can be found [here](https://github.com/orgs/community/discussions/5204)
The reason is `signin` endpoint is not allowed in codespaces, details can be found [here](https://github.com/orgs/community/discussions/5204)

View File

@ -2,7 +2,7 @@
// README at: https://github.com/devcontainers/templates/tree/main/src/anaconda
{
"name": "Python 3.12",
"build": {
"build": {
"context": "..",
"dockerfile": "Dockerfile"
},

View File

@ -1,3 +1,3 @@
This file copied into the container along with environment.yml* from the parent
folder. This file is included to prevents the Dockerfile COPY instruction from
failing if no environment.yml is found.
folder. This file is included to prevents the Dockerfile COPY instruction from
failing if no environment.yml is found.

View File

@ -5,18 +5,35 @@ root = true
# Unix-style newlines with a newline ending every file
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[*.py]
indent_size = 4
indent_style = space
[*.{yml,yaml}]
indent_style = space
indent_size = 2
[*.toml]
indent_size = 4
indent_style = space
# Markdown and MDX are whitespace sensitive languages.
# Do not remove trailing spaces.
[*.{md,mdx}]
trim_trailing_whitespace = false
# Matches multiple files with brace expansion notation
# Set default charset
[*.{js,tsx}]
charset = utf-8
indent_style = space
indent_size = 2
# Matches the exact files either package.json or .travis.yml
[{package.json,.travis.yml}]
# Matches the exact files package.json
[package.json]
indent_style = space
indent_size = 2

2
.gitattributes vendored
View File

@ -1,5 +1,5 @@
# Ensure that .sh scripts use LF as line separator, even if they are checked out
# to Windows(NTFS) file-system, by a user of Docker for Windows.
# to Windows(NTFS) file-system, by a user of Docker for Windows.
# These .sh scripts will be run from the Container after `docker compose up -d`.
# If they appear to be CRLF style, Dash from the Container will fail to execute
# them.

View File

@ -0,0 +1,22 @@
{
"Verbose": false,
"Debug": false,
"IgnoreDefaults": false,
"SpacesAfterTabs": false,
"NoColor": false,
"Exclude": [
"^web/public/vs/",
"^web/public/pdf.worker.min.mjs$",
"web/app/components/base/icons/src/vender/"
],
"AllowedContentTypes": [],
"PassedFiles": [],
"Disable": {
"EndOfLine": false,
"Indentation": false,
"IndentSize": true,
"InsertFinalNewline": false,
"TrimTrailingWhitespace": false,
"MaxLineLength": false
}
}

View File

@ -88,3 +88,6 @@ jobs:
- name: Run Workflow
run: uv run --project api bash dev/pytest/pytest_workflow.sh
- name: Run Tool
run: uv run --project api bash dev/pytest/pytest_tools.sh

View File

@ -9,6 +9,12 @@ concurrency:
group: style-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
permissions:
checks: write
statuses: write
contents: read
jobs:
python-style:
name: Python Style
@ -43,8 +49,8 @@ jobs:
if: steps.changed-files.outputs.any_changed == 'true'
run: |
uv run --directory api ruff --version
uv run --directory api ruff check ./
uv run --directory api ruff format --check ./
uv run --directory api ruff check --diff ./
uv run --directory api ruff format --check --diff ./
- name: Dotenv check
if: steps.changed-files.outputs.any_changed == 'true'
@ -163,3 +169,14 @@ jobs:
VALIDATE_DOCKERFILE_HADOLINT: true
VALIDATE_XML: true
VALIDATE_YAML: true
- name: EditorConfig checks
uses: super-linter/super-linter/slim@v7
env:
DEFAULT_BRANCH: main
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
IGNORE_GENERATED_FILES: true
IGNORE_GITIGNORED_FILES: true
# EditorConfig validation
VALIDATE_EDITORCONFIG: true
EDITORCONFIG_FILE_NAME: editorconfig-checker.json

View File

@ -90,4 +90,4 @@ Recomendamos revisar este documento cuidadosamente antes de proceder con la conf
No dudes en contactarnos si encuentras algún problema durante el proceso de configuración.
## Obteniendo Ayuda
Si alguna vez te quedas atascado o tienes una pregunta urgente mientras contribuyes, simplemente envíanos tus consultas a través del issue relacionado de GitHub, o únete a nuestro [Discord](https://discord.gg/8Tpq4AcN9c) para una charla rápida.
Si alguna vez te quedas atascado o tienes una pregunta urgente mientras contribuyes, simplemente envíanos tus consultas a través del issue relacionado de GitHub, o únete a nuestro [Discord](https://discord.gg/8Tpq4AcN9c) para una charla rápida.

View File

@ -90,4 +90,4 @@ Nous recommandons de revoir attentivement ce document avant de procéder à la c
N'hésitez pas à nous contacter si vous rencontrez des problèmes pendant le processus de configuration.
## Obtenir de l'aide
Si jamais vous êtes bloqué ou avez une question urgente en contribuant, envoyez-nous simplement vos questions via le problème GitHub concerné, ou rejoignez notre [Discord](https://discord.gg/8Tpq4AcN9c) pour une discussion rapide.
Si jamais vous êtes bloqué ou avez une question urgente en contribuant, envoyez-nous simplement vos questions via le problème GitHub concerné, ou rejoignez notre [Discord](https://discord.gg/8Tpq4AcN9c) pour une discussion rapide.

View File

@ -90,4 +90,4 @@ PR 설명에 기존 이슈를 연결하거나 새 이슈를 여는 것을 잊지
설정 과정에서 문제가 발생하면 언제든지 연락해 주세요.
## 도움 받기
기여하는 동안 막히거나 긴급한 질문이 있으면, 관련 GitHub 이슈를 통해 질문을 보내거나, 빠른 대화를 위해 우리의 [Discord](https://discord.gg/8Tpq4AcN9c)에 참여하세요.
기여하는 동안 막히거나 긴급한 질문이 있으면, 관련 GitHub 이슈를 통해 질문을 보내거나, 빠른 대화를 위해 우리의 [Discord](https://discord.gg/8Tpq4AcN9c)에 참여하세요.

View File

@ -90,4 +90,4 @@ Recomendamos revisar este documento cuidadosamente antes de prosseguir com a con
Sinta-se à vontade para entrar em contato se encontrar quaisquer problemas durante o processo de configuração.
## Obtendo Ajuda
Se você ficar preso ou tiver uma dúvida urgente enquanto contribui, simplesmente envie suas perguntas através do problema relacionado no GitHub, ou entre no nosso [Discord](https://discord.gg/8Tpq4AcN9c) para uma conversa rápida.
Se você ficar preso ou tiver uma dúvida urgente enquanto contribui, simplesmente envie suas perguntas através do problema relacionado no GitHub, ou entre no nosso [Discord](https://discord.gg/8Tpq4AcN9c) para uma conversa rápida.

View File

@ -90,4 +90,4 @@ Kuruluma geçmeden önce bu belgeyi dikkatlice incelemenizi öneririz, çünkü
Kurulum süreci sırasında herhangi bir sorunla karşılaşırsanız bizimle iletişime geçmekten çekinmeyin.
## Yardım Almak
Katkıda bulunurken takılırsanız veya yanıcı bir sorunuz olursa, sorularınızı ilgili GitHub sorunu aracılığıyla bize gönderin veya hızlı bir sohbet için [Discord'umuza](https://discord.gg/8Tpq4AcN9c) katılın.
Katkıda bulunurken takılırsanız veya yanıcı bir sorunuz olursa, sorularınızı ilgili GitHub sorunu aracılığıyla bize gönderin veya hızlı bir sohbet için [Discord'umuza](https://discord.gg/8Tpq4AcN9c) katılın.

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Introducing Dify Workflow File Upload: Recreate Google NotebookLM Podcast</a>
@ -54,7 +54,7 @@
<a href="./README_BN.md"><img alt="README in বাংলা" src="https://img.shields.io/badge/বাংলা-d9d9d9"></a>
</p>
Dify is an open-source LLM app development platform. Its intuitive interface combines agentic AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
Dify is an open-source LLM app development platform. Its intuitive interface combines agentic AI workflow, RAG pipeline, agent capabilities, model management, observability features, and more, allowing you to quickly move from prototype to production.
## Quick start
@ -87,8 +87,6 @@ Please refer to our [FAQ](https://docs.dify.ai/getting-started/install-self-host
**1. Workflow**:
Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Comprehensive model support**:
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).
@ -188,7 +186,7 @@ All of Dify's offerings come with corresponding APIs, so you could effortlessly
- **Dify for enterprise / organizations</br>**
We provide additional enterprise-centric features. [Log your questions for us through this chatbot](https://udify.app/chat/22L1zSxg6yW1cWQg) or [send us an email](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) to discuss enterprise needs. </br>
> For startups and small businesses using AWS, check out [Dify Premium on AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) and deploy it to your own AWS VPC with one-click. It's an affordable AMI offering with the option to create apps with custom logo and branding.
> For startups and small businesses using AWS, check out [Dify Premium on AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) and deploy it to your own AWS VPC with one click. It's an affordable AMI offering with the option to create apps with custom logo and branding.
## Staying ahead
@ -233,7 +231,7 @@ Deploy Dify to AWS with [CDK](https://aws.amazon.com/cdk/)
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
At the same time, please consider supporting Dify by sharing it on social media and at events and conferences.
> We are looking for contributors to help with translating Dify to languages other than Mandarin or English. If you are interested in helping, please see the [i18n README](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) for more information, and leave us a comment in the `global-users` channel of our [Discord Community Server](https://discord.gg/8Tpq4AcN9c).
> We are looking for contributors to help translate Dify into languages other than Mandarin or English. If you are interested in helping, please see the [i18n README](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) for more information, and leave us a comment in the `global-users` channel of our [Discord Community Server](https://discord.gg/8Tpq4AcN9c).
## Community & contact

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
@ -54,8 +54,6 @@
**1. سير العمل**: قم ببناء واختبار سير عمل الذكاء الاصطناعي القوي على قماش بصري، مستفيدًا من جميع الميزات التالية وأكثر.
<https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa>
**2. الدعم الشامل للنماذج**: تكامل سلس مع مئات من LLMs الخاصة / مفتوحة المصدر من عشرات من موفري التحليل والحلول المستضافة ذاتيًا، مما يغطي GPT و Mistral و Llama3 وأي نماذج متوافقة مع واجهة OpenAI API. يمكن العثور على قائمة كاملة بمزودي النموذج المدعومين [هنا](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">ডিফাই ওয়ার্কফ্লো ফাইল আপলোড পরিচিতি: গুগল নোটবুক-এলএম পডকাস্ট পুনর্নির্মাণ</a>
@ -84,8 +84,6 @@ docker compose up -d
**১. ওয়ার্কফ্লো**:
ভিজ্যুয়াল ক্যানভাসে AI ওয়ার্কফ্লো তৈরি এবং পরীক্ষা করুন, নিম্নলিখিত সব ফিচার এবং তার বাইরেও আরও অনেক কিছু ব্যবহার করে।
<https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa>
**২. মডেল সাপোর্ট**:
GPT, Mistral, Llama3, এবং যেকোনো OpenAI API-সামঞ্জস্যপূর্ণ মডেলসহ, কয়েক ডজন ইনফারেন্স প্রদানকারী এবং সেল্ফ-হোস্টেড সমাধান থেকে শুরু করে প্রোপ্রাইটরি/ওপেন-সোর্স LLM-এর সাথে সহজে ইন্টিগ্রেশন। সমর্থিত মডেল প্রদানকারীদের একটি সম্পূর্ণ তালিকা পাওয়া যাবে [এখানে](https://docs.dify.ai/getting-started/readme/model-providers)।

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<div align="center">
<a href="https://cloud.dify.ai">Dify 云服务</a> ·
@ -61,11 +61,6 @@ Dify 是一个开源的 LLM 应用开发平台。其直观的界面结合了 AI
**1. 工作流**:
在画布上构建和测试功能强大的 AI 工作流程,利用以下所有功能以及更多功能。
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. 全面的模型支持**:
与数百种专有/开源 LLMs 以及数十种推理提供商和自托管解决方案无缝集成,涵盖 GPT、Mistral、Llama3 以及任何与 OpenAI API 兼容的模型。完整的支持模型提供商列表可在[此处](https://docs.dify.ai/getting-started/readme/model-providers)找到。

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Einführung in Dify Workflow File Upload: Google NotebookLM Podcast nachbilden</a>
@ -83,11 +83,6 @@ Bitte beachten Sie unsere [FAQ](https://docs.dify.ai/getting-started/install-sel
**1. Workflow**:
Erstellen und testen Sie leistungsstarke KI-Workflows auf einer visuellen Oberfläche, wobei Sie alle der folgenden Funktionen und darüber hinaus nutzen können.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Umfassende Modellunterstützung**:
Nahtlose Integration mit Hunderten von proprietären und Open-Source-LLMs von Dutzenden Inferenzanbietern und selbstgehosteten Lösungen, die GPT, Mistral, Llama3 und alle mit der OpenAI API kompatiblen Modelle abdecken. Eine vollständige Liste der unterstützten Modellanbieter finden Sie [hier](https://docs.dify.ai/getting-started/readme/model-providers).

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
@ -59,11 +59,6 @@ Dify es una plataforma de desarrollo de aplicaciones de LLM de código abierto.
**1. Flujo de trabajo**:
Construye y prueba potentes flujos de trabajo de IA en un lienzo visual, aprovechando todas las siguientes características y más.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Soporte de modelos completo**:
Integración perfecta con cientos de LLMs propietarios / de código abierto de docenas de proveedores de inferencia y soluciones auto-alojadas, que cubren GPT, Mistral, Llama3 y cualquier modelo compatible con la API de OpenAI. Se puede encontrar una lista completa de proveedores de modelos admitidos [aquí](https://docs.dify.ai/getting-started/readme/model-providers).

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
@ -59,11 +59,6 @@ Dify est une plateforme de développement d'applications LLM open source. Son in
**1. Flux de travail** :
Construisez et testez des flux de travail d'IA puissants sur un canevas visuel, en utilisant toutes les fonctionnalités suivantes et plus encore.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Prise en charge complète des modèles** :
Intégration transparente avec des centaines de LLM propriétaires / open source provenant de dizaines de fournisseurs d'inférence et de solutions auto-hébergées, couvrant GPT, Mistral, Llama3, et tous les modèles compatibles avec l'API OpenAI. Une liste complète des fournisseurs de modèles pris en charge se trouve [ici](https://docs.dify.ai/getting-started/readme/model-providers).

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
@ -60,11 +60,6 @@ DifyはオープンソースのLLMアプリケーション開発プラットフ
**1. ワークフロー**:
強力なAIワークフローをビジュアルキャンバス上で構築し、テストできます。すべての機能、および以下の機能を使用できます。
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. 総合的なモデルサポート**:
数百ものプロプライエタリ/オープンソースのLLMと、数十もの推論プロバイダーおよびセルフホスティングソリューションとのシームレスな統合を提供します。GPT、Mistral、Llama3、OpenAI APIと互換性のあるすべてのモデルを統合されています。サポートされているモデルプロバイダーの完全なリストは[こちら](https://docs.dify.ai/getting-started/readme/model-providers)をご覧ください。

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
@ -59,11 +59,6 @@ Dify is an open-source LLM app development platform. Its intuitive interface com
**1. Workflow**:
Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Comprehensive model support**:
Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found [here](https://docs.dify.ai/getting-started/readme/model-providers).

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify 클라우드</a> ·
@ -54,11 +54,6 @@
**1. 워크플로우**:
다음 기능들을 비롯한 다양한 기능을 활용하여 시각적 캔버스에서 강력한 AI 워크플로우를 구축하고 테스트하세요.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. 포괄적인 모델 지원:**:
수십 개의 추론 제공업체와 자체 호스팅 솔루션에서 제공하는 수백 개의 독점 및 오픈 소스 LLM과 원활하게 통합되며, GPT, Mistral, Llama3 및 모든 OpenAI API 호환 모델을 포함합니다. 지원되는 모델 제공업체의 전체 목록은 [여기](https://docs.dify.ai/getting-started/readme/model-providers)에서 확인할 수 있습니다.

View File

@ -1,5 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Introduzindo o Dify Workflow com Upload de Arquivo: Recrie o Podcast Google NotebookLM</a>
</p>
@ -59,11 +58,6 @@ Dify é uma plataforma de desenvolvimento de aplicativos LLM de código aberto.
**1. Workflow**:
Construa e teste workflows poderosos de IA em uma interface visual, aproveitando todos os recursos a seguir e muito mais.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Suporte abrangente a modelos**:
Integração perfeita com centenas de LLMs proprietários e de código aberto de diversas provedoras e soluções auto-hospedadas, abrangendo GPT, Mistral, Llama3 e qualquer modelo compatível com a API da OpenAI. A lista completa de provedores suportados pode ser encontrada [aqui](https://docs.dify.ai/getting-started/readme/model-providers).

View File

@ -1,259 +1,254 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Predstavljamo nalaganje datotek Dify Workflow: znova ustvarite Google NotebookLM Podcast</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Samostojno gostovanje</a> ·
<a href="https://docs.dify.ai">Dokumentacija</a> ·
<a href="https://dify.ai/pricing">Pregled ponudb izdelkov Dify</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
<a href="https://www.linkedin.com/company/langgenius/" target="_blank">
<img src="https://custom-icon-badges.demolab.com/badge/LinkedIn-0A66C2?logo=linkedin-white&logoColor=fff"
alt="follow on LinkedIn"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
<a href="./README_SI.md"><img alt="README Slovenščina" src="https://img.shields.io/badge/Sloven%C5%A1%C4%8Dina-d9d9d9"></a>
<a href="./README_BN.md"><img alt="README in বাংলা" src="https://img.shields.io/badge/বাংলা-d9d9d9"></a>
</p>
Dify je odprtokodna platforma za razvoj aplikacij LLM. Njegov intuitivni vmesnik združuje agentski potek dela z umetno inteligenco, cevovod RAG, zmogljivosti agentov, upravljanje modelov, funkcije opazovanja in več, kar vam omogoča hiter prehod od prototipa do proizvodnje.
## Hitri začetek
> Preden namestite Dify, se prepričajte, da vaša naprava izpolnjuje naslednje minimalne sistemske zahteve:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
</br>
Najlažji način za zagon strežnika Dify je prek docker compose . Preden zaženete Dify z naslednjimi ukazi, se prepričajte, da sta Docker in Docker Compose nameščena na vašem računalniku:
```bash
cd dify
cd docker
cp .env.example .env
docker compose up -d
```
Po zagonu lahko dostopate do nadzorne plošče Dify v brskalniku na [http://localhost/install](http://localhost/install) in začnete postopek inicializacije.
#### Iskanje pomoči
Prosimo, glejte naša pogosta vprašanja [FAQ](https://docs.dify.ai/getting-started/install-self-hosted/faqs) če naletite na težave pri nastavitvi Dify. Če imate še vedno težave, se obrnite na [skupnost ali nas](#community--contact).
> Če želite prispevati k Difyju ali narediti dodaten razvoj, glejte naš vodnik za [uvajanje iz izvorne kode](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Ključne značilnosti
**1. Potek dela**:
Zgradite in preizkusite zmogljive poteke dela AI na vizualnem platnu, pri čemer izkoristite vse naslednje funkcije in več.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Celovita podpora za modele**:
Brezhibna integracija s stotinami lastniških/odprtokodnih LLM-jev ducatov ponudnikov sklepanja in samostojnih rešitev, ki pokrivajo GPT, Mistral, Llama3 in vse modele, združljive z API-jem OpenAI. Celoten seznam podprtih ponudnikov modelov najdete [tukaj](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. Prompt IDE**:
intuitivni vmesnik za ustvarjanje pozivov, primerjavo zmogljivosti modela in dodajanje dodatnih funkcij, kot je pretvorba besedila v govor, aplikaciji, ki temelji na klepetu.
**4. RAG Pipeline**:
E Obsežne zmogljivosti RAG, ki pokrivajo vse od vnosa dokumenta do priklica, s podporo za ekstrakcijo besedila iz datotek PDF, PPT in drugih običajnih formatov dokumentov.
**5. Agent capabilities**:
definirate lahko agente, ki temeljijo na klicanju funkcij LLM ali ReAct, in dodate vnaprej izdelana orodja ali orodja po meri za agenta. Dify ponuja več kot 50 vgrajenih orodij za agente AI, kot so Google Search, DALL·E, Stable Diffusion in WolframAlpha.
**6. LLMOps**:
Spremljajte in analizirajte dnevnike aplikacij in učinkovitost skozi čas. Pozive, nabore podatkov in modele lahko nenehno izboljšujete na podlagi proizvodnih podatkov in opomb.
**7. Backend-as-a-Service**:
AVse ponudbe Difyja so opremljene z ustreznimi API-ji, tako da lahko Dify brez težav integrirate v svojo poslovno logiko.
## Primerjava Funkcij
<table style="width: 100%;">
<tr>
<th align="center">Funkcija</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Programski pristop</td>
<td align="center">API + usmerjeno v aplikacije</td>
<td align="center">Python koda</td>
<td align="center">Usmerjeno v aplikacije</td>
<td align="center">Usmerjeno v API</td>
</tr>
<tr>
<td align="center">Podprti LLM-ji</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Samo OpenAI</td>
</tr>
<tr>
<td align="center">RAG pogon</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Potek dela</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Spremljanje</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Funkcija za podjetja (SSO/nadzor dostopa)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Lokalna namestitev</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## Uporaba Dify
- **Cloud </br>**
Gostimo storitev Dify Cloud za vsakogar, ki jo lahko preizkusite brez nastavitev. Zagotavlja vse zmožnosti različice za samostojno namestitev in vključuje 200 brezplačnih klicev GPT-4 v načrtu peskovnika.
- **Self-hosting Dify Community Edition</br>**
Hitro zaženite Dify v svojem okolju s tem [začetnim vodnikom](#quick-start) . Za dodatne reference in podrobnejša navodila uporabite našo [dokumentacijo](https://docs.dify.ai) .
- **Dify za podjetja/organizacije</br>**
Ponujamo dodatne funkcije, osredotočene na podjetja. Zabeležite svoja vprašanja prek tega klepetalnega robota ali nam pošljite e-pošto, da se pogovorimo o potrebah podjetja. </br>
> Za novoustanovljena podjetja in mala podjetja, ki uporabljajo AWS, si oglejte Dify Premium na AWS Marketplace in ga z enim klikom uvedite v svoj AWS VPC. To je cenovno ugodna ponudba AMI z možnostjo ustvarjanja aplikacij z logotipom in blagovno znamko po meri.
## Staying ahead
Star Dify on GitHub and be instantly notified of new releases.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Napredne nastavitve
Če morate prilagoditi konfiguracijo, si oglejte komentarje v naši datoteki .env.example in posodobite ustrezne vrednosti v svoji .env datoteki. Poleg tega boste morda morali prilagoditi docker-compose.yamlsamo datoteko, na primer spremeniti različice slike, preslikave vrat ali namestitve nosilca, glede na vaše specifično okolje in zahteve za uvajanje. Po kakršnih koli spremembah ponovno zaženite docker-compose up -d. Celoten seznam razpoložljivih spremenljivk okolja najdete tukaj .
Če želite konfigurirati visoko razpoložljivo nastavitev, so na voljo Helm Charts in datoteke YAML, ki jih prispeva skupnost, ki omogočajo uvedbo Difyja v Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
- [YAML file by @wyy-holding](https://github.com/wyy-holding/dify-k8s)
#### Uporaba Terraform za uvajanje
namestite Dify v Cloud Platform z enim klikom z uporabo [terraform](https://www.terraform.io/)
##### Azure Global
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Uporaba AWS CDK za uvajanje
Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Prispevam
Za tiste, ki bi radi prispevali kodo, si oglejte naš vodnik za prispevke . Hkrati vas prosimo, da podprete Dify tako, da ga delite na družbenih medijih ter na dogodkih in konferencah.
> Iščemo sodelavce za pomoč pri prevajanju Difyja v jezike, ki niso mandarinščina ali angleščina. Če želite pomagati, si oglejte i18n README za več informacij in nam pustite komentar v global-userskanalu našega strežnika skupnosti Discord .
## Skupnost in stik
* [Github Discussion](https://github.com/langgenius/dify/discussions). Najboljše za: izmenjavo povratnih informacij in postavljanje vprašanj.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Najboljše za: hrošče, na katere naletite pri uporabi Dify.AI, in predloge funkcij. Oglejte si naš [vodnik za prispevke](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
* [X(Twitter)](https://twitter.com/dify_ai). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
**Contributors**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Varnostno razkritje
Zaradi zaščite vaše zasebnosti se izogibajte objavljanju varnostnih vprašanj na GitHub. Namesto tega pošljite vprašanja na security@dify.ai in zagotovili vam bomo podrobnejši odgovor.
## Licenca
To skladišče je na voljo pod [odprtokodno licenco Dify](LICENSE) , ki je v bistvu Apache 2.0 z nekaj dodatnimi omejitvami.
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Predstavljamo nalaganje datotek Dify Workflow: znova ustvarite Google NotebookLM Podcast</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Samostojno gostovanje</a> ·
<a href="https://docs.dify.ai">Dokumentacija</a> ·
<a href="https://dify.ai/pricing">Pregled ponudb izdelkov Dify</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
<a href="https://www.linkedin.com/company/langgenius/" target="_blank">
<img src="https://custom-icon-badges.demolab.com/badge/LinkedIn-0A66C2?logo=linkedin-white&logoColor=fff"
alt="follow on LinkedIn"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
<a href="./README_SI.md"><img alt="README Slovenščina" src="https://img.shields.io/badge/Sloven%C5%A1%C4%8Dina-d9d9d9"></a>
<a href="./README_BN.md"><img alt="README in বাংলা" src="https://img.shields.io/badge/বাংলা-d9d9d9"></a>
</p>
Dify je odprtokodna platforma za razvoj aplikacij LLM. Njegov intuitivni vmesnik združuje agentski potek dela z umetno inteligenco, cevovod RAG, zmogljivosti agentov, upravljanje modelov, funkcije opazovanja in več, kar vam omogoča hiter prehod od prototipa do proizvodnje.
## Hitri začetek
> Preden namestite Dify, se prepričajte, da vaša naprava izpolnjuje naslednje minimalne sistemske zahteve:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
</br>
Najlažji način za zagon strežnika Dify je prek docker compose . Preden zaženete Dify z naslednjimi ukazi, se prepričajte, da sta Docker in Docker Compose nameščena na vašem računalniku:
```bash
cd dify
cd docker
cp .env.example .env
docker compose up -d
```
Po zagonu lahko dostopate do nadzorne plošče Dify v brskalniku na [http://localhost/install](http://localhost/install) in začnete postopek inicializacije.
#### Iskanje pomoči
Prosimo, glejte naša pogosta vprašanja [FAQ](https://docs.dify.ai/getting-started/install-self-hosted/faqs) če naletite na težave pri nastavitvi Dify. Če imate še vedno težave, se obrnite na [skupnost ali nas](#community--contact).
> Če želite prispevati k Difyju ali narediti dodaten razvoj, glejte naš vodnik za [uvajanje iz izvorne kode](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Ključne značilnosti
**1. Potek dela**:
Zgradite in preizkusite zmogljive poteke dela AI na vizualnem platnu, pri čemer izkoristite vse naslednje funkcije in več.
**2. Celovita podpora za modele**:
Brezhibna integracija s stotinami lastniških/odprtokodnih LLM-jev ducatov ponudnikov sklepanja in samostojnih rešitev, ki pokrivajo GPT, Mistral, Llama3 in vse modele, združljive z API-jem OpenAI. Celoten seznam podprtih ponudnikov modelov najdete [tukaj](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. Prompt IDE**:
intuitivni vmesnik za ustvarjanje pozivov, primerjavo zmogljivosti modela in dodajanje dodatnih funkcij, kot je pretvorba besedila v govor, aplikaciji, ki temelji na klepetu.
**4. RAG Pipeline**:
E Obsežne zmogljivosti RAG, ki pokrivajo vse od vnosa dokumenta do priklica, s podporo za ekstrakcijo besedila iz datotek PDF, PPT in drugih običajnih formatov dokumentov.
**5. Agent capabilities**:
definirate lahko agente, ki temeljijo na klicanju funkcij LLM ali ReAct, in dodate vnaprej izdelana orodja ali orodja po meri za agenta. Dify ponuja več kot 50 vgrajenih orodij za agente AI, kot so Google Search, DALL·E, Stable Diffusion in WolframAlpha.
**6. LLMOps**:
Spremljajte in analizirajte dnevnike aplikacij in učinkovitost skozi čas. Pozive, nabore podatkov in modele lahko nenehno izboljšujete na podlagi proizvodnih podatkov in opomb.
**7. Backend-as-a-Service**:
AVse ponudbe Difyja so opremljene z ustreznimi API-ji, tako da lahko Dify brez težav integrirate v svojo poslovno logiko.
## Primerjava Funkcij
<table style="width: 100%;">
<tr>
<th align="center">Funkcija</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Programski pristop</td>
<td align="center">API + usmerjeno v aplikacije</td>
<td align="center">Python koda</td>
<td align="center">Usmerjeno v aplikacije</td>
<td align="center">Usmerjeno v API</td>
</tr>
<tr>
<td align="center">Podprti LLM-ji</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Samo OpenAI</td>
</tr>
<tr>
<td align="center">RAG pogon</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Potek dela</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Spremljanje</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Funkcija za podjetja (SSO/nadzor dostopa)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Lokalna namestitev</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## Uporaba Dify
- **Cloud </br>**
Gostimo storitev Dify Cloud za vsakogar, ki jo lahko preizkusite brez nastavitev. Zagotavlja vse zmožnosti različice za samostojno namestitev in vključuje 200 brezplačnih klicev GPT-4 v načrtu peskovnika.
- **Self-hosting Dify Community Edition</br>**
Hitro zaženite Dify v svojem okolju s tem [začetnim vodnikom](#quick-start) . Za dodatne reference in podrobnejša navodila uporabite našo [dokumentacijo](https://docs.dify.ai) .
- **Dify za podjetja/organizacije</br>**
Ponujamo dodatne funkcije, osredotočene na podjetja. Zabeležite svoja vprašanja prek tega klepetalnega robota ali nam pošljite e-pošto, da se pogovorimo o potrebah podjetja. </br>
> Za novoustanovljena podjetja in mala podjetja, ki uporabljajo AWS, si oglejte Dify Premium na AWS Marketplace in ga z enim klikom uvedite v svoj AWS VPC. To je cenovno ugodna ponudba AMI z možnostjo ustvarjanja aplikacij z logotipom in blagovno znamko po meri.
## Staying ahead
Star Dify on GitHub and be instantly notified of new releases.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Napredne nastavitve
Če morate prilagoditi konfiguracijo, si oglejte komentarje v naši datoteki .env.example in posodobite ustrezne vrednosti v svoji .env datoteki. Poleg tega boste morda morali prilagoditi docker-compose.yamlsamo datoteko, na primer spremeniti različice slike, preslikave vrat ali namestitve nosilca, glede na vaše specifično okolje in zahteve za uvajanje. Po kakršnih koli spremembah ponovno zaženite docker-compose up -d. Celoten seznam razpoložljivih spremenljivk okolja najdete tukaj .
Če želite konfigurirati visoko razpoložljivo nastavitev, so na voljo Helm Charts in datoteke YAML, ki jih prispeva skupnost, ki omogočajo uvedbo Difyja v Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
- [YAML file by @wyy-holding](https://github.com/wyy-holding/dify-k8s)
#### Uporaba Terraform za uvajanje
namestite Dify v Cloud Platform z enim klikom z uporabo [terraform](https://www.terraform.io/)
##### Azure Global
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Uporaba AWS CDK za uvajanje
Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Prispevam
Za tiste, ki bi radi prispevali kodo, si oglejte naš vodnik za prispevke . Hkrati vas prosimo, da podprete Dify tako, da ga delite na družbenih medijih ter na dogodkih in konferencah.
> Iščemo sodelavce za pomoč pri prevajanju Difyja v jezike, ki niso mandarinščina ali angleščina. Če želite pomagati, si oglejte i18n README za več informacij in nam pustite komentar v global-userskanalu našega strežnika skupnosti Discord .
## Skupnost in stik
* [Github Discussion](https://github.com/langgenius/dify/discussions). Najboljše za: izmenjavo povratnih informacij in postavljanje vprašanj.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Najboljše za: hrošče, na katere naletite pri uporabi Dify.AI, in predloge funkcij. Oglejte si naš [vodnik za prispevke](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
* [X(Twitter)](https://twitter.com/dify_ai). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
**Contributors**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Varnostno razkritje
Zaradi zaščite vaše zasebnosti se izogibajte objavljanju varnostnih vprašanj na GitHub. Namesto tega pošljite vprašanja na security@dify.ai in zagotovili vam bomo podrobnejši odgovor.
## Licenca
To skladišče je na voljo pod [odprtokodno licenco Dify](LICENSE) , ki je v bistvu Apache 2.0 z nekaj dodatnimi omejitvami.

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify Bulut</a> ·
@ -55,11 +55,6 @@ Dify, açık kaynaklı bir LLM uygulama geliştirme platformudur. Sezgisel aray
**1. Workflow**:
Görsel bir arayüz üzerinde güçlü AI iş akışları oluşturun ve test edin, aşağıdaki tüm özellikleri ve daha fazlasını kullanarak.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Kapsamlı model desteği**:
Çok sayıda çıkarım sağlayıcısı ve kendi kendine barındırılan çözümlerden yüzlerce özel / açık kaynaklı LLM ile sorunsuz entegrasyon sağlar. GPT, Mistral, Llama3 ve OpenAI API uyumlu tüm modelleri kapsar. Desteklenen model sağlayıcılarının tam listesine [buradan](https://docs.dify.ai/getting-started/readme/model-providers) ulaşabilirsiniz.

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">介紹 Dify 工作流程檔案上傳功能:重現 Google NotebookLM Podcast</a>
@ -86,8 +86,6 @@ docker compose up -d
**1. 工作流程**
在視覺化畫布上建立和測試強大的 AI 工作流程,利用以下所有功能及更多。
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. 全面的模型支援**
無縫整合來自數十個推理提供商和自託管解決方案的數百個專有/開源 LLM涵蓋 GPT、Mistral、Llama3 和任何與 OpenAI API 兼容的模型。您可以在[此處](https://docs.dify.ai/getting-started/readme/model-providers)找到支援的模型提供商完整列表。

View File

@ -1,4 +1,4 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
![cover-v5-optimized](./images/GitHub_README_if.png)
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
@ -55,11 +55,6 @@ Dify là một nền tảng phát triển ứng dụng LLM mã nguồn mở. Gia
**1. Quy trình làm việc**:
Xây dựng và kiểm tra các quy trình làm việc AI mạnh mẽ trên một canvas trực quan, tận dụng tất cả các tính năng sau đây và hơn thế nữa.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Hỗ trợ mô hình toàn diện**:
Tích hợp liền mạch với hàng trăm mô hình LLM độc quyền / mã nguồn mở từ hàng chục nhà cung cấp suy luận và giải pháp tự lưu trữ, bao gồm GPT, Mistral, Llama3, và bất kỳ mô hình tương thích API OpenAI nào. Danh sách đầy đủ các nhà cung cấp mô hình được hỗ trợ có thể được tìm thấy [tại đây](https://docs.dify.ai/getting-started/readme/model-providers).

View File

@ -16,4 +16,4 @@ logs
.ruff_cache
# venv
.venv
.venv

View File

@ -297,6 +297,7 @@ LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:
LINDORM_USERNAME=admin
LINDORM_PASSWORD=admin
USING_UGC_INDEX=False
LINDORM_QUERY_TIMEOUT=1
# OceanBase Vector configuration
OCEANBASE_VECTOR_HOST=127.0.0.1
@ -475,6 +476,7 @@ LOGIN_LOCKOUT_DURATION=86400
ENABLE_OTEL=false
OTLP_BASE_ENDPOINT=http://localhost:4318
OTLP_API_KEY=
OTEL_EXPORTER_OTLP_PROTOCOL=
OTEL_EXPORTER_TYPE=otlp
OTEL_SAMPLING_RATE=0.1
OTEL_BATCH_EXPORT_SCHEDULE_DELAY=5000

View File

@ -90,3 +90,4 @@
```bash
uv run -P api bash dev/pytest/pytest_all_tests.sh
```

View File

@ -18,7 +18,7 @@ else:
# so we need to disable gevent in debug mode.
# If you are using debugpy and set GEVENT_SUPPORT=True, you can debug with gevent.
if (flask_debug := os.environ.get("FLASK_DEBUG", "0")) and flask_debug.lower() in {"false", "0", "no"}:
from gevent import monkey # type: ignore
from gevent import monkey
# gevent
monkey.patch_all()

View File

@ -52,10 +52,8 @@ def initialize_extensions(app: DifyApp):
ext_mail,
ext_migrate,
ext_otel,
ext_otel_patch,
ext_proxy_fix,
ext_redis,
ext_repositories,
ext_sentry,
ext_set_secretkey,
ext_storage,
@ -76,7 +74,6 @@ def initialize_extensions(app: DifyApp):
ext_migrate,
ext_redis,
ext_storage,
ext_repositories,
ext_celery,
ext_login,
ext_mail,
@ -85,7 +82,6 @@ def initialize_extensions(app: DifyApp):
ext_proxy_fix,
ext_blueprints,
ext_commands,
ext_otel_patch, # Apply patch before initializing OpenTelemetry
ext_otel,
]
for ext in extensions:

View File

@ -6,6 +6,7 @@ from typing import Optional
import click
from flask import current_app
from sqlalchemy import select
from werkzeug.exceptions import NotFound
from configs import dify_config
@ -17,6 +18,7 @@ from core.rag.models.document import Document
from events.app_event import app_was_created
from extensions.ext_database import db
from extensions.ext_redis import redis_client
from extensions.ext_storage import storage
from libs.helper import email as email_validate
from libs.password import hash_password, password_pattern, valid_password
from libs.rsa import generate_key_pair
@ -271,6 +273,7 @@ def migrate_knowledge_vector_database():
upper_collection_vector_types = {
VectorType.MILVUS,
VectorType.PGVECTOR,
VectorType.VASTBASE,
VectorType.RELYT,
VectorType.WEAVIATE,
VectorType.ORACLE,
@ -295,11 +298,11 @@ def migrate_knowledge_vector_database():
page = 1
while True:
try:
datasets = (
Dataset.query.filter(Dataset.indexing_technique == "high_quality")
.order_by(Dataset.created_at.desc())
.paginate(page=page, per_page=50)
stmt = (
select(Dataset).filter(Dataset.indexing_technique == "high_quality").order_by(Dataset.created_at.desc())
)
datasets = db.paginate(select=stmt, page=page, per_page=50, max_per_page=50, error_out=False)
except NotFound:
break
@ -442,13 +445,13 @@ def convert_to_agent_apps():
WHERE a.mode = 'chat'
AND am.agent_mode is not null
AND (
am.agent_mode like '%"strategy": "function_call"%'
am.agent_mode like '%"strategy": "function_call"%'
OR am.agent_mode like '%"strategy": "react"%'
)
)
AND (
am.agent_mode like '{"enabled": true%'
am.agent_mode like '{"enabled": true%'
OR am.agent_mode like '{"max_iteration": %'
) ORDER BY a.created_at DESC LIMIT 1000
) ORDER BY a.created_at DESC LIMIT 1000
"""
with db.engine.begin() as conn:
@ -549,11 +552,12 @@ def old_metadata_migration():
page = 1
while True:
try:
documents = (
DatasetDocument.query.filter(DatasetDocument.doc_metadata is not None)
stmt = (
select(DatasetDocument)
.filter(DatasetDocument.doc_metadata.is_not(None))
.order_by(DatasetDocument.created_at.desc())
.paginate(page=page, per_page=50)
)
documents = db.paginate(select=stmt, page=page, per_page=50, max_per_page=50, error_out=False)
except NotFound:
break
if not documents:
@ -590,11 +594,15 @@ def old_metadata_migration():
)
db.session.add(dataset_metadata_binding)
else:
dataset_metadata_binding = DatasetMetadataBinding.query.filter(
DatasetMetadataBinding.dataset_id == document.dataset_id,
DatasetMetadataBinding.document_id == document.id,
DatasetMetadataBinding.metadata_id == dataset_metadata.id,
).first()
dataset_metadata_binding = (
db.session.query(DatasetMetadataBinding) # type: ignore
.filter(
DatasetMetadataBinding.dataset_id == document.dataset_id,
DatasetMetadataBinding.document_id == document.id,
DatasetMetadataBinding.metadata_id == dataset_metadata.id,
)
.first()
)
if not dataset_metadata_binding:
dataset_metadata_binding = DatasetMetadataBinding(
tenant_id=document.tenant_id,
@ -666,7 +674,7 @@ def upgrade_db():
click.echo(click.style("Starting database migration.", fg="green"))
# run db migration
import flask_migrate # type: ignore
import flask_migrate
flask_migrate.upgrade()
@ -814,3 +822,331 @@ def clear_free_plan_tenant_expired_logs(days: int, batch: int, tenant_ids: list[
ClearFreePlanTenantExpiredLogs.process(days, batch, tenant_ids)
click.echo(click.style("Clear free plan tenant expired logs completed.", fg="green"))
@click.option("-f", "--force", is_flag=True, help="Skip user confirmation and force the command to execute.")
@click.command("clear-orphaned-file-records", help="Clear orphaned file records.")
def clear_orphaned_file_records(force: bool):
"""
Clear orphaned file records in the database.
"""
# define tables and columns to process
files_tables = [
{"table": "upload_files", "id_column": "id", "key_column": "key"},
{"table": "tool_files", "id_column": "id", "key_column": "file_key"},
]
ids_tables = [
{"type": "uuid", "table": "message_files", "column": "upload_file_id"},
{"type": "text", "table": "documents", "column": "data_source_info"},
{"type": "text", "table": "document_segments", "column": "content"},
{"type": "text", "table": "messages", "column": "answer"},
{"type": "text", "table": "workflow_node_executions", "column": "inputs"},
{"type": "text", "table": "workflow_node_executions", "column": "process_data"},
{"type": "text", "table": "workflow_node_executions", "column": "outputs"},
{"type": "text", "table": "conversations", "column": "introduction"},
{"type": "text", "table": "conversations", "column": "system_instruction"},
{"type": "json", "table": "messages", "column": "inputs"},
{"type": "json", "table": "messages", "column": "message"},
]
# notify user and ask for confirmation
click.echo(
click.style(
"This command will first find and delete orphaned file records from the message_files table,", fg="yellow"
)
)
click.echo(
click.style(
"and then it will find and delete orphaned file records in the following tables:",
fg="yellow",
)
)
for files_table in files_tables:
click.echo(click.style(f"- {files_table['table']}", fg="yellow"))
click.echo(
click.style("The following tables and columns will be scanned to find orphaned file records:", fg="yellow")
)
for ids_table in ids_tables:
click.echo(click.style(f"- {ids_table['table']} ({ids_table['column']})", fg="yellow"))
click.echo("")
click.echo(click.style("!!! USE WITH CAUTION !!!", fg="red"))
click.echo(
click.style(
(
"Since not all patterns have been fully tested, "
"please note that this command may delete unintended file records."
),
fg="yellow",
)
)
click.echo(
click.style("This cannot be undone. Please make sure to back up your database before proceeding.", fg="yellow")
)
click.echo(
click.style(
(
"It is also recommended to run this during the maintenance window, "
"as this may cause high load on your instance."
),
fg="yellow",
)
)
if not force:
click.confirm("Do you want to proceed?", abort=True)
# start the cleanup process
click.echo(click.style("Starting orphaned file records cleanup.", fg="white"))
# clean up the orphaned records in the message_files table where message_id doesn't exist in messages table
try:
click.echo(
click.style("- Listing message_files records where message_id doesn't exist in messages table", fg="white")
)
query = (
"SELECT mf.id, mf.message_id "
"FROM message_files mf LEFT JOIN messages m ON mf.message_id = m.id "
"WHERE m.id IS NULL"
)
orphaned_message_files = []
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
orphaned_message_files.append({"id": str(i[0]), "message_id": str(i[1])})
if orphaned_message_files:
click.echo(click.style(f"Found {len(orphaned_message_files)} orphaned message_files records:", fg="white"))
for record in orphaned_message_files:
click.echo(click.style(f" - id: {record['id']}, message_id: {record['message_id']}", fg="black"))
if not force:
click.confirm(
(
f"Do you want to proceed "
f"to delete all {len(orphaned_message_files)} orphaned message_files records?"
),
abort=True,
)
click.echo(click.style("- Deleting orphaned message_files records", fg="white"))
query = "DELETE FROM message_files WHERE id IN :ids"
with db.engine.begin() as conn:
conn.execute(db.text(query), {"ids": tuple([record["id"] for record in orphaned_message_files])})
click.echo(
click.style(f"Removed {len(orphaned_message_files)} orphaned message_files records.", fg="green")
)
else:
click.echo(click.style("No orphaned message_files records found. There is nothing to delete.", fg="green"))
except Exception as e:
click.echo(click.style(f"Error deleting orphaned message_files records: {str(e)}", fg="red"))
# clean up the orphaned records in the rest of the *_files tables
try:
# fetch file id and keys from each table
all_files_in_tables = []
for files_table in files_tables:
click.echo(click.style(f"- Listing file records in table {files_table['table']}", fg="white"))
query = f"SELECT {files_table['id_column']}, {files_table['key_column']} FROM {files_table['table']}"
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
all_files_in_tables.append({"table": files_table["table"], "id": str(i[0]), "key": i[1]})
click.echo(click.style(f"Found {len(all_files_in_tables)} files in tables.", fg="white"))
# fetch referred table and columns
guid_regexp = "[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}"
all_ids_in_tables = []
for ids_table in ids_tables:
query = ""
if ids_table["type"] == "uuid":
click.echo(
click.style(
f"- Listing file ids in column {ids_table['column']} in table {ids_table['table']}", fg="white"
)
)
query = (
f"SELECT {ids_table['column']} FROM {ids_table['table']} WHERE {ids_table['column']} IS NOT NULL"
)
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
all_ids_in_tables.append({"table": ids_table["table"], "id": str(i[0])})
elif ids_table["type"] == "text":
click.echo(
click.style(
f"- Listing file-id-like strings in column {ids_table['column']} in table {ids_table['table']}",
fg="white",
)
)
query = (
f"SELECT regexp_matches({ids_table['column']}, '{guid_regexp}', 'g') AS extracted_id "
f"FROM {ids_table['table']}"
)
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
for j in i[0]:
all_ids_in_tables.append({"table": ids_table["table"], "id": j})
elif ids_table["type"] == "json":
click.echo(
click.style(
(
f"- Listing file-id-like JSON string in column {ids_table['column']} "
f"in table {ids_table['table']}"
),
fg="white",
)
)
query = (
f"SELECT regexp_matches({ids_table['column']}::text, '{guid_regexp}', 'g') AS extracted_id "
f"FROM {ids_table['table']}"
)
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
for j in i[0]:
all_ids_in_tables.append({"table": ids_table["table"], "id": j})
click.echo(click.style(f"Found {len(all_ids_in_tables)} file ids in tables.", fg="white"))
except Exception as e:
click.echo(click.style(f"Error fetching keys: {str(e)}", fg="red"))
return
# find orphaned files
all_files = [file["id"] for file in all_files_in_tables]
all_ids = [file["id"] for file in all_ids_in_tables]
orphaned_files = list(set(all_files) - set(all_ids))
if not orphaned_files:
click.echo(click.style("No orphaned file records found. There is nothing to delete.", fg="green"))
return
click.echo(click.style(f"Found {len(orphaned_files)} orphaned file records.", fg="white"))
for file in orphaned_files:
click.echo(click.style(f"- orphaned file id: {file}", fg="black"))
if not force:
click.confirm(f"Do you want to proceed to delete all {len(orphaned_files)} orphaned file records?", abort=True)
# delete orphaned records for each file
try:
for files_table in files_tables:
click.echo(click.style(f"- Deleting orphaned file records in table {files_table['table']}", fg="white"))
query = f"DELETE FROM {files_table['table']} WHERE {files_table['id_column']} IN :ids"
with db.engine.begin() as conn:
conn.execute(db.text(query), {"ids": tuple(orphaned_files)})
except Exception as e:
click.echo(click.style(f"Error deleting orphaned file records: {str(e)}", fg="red"))
return
click.echo(click.style(f"Removed {len(orphaned_files)} orphaned file records.", fg="green"))
@click.option("-f", "--force", is_flag=True, help="Skip user confirmation and force the command to execute.")
@click.command("remove-orphaned-files-on-storage", help="Remove orphaned files on the storage.")
def remove_orphaned_files_on_storage(force: bool):
"""
Remove orphaned files on the storage.
"""
# define tables and columns to process
files_tables = [
{"table": "upload_files", "key_column": "key"},
{"table": "tool_files", "key_column": "file_key"},
]
storage_paths = ["image_files", "tools", "upload_files"]
# notify user and ask for confirmation
click.echo(click.style("This command will find and remove orphaned files on the storage,", fg="yellow"))
click.echo(
click.style("by comparing the files on the storage with the records in the following tables:", fg="yellow")
)
for files_table in files_tables:
click.echo(click.style(f"- {files_table['table']}", fg="yellow"))
click.echo(click.style("The following paths on the storage will be scanned to find orphaned files:", fg="yellow"))
for storage_path in storage_paths:
click.echo(click.style(f"- {storage_path}", fg="yellow"))
click.echo("")
click.echo(click.style("!!! USE WITH CAUTION !!!", fg="red"))
click.echo(
click.style(
"Currently, this command will work only for opendal based storage (STORAGE_TYPE=opendal).", fg="yellow"
)
)
click.echo(
click.style(
"Since not all patterns have been fully tested, please note that this command may delete unintended files.",
fg="yellow",
)
)
click.echo(
click.style("This cannot be undone. Please make sure to back up your storage before proceeding.", fg="yellow")
)
click.echo(
click.style(
(
"It is also recommended to run this during the maintenance window, "
"as this may cause high load on your instance."
),
fg="yellow",
)
)
if not force:
click.confirm("Do you want to proceed?", abort=True)
# start the cleanup process
click.echo(click.style("Starting orphaned files cleanup.", fg="white"))
# fetch file id and keys from each table
all_files_in_tables = []
try:
for files_table in files_tables:
click.echo(click.style(f"- Listing files from table {files_table['table']}", fg="white"))
query = f"SELECT {files_table['key_column']} FROM {files_table['table']}"
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
all_files_in_tables.append(str(i[0]))
click.echo(click.style(f"Found {len(all_files_in_tables)} files in tables.", fg="white"))
except Exception as e:
click.echo(click.style(f"Error fetching keys: {str(e)}", fg="red"))
all_files_on_storage = []
for storage_path in storage_paths:
try:
click.echo(click.style(f"- Scanning files on storage path {storage_path}", fg="white"))
files = storage.scan(path=storage_path, files=True, directories=False)
all_files_on_storage.extend(files)
except FileNotFoundError as e:
click.echo(click.style(f" -> Skipping path {storage_path} as it does not exist.", fg="yellow"))
continue
except Exception as e:
click.echo(click.style(f" -> Error scanning files on storage path {storage_path}: {str(e)}", fg="red"))
continue
click.echo(click.style(f"Found {len(all_files_on_storage)} files on storage.", fg="white"))
# find orphaned files
orphaned_files = list(set(all_files_on_storage) - set(all_files_in_tables))
if not orphaned_files:
click.echo(click.style("No orphaned files found. There is nothing to remove.", fg="green"))
return
click.echo(click.style(f"Found {len(orphaned_files)} orphaned files.", fg="white"))
for file in orphaned_files:
click.echo(click.style(f"- orphaned file: {file}", fg="black"))
if not force:
click.confirm(f"Do you want to proceed to remove all {len(orphaned_files)} orphaned files?", abort=True)
# delete orphaned files
removed_files = 0
error_files = 0
for file in orphaned_files:
try:
storage.delete(file)
removed_files += 1
click.echo(click.style(f"- Removing orphaned file: {file}", fg="white"))
except Exception as e:
error_files += 1
click.echo(click.style(f"- Error deleting orphaned file {file}: {str(e)}", fg="red"))
continue
if error_files == 0:
click.echo(click.style(f"Removed {removed_files} orphaned files without errors.", fg="green"))
else:
click.echo(click.style(f"Removed {removed_files} orphaned files, with {error_files} errors.", fg="yellow"))

View File

@ -74,7 +74,7 @@ class CodeExecutionSandboxConfig(BaseSettings):
CODE_EXECUTION_ENDPOINT: HttpUrl = Field(
description="URL endpoint for the code execution service",
default="http://sandbox:8194",
default=HttpUrl("http://sandbox:8194"),
)
CODE_EXECUTION_API_KEY: str = Field(
@ -145,7 +145,7 @@ class PluginConfig(BaseSettings):
PLUGIN_DAEMON_URL: HttpUrl = Field(
description="Plugin API URL",
default="http://localhost:5002",
default=HttpUrl("http://localhost:5002"),
)
PLUGIN_DAEMON_KEY: str = Field(
@ -188,7 +188,7 @@ class MarketplaceConfig(BaseSettings):
MARKETPLACE_API_URL: HttpUrl = Field(
description="Marketplace API URL",
default="https://marketplace.dify.ai",
default=HttpUrl("https://marketplace.dify.ai"),
)
@ -398,6 +398,11 @@ class InnerAPIConfig(BaseSettings):
default=False,
)
INNER_API_KEY: Optional[str] = Field(
description="API key for accessing the internal API",
default=None,
)
class LoggingConfig(BaseSettings):
"""

View File

@ -1,6 +1,6 @@
import os
from typing import Any, Literal, Optional
from urllib.parse import quote_plus
from urllib.parse import parse_qsl, quote_plus
from pydantic import Field, NonNegativeInt, PositiveFloat, PositiveInt, computed_field
from pydantic_settings import BaseSettings
@ -39,6 +39,7 @@ from .vdb.tencent_vector_config import TencentVectorDBConfig
from .vdb.tidb_on_qdrant_config import TidbOnQdrantConfig
from .vdb.tidb_vector_config import TiDBVectorConfig
from .vdb.upstash_config import UpstashConfig
from .vdb.vastbase_vector_config import VastbaseVectorConfig
from .vdb.vikingdb_config import VikingDBConfig
from .vdb.weaviate_config import WeaviateConfig
@ -172,17 +173,31 @@ class DatabaseConfig(BaseSettings):
RETRIEVAL_SERVICE_EXECUTORS: NonNegativeInt = Field(
description="Number of processes for the retrieval service, default to CPU cores.",
default=os.cpu_count(),
default=os.cpu_count() or 1,
)
@computed_field
@computed_field # type: ignore[misc]
@property
def SQLALCHEMY_ENGINE_OPTIONS(self) -> dict[str, Any]:
# Parse DB_EXTRAS for 'options'
db_extras_dict = dict(parse_qsl(self.DB_EXTRAS))
options = db_extras_dict.get("options", "")
# Always include timezone
timezone_opt = "-c timezone=UTC"
if options:
# Merge user options and timezone
merged_options = f"{options} {timezone_opt}"
else:
merged_options = timezone_opt
connect_args = {"options": merged_options}
return {
"pool_size": self.SQLALCHEMY_POOL_SIZE,
"max_overflow": self.SQLALCHEMY_MAX_OVERFLOW,
"pool_recycle": self.SQLALCHEMY_POOL_RECYCLE,
"pool_pre_ping": self.SQLALCHEMY_POOL_PRE_PING,
"connect_args": {"options": "-c timezone=UTC"},
"connect_args": connect_args,
}
@ -270,6 +285,7 @@ class MiddlewareConfig(
OpenSearchConfig,
OracleConfig,
PGVectorConfig,
VastbaseVectorConfig,
PGVectoRSConfig,
QdrantConfig,
RelytConfig,

View File

@ -83,3 +83,13 @@ class RedisConfig(BaseSettings):
description="Password for Redis Clusters authentication (if required)",
default=None,
)
REDIS_SERIALIZATION_PROTOCOL: int = Field(
description="Redis serialization protocol (RESP) version",
default=3,
)
REDIS_ENABLE_CLIENT_SIDE_CACHE: bool = Field(
description="Enable client side cache in redis",
default=False,
)

View File

@ -32,3 +32,4 @@ class LindormConfig(BaseSettings):
description="Using UGC index will store the same type of Index in a single index but can retrieve separately.",
default=False,
)
LINDORM_QUERY_TIMEOUT: Optional[float] = Field(description="The lindorm search request timeout (s)", default=2.0)

View File

@ -1,4 +1,5 @@
from typing import Optional
import enum
from typing import Literal, Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
@ -9,6 +10,14 @@ class OpenSearchConfig(BaseSettings):
Configuration settings for OpenSearch
"""
class AuthMethod(enum.StrEnum):
"""
Authentication method for OpenSearch
"""
BASIC = "basic"
AWS_MANAGED_IAM = "aws_managed_iam"
OPENSEARCH_HOST: Optional[str] = Field(
description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')",
default=None,
@ -19,6 +28,16 @@ class OpenSearchConfig(BaseSettings):
default=9200,
)
OPENSEARCH_SECURE: bool = Field(
description="Whether to use SSL/TLS encrypted connection for OpenSearch (True for HTTPS, False for HTTP)",
default=False,
)
OPENSEARCH_AUTH_METHOD: AuthMethod = Field(
description="Authentication method for OpenSearch connection (default is 'basic')",
default=AuthMethod.BASIC,
)
OPENSEARCH_USER: Optional[str] = Field(
description="Username for authenticating with OpenSearch",
default=None,
@ -29,7 +48,11 @@ class OpenSearchConfig(BaseSettings):
default=None,
)
OPENSEARCH_SECURE: bool = Field(
description="Whether to use SSL/TLS encrypted connection for OpenSearch (True for HTTPS, False for HTTP)",
default=False,
OPENSEARCH_AWS_REGION: Optional[str] = Field(
description="AWS region for OpenSearch (e.g. 'us-west-2')",
default=None,
)
OPENSEARCH_AWS_SERVICE: Optional[Literal["es", "aoss"]] = Field(
description="AWS service for OpenSearch (e.g. 'aoss' for OpenSearch Serverless)", default=None
)

View File

@ -0,0 +1,45 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class VastbaseVectorConfig(BaseSettings):
"""
Configuration settings for Vector (Vastbase with vector extension)
"""
VASTBASE_HOST: Optional[str] = Field(
description="Hostname or IP address of the Vastbase server with Vector extension (e.g., 'localhost')",
default=None,
)
VASTBASE_PORT: PositiveInt = Field(
description="Port number on which the Vastbase server is listening (default is 5432)",
default=5432,
)
VASTBASE_USER: Optional[str] = Field(
description="Username for authenticating with the Vastbase database",
default=None,
)
VASTBASE_PASSWORD: Optional[str] = Field(
description="Password for authenticating with the Vastbase database",
default=None,
)
VASTBASE_DATABASE: Optional[str] = Field(
description="Name of the Vastbase database to connect to",
default=None,
)
VASTBASE_MIN_CONNECTION: PositiveInt = Field(
description="Min connection of the Vastbase database",
default=1,
)
VASTBASE_MAX_CONNECTION: PositiveInt = Field(
description="Max connection of the Vastbase database",
default=5,
)

View File

@ -27,6 +27,11 @@ class OTelConfig(BaseSettings):
default="otlp",
)
OTEL_EXPORTER_OTLP_PROTOCOL: str = Field(
description="OTLP exporter protocol ('grpc' or 'http')",
default="http",
)
OTEL_SAMPLING_RATE: float = Field(default=0.1, description="Sampling rate for traces (0.0 to 1.0)")
OTEL_BATCH_EXPORT_SCHEDULE_DELAY: int = Field(

View File

@ -9,7 +9,7 @@ class PackagingInfo(BaseSettings):
CURRENT_VERSION: str = Field(
description="Dify version",
default="1.3.0",
default="1.4.0",
)
COMMIT_SHA: str = Field(

View File

@ -16,11 +16,25 @@ AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
if dify_config.ETL_TYPE == "Unstructured":
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls"]
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"]
DOCUMENT_EXTENSIONS.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
if dify_config.UNSTRUCTURED_API_URL:
DOCUMENT_EXTENSIONS.append("ppt")
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
else:
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "docx", "csv"]
DOCUMENT_EXTENSIONS = [
"txt",
"markdown",
"md",
"mdx",
"pdf",
"html",
"htm",
"xlsx",
"xls",
"docx",
"csv",
"vtt",
"properties",
]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])

View File

@ -0,0 +1,7 @@
# The two constants below should keep in sync.
# Default content type for files which have no explicit content type.
DEFAULT_MIME_TYPE = "application/octet-stream"
# Default file extension for files which have no explicit content type, should
# correspond to the `DEFAULT_MIME_TYPE` above.
DEFAULT_EXTENSION = ".bin"

View File

@ -1,4 +1,6 @@
from flask_restful import fields # type: ignore
from flask_restful import fields
from libs.helper import AppIconUrlField
parameters__system_parameters = {
"image_file_size_limit": fields.Integer,
@ -22,3 +24,20 @@ parameters_fields = {
"file_upload": fields.Raw,
"system_parameters": fields.Nested(parameters__system_parameters),
}
site_fields = {
"title": fields.String,
"chat_color_theme": fields.String,
"chat_color_theme_inverted": fields.Boolean,
"icon_type": fields.String,
"icon": fields.String,
"icon_background": fields.String,
"icon_url": AppIconUrlField,
"description": fields.String,
"copyright": fields.String,
"privacy_policy": fields.String,
"custom_disclaimer": fields.String,
"default_language": fields.String,
"show_workflow_steps": fields.Boolean,
"use_icon_as_answer_icon": fields.Boolean,
}

View File

@ -1,7 +1,7 @@
from functools import wraps
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound, Unauthorized

View File

@ -1,7 +1,7 @@
from typing import Any
import flask_restful # type: ignore
from flask_login import current_user # type: ignore
import flask_restful
from flask_login import current_user
from flask_restful import Resource, fields, marshal_with
from sqlalchemy import select
from sqlalchemy.orm import Session

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -1,6 +1,6 @@
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, marshal_with, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
@ -186,7 +186,7 @@ class AnnotationUpdateDeleteApi(Resource):
app_id = str(app_id)
annotation_id = str(annotation_id)
AppAnnotationService.delete_app_annotation(app_id, annotation_id)
return {"result": "success"}, 200
return {"result": "success"}, 204
class AnnotationBatchImportApi(Resource):

View File

@ -1,8 +1,8 @@
import uuid
from typing import cast
from flask_login import current_user # type: ignore
from flask_restful import Resource, inputs, marshal, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, inputs, marshal, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden, abort

View File

@ -1,7 +1,7 @@
from typing import cast
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden

View File

@ -1,7 +1,7 @@
import logging
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from werkzeug.exceptions import InternalServerError
import services

View File

@ -1,7 +1,7 @@
import logging
import flask_login # type: ignore
from flask_restful import Resource, reqparse # type: ignore
import flask_login
from flask_restful import Resource, reqparse
from werkzeug.exceptions import InternalServerError, NotFound
import services

View File

@ -1,9 +1,9 @@
from datetime import UTC, datetime
import pytz # pip install pytz
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from sqlalchemy import func, or_
from sqlalchemy.orm import joinedload
from werkzeug.exceptions import Forbidden, NotFound

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session

View File

@ -1,7 +1,7 @@
import os
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.error import (

View File

@ -1,8 +1,8 @@
import logging
from flask_login import current_user # type: ignore
from flask_restful import Resource, fields, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_login import current_user
from flask_restful import Resource, fields, marshal_with, reqparse
from flask_restful.inputs import int_range
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
from controllers.console import api

View File

@ -2,8 +2,8 @@ import json
from typing import cast
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource # type: ignore
from flask_login import current_user
from flask_restful import Resource
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from werkzeug.exceptions import BadRequest
from controllers.console import api
@ -84,7 +84,7 @@ class TraceAppConfigApi(Resource):
result = OpsService.delete_tracing_app_config(app_id=app_id, tracing_provider=args["tracing_provider"])
if not result:
raise TracingConfigNotExist()
return {"result": "success"}
return {"result": "success"}, 204
except Exception as e:
raise BadRequest(str(e))

View File

@ -1,7 +1,7 @@
from datetime import UTC, datetime
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from werkzeug.exceptions import Forbidden, NotFound
from constants.languages import supported_language

View File

@ -3,8 +3,8 @@ from decimal import Decimal
import pytz
from flask import jsonify
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -3,7 +3,7 @@ import logging
from typing import cast
from flask import abort, request
from flask_restful import Resource, inputs, marshal_with, reqparse # type: ignore
from flask_restful import Resource, inputs, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound

View File

@ -1,6 +1,6 @@
from dateutil.parser import isoparse
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from sqlalchemy.orm import Session
from controllers.console import api

View File

@ -1,5 +1,5 @@
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -3,8 +3,8 @@ from decimal import Decimal
import pytz
from flask import jsonify
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -1,7 +1,7 @@
import datetime
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from constants.languages import supported_language
from controllers.console import api

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
@ -65,7 +65,7 @@ class ApiKeyAuthDataSourceBindingDelete(Resource):
ApiKeyAuthService.delete_provider_auth(current_user.current_tenant_id, binding_id)
return {"result": "success"}, 200
return {"result": "success"}, 204
api.add_resource(ApiKeyAuthDataSource, "/api-key-auth/data-source")

View File

@ -2,8 +2,8 @@ import logging
import requests
from flask import current_app, redirect, request
from flask_login import current_user # type: ignore
from flask_restful import Resource # type: ignore
from flask_login import current_user
from flask_restful import Resource
from werkzeug.exceptions import Forbidden
from configs import dify_config

View File

@ -2,7 +2,7 @@ import base64
import secrets
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session

View File

@ -1,8 +1,8 @@
from typing import cast
import flask_login # type: ignore
import flask_login
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
import services
from configs import dify_config

View File

@ -4,7 +4,7 @@ from typing import Optional
import requests
from flask import current_app, redirect, request
from flask_restful import Resource # type: ignore
from flask_restful import Resource
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import Unauthorized

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.wraps import account_initialization_required, only_edition_cloud, setup_required

View File

@ -1,6 +1,6 @@
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from libs.helper import extract_remote_ip
from libs.login import login_required

View File

@ -2,8 +2,8 @@ import datetime
import json
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound

View File

@ -1,7 +1,7 @@
import flask_restful # type: ignore
import flask_restful
from flask import request
from flask_login import current_user # type: ignore # type: ignore
from flask_restful import Resource, marshal, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, marshal_with, reqparse
from werkzeug.exceptions import Forbidden, NotFound
import services
@ -526,14 +526,20 @@ class DatasetIndexingStatusApi(Resource):
)
documents_status = []
for document in documents:
completed_segments = DocumentSegment.query.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document.id),
DocumentSegment.status != "re_segment",
).count()
total_segments = DocumentSegment.query.filter(
DocumentSegment.document_id == str(document.id), DocumentSegment.status != "re_segment"
).count()
completed_segments = (
db.session.query(DocumentSegment)
.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document.id),
DocumentSegment.status != "re_segment",
)
.count()
)
total_segments = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.document_id == str(document.id), DocumentSegment.status != "re_segment")
.count()
)
document.completed_segments = completed_segments
document.total_segments = total_segments
documents_status.append(marshal(document, document_status_fields))
@ -657,6 +663,7 @@ class DatasetRetrievalSettingApi(Resource):
| VectorType.ELASTICSEARCH
| VectorType.ELASTICSEARCH_JA
| VectorType.PGVECTOR
| VectorType.VASTBASE
| VectorType.TIDB_ON_QDRANT
| VectorType.LINDORM
| VectorType.COUCHBASE
@ -706,6 +713,7 @@ class DatasetRetrievalSettingMockApi(Resource):
| VectorType.ELASTICSEARCH_JA
| VectorType.COUCHBASE
| VectorType.PGVECTOR
| VectorType.VASTBASE
| VectorType.LINDORM
| VectorType.OPENGAUSS
| VectorType.OCEANBASE

View File

@ -4,9 +4,9 @@ from datetime import UTC, datetime
from typing import cast
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, fields, marshal, marshal_with, reqparse # type: ignore
from sqlalchemy import asc, desc
from flask_login import current_user
from flask_restful import Resource, fields, marshal, marshal_with, reqparse
from sqlalchemy import asc, desc, select
from werkzeug.exceptions import Forbidden, NotFound
import services
@ -40,7 +40,7 @@ from core.indexing_runner import IndexingRunner
from core.model_manager import ModelManager
from core.model_runtime.entities.model_entities import ModelType
from core.model_runtime.errors.invoke import InvokeAuthorizationError
from core.plugin.manager.exc import PluginDaemonClientSideError
from core.plugin.impl.exc import PluginDaemonClientSideError
from core.rag.extractor.entity.extract_setting import ExtractSetting
from extensions.ext_database import db
from extensions.ext_redis import redis_client
@ -112,7 +112,7 @@ class GetProcessRuleApi(Resource):
limits = DocumentService.DEFAULT_RULES["limits"]
if document_id:
# get the latest process rule
document = Document.query.get_or_404(document_id)
document = db.get_or_404(Document, document_id)
dataset = DatasetService.get_dataset(document.dataset_id)
@ -175,7 +175,7 @@ class DatasetDocumentListApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
query = Document.query.filter_by(dataset_id=str(dataset_id), tenant_id=current_user.current_tenant_id)
query = select(Document).filter_by(dataset_id=str(dataset_id), tenant_id=current_user.current_tenant_id)
if search:
search = f"%{search}%"
@ -209,18 +209,24 @@ class DatasetDocumentListApi(Resource):
desc(Document.position),
)
paginated_documents = query.paginate(page=page, per_page=limit, max_per_page=100, error_out=False)
paginated_documents = db.paginate(select=query, page=page, per_page=limit, max_per_page=100, error_out=False)
documents = paginated_documents.items
if fetch:
for document in documents:
completed_segments = DocumentSegment.query.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document.id),
DocumentSegment.status != "re_segment",
).count()
total_segments = DocumentSegment.query.filter(
DocumentSegment.document_id == str(document.id), DocumentSegment.status != "re_segment"
).count()
completed_segments = (
db.session.query(DocumentSegment)
.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document.id),
DocumentSegment.status != "re_segment",
)
.count()
)
total_segments = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.document_id == str(document.id), DocumentSegment.status != "re_segment")
.count()
)
document.completed_segments = completed_segments
document.total_segments = total_segments
data = marshal(documents, document_with_segments_fields)
@ -563,14 +569,20 @@ class DocumentBatchIndexingStatusApi(DocumentResource):
documents = self.get_batch_documents(dataset_id, batch)
documents_status = []
for document in documents:
completed_segments = DocumentSegment.query.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document.id),
DocumentSegment.status != "re_segment",
).count()
total_segments = DocumentSegment.query.filter(
DocumentSegment.document_id == str(document.id), DocumentSegment.status != "re_segment"
).count()
completed_segments = (
db.session.query(DocumentSegment)
.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document.id),
DocumentSegment.status != "re_segment",
)
.count()
)
total_segments = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.document_id == str(document.id), DocumentSegment.status != "re_segment")
.count()
)
document.completed_segments = completed_segments
document.total_segments = total_segments
if document.is_paused:
@ -589,14 +601,20 @@ class DocumentIndexingStatusApi(DocumentResource):
document_id = str(document_id)
document = self.get_document(dataset_id, document_id)
completed_segments = DocumentSegment.query.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document_id),
DocumentSegment.status != "re_segment",
).count()
total_segments = DocumentSegment.query.filter(
DocumentSegment.document_id == str(document_id), DocumentSegment.status != "re_segment"
).count()
completed_segments = (
db.session.query(DocumentSegment)
.filter(
DocumentSegment.completed_at.isnot(None),
DocumentSegment.document_id == str(document_id),
DocumentSegment.status != "re_segment",
)
.count()
)
total_segments = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.document_id == str(document_id), DocumentSegment.status != "re_segment")
.count()
)
document.completed_segments = completed_segments
document.total_segments = total_segments

View File

@ -2,8 +2,9 @@ import uuid
import pandas as pd
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, reqparse
from sqlalchemy import select
from werkzeug.exceptions import Forbidden, NotFound
import services
@ -26,6 +27,7 @@ from controllers.console.wraps import (
from core.errors.error import LLMBadRequestError, ProviderTokenNotInitError
from core.model_manager import ModelManager
from core.model_runtime.entities.model_entities import ModelType
from extensions.ext_database import db
from extensions.ext_redis import redis_client
from fields.segment_fields import child_chunk_fields, segment_fields
from libs.login import login_required
@ -74,9 +76,14 @@ class DatasetDocumentSegmentListApi(Resource):
hit_count_gte = args["hit_count_gte"]
keyword = args["keyword"]
query = DocumentSegment.query.filter(
DocumentSegment.document_id == str(document_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).order_by(DocumentSegment.position.asc())
query = (
select(DocumentSegment)
.filter(
DocumentSegment.document_id == str(document_id),
DocumentSegment.tenant_id == current_user.current_tenant_id,
)
.order_by(DocumentSegment.position.asc())
)
if status_list:
query = query.filter(DocumentSegment.status.in_(status_list))
@ -93,7 +100,7 @@ class DatasetDocumentSegmentListApi(Resource):
elif args["enabled"].lower() == "false":
query = query.filter(DocumentSegment.enabled == False)
segments = query.paginate(page=page, per_page=limit, max_per_page=100, error_out=False)
segments = db.paginate(select=query, page=page, per_page=limit, max_per_page=100, error_out=False)
response = {
"data": marshal(segments.items, segment_fields),
@ -131,7 +138,7 @@ class DatasetDocumentSegmentListApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
SegmentService.delete_segments(segment_ids, document, dataset)
return {"result": "success"}, 200
return {"result": "success"}, 204
class DatasetDocumentSegmentApi(Resource):
@ -276,9 +283,11 @@ class DatasetDocumentSegmentUpdateApi(Resource):
raise ProviderNotInitializeError(ex.description)
# check segment
segment_id = str(segment_id)
segment = DocumentSegment.query.filter(
DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).first()
segment = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
# The role of the current user in the ta table must be admin, owner, dataset_operator, or editor
@ -320,9 +329,11 @@ class DatasetDocumentSegmentUpdateApi(Resource):
raise NotFound("Document not found.")
# check segment
segment_id = str(segment_id)
segment = DocumentSegment.query.filter(
DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).first()
segment = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
# The role of the current user in the ta table must be admin, owner, dataset_operator, or editor
@ -333,7 +344,7 @@ class DatasetDocumentSegmentUpdateApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
SegmentService.delete_segment(segment, document, dataset)
return {"result": "success"}, 200
return {"result": "success"}, 204
class DatasetDocumentSegmentBatchImportApi(Resource):
@ -423,9 +434,11 @@ class ChildChunkAddApi(Resource):
raise NotFound("Document not found.")
# check segment
segment_id = str(segment_id)
segment = DocumentSegment.query.filter(
DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).first()
segment = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
if not current_user.is_dataset_editor:
@ -478,9 +491,11 @@ class ChildChunkAddApi(Resource):
raise NotFound("Document not found.")
# check segment
segment_id = str(segment_id)
segment = DocumentSegment.query.filter(
DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).first()
segment = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
parser = reqparse.RequestParser()
@ -523,9 +538,11 @@ class ChildChunkAddApi(Resource):
raise NotFound("Document not found.")
# check segment
segment_id = str(segment_id)
segment = DocumentSegment.query.filter(
DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).first()
segment = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
# The role of the current user in the ta table must be admin, owner, dataset_operator, or editor
@ -567,16 +584,20 @@ class ChildChunkUpdateApi(Resource):
raise NotFound("Document not found.")
# check segment
segment_id = str(segment_id)
segment = DocumentSegment.query.filter(
DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).first()
segment = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
# check child chunk
child_chunk_id = str(child_chunk_id)
child_chunk = ChildChunk.query.filter(
ChildChunk.id == str(child_chunk_id), ChildChunk.tenant_id == current_user.current_tenant_id
).first()
child_chunk = (
db.session.query(ChildChunk)
.filter(ChildChunk.id == str(child_chunk_id), ChildChunk.tenant_id == current_user.current_tenant_id)
.first()
)
if not child_chunk:
raise NotFound("Child chunk not found.")
# The role of the current user in the ta table must be admin, owner, dataset_operator, or editor
@ -590,7 +611,7 @@ class ChildChunkUpdateApi(Resource):
SegmentService.delete_child_chunk(child_chunk, dataset)
except ChildChunkDeleteIndexServiceError as e:
raise ChildChunkDeleteIndexError(str(e))
return {"result": "success"}, 200
return {"result": "success"}, 204
@setup_required
@login_required
@ -612,16 +633,20 @@ class ChildChunkUpdateApi(Resource):
raise NotFound("Document not found.")
# check segment
segment_id = str(segment_id)
segment = DocumentSegment.query.filter(
DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id
).first()
segment = (
db.session.query(DocumentSegment)
.filter(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
# check child chunk
child_chunk_id = str(child_chunk_id)
child_chunk = ChildChunk.query.filter(
ChildChunk.id == str(child_chunk_id), ChildChunk.tenant_id == current_user.current_tenant_id
).first()
child_chunk = (
db.session.query(ChildChunk)
.filter(ChildChunk.id == str(child_chunk_id), ChildChunk.tenant_id == current_user.current_tenant_id)
.first()
)
if not child_chunk:
raise NotFound("Child chunk not found.")
# The role of the current user in the ta table must be admin, owner, dataset_operator, or editor

View File

@ -1,6 +1,6 @@
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
@ -135,7 +135,7 @@ class ExternalApiTemplateApi(Resource):
raise Forbidden()
ExternalDatasetService.delete_external_knowledge_api(current_user.current_tenant_id, external_knowledge_api_id)
return {"result": "success"}, 200
return {"result": "success"}, 204
class ExternalApiUseCheckApi(Resource):
@ -209,6 +209,7 @@ class ExternalKnowledgeHitTestingApi(Resource):
parser = reqparse.RequestParser()
parser.add_argument("query", type=str, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
parser.add_argument("metadata_filtering_conditions", type=dict, required=False, location="json")
args = parser.parse_args()
HitTestingService.hit_testing_args_check(args)
@ -219,6 +220,7 @@ class ExternalKnowledgeHitTestingApi(Resource):
query=args["query"],
account=current_user,
external_retrieval_model=args["external_retrieval_model"],
metadata_filtering_conditions=args["metadata_filtering_conditions"],
)
return response

View File

@ -1,4 +1,4 @@
from flask_restful import Resource # type: ignore
from flask_restful import Resource
from controllers.console import api
from controllers.console.datasets.hit_testing_base import DatasetsHitTestingBase

View File

@ -1,7 +1,7 @@
import logging
from flask_login import current_user # type: ignore
from flask_restful import marshal, reqparse # type: ignore
from flask_login import current_user
from flask_restful import marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services.dataset_service

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from werkzeug.exceptions import NotFound
from controllers.console import api
@ -82,7 +82,7 @@ class DatasetMetadataApi(Resource):
DatasetService.check_dataset_permission(dataset, current_user)
MetadataService.delete_metadata(dataset_id_str, metadata_id_str)
return 200
return {"result": "success"}, 204
class DatasetMetadataBuiltInFieldApi(Resource):

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.datasets.error import WebsiteCrawlError

View File

@ -66,7 +66,7 @@ class ChatAudioApi(InstalledAppResource):
class ChatTextApi(InstalledAppResource):
def post(self, installed_app):
from flask_restful import reqparse # type: ignore
from flask_restful import reqparse
app_model = installed_app.app
try:

View File

@ -1,8 +1,8 @@
import logging
from datetime import UTC, datetime
from flask_login import current_user # type: ignore
from flask_restful import reqparse # type: ignore
from flask_login import current_user
from flask_restful import reqparse
from werkzeug.exceptions import InternalServerError, NotFound
import services

View File

@ -1,6 +1,6 @@
from flask_login import current_user # type: ignore
from flask_restful import marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_login import current_user
from flask_restful import marshal_with, reqparse
from flask_restful.inputs import int_range
from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound

View File

@ -2,8 +2,8 @@ from datetime import UTC, datetime
from typing import Any
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, inputs, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, inputs, marshal_with, reqparse
from sqlalchemy import and_
from werkzeug.exceptions import BadRequest, Forbidden, NotFound
@ -66,7 +66,7 @@ class InstalledAppsListApi(Resource):
parser.add_argument("app_id", type=str, required=True, help="Invalid app_id")
args = parser.parse_args()
recommended_app = RecommendedApp.query.filter(RecommendedApp.app_id == args["app_id"]).first()
recommended_app = db.session.query(RecommendedApp).filter(RecommendedApp.app_id == args["app_id"]).first()
if recommended_app is None:
raise NotFound("App not found")
@ -79,9 +79,11 @@ class InstalledAppsListApi(Resource):
if not app.is_public:
raise Forbidden("You can't install a non-public app")
installed_app = InstalledApp.query.filter(
and_(InstalledApp.app_id == args["app_id"], InstalledApp.tenant_id == current_tenant_id)
).first()
installed_app = (
db.session.query(InstalledApp)
.filter(and_(InstalledApp.app_id == args["app_id"], InstalledApp.tenant_id == current_tenant_id))
.first()
)
if installed_app is None:
# todo: position
@ -113,7 +115,7 @@ class InstalledAppApi(InstalledAppResource):
db.session.delete(installed_app)
db.session.commit()
return {"result": "success", "message": "App uninstalled successfully"}
return {"result": "success", "message": "App uninstalled successfully"}, 204
def patch(self, installed_app):
parser = reqparse.RequestParser()

View File

@ -1,8 +1,8 @@
import logging
from flask_login import current_user # type: ignore
from flask_restful import marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_login import current_user
from flask_restful import marshal_with, reqparse
from flask_restful.inputs import int_range
from werkzeug.exceptions import InternalServerError, NotFound
import services

View File

@ -1,4 +1,4 @@
from flask_restful import marshal_with # type: ignore
from flask_restful import marshal_with
from controllers.common import fields
from controllers.console import api

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore
from flask_restful import Resource, fields, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, fields, marshal_with, reqparse
from constants.languages import languages
from controllers.console import api

View File

@ -1,6 +1,6 @@
from flask_login import current_user # type: ignore
from flask_restful import fields, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_login import current_user
from flask_restful import fields, marshal_with, reqparse
from flask_restful.inputs import int_range
from werkzeug.exceptions import NotFound
from controllers.console import api
@ -72,7 +72,7 @@ class SavedMessageApi(InstalledAppResource):
SavedMessageService.delete(app_model, current_user, message_id)
return {"result": "success"}
return {"result": "success"}, 204
api.add_resource(

View File

@ -1,6 +1,6 @@
import logging
from flask_restful import reqparse # type: ignore
from flask_restful import reqparse
from werkzeug.exceptions import InternalServerError
from controllers.console.app.error import (

View File

@ -1,7 +1,7 @@
from functools import wraps
from flask_login import current_user # type: ignore
from flask_restful import Resource # type: ignore
from flask_login import current_user
from flask_restful import Resource
from werkzeug.exceptions import NotFound
from controllers.console.wraps import account_initialization_required

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from constants import HIDDEN_VALUE
from controllers.console import api
@ -99,7 +99,7 @@ class APIBasedExtensionDetailAPI(Resource):
APIBasedExtensionService.delete(extension_data_from_db)
return {"result": "success"}
return {"result": "success"}, 204
api.add_resource(CodeBasedExtensionAPI, "/code-based-extension")

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore
from flask_restful import Resource # type: ignore
from flask_login import current_user
from flask_restful import Resource
from libs.login import login_required
from services.feature_service import FeatureService

View File

@ -1,8 +1,8 @@
from typing import Literal
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with
from werkzeug.exceptions import Forbidden
import services

View File

@ -1,7 +1,7 @@
import os
from flask import session
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session

View File

@ -1,4 +1,4 @@
from flask_restful import Resource # type: ignore
from flask_restful import Resource
from controllers.console import api

View File

@ -2,8 +2,8 @@ import urllib.parse
from typing import cast
import httpx
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
import services
from controllers.common import helpers

View File

@ -1,5 +1,5 @@
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from configs import dify_config
from libs.helper import StrLen, email, extract_remote_ip

Some files were not shown because too many files have changed in this diff Show More