Compare commits

...

310 Commits

Author SHA1 Message Date
98e9cc5275 Update api/services/account_service.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-05-06 22:42:27 -04:00
1e5bf958ec fix: allow admin to remove and update non-privlige users. 2025-05-06 16:38:25 -04:00
f23cf98317 refactor: Remove RepositoryFactory (#19176)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-05-06 21:14:51 +08:00
a6827493f0 chore: slice workflow refresh draft hook (#19292) 2025-05-06 18:24:10 +08:00
9565fe9b1b fix(api): fix alembic offline mode (#19285)
Alembic's offline mode generates SQL from SQLAlchemy migration operations,
providing developers with a clear view of database schema changes without
requiring an active database connection.

However, some migration versions (specifically bbadea11becb and d7999dfa4aae)
were performing database schema introspection, which fails in offline mode
since it requires an actual database connection.

This commit:
- Adds offline mode support by detecting context.is_offline_mode()
- Skips introspection steps when in offline mode
- Adds warning messages in SQL output to inform users that assumptions were made
- Prompts users to review the generated SQL for accuracy

These changes ensure migrations work consistently in both online and offline modes.

Close #19284.
2025-05-06 18:05:19 +08:00
8de24bc16e chore: enhance dev script robustness by determining the script directory (#19209) 2025-05-06 17:02:40 +08:00
3ecc1e0228 Fix: update docs link (#19278) 2025-05-06 17:02:01 +08:00
1abf00e443 Fix doc bug workflow (#19269)
Co-authored-by: liuwangwang <liuwangwang@hikvision.com.cn>
2025-05-06 15:11:08 +08:00
Jim
6c515ef47f fix(web): add workspace selector overflow auto (#19265)
Co-authored-by: JMY <jiangmingyao@gf.com.cn>
2025-05-06 13:30:25 +08:00
0b44791eae feat: add mode for /info api (#19264) 2025-05-06 13:24:53 +08:00
0cfc82d731 fix(structured-output): reasoning model's json format parsing (#19261) 2025-05-06 13:16:08 +08:00
b78846078c Fix: hide view chat setting button when no inputs form (#19263) 2025-05-06 13:15:23 +08:00
8537abfff8 chore: avoid repeated type ignore noqa by adding flask_restful and flask_login in mypy import exclusions (#19224) 2025-05-06 11:58:49 +08:00
4b77c9df9d Fix: optional input in batch run (#19257) 2025-05-06 11:33:15 +08:00
b979a8a707 feat: sort variables in the selector by x axis for most recent order (#19243) 2025-05-06 10:59:02 +08:00
9231c197a5 fix: s.filter is not a function (#19250) 2025-05-06 10:26:44 +08:00
8ac3a223a8 fix(api): add missing INNER_API_KEY to InnerAPIConfig (#19166) 2025-05-06 10:02:14 +08:00
5a6f20d575 Optimize the event handling and rendering logic of the component picker (#19232) 2025-05-06 09:48:53 +08:00
c5568f756f fix basic auth if not base64 encode (#19242)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-05-06 09:18:37 +08:00
22f5af9987 fix: support non-ascii charactors in filename of the tool files (#19228) 2025-05-06 09:18:11 +08:00
e352ab2bdd chore: required pip and performance improvment in mypy checks (#19225) 2025-05-06 09:16:43 +08:00
bbf513a2cd Fix appURL construction when basePath is empty (#19234) 2025-05-05 22:35:41 +08:00
9bcf837f17 fix: use only supported operators in metadata filter system prompts (#19195) 2025-05-03 20:08:08 +08:00
a212a63e6a fix: time type metadata filtering error (#19192) 2025-05-03 20:07:37 +08:00
e2cae42115 chore: bump celery from 5.4 to 5.5 (#19190) 2025-05-03 20:07:04 +08:00
50fa0d1512 feat: Plugin page related document supports multiple languages (#19197) 2025-05-03 20:03:56 +08:00
bb1d1dc263 fix: fix API tool integration test (#19187) 2025-05-01 14:49:43 +08:00
1ca6dbcdc8 fix: file name incorrect when download file (#19183) 2025-04-30 22:47:59 +08:00
349c3cf7b8 feat(api): Add image multimodal support for LLMNode (#17372)
Enhance `LLMNode` with multimodal capability, introducing support for
image outputs.

This implementation extracts base64-encoded images from LLM responses,
saves them to the storage service, and records the file metadata in the
`ToolFile` table. In conversations, these images are rendered as
markdown-based inline images.
Additionally, the images are included in the LLMNode's output as
file variables, enabling subsequent nodes in the workflow to utilize them.

To integrate file outputs into workflows, adjustments to the frontend code
are necessary.

For multimodal output functionality, updates to related model configurations
are required. Currently, this capability has been applied exclusively to
Google's Gemini models.

Close #15814.

Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-04-30 17:28:02 +08:00
6c9a9d344a fix: mouse scrolling zooming can not function anymore (#19160) 2025-04-30 16:57:48 +08:00
f8e5341ac0 ci: add diff to lint ci (#17874)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-04-30 16:27:25 +08:00
12c96b93d9 immediately return initialed tiktokenizer instance and remove dead code in usage of tiktokenizer (#17957) 2025-04-30 16:07:20 +08:00
bcc95e520b feat: support remove first and remove last in variable assigner (#19144)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-30 15:50:00 +08:00
69b43a955f fix: inconsistent case expression in _process_metadata_filter_func (#19146) 2025-04-30 15:14:01 +08:00
3dff21e0be fix(i18n): add functions to retrieve document and pricing page languages (#19142) 2025-04-30 14:58:49 +08:00
d5ee465bf9 fix: fix render undefined when text children is empty (#19135) 2025-04-30 14:17:41 +08:00
65b7a783fe fix: metadata filter not work (#19020)
Co-authored-by: 金鹏程 <jinpengcheng01@corp.netease.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-30 11:06:03 +08:00
1bc94b92ac fix: fix import LexicalErrorBoundary error (#19124) 2025-04-30 11:05:47 +08:00
5088ab5061 feat(logAndAnn): add control option for question editing feature (#19117) 2025-04-30 10:57:23 +08:00
d70fa2847b add Accept-Ranges header for audio/video files (#19119) 2025-04-30 10:51:27 +08:00
8bf3f5ea78 fix(api): resolve external knowledge API error due to excessive URL validation (#19003)
The `validators.url` method from the `validators==0.21.0` library enforces a
URL length limit of less than 90 characters, which led to failures in external
knowledge API requests for long URLs.

This PR addresses the issue by replacing `validators.url` with 
`urllib.parse.urlparse`, effectively removing the restrictive URL length check.

Additionally, the unused `validators` dependency has been removed.

Fixes #18981.

Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-04-29 22:32:38 +08:00
a9f7bcb12f fix: Chinese input deletes extra character in Safari within Workspaces (#18193) (#19088) 2025-04-29 18:47:35 +08:00
bd1bbfee4b Enhance Code Consistency Across Repository with .editorconfig (#19023) 2025-04-29 18:04:33 +08:00
226afd4550 Fix: the issue of getting empty environment variables. (#19085) 2025-04-29 18:01:11 +08:00
ce9737c6a0 fix: change provider should change the content (#19075) 2025-04-29 15:49:31 +08:00
23f6914b9b fix: image preview triggers binary download (#19070) 2025-04-29 15:38:33 +08:00
2a3cc43b62 fix: remove redundant Mermaid graph direction enforcement (#19024) 2025-04-29 15:10:38 +08:00
8266815cda feat: add AWS Managed IAM auth for OpenSearch vector DB (#18963) 2025-04-29 15:10:08 +08:00
8b4ea01810 feat: support access milvus with token (#19034) 2025-04-29 14:52:13 +08:00
83187b30c0 fix: fix rerank model runner usage (#19008) 2025-04-29 14:51:21 +08:00
3c09b57059 fix(web): fix the wrong components usage(#18995) (#19065) 2025-04-29 14:39:59 +08:00
a147d2a200 feat(api): use json_repair to fix invalid json while generating structured output (#18977)
When generating JSON schema using an LLM in the structured output feature,
models may occasionally return invalid JSON, which prevents clients from correctly 
parsing the response and can lead to UI breakage.

This commit addresses the issue by introducing `json_repair` to automatically 
fix invalid JSON strings returned by the LLM, ensuring smoother functionality 
and better client-side handling of structured outputs.


Co-authored-by: lizb <lizb@sugon.com>
2025-04-29 12:39:13 +08:00
28a59ba344 chore: improve diagram picture of docker compose (#19054) 2025-04-29 11:43:50 +08:00
94cc0b7a12 fix(workflow_cycle_manage): failed nodes were not updated in workflow_node_executions (#18994) 2025-04-29 10:31:08 +08:00
e36b1a7016 test(graph-engine-test): modify the assert condition (#19041) 2025-04-29 09:51:42 +08:00
bf46b894e9 chore: Improve FILES_URL Configuration Comments (#19040) 2025-04-29 09:05:04 +08:00
f0ca828544 fix: fix embedded chatbot styles on a relatively wide screen (#19030) 2025-04-29 08:58:10 +08:00
36521e4275 fixes: fix entrypoint script with missing environment variables (#19039) 2025-04-29 08:57:58 +08:00
a93a09e0f7 feat: clean up message_files table first before proceeding to find orphaned files (#19035) 2025-04-29 08:57:42 +08:00
a54773fbff refactor: switch to dynamic versioning in package configuration (#19019)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-28 19:51:50 +08:00
b8bb45b106 remove unstructured api key check (#18989) 2025-04-28 17:26:30 +08:00
ecade13455 add MAX_TASK_PRE_CHILD for celery (#18985) 2025-04-28 17:06:00 +08:00
49678e4b48 chore: Bump version to 1.3.1 (#18962)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-28 16:45:17 +08:00
ea150dc336 feat: Upload a folder to knowledgebase (#17026)
Co-authored-by: Silow9 <soulskytony@gmail.com>
2025-04-28 15:31:22 +08:00
5de01c1444 feat (document_extractor): support .properties file (#18969) 2025-04-28 15:28:11 +08:00
f86e2edc54 refactor(plugin/backwards_invocation/app): Remove unnecessary .value from StrEnum (#18896)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-28 14:50:59 +08:00
bf01e41fd9 feat: improve embedded chatbot styles (#18692) 2025-04-28 14:38:59 +08:00
edcfd7761b if as_attachment is in the url, add it to the sign_url (#18930) 2025-04-28 14:25:59 +08:00
b8daf944f1 fix: header padding (#18957) 2025-04-28 13:57:44 +08:00
315436e43b fix: classify remove always remove the last one (#18959) 2025-04-28 13:56:43 +08:00
2c2af1d117 feat: add VTT data transform to Document extractor (#18936) 2025-04-28 13:45:15 +08:00
03ac2d0f17 fix: i.find is not a function (#18945) 2025-04-28 11:09:54 +08:00
8299614e60 [Observability][Bugfix] Fix expected an instance of Token, got None error in OpenTelemetry (#18934) 2025-04-28 10:31:13 +08:00
7a62202392 fix: when cot_agent call tool like searxng lost some response content (#16781) 2025-04-28 09:27:46 +08:00
eb92dd59f9 chore: disabled struct output not show model not support warning (#18909) 2025-04-27 18:30:45 +08:00
d9aa2b155a refactor: Refactors repository imports structure (#18901)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-27 17:29:03 +08:00
e5bdc1438a fix: annotation update need use http put method and annotation-reply api doc parms wrong (#18891) 2025-04-27 16:13:36 +08:00
e3daef19e7 chatflow/workflow add field required (#18892) 2025-04-27 16:12:02 +08:00
0e0ec4691a feat: add interfaces of OAuth handler methods for authorization (#18889) 2025-04-27 16:00:37 +08:00
7ccec5cd95 refactor: remove external link for dataset description guidance (#18884) 2025-04-27 14:47:00 +08:00
7613d9dc33 [Observability] Convert exception logging into span in OpenTelemetry (#18821) 2025-04-27 14:39:47 +08:00
77ad600a33 fix: Text Generation App should not should not show embedded in website (#18880) 2025-04-27 14:33:50 +08:00
abafa68647 refactor: rename plugin manager to plugin client and rename path from manager to impl (#18876) 2025-04-27 14:22:25 +08:00
d91828dd90 chore: support other webapps embedded in iframe (#18877) 2025-04-27 14:21:27 +08:00
19f2a74ba8 fix: check dsl version when create app from explore template (#18872) 2025-04-27 14:00:45 +08:00
58a929edd5 fix: install plugins permissions (#18870) 2025-04-27 14:00:35 +08:00
bed47dffb9 fix: update notice for users for clear-orphaned-file-records and remove-orphaned-files-on-storage commands (#18864) 2025-04-27 13:01:02 +08:00
136995d2a1 fix: change delete app status code from 204 to 200 (#18398)
Co-authored-by: devxing <devxing@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-27 12:12:46 +08:00
c1559a7c8e fix: LLMResultChunk cause concatenate str and list exception (#18852) 2025-04-27 11:32:14 +08:00
993ef87dca feat: add administrative commands to free up storage space by removing unused files (#18835) 2025-04-27 11:11:04 +08:00
b62eb61400 fix depth param issue for WaterCrawl (#18839) 2025-04-27 11:04:56 +08:00
0a20210a59 feat: Add W&B Weave Tracing Integration (#14262)
Signed-off-by: Yuichiro Utsumi <utsumi.yuichiro@fujitsu.com>
Signed-off-by: -LAN- <laipz8200@outlook.com>
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Signed-off-by: ChengZi <chen.zhang@zilliz.com>
Signed-off-by: cl <cailue@apache.org>
Co-authored-by: Yu Chun Chang <changyuchun159630@gmail.com>
Co-authored-by: Kyle Chang <kylechang@91app.com>
Co-authored-by: Lick-liu <51771897+Lick-liu@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: Yuichiro Utsumi <81412151+utsumi-fj@users.noreply.github.com>
Co-authored-by: NFish <douxc512@gmail.com>
Co-authored-by: Yeuoly <45712896+Yeuoly@users.noreply.github.com>
Co-authored-by: Wu Tianwei <30284043+WTW0313@users.noreply.github.com>
Co-authored-by: DDDDD12138 <43703884+DDDDD12138@users.noreply.github.com>
Co-authored-by: Jyong <76649700+JohnJyong@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Novice <857526207@qq.com>
Co-authored-by: yihong <zouzou0208@gmail.com>
Co-authored-by: Kalo Chin <91766386+fdb02983rhy@users.noreply.github.com>
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
Co-authored-by: jiangbo721 <365065261@qq.com>
Co-authored-by: 刘江波 <jiangbo721@163.com>
Co-authored-by: Lam <scau_ljw@126.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Mars <524574386@qq.com>
Co-authored-by: mars <linjx2@by-health.com>
Co-authored-by: Joe <79627742+ZhouhaoJiang@users.noreply.github.com>
Co-authored-by: Rafael Carvalho <r.carvalho@me.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: 非法操作 <hjlarry@163.com>
Co-authored-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: codingjaguar <codingjaguar@gmail.com>
Co-authored-by: ChengZi <chen.zhang@zilliz.com>
Co-authored-by: Fei He <droxer.he@gmail.com>
Co-authored-by: Arcaner <52057416+lrhan321@users.noreply.github.com>
Co-authored-by: Xiyuan Chen <52963600+GareArc@users.noreply.github.com>
Co-authored-by: KVOJJJin <jzongcode@gmail.com>
Co-authored-by: XiaoBa <94062266+XiaoBa-Yu@users.noreply.github.com>
Co-authored-by: Xiaoba Yu <xb1823725853@gmail.com>
Co-authored-by: zhangyuhang <2827528315@qq.com>
Co-authored-by: yuhang2.zhang <yuhang2.zhang@ly.com>
Co-authored-by: 诗浓 <nyaashino@gmail.com>
Co-authored-by: RookieAgent <42060616+Sakura4036@users.noreply.github.com>
Co-authored-by: sho-takano-dev <shota.takano.dev@gmail.com>
Co-authored-by: 過世秋風 <1040926235@qq.com>
Co-authored-by: Yi Feng <66539215+bigyifeng@users.noreply.github.com>
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
Co-authored-by: Yongtao Huang <99629139+hyongtao-db@users.noreply.github.com>
Co-authored-by: ShadowJobs <794878115@qq.com>
Co-authored-by: LinYing <linying@momenta.ai>
Co-authored-by: Benjamin <benjaminx@gmail.com>
Co-authored-by: LiuBodong <liubodong2010@126.com>
Co-authored-by: huangzhuo1949 <167434202+huangzhuo1949@users.noreply.github.com>
Co-authored-by: huangzhuo <huangzhuo1@xiaomi.com>
Co-authored-by: csurong <csurong1@gmail.com>
Co-authored-by: 傻笑zz <43721571+shaxiaozz@users.noreply.github.com>
Co-authored-by: L8ng <straydragonl@foxmail.com>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
Co-authored-by: Novice Lee <novicelee@NoviPro.local>
Co-authored-by: GuanMu <ballmanjq@gmail.com>
Co-authored-by: LittleFish-15 <58618983+LittleFish-15@users.noreply.github.com>
Co-authored-by: 诗浓 <844670992@qq.com>
Co-authored-by: luckylhb90 <luckylhb90@gmail.com>
Co-authored-by: hobo.l <hobo.l@binance.com>
Co-authored-by: Gen Sato <52241300+halogen22@users.noreply.github.com>
Co-authored-by: twwu <twwu@dify.ai>
Co-authored-by: StoneFancyX <53338920+StoneFancyX@users.noreply.github.com>
Co-authored-by: StoneFancyX <kindbin@qq.com>
Co-authored-by: Naoki KOBAYASHI <naotama@gmail.com>
Co-authored-by: kurokobo <kuro664@gmail.com>
Co-authored-by: cyflhn <cyflhn@163.com>
Co-authored-by: Yingchun Lai <laiyingchun@apache.org>
Co-authored-by: jimmyfen <757343258@qq.com>
Co-authored-by: Xuetao Song <xuetaomagicsong@gmail.com>
Co-authored-by: Panpan <wurui.dev@gmail.com>
Co-authored-by: wyy-holding <59436937+wyy-holding@users.noreply.github.com>
Co-authored-by: リイノ Lin <sorphwer@gmail.com>
Co-authored-by: Ning <accelerator314@gmail.com>
Co-authored-by: Linh Nguyen <55907715+batman0911@users.noreply.github.com>
Co-authored-by: Junjie.M <118170653@qq.com>
Co-authored-by: Ron <svcvit@gmail.com>
Co-authored-by: Novice <novice12185727@gmail.com>
Co-authored-by: NanoNova <kid1412621@gmail.com>
Co-authored-by: JaydenZhou <380774082@qq.com>
Co-authored-by: dotdotdot <823150982@qq.com>
Co-authored-by: Good Wood <slm_1990@126.com>
Co-authored-by: Ryosei Karaki <38310693+karamaru-alpha@users.noreply.github.com>
Co-authored-by: chenhuan0728 <54611342+chenhuan0728@users.noreply.github.com>
Co-authored-by: chenhuan <huan.chen0728@foxmail>
Co-authored-by: lenbo <islenbo@qq.com>
Co-authored-by: Jiang <65766008+AlwaysBluer@users.noreply.github.com>
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
Co-authored-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: zhangkun-21 <sephiroth0932@gmail.com>
Co-authored-by: hsiong <37357447+hsiong@users.noreply.github.com>
Co-authored-by: 李远军 <4842@9ji.com>
Co-authored-by: yourchanges <yourchanges@gmail.com>
Co-authored-by: David <guyuezhuying@126.com>
Co-authored-by: liuzhenghua <1090179900@qq.com>
Co-authored-by: taokuizu <taokuizu@qq.com>
Co-authored-by: Hanqing Zhao <sherry9277@gmail.com>
Co-authored-by: JimintheBox <gjwlals111@gmail.com>
Co-authored-by: wlleiiwang <1025164922@qq.com>
Co-authored-by: wlleiiwang <wlleiiwang@tencent.com>
Co-authored-by: Alex <32982705+AlexYuan997@users.noreply.github.com>
Co-authored-by: yuanlong <yuanlong@boco.com.cn>
Co-authored-by: wanttobeamaster <45583625+wanttobeamaster@users.noreply.github.com>
Co-authored-by: xiaozhiqing.xzq <xiaozhiqing.xzq@alibaba-inc.com>
Co-authored-by: Chenhe Gu <guchenhe@gmail.com>
Co-authored-by: tyounami <vkbo@qq.com>
Co-authored-by: bo.zhao <bo.zhao@iglooinsure.com>
Co-authored-by: ClSlaid <cailue@apache.org>
Co-authored-by: adru <106513264+adpanru@users.noreply.github.com>
Co-authored-by: horochx <32632779+horochx@users.noreply.github.com>
2025-04-26 04:28:30 -07:00
f6305858a5 fix(plugin_service): Add marketplace enabled check before plugin operations (#18806) 2025-04-26 08:02:53 +08:00
db7af52fcc Hotfix/create from template category (#18807) 2025-04-25 22:14:19 +08:00
09a5f8da1d feat(app_dsl_service): Refines version compatibility logic (#18798)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-25 18:42:39 +08:00
c104febf63 refactor: Apply DI to WorkflowNodeExecutionRepository. (#18794)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-25 18:05:36 +08:00
0e68e8b40a fix: embed chatbot can't drag when use mouse (#18781) 2025-04-25 17:26:16 +08:00
9a3ecc1ac8 fix: Allow advanced chat app to get workflow run detail (#18753) (#18758) 2025-04-25 16:48:38 +08:00
ec82534a1e optimize account status field hard coded (#18771)
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-25 16:47:03 +08:00
9bcc8041e9 fix: #18744 The model order defined in position.yaml in the Model Plugin is not taking effect. (#18756) 2025-04-25 16:45:48 +08:00
a944542858 fix(web): add missing 'clsx' dependency for pagination component (#18769) 2025-04-25 16:43:44 +08:00
a5e6a0dc0c enable pan by fingers (#18775) 2025-04-25 16:36:54 +08:00
a575fbca94 fix: update api rate limit on Pricing page (#18755) 2025-04-25 14:37:04 +08:00
fc4e11d127 fix: wording in README.md (#18751) 2025-04-25 13:42:10 +08:00
9988a506cd fix: enable Milvus database configuration via .env(#18707) (#18714)
Co-authored-by: jiawei.wang <jml@0603>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-25 12:12:30 +08:00
12836f9db9 Resolves #18536 Retreive conversation variables (#18581) 2025-04-25 11:52:25 +08:00
2627e221f2 fix: buildin tool provider credentials_for_provider (#18725)
Co-authored-by: hobo.l <hobo.l@binance.com>
2025-04-25 10:08:16 +08:00
5e2b3b34e5 issue: #17056 : Add a reason field to the message_replace event (#17195)
Co-authored-by: 聂政 <niezheng@pjlab.org.cn>
2025-04-25 10:08:06 +08:00
37e2f73909 [Lindorm VDB] Add the QUERY_TIMEOUT parameter to force the search query to fail. (#18613)
Co-authored-by: jiangzhijie <jiangzhijie.jzj@alibaba-inc.com>
2025-04-25 09:42:58 +08:00
759584f8c5 fix: not render conversation var in prompt editor (#18728) 2025-04-25 09:06:07 +08:00
0babdffe3e feat: support vastbase vector database (#16308) 2025-04-24 18:04:57 +08:00
cd9e6609ad fix: project version to 1.3.0 in package.json and uv.lock (#18684) 2025-04-24 17:16:41 +08:00
9982445dad Added the missing path of the webpath prefix and the prefix basepath + of static resources to remove the bug of adding more basepath. (#18658)
Co-authored-by: qingguo <qingguo@lexin.com>
2025-04-24 17:14:26 +08:00
13f647feaa fix: remove chat variable in workflow mode (#18696) 2025-04-24 16:51:19 +08:00
7b00f35a0d fix: link address error in the embedding in websites first example (#18677) 2025-04-24 14:50:12 +08:00
f6b3724268 chore(docker): bump dify-plugin-daemon to 0.0.9 (#18672)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-24 14:08:37 +08:00
212521c92b fix: sometimes error message not display complete (#18663) 2025-04-24 11:58:44 +08:00
69d3853111 fix some browser autofill password when authorization plugin (#18661) 2025-04-24 11:55:42 +08:00
d242e4b95b fix agentflow error if first variable is num (#18660)
Co-authored-by: lizb <lizb@sugon.com>
2025-04-24 11:55:29 +08:00
2266001d19 Fix some prompt typos (#18645) 2025-04-24 10:36:45 +08:00
2f141aa483 Add jp translation (#18628) 2025-04-24 10:32:21 +08:00
dd02a9ac9d fix: enhance TOC navigation with scrollable overflow for better usability (#18636) 2025-04-23 23:17:28 +08:00
b203139356 embedding in websites setting conversation_id requires hiding reset conversation button (#18623) 2025-04-23 22:57:42 +08:00
c479fcf251 feat: add missing switches (#18619) 2025-04-23 18:02:18 +08:00
d7c3e54eaa fix: improve translation of community code of conduct sentence (#18607) 2025-04-23 17:06:17 +08:00
d5fe50e471 embedding in websites support initializes to specify the conversation_id (#18602) 2025-04-23 16:48:45 +08:00
205535c8e9 chore: fix reimported (#18610) 2025-04-23 16:48:00 +08:00
e9aedc701c chore: Updates version numbers for upcoming release (#18550)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-23 16:26:55 +08:00
cf464d252d fix#18595: update workflow duplicate env variable name (#18596)
Co-authored-by: tiankuo.zhou <tiankuo.zhou@lofty.com>
2025-04-23 15:55:46 +08:00
5e09ac696c fix: add composer configuration and delete DifyClient->file_client (#18574) 2025-04-23 15:43:19 +08:00
ba9357da96 fix: handle PluginPermissionDeniedError in EndpointCreateApi (#18597) 2025-04-23 15:29:58 +08:00
c6fb879cea fix: select struct output root object show the wrong type (#18582) 2025-04-23 11:06:29 +08:00
e2cb7006c4 check metadata_filtering_conditions could be None in auto mode (#18548) 2025-04-22 17:09:33 +08:00
3737e0b087 fix: clickjacking (#18516)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-04-22 16:48:45 +08:00
a1158cc946 fix: Update prompt message content types to use Literal and add union type for content (#17136)
Co-authored-by: 朱庆超 <zhuqingchao@xiaomi.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-22 16:17:55 +08:00
404f8a790c fix conversation log raise 500 (#18534) 2025-04-22 16:08:47 +08:00
35a008af18 fix can't resize workflow run panel (#18538) 2025-04-22 16:07:51 +08:00
79bf590576 docs: update enterprise inquiry links across all README language variants (#18541) 2025-04-22 16:07:26 +08:00
61e39bccdf fix: Patch OpenTelemetry to handle None tokens (#18498)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-22 16:04:20 +08:00
6b7dfee88b fix: Validates session factory type in repository (#18497)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-22 16:04:06 +08:00
21412a8c55 docs: replace outdated Enterprise inquiry link with a new one (#18528) 2025-04-22 14:54:21 +08:00
239e40c8d5 chore: remove useless frontend code file (#18532) 2025-04-22 14:46:49 +08:00
1ce2c7f3e8 refactor: improve layout and structure of ProviderDetail component (#18523) 2025-04-22 13:57:45 +08:00
de750a67ec [Observability] feat: add metrics of http response (#18499) 2025-04-22 13:19:22 +08:00
8e6ea4d117 support load .env config from nacos (#18186) 2025-04-22 13:12:36 +08:00
ef188564f3 Mermaid analysis optimization (#18089)
Co-authored-by: luowei <glpat-EjySCyNjWiLqAED-YmwM>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-22 13:06:47 +08:00
413271eaa6 feat[plugin]:The plugin upload file change to be stored as a toolfile… (#18277) 2025-04-22 13:05:42 +08:00
eb1ce3dd6b feat: support huawei cloud vector database (#16141) 2025-04-22 13:03:35 +08:00
18e4f42c3c fix draft run node exception (#18520) 2025-04-22 13:02:38 +08:00
e0e92921b5 fix: external knowledge setting in knowledge selector (#18519) 2025-04-22 11:29:45 +08:00
94e22ba0fd feat: add search input field (#18409) 2025-04-22 11:07:18 +08:00
67eefd0ba1 fix: update search model placeholder and add translations f (#18518) 2025-04-22 11:06:36 +08:00
bf031af7b1 feat(embedded-chatbot): support overriding locale via URL params (#18509) 2025-04-22 11:03:01 +08:00
617611ee22 fix: adjust padding and background for sticky header (#18515) 2025-04-22 11:00:22 +08:00
d43b884c2a fix: filter empty marketplace collection (#18511) 2025-04-22 10:13:22 +08:00
80f5ee1eb2 fix: fix workflow as a tool confirm dialog layout issue (#18494) 2025-04-22 09:59:14 +08:00
ee30497237 fix(markdown): correctly render links with inline code (#18500) 2025-04-22 09:56:53 +08:00
be964c78ec fix: update document link based on client locale (#18493) 2025-04-21 21:00:04 +08:00
2543162dec fix: cannot delete workflow version if other version is published as a tool (#18486)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-21 17:58:22 +08:00
3136eb8e4b Fix: json update in conversation variable (#18483) 2025-04-21 17:58:01 +08:00
7b6523e54d Update Oracle db connection library and change connection pool to single connection (#18466) 2025-04-21 17:56:57 +08:00
30c051d485 fix: update Japanese translation for 'switchVersion' in plugin.ts to … (#18469) 2025-04-21 17:56:31 +08:00
f191d372f0 fix(promptMessage): correct field_serializer implementation for content serialization (#18458) 2025-04-21 15:09:49 +08:00
cb69cb2d64 fix weird syntax error (#18454) 2025-04-21 14:18:32 +08:00
5d9c67e97e fix: handle array item type error in struct output (#18452) 2025-04-21 14:15:38 +08:00
0ba37592f7 fix: update Japanese translation for document link in plugin.ts, translation for "endpointsDocLink" label (#18446) 2025-04-21 14:14:09 +08:00
e0e8667a0b fix: translate 'back' to '戻る' in Japanese plugin localization (#18444) 2025-04-21 14:12:44 +08:00
2157d9e17e fix: update Japanese translation for 'from' in plugin.ts to improve c… (#18449) 2025-04-21 14:12:21 +08:00
62e7fa1f63 "fix: Changed the translated text from '障害者' (#18427)" (#18438) 2025-04-21 12:41:31 +08:00
0ac7366cdc fix: correct unsupported German date format on document list page (#18426) 2025-04-21 10:06:21 +08:00
c768d97637 feat: update privacy policy URL and add validation for privacy policy link (#18422) 2025-04-21 10:04:33 +08:00
9bd8e62702 fix: bump the minimal node requirement to fix eslint fail (#17938) 2025-04-21 09:51:39 +08:00
9a3acdcff8 Fix: agent app debug re-rendering issue (#18389) 2025-04-19 17:25:52 +08:00
93c1ee225e fix: styles and missing imports (#18396) 2025-04-19 14:46:10 +08:00
1e32175cdc Feat/music annotation (#18391)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-19 11:59:00 +08:00
00d9f037b5 fix: correct icons for gpt-4 series from non-openai providers (#18387) 2025-04-19 10:12:12 +08:00
44a2eca449 refactor: Refactors workflow node execution handling (#18382)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-18 20:06:24 +08:00
20df6e9c00 Add docker environment variable PIP_MIRROR_URL for sandbox (#18371) 2025-04-18 18:50:03 +08:00
7ba3e599d2 fix: update reset password token when email code verify success (#18364) 2025-04-18 17:14:51 +08:00
4247a6b807 fix: reset_password security issue (#18363) 2025-04-18 05:06:09 -04:00
775dc47abe feat: llm support struct output (#17994)
Co-authored-by: twwu <twwu@dify.ai>
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
2025-04-18 16:53:43 +08:00
da9269ca97 feat: structured output (#17877) 2025-04-18 16:33:53 +08:00
d2e3744ca3 Switching from CONSOLE_API_URL to FILES_URL in word_extractor.py (#18249) 2025-04-18 16:05:48 +08:00
3914cf07e7 fix: Adjust span height and alignment in WorkplaceSelector component (#18361) 2025-04-18 16:00:12 +08:00
1e7418095f feat/TanStack-Form (#18346) 2025-04-18 15:54:22 +08:00
efe5db38ee Chore/slice workflow (#18351) 2025-04-18 13:59:12 +08:00
523efbfea5 Fix: ValueError: Formatting field not found in record: 'req_id' (#18327) 2025-04-18 09:42:38 +08:00
b96ecd072a fix: can not input R when debug (#18323) 2025-04-18 09:42:08 +08:00
28ffe7e3db fix: missing headers in some cases (#18283)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-17 21:10:58 +08:00
721294948c Diable expire_on_commit in the implemention of the WorkflowNodeExecut… (#18321)
Co-authored-by: lizb <lizb@sugon.com>
2025-04-17 21:09:19 +08:00
b287aaccec fix: Correctly render multiple think blocks in Markdown (#18310)
Co-authored-by: xzj16125 <xuzijie@noahgroup.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-04-17 19:50:41 +08:00
bbc6efd773 fix: curl request address (#18320)
Co-authored-by: devxing <devxing@gmail.com>
2025-04-17 19:50:20 +08:00
dc9c5a4bc7 make repository type be private (#18304)
Co-authored-by: lizb <lizb@sugon.com>
2025-04-17 18:49:22 +08:00
e90c532c3a fix retrival resource miss in chatflow (#18307) 2025-04-17 18:05:15 +08:00
397e2a8522 datasets api create-by-file add reranking_mode properties (#18300) 2025-04-17 18:04:43 +08:00
8f547e6340 fix(typing): validate OAuth code before processing access token (#18288) 2025-04-17 16:58:29 +08:00
defd5520ea fix: invalid new tool call creation logic during response handling in OAI-Compat model (#17781) 2025-04-17 16:52:49 +08:00
b6b608219a fix: update retrieval_model documentation (#18289) 2025-04-17 16:18:06 +08:00
22a1bc337f fix: perferred model provider not match with provider. (#18282)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-17 15:44:00 +08:00
caa179a1d3 If the DSL version is less than 0.1.5, it causes errors in an intranet environment. (#18273)
Co-authored-by: warlocgao <warlocgao@tencent.com>
2025-04-17 15:25:31 +08:00
e8e47aee21 fix: Access the text-generation app's API doc error (#18278) 2025-04-17 15:17:22 +08:00
83f1aeec1d Fix ORDER BY (score, id) error in api/core/rag/datasource/vdb/analyticdb/analyticdb_vector_sql.py line 249 (#18252) 2025-04-17 14:15:05 +08:00
6d9dd3109e feat: add a abstract layer for WorkflowNodeExcetion (#18026) 2025-04-17 12:48:52 +09:00
77fde04ef7 style: add left padding to editor component and remove unused CSS (#18247) 2025-04-17 11:47:59 +08:00
9d139fa306 fix: Could not load the logo of workflow as Tool in Agent Node (#18243) 2025-04-17 11:22:06 +08:00
6d66e3f680 fix(follow_ups): handle empty LLM responses in context (#18237) 2025-04-17 10:41:56 +08:00
e8d98e3d89 Add analyzer_params config for milvus vectordb (#18180) 2025-04-17 10:38:56 +08:00
a1d20085e6 fix: change the method of update_dataset api in document (#18197) 2025-04-17 10:10:27 +08:00
6da7e6158f Add the parameter appid to apiserver (#18224) 2025-04-16 23:07:05 +08:00
c91045a9d0 fix(fail-branch): prevent streaming output in exception branches (#17153) 2025-04-16 22:34:07 +08:00
44cdb3dcea feat: improve embedding sys.user_id and conversion id info usage (#18035) 2025-04-16 21:08:13 +08:00
358fd28c28 feat: fetch app info in plugins (#18202) 2025-04-16 20:27:29 +08:00
e912928cce fix: create child chunk (#18209)
Co-authored-by: devxing <devxing@gmail.com>
2025-04-16 19:56:21 +08:00
cac0d3c33e fix: implement robust file type checks to align with existing logic (#17557)
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
2025-04-16 19:21:50 +08:00
18f98f4fe1 fix: ruff check isoparse (#18033)
Co-authored-by: 刘江波 <jiangbo721@163.com>
2025-04-16 19:21:18 +08:00
4166f73d9d fix: page/limit param not effective (#18196) 2025-04-16 17:26:47 +08:00
bbd9fe9777 Fix:style of opening questions (#18194) 2025-04-16 17:25:25 +08:00
b7e8517b31 feat: agent strategy parameter add help information (#18192) 2025-04-16 17:24:09 +08:00
da7c8621f7 fix: agent strategy string type parameter default value invalid (#18185) 2025-04-16 17:03:18 +08:00
8cc37f3115 fix:the extraction function of the list operation node received 0 that should not be received (#18170) 2025-04-16 16:26:24 +08:00
c6e2970b65 chore: Reorganizes test file structure (#18155)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-16 16:09:17 +08:00
b006b9ac0c Http requests node add ssl verify (#18125)
Co-authored-by: lizb <lizb@sugon.com>
2025-04-16 15:59:34 +08:00
e1455cecd8 feat: add switches for jina firecrawl watercrawl (#18153) 2025-04-16 15:50:15 +08:00
b247ef85bf fix dataset api retrieval model null handling (#18151)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-04-16 15:50:06 +08:00
fcdf965037 feat: add PATCH method support in Heading component (#18160) 2025-04-16 15:48:09 +08:00
640ee80010 feat: add red corner mark to Badge component for marketplace plugins (#18162) 2025-04-16 15:15:23 +08:00
95283b4dd3 Feat/change split length method (#18097)
Co-authored-by: JzoNg <jzongcode@gmail.com>
2025-04-16 12:28:22 +08:00
2a0d7533d7 [Unit Test] Generate coverage number for UT (#18106) 2025-04-16 11:55:37 +08:00
57b28576f0 chore: remove unused poetry.toml (#18112) 2025-04-16 11:55:19 +08:00
aead48726e fix: cannot regenerate with image(#15060) (#16611)
Co-authored-by: werido <359066432@qq.com>
2025-04-16 09:56:46 +08:00
cd17ce9250 fix: start api and worker after the database has become healthy (#18109) 2025-04-16 09:54:03 +08:00
9d7357058a chore: merge lint dependency group into dev group of python packages (#18088) 2025-04-15 20:50:06 +08:00
9889aa10bd chore: speed up git checkout by removing fetch-depth 0 to avoid pulling all tags and branches (#18103) 2025-04-15 20:21:21 +08:00
d619fa1767 feat: implement blob chunk handling in plugin manager (#18101) 2025-04-15 19:23:03 +08:00
7161d7ad96 feat: add base path to resources (#17655)
Co-authored-by: fhliu4 <fhliu4@iflytek.com>
2025-04-15 17:05:50 +08:00
12de1d175c build: introduce uv as Python package manager (#16317)
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
2025-04-15 16:16:49 +08:00
f27a956c71 Feat: api page dark mode (#18078) 2025-04-15 16:13:18 +08:00
d119c7d629 ignore errors when creating duplicate indexes (#18069)
Co-authored-by: 璟义 <yangshangpo.ysp@alibaba-inc.com>
2025-04-15 15:48:16 +08:00
0a9031fd42 fix: plugin parameter aws_secret_key parameter not found (#18075) 2025-04-15 15:48:07 +08:00
438463b1c4 feat: edit question in Chat (#17961) 2025-04-15 15:37:08 +08:00
5dd9acbe44 fix: cot agent chinese json bug (#18073)
Co-authored-by: huangzhuo <huangzhuo1@xiaomi.com>
2025-04-15 15:36:44 +08:00
05b8b2a30c fix: plugin parameter type TOOLS_SELECTOR parameter not validation required (#18060) 2025-04-15 13:51:40 +08:00
6c167038af [Observability] Instrument with celery (#18029) 2025-04-15 11:35:34 +08:00
dfc123819e fix basic auth encoding (#18047)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-04-15 11:34:50 +08:00
be6a88cb77 fix: Prevents duplicate logs from SQLAlchemy engine (#18024)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-14 20:28:31 +08:00
2134a76517 feat: add minimum dify version requirement to plugins (#18022) 2025-04-14 20:09:22 +08:00
9f8947f1dd feat: plugin tool selector add tool default description (#18018) 2025-04-14 19:08:53 +08:00
85004f8510 fix(typo): workflow ops triggered from (#18019) 2025-04-14 19:08:05 +08:00
f40e22dda6 fix(docs): update API documentation to replace 'Params' with 'Path' (#18004) 2025-04-14 17:25:41 +08:00
4c99e9dc73 refactor: type improvements that doesn't modify functionality (#17970) 2025-04-14 16:06:10 +08:00
53efb2bad5 fix chat message type error (#17997)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-04-14 16:05:46 +08:00
ed63fbaf9a Feat: dataset dark mode (#17993) 2025-04-14 15:45:23 +08:00
cd7fd100a7 fix(langfuse): qusetion classify node can't see cost in langfuse (#17982) 2025-04-14 15:28:26 +08:00
d80f4c7d3b chore: eslint add sonar (#17989) 2025-04-14 15:28:20 +08:00
8f9cbe1c49 Chore/cleanup warnings (#17974) 2025-04-14 11:27:14 +08:00
f84832e0c2 feat: added export workflow as img (#17904) 2025-04-14 11:18:18 +08:00
0975c3c399 style(retry-on-node): add margin-bottom to the container (#17972) 2025-04-14 11:05:59 +08:00
1f722cde22 fix(api): Some params were ignored when creating empty Datasets through API (#17932) 2025-04-14 10:24:01 +08:00
4aecc9f090 fix: TypeError: a.variable_selector.join is not a function (#17950) 2025-04-14 09:27:08 +08:00
c9a594100b refactor & perf of files datesets/Doc.tsx and template.xx.mdx (#17951) 2025-04-13 18:12:29 +08:00
7ca497f0d6 refactor & perf: improve type safety of component PluginList (#17498) 2025-04-13 10:52:54 +08:00
cf8d15e8d5 fix: fix wrong layer adding customized tools (#17937) 2025-04-13 10:25:34 +08:00
bc57fa0619 fix: start the plugin daemon after the database has become healthy (#17928) 2025-04-13 10:21:56 +08:00
5d72003ebb Remove dead code (#17899) 2025-04-11 20:33:52 +08:00
08a693a0a0 fix: published workflow(tool) can be deleted. (#17900)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-11 19:39:09 +08:00
59b2e1ab82 Chore/add unit test for utils (#17858) 2025-04-11 17:53:18 +08:00
4ef297bf38 refactor(api): Enhance error handling in BasePluginManager (#17887) 2025-04-11 17:32:20 +08:00
8e6f6d64a4 feat: re-add prompt messages to result and chunks in llm (#17883)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-04-11 17:04:49 +08:00
5f8d20b5b2 [Observability] Integrate OpenTelemetry (#17627) 2025-04-11 17:04:06 +08:00
c285441233 fix: refactor SVG icon handling logic and optimize event listener management in embed.js to support mobile browsers #16719 (#16717) 2025-04-11 16:59:12 +08:00
316cb00ada fix: adjust margin in DatasetCard component for better layout (#17879) 2025-04-11 16:44:00 +08:00
Lao
0185f84cc8 Update the model modal:position the scrollbar further inside the modal (#17672) 2025-04-11 16:09:56 +08:00
4b0d3c3688 fix: add annotation ctrl button for annotation add (#17873) 2025-04-11 16:00:51 +08:00
91cfa90503 Fix external knowledge Issues: (#17685) (#17843) 2025-04-11 15:37:27 +08:00
cc08451eb8 fix: fix file number limit error (#17848) 2025-04-11 15:26:26 +08:00
f04d52c044 fix: autocorrect everything in api (#17859)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2025-04-11 15:24:39 +08:00
fe19cc7568 fix: in variable settings, use Textarea to replace Input. (#17864) 2025-04-11 15:24:16 +08:00
b2f5ca356a enhance(plugin): replace json.loads with Pydantic model_validate_json (#17867) 2025-04-11 15:20:03 +08:00
78da4ca024 fix: do not submit value when file input is optional (#17861) 2025-04-11 14:40:57 +08:00
3ece713a05 feat: add optional search parameters to dataset query templates i (#17857) 2025-04-11 14:27:59 +08:00
bf26f1129e fix: run button disappeared when where is no inputs in form (#17854) 2025-04-11 13:52:19 +08:00
7ee5cc80a2 fix: text.split (#17842) 2025-04-11 11:37:47 +08:00
0ccd8bdfa8 chore: Modify watercrawl translation in en-US and zh-Hans (#17828) 2025-04-11 10:14:00 +08:00
0a939feaa3 chore: remove non-existed extra msg for unstructured package (#17670) 2025-04-11 09:29:20 +08:00
1e1d457548 fix: make prompt consistent with few-show examples (#11538) 2025-04-11 09:16:26 +08:00
5541a1f80e robust for json parser (#17687) 2025-04-10 22:18:26 +08:00
0e0220bdbf fix: return null url when upload local file (#17752)
Co-authored-by: achmad-kautsar <achmad.kautsar@insignia.co.id>
2025-04-10 18:05:18 +08:00
9d20561af4 create db if not exists (#17796)
Co-authored-by: wlleiiwang <wlleiiwang@tencent.com>
2025-04-10 18:03:22 +08:00
f8145480fc fix: parallel id caused append to wrong branch (#17794) 2025-04-10 17:44:55 +08:00
605ab9e46c hotfix: Workflow page element warning problem #17787 (#17789) 2025-04-10 17:38:50 +08:00
17a26da1e6 Feat: workflow dark mode (#17785) 2025-04-10 17:15:48 +08:00
636a0ba37f chore: skip document segments fetching with non-existed dataset of DatasetDocument in add_document_to_index_task task (#17784) 2025-04-10 17:12:48 +08:00
29720b7360 fix: adjust spacing in ViewHistory and Panel components (#17766) 2025-04-10 15:53:50 +08:00
d0d02be711 feat: add consistent keyboard shortcut support and visual indicators across all app creation dialogs (#17138) 2025-04-10 14:58:39 +08:00
88cb81d3d6 fix: fix inputs lost (#17747) 2025-04-10 13:58:35 +08:00
ef27942b8a Add and modify jp translation (#17748) 2025-04-10 13:56:36 +08:00
63aab5cdd6 feat: add search params to url (#17684)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-10 11:18:43 +08:00
Qun
0e136b42a2 enhance guessing mimetype of tool file (#17640) 2025-04-10 11:14:20 +08:00
6df0215246 fix: Enhance error handling and retry logic in Apps component (#17733) 2025-04-10 11:12:34 +08:00
63ba607738 fix: 17712-get-messages-api-encountered-internal-server-error (#17716) 2025-04-10 11:09:38 +08:00
30f7118c7a Chore/slice workflow utils (#17730) 2025-04-10 10:03:19 +08:00
9d5a0fdd8a Fix create blank app (#17724) 2025-04-10 10:01:44 +08:00
0b1f938389 fix: docker compose plugin s3 config default value (#17722) 2025-04-10 09:57:50 +08:00
d3157b46ee feat(large_language_model): Adds plugin-based token counting configuration option (#17706)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Yeuoly <admin@srmxy.cn>
2025-04-09 20:52:58 +08:00
8b3be4224d revert batch query (#17707) 2025-04-09 20:25:36 +08:00
1d5c07dedb fix : PLUGIN_S3_USE_AWS_MANAGED_IAM AND PLUGIN_S3_USE_PATH_STYLE … (#17702) 2025-04-09 19:16:01 +08:00
f148f1efa2 fix: Check collection exists before drop it. (#17692)
Co-authored-by: wlleiiwang <wlleiiwang@tencent.com>
2025-04-09 19:14:32 +08:00
abfcd9d3b6 fix segment query index not effect (#17704) 2025-04-09 19:09:08 +08:00
6cf1ada36e chore: hide eslint complexity warning (#17698) 2025-04-09 18:31:31 +08:00
33324ee23d refactor: add API endpoint to list latest plugin versions and query it in a asynchronous way (#17695) 2025-04-09 17:49:27 +08:00
1363 changed files with 36922 additions and 23055 deletions

View File

@ -34,4 +34,4 @@ if you see such error message when you open this project in codespaces:
![Alt text](troubleshooting.png)
a simple workaround is change `/signin` endpoint into another one, then login with GitHub account and close the tab, then change it back to `/signin` endpoint. Then all things will be fine.
The reason is `signin` endpoint is not allowed in codespaces, details can be found [here](https://github.com/orgs/community/discussions/5204)
The reason is `signin` endpoint is not allowed in codespaces, details can be found [here](https://github.com/orgs/community/discussions/5204)

View File

@ -2,7 +2,7 @@
// README at: https://github.com/devcontainers/templates/tree/main/src/anaconda
{
"name": "Python 3.12",
"build": {
"build": {
"context": "..",
"dockerfile": "Dockerfile"
},

View File

@ -1,3 +1,3 @@
This file copied into the container along with environment.yml* from the parent
folder. This file is included to prevents the Dockerfile COPY instruction from
failing if no environment.yml is found.
folder. This file is included to prevents the Dockerfile COPY instruction from
failing if no environment.yml is found.

View File

@ -2,10 +2,10 @@
npm add -g pnpm@10.8.0
cd web && pnpm install
pipx install poetry
pipx install uv
echo 'alias start-api="cd /workspaces/dify/api && poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug"' >> ~/.bashrc
echo 'alias start-worker="cd /workspaces/dify/api && poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion"' >> ~/.bashrc
echo 'alias start-api="cd /workspaces/dify/api && uv run python -m flask run --host 0.0.0.0 --port=5001 --debug"' >> ~/.bashrc
echo 'alias start-worker="cd /workspaces/dify/api && uv run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion"' >> ~/.bashrc
echo 'alias start-web="cd /workspaces/dify/web && pnpm dev"' >> ~/.bashrc
echo 'alias start-containers="cd /workspaces/dify/docker && docker-compose -f docker-compose.middleware.yaml -p dify --env-file middleware.env up -d"' >> ~/.bashrc
echo 'alias stop-containers="cd /workspaces/dify/docker && docker-compose -f docker-compose.middleware.yaml -p dify --env-file middleware.env down"' >> ~/.bashrc

View File

@ -1,3 +1,3 @@
#!/bin/bash
cd api && poetry install
cd api && uv sync

View File

@ -5,18 +5,35 @@ root = true
# Unix-style newlines with a newline ending every file
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[*.py]
indent_size = 4
indent_style = space
[*.{yml,yaml}]
indent_style = space
indent_size = 2
[*.toml]
indent_size = 4
indent_style = space
# Markdown and MDX are whitespace sensitive languages.
# Do not remove trailing spaces.
[*.{md,mdx}]
trim_trailing_whitespace = false
# Matches multiple files with brace expansion notation
# Set default charset
[*.{js,tsx}]
charset = utf-8
indent_style = space
indent_size = 2
# Matches the exact files either package.json or .travis.yml
[{package.json,.travis.yml}]
# Matches the exact files package.json
[package.json]
indent_style = space
indent_size = 2

2
.gitattributes vendored
View File

@ -1,5 +1,5 @@
# Ensure that .sh scripts use LF as line separator, even if they are checked out
# to Windows(NTFS) file-system, by a user of Docker for Windows.
# to Windows(NTFS) file-system, by a user of Docker for Windows.
# These .sh scripts will be run from the Container after `docker compose up -d`.
# If they appear to be CRLF style, Dash from the Container will fail to execute
# them.

View File

@ -1,36 +0,0 @@
name: Setup Poetry and Python
inputs:
python-version:
description: Python version to use and the Poetry installed with
required: true
default: '3.11'
poetry-version:
description: Poetry version to set up
required: true
default: '2.0.1'
poetry-lockfile:
description: Path to the Poetry lockfile to restore cache from
required: true
default: ''
runs:
using: composite
steps:
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ inputs.python-version }}
cache: pip
- name: Install Poetry
shell: bash
run: pip install poetry==${{ inputs.poetry-version }}
- name: Restore Poetry cache
if: ${{ inputs.poetry-lockfile != '' }}
uses: actions/setup-python@v5
with:
python-version: ${{ inputs.python-version }}
cache: poetry
cache-dependency-path: ${{ inputs.poetry-lockfile }}

34
.github/actions/setup-uv/action.yml vendored Normal file
View File

@ -0,0 +1,34 @@
name: Setup UV and Python
inputs:
python-version:
description: Python version to use and the UV installed with
required: true
default: '3.12'
uv-version:
description: UV version to set up
required: true
default: '0.6.14'
uv-lockfile:
description: Path to the UV lockfile to restore cache from
required: true
default: ''
enable-cache:
required: true
default: true
runs:
using: composite
steps:
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ inputs.python-version }}
- name: Install uv
uses: astral-sh/setup-uv@v5
with:
version: ${{ inputs.uv-version }}
python-version: ${{ inputs.python-version }}
enable-cache: ${{ inputs.enable-cache }}
cache-dependency-glob: ${{ inputs.uv-lockfile }}

View File

@ -0,0 +1,22 @@
{
"Verbose": false,
"Debug": false,
"IgnoreDefaults": false,
"SpacesAfterTabs": false,
"NoColor": false,
"Exclude": [
"^web/public/vs/",
"^web/public/pdf.worker.min.mjs$",
"web/app/components/base/icons/src/vender/"
],
"AllowedContentTypes": [],
"PassedFiles": [],
"Disable": {
"EndOfLine": false,
"Indentation": false,
"IndentSize": true,
"InsertFinalNewline": false,
"TrimTrailingWhitespace": false,
"MaxLineLength": false
}
}

View File

@ -17,6 +17,9 @@ jobs:
test:
name: API Tests
runs-on: ubuntu-latest
defaults:
run:
shell: bash
strategy:
matrix:
python-version:
@ -27,40 +30,44 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Setup Poetry and Python ${{ matrix.python-version }}
uses: ./.github/actions/setup-poetry
- name: Setup UV and Python
uses: ./.github/actions/setup-uv
with:
python-version: ${{ matrix.python-version }}
poetry-lockfile: api/poetry.lock
uv-lockfile: api/uv.lock
- name: Check Poetry lockfile
run: |
poetry check -C api --lock
poetry show -C api
- name: Check UV lockfile
run: uv lock --project api --check
- name: Install dependencies
run: poetry install -C api --with dev
- name: Check dependencies in pyproject.toml
run: poetry run -P api bash dev/pytest/pytest_artifacts.sh
run: uv sync --project api --dev
- name: Run Unit tests
run: poetry run -P api bash dev/pytest/pytest_unit_tests.sh
run: |
uv run --project api bash dev/pytest/pytest_unit_tests.sh
# Extract coverage percentage and create a summary
TOTAL_COVERAGE=$(python -c 'import json; print(json.load(open("coverage.json"))["totals"]["percent_covered_display"])')
# Create a detailed coverage summary
echo "### Test Coverage Summary :test_tube:" >> $GITHUB_STEP_SUMMARY
echo "Total Coverage: ${TOTAL_COVERAGE}%" >> $GITHUB_STEP_SUMMARY
echo "\`\`\`" >> $GITHUB_STEP_SUMMARY
uv run --project api coverage report >> $GITHUB_STEP_SUMMARY
echo "\`\`\`" >> $GITHUB_STEP_SUMMARY
- name: Run dify config tests
run: poetry run -P api python dev/pytest/pytest_config_tests.py
run: uv run --project api dev/pytest/pytest_config_tests.py
- name: Cache MyPy
- name: MyPy Cache
uses: actions/cache@v4
with:
path: api/.mypy_cache
key: mypy-${{ matrix.python-version }}-${{ runner.os }}-${{ hashFiles('api/poetry.lock') }}
key: mypy-${{ matrix.python-version }}-${{ runner.os }}-${{ hashFiles('api/uv.lock') }}
- name: Run mypy
run: dev/run-mypy
- name: Run MyPy Checks
run: dev/mypy-check
- name: Set up dotenvs
run: |
@ -80,4 +87,7 @@ jobs:
ssrf_proxy
- name: Run Workflow
run: poetry run -P api bash dev/pytest/pytest_workflow.sh
run: uv run --project api bash dev/pytest/pytest_workflow.sh
- name: Run Tool
run: uv run --project api bash dev/pytest/pytest_tools.sh

View File

@ -24,13 +24,13 @@ jobs:
fetch-depth: 0
persist-credentials: false
- name: Setup Poetry and Python
uses: ./.github/actions/setup-poetry
- name: Setup UV and Python
uses: ./.github/actions/setup-uv
with:
poetry-lockfile: api/poetry.lock
uv-lockfile: api/uv.lock
- name: Install dependencies
run: poetry install -C api
run: uv sync --project api
- name: Prepare middleware env
run: |
@ -54,6 +54,4 @@ jobs:
- name: Run DB Migration
env:
DEBUG: true
run: |
cd api
poetry run python -m flask upgrade-db
run: uv run --directory api flask upgrade-db

View File

@ -42,6 +42,7 @@ jobs:
with:
push: false
context: "{{defaultContext}}:${{ matrix.context }}"
file: "${{ matrix.file }}"
platforms: ${{ matrix.platform }}
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -9,6 +9,12 @@ concurrency:
group: style-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
permissions:
checks: write
statuses: write
contents: read
jobs:
python-style:
name: Python Style
@ -18,7 +24,6 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Check changed files
@ -29,24 +34,27 @@ jobs:
api/**
.github/workflows/style.yml
- name: Setup Poetry and Python
- name: Setup UV and Python
if: steps.changed-files.outputs.any_changed == 'true'
uses: ./.github/actions/setup-poetry
uses: ./.github/actions/setup-uv
with:
uv-lockfile: api/uv.lock
enable-cache: false
- name: Install dependencies
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry install -C api --only lint
run: uv sync --project api --dev
- name: Ruff check
if: steps.changed-files.outputs.any_changed == 'true'
run: |
poetry run -C api ruff --version
poetry run -C api ruff check ./
poetry run -C api ruff format --check ./
uv run --directory api ruff --version
uv run --directory api ruff check --diff ./
uv run --directory api ruff format --check --diff ./
- name: Dotenv check
if: steps.changed-files.outputs.any_changed == 'true'
run: poetry run -P api dotenv-linter ./api/.env.example ./web/.env.example
run: uv run --project api dotenv-linter ./api/.env.example ./web/.env.example
- name: Lint hints
if: failure()
@ -63,7 +71,6 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Check changed files
@ -102,7 +109,6 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Check changed files
@ -133,7 +139,6 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Check changed files
@ -164,3 +169,14 @@ jobs:
VALIDATE_DOCKERFILE_HADOLINT: true
VALIDATE_XML: true
VALIDATE_YAML: true
- name: EditorConfig checks
uses: super-linter/super-linter/slim@v7
env:
DEFAULT_BRANCH: main
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
IGNORE_GENERATED_FILES: true
IGNORE_GITIGNORED_FILES: true
# EditorConfig validation
VALIDATE_EDITORCONFIG: true
EDITORCONFIG_FILE_NAME: editorconfig-checker.json

View File

@ -27,7 +27,6 @@ jobs:
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Use Node.js ${{ matrix.node-version }}

View File

@ -8,7 +8,7 @@ on:
- api/core/rag/datasource/**
- docker/**
- .github/workflows/vdb-tests.yml
- api/poetry.lock
- api/uv.lock
- api/pyproject.toml
concurrency:
@ -29,22 +29,19 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Setup Poetry and Python ${{ matrix.python-version }}
uses: ./.github/actions/setup-poetry
- name: Setup UV and Python
uses: ./.github/actions/setup-uv
with:
python-version: ${{ matrix.python-version }}
poetry-lockfile: api/poetry.lock
uv-lockfile: api/uv.lock
- name: Check Poetry lockfile
run: |
poetry check -C api --lock
poetry show -C api
- name: Check UV lockfile
run: uv lock --project api --check
- name: Install dependencies
run: poetry install -C api --with dev
run: uv sync --project api --dev
- name: Set up dotenvs
run: |
@ -80,7 +77,7 @@ jobs:
elasticsearch
- name: Check TiDB Ready
run: poetry run -P api python api/tests/integration_tests/vdb/tidb_vector/check_tiflash_ready.py
run: uv run --project api python api/tests/integration_tests/vdb/tidb_vector/check_tiflash_ready.py
- name: Test Vector Stores
run: poetry run -P api bash dev/pytest/pytest_vdb.sh
run: uv run --project api bash dev/pytest/pytest_vdb.sh

View File

@ -23,7 +23,6 @@ jobs:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Check changed files

1
.gitignore vendored
View File

@ -46,6 +46,7 @@ htmlcov/
.cache
nosetests.xml
coverage.xml
coverage.json
*.cover
*.py,cover
.hypothesis/

View File

@ -6,7 +6,7 @@
本指南和 Dify 一样在不断完善中。如果有任何滞后于项目实际情况的地方,恳请谅解,我们也欢迎任何改进建议。
关于许可证,请花一分钟阅读我们简短的[许可和贡献者协议](./LICENSE)。社区同时也遵循[行为准则](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md)。
关于许可证,请花一分钟阅读我们简短的[许可和贡献者协议](./LICENSE)。同时也遵循社区[行为准则](https://github.com/langgenius/.github/blob/main/CODE_OF_CONDUCT.md)。
## 开始之前

View File

@ -90,4 +90,4 @@ Recomendamos revisar este documento cuidadosamente antes de proceder con la conf
No dudes en contactarnos si encuentras algún problema durante el proceso de configuración.
## Obteniendo Ayuda
Si alguna vez te quedas atascado o tienes una pregunta urgente mientras contribuyes, simplemente envíanos tus consultas a través del issue relacionado de GitHub, o únete a nuestro [Discord](https://discord.gg/8Tpq4AcN9c) para una charla rápida.
Si alguna vez te quedas atascado o tienes una pregunta urgente mientras contribuyes, simplemente envíanos tus consultas a través del issue relacionado de GitHub, o únete a nuestro [Discord](https://discord.gg/8Tpq4AcN9c) para una charla rápida.

View File

@ -90,4 +90,4 @@ Nous recommandons de revoir attentivement ce document avant de procéder à la c
N'hésitez pas à nous contacter si vous rencontrez des problèmes pendant le processus de configuration.
## Obtenir de l'aide
Si jamais vous êtes bloqué ou avez une question urgente en contribuant, envoyez-nous simplement vos questions via le problème GitHub concerné, ou rejoignez notre [Discord](https://discord.gg/8Tpq4AcN9c) pour une discussion rapide.
Si jamais vous êtes bloqué ou avez une question urgente en contribuant, envoyez-nous simplement vos questions via le problème GitHub concerné, ou rejoignez notre [Discord](https://discord.gg/8Tpq4AcN9c) pour une discussion rapide.

View File

@ -90,4 +90,4 @@ PR 설명에 기존 이슈를 연결하거나 새 이슈를 여는 것을 잊지
설정 과정에서 문제가 발생하면 언제든지 연락해 주세요.
## 도움 받기
기여하는 동안 막히거나 긴급한 질문이 있으면, 관련 GitHub 이슈를 통해 질문을 보내거나, 빠른 대화를 위해 우리의 [Discord](https://discord.gg/8Tpq4AcN9c)에 참여하세요.
기여하는 동안 막히거나 긴급한 질문이 있으면, 관련 GitHub 이슈를 통해 질문을 보내거나, 빠른 대화를 위해 우리의 [Discord](https://discord.gg/8Tpq4AcN9c)에 참여하세요.

View File

@ -90,4 +90,4 @@ Recomendamos revisar este documento cuidadosamente antes de prosseguir com a con
Sinta-se à vontade para entrar em contato se encontrar quaisquer problemas durante o processo de configuração.
## Obtendo Ajuda
Se você ficar preso ou tiver uma dúvida urgente enquanto contribui, simplesmente envie suas perguntas através do problema relacionado no GitHub, ou entre no nosso [Discord](https://discord.gg/8Tpq4AcN9c) para uma conversa rápida.
Se você ficar preso ou tiver uma dúvida urgente enquanto contribui, simplesmente envie suas perguntas através do problema relacionado no GitHub, ou entre no nosso [Discord](https://discord.gg/8Tpq4AcN9c) para uma conversa rápida.

View File

@ -90,4 +90,4 @@ Kuruluma geçmeden önce bu belgeyi dikkatlice incelemenizi öneririz, çünkü
Kurulum süreci sırasında herhangi bir sorunla karşılaşırsanız bizimle iletişime geçmekten çekinmeyin.
## Yardım Almak
Katkıda bulunurken takılırsanız veya yanıcı bir sorunuz olursa, sorularınızı ilgili GitHub sorunu aracılığıyla bize gönderin veya hızlı bir sohbet için [Discord'umuza](https://discord.gg/8Tpq4AcN9c) katılın.
Katkıda bulunurken takılırsanız veya yanıcı bir sorunuz olursa, sorularınızı ilgili GitHub sorunu aracılığıyla bize gönderin veya hızlı bir sohbet için [Discord'umuza](https://discord.gg/8Tpq4AcN9c) katılın.

View File

@ -8,7 +8,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Self-hosting</a> ·
<a href="https://docs.dify.ai">Documentation</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Enterprise inquiry</a>
<a href="https://dify.ai/pricing">Dify edition overview</a>
</p>
<p align="center">
@ -54,7 +54,7 @@
<a href="./README_BN.md"><img alt="README in বাংলা" src="https://img.shields.io/badge/বাংলা-d9d9d9"></a>
</p>
Dify is an open-source LLM app development platform. Its intuitive interface combines agentic AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
Dify is an open-source LLM app development platform. Its intuitive interface combines agentic AI workflow, RAG pipeline, agent capabilities, model management, observability features, and more, allowing you to quickly move from prototype to production.
## Quick start
@ -188,7 +188,7 @@ All of Dify's offerings come with corresponding APIs, so you could effortlessly
- **Dify for enterprise / organizations</br>**
We provide additional enterprise-centric features. [Log your questions for us through this chatbot](https://udify.app/chat/22L1zSxg6yW1cWQg) or [send us an email](mailto:business@dify.ai?subject=[GitHub]Business%20License%20Inquiry) to discuss enterprise needs. </br>
> For startups and small businesses using AWS, check out [Dify Premium on AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) and deploy it to your own AWS VPC with one-click. It's an affordable AMI offering with the option to create apps with custom logo and branding.
> For startups and small businesses using AWS, check out [Dify Premium on AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) and deploy it to your own AWS VPC with one click. It's an affordable AMI offering with the option to create apps with custom logo and branding.
## Staying ahead
@ -233,7 +233,7 @@ Deploy Dify to AWS with [CDK](https://aws.amazon.com/cdk/)
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
At the same time, please consider supporting Dify by sharing it on social media and at events and conferences.
> We are looking for contributors to help with translating Dify to languages other than Mandarin or English. If you are interested in helping, please see the [i18n README](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) for more information, and leave us a comment in the `global-users` channel of our [Discord Community Server](https://discord.gg/8Tpq4AcN9c).
> We are looking for contributors to help translate Dify into languages other than Mandarin or English. If you are interested in helping, please see the [i18n README](https://github.com/langgenius/dify/blob/main/web/i18n/README.md) for more information, and leave us a comment in the `global-users` channel of our [Discord Community Server](https://discord.gg/8Tpq4AcN9c).
## Community & contact

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">الاستضافة الذاتية</a> ·
<a href="https://docs.dify.ai">التوثيق</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">استفسار الشركات (للإنجليزية فقط)</a>
<a href="https://dify.ai/pricing">نظرة عامة على منتجات Dify</a>
</p>
<p align="center">

View File

@ -8,7 +8,7 @@
<a href="https://cloud.dify.ai">ডিফাই ক্লাউড</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">সেল্ফ-হোস্টিং</a> ·
<a href="https://docs.dify.ai">ডকুমেন্টেশন</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">ব্যাবসায়িক অনুসন্ধান</a>
<a href="https://dify.ai/pricing">Dify পণ্যের রূপভেদ</a>
</p>
<p align="center">

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify 云服务</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">自托管</a> ·
<a href="https://docs.dify.ai">文档</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">(需用英文)常见问题解答 / 联系团队</a>
<a href="https://dify.ai/pricing">Dify 产品形态总览</a>
</div>
<p align="center">

View File

@ -8,7 +8,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Selbstgehostetes</a> ·
<a href="https://docs.dify.ai">Dokumentation</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Anfrage an Unternehmen</a>
<a href="https://dify.ai/pricing">Überblick über die Dify-Produkte</a>
</p>
<p align="center">

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Auto-alojamiento</a> ·
<a href="https://docs.dify.ai">Documentación</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Consultas empresariales (en inglés)</a>
<a href="https://dify.ai/pricing">Resumen de las ediciones de Dify</a>
</p>
<p align="center">

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Auto-hébergement</a> ·
<a href="https://docs.dify.ai">Documentation</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Demande dentreprise (en anglais seulement)</a>
<a href="https://dify.ai/pricing">Présentation des différentes offres Dify</a>
</p>
<p align="center">

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">セルフホスティング</a> ·
<a href="https://docs.dify.ai">ドキュメント</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">企業のお問い合わせ(英語のみ)</a>
<a href="https://dify.ai/pricing">Difyの各種エディションについて</a>
</p>
<p align="center">

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Self-hosting</a> ·
<a href="https://docs.dify.ai">Documentation</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Commercial enquiries</a>
<a href="https://dify.ai/pricing">Dify product editions</a>
</p>
<p align="center">

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify 클라우드</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">셀프-호스팅</a> ·
<a href="https://docs.dify.ai">문서</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">기업 문의 (영어만 가능)</a>
<a href="https://dify.ai/pricing">Dify 제품 에디션 안내</a>
</p>
<p align="center">

View File

@ -8,7 +8,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Auto-hospedagem</a> ·
<a href="https://docs.dify.ai">Documentação</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Consultas empresariais</a>
<a href="https://dify.ai/pricing">Visão geral das edições do Dify</a>
</p>
<p align="center">

View File

@ -1,259 +1,259 @@
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Predstavljamo nalaganje datotek Dify Workflow: znova ustvarite Google NotebookLM Podcast</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Samostojno gostovanje</a> ·
<a href="https://docs.dify.ai">Dokumentacija</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Povpraševanje za podjetja</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
<a href="https://www.linkedin.com/company/langgenius/" target="_blank">
<img src="https://custom-icon-badges.demolab.com/badge/LinkedIn-0A66C2?logo=linkedin-white&logoColor=fff"
alt="follow on LinkedIn"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
<a href="./README_SI.md"><img alt="README Slovenščina" src="https://img.shields.io/badge/Sloven%C5%A1%C4%8Dina-d9d9d9"></a>
<a href="./README_BN.md"><img alt="README in বাংলা" src="https://img.shields.io/badge/বাংলা-d9d9d9"></a>
</p>
Dify je odprtokodna platforma za razvoj aplikacij LLM. Njegov intuitivni vmesnik združuje agentski potek dela z umetno inteligenco, cevovod RAG, zmogljivosti agentov, upravljanje modelov, funkcije opazovanja in več, kar vam omogoča hiter prehod od prototipa do proizvodnje.
## Hitri začetek
> Preden namestite Dify, se prepričajte, da vaša naprava izpolnjuje naslednje minimalne sistemske zahteve:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
</br>
Najlažji način za zagon strežnika Dify je prek docker compose . Preden zaženete Dify z naslednjimi ukazi, se prepričajte, da sta Docker in Docker Compose nameščena na vašem računalniku:
```bash
cd dify
cd docker
cp .env.example .env
docker compose up -d
```
Po zagonu lahko dostopate do nadzorne plošče Dify v brskalniku na [http://localhost/install](http://localhost/install) in začnete postopek inicializacije.
#### Iskanje pomoči
Prosimo, glejte naša pogosta vprašanja [FAQ](https://docs.dify.ai/getting-started/install-self-hosted/faqs) če naletite na težave pri nastavitvi Dify. Če imate še vedno težave, se obrnite na [skupnost ali nas](#community--contact).
> Če želite prispevati k Difyju ali narediti dodaten razvoj, glejte naš vodnik za [uvajanje iz izvorne kode](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Ključne značilnosti
**1. Potek dela**:
Zgradite in preizkusite zmogljive poteke dela AI na vizualnem platnu, pri čemer izkoristite vse naslednje funkcije in več.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Celovita podpora za modele**:
Brezhibna integracija s stotinami lastniških/odprtokodnih LLM-jev ducatov ponudnikov sklepanja in samostojnih rešitev, ki pokrivajo GPT, Mistral, Llama3 in vse modele, združljive z API-jem OpenAI. Celoten seznam podprtih ponudnikov modelov najdete [tukaj](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. Prompt IDE**:
intuitivni vmesnik za ustvarjanje pozivov, primerjavo zmogljivosti modela in dodajanje dodatnih funkcij, kot je pretvorba besedila v govor, aplikaciji, ki temelji na klepetu.
**4. RAG Pipeline**:
E Obsežne zmogljivosti RAG, ki pokrivajo vse od vnosa dokumenta do priklica, s podporo za ekstrakcijo besedila iz datotek PDF, PPT in drugih običajnih formatov dokumentov.
**5. Agent capabilities**:
definirate lahko agente, ki temeljijo na klicanju funkcij LLM ali ReAct, in dodate vnaprej izdelana orodja ali orodja po meri za agenta. Dify ponuja več kot 50 vgrajenih orodij za agente AI, kot so Google Search, DALL·E, Stable Diffusion in WolframAlpha.
**6. LLMOps**:
Spremljajte in analizirajte dnevnike aplikacij in učinkovitost skozi čas. Pozive, nabore podatkov in modele lahko nenehno izboljšujete na podlagi proizvodnih podatkov in opomb.
**7. Backend-as-a-Service**:
AVse ponudbe Difyja so opremljene z ustreznimi API-ji, tako da lahko Dify brez težav integrirate v svojo poslovno logiko.
## Primerjava Funkcij
<table style="width: 100%;">
<tr>
<th align="center">Funkcija</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Programski pristop</td>
<td align="center">API + usmerjeno v aplikacije</td>
<td align="center">Python koda</td>
<td align="center">Usmerjeno v aplikacije</td>
<td align="center">Usmerjeno v API</td>
</tr>
<tr>
<td align="center">Podprti LLM-ji</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Samo OpenAI</td>
</tr>
<tr>
<td align="center">RAG pogon</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Potek dela</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Spremljanje</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Funkcija za podjetja (SSO/nadzor dostopa)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Lokalna namestitev</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## Uporaba Dify
- **Cloud </br>**
Gostimo storitev Dify Cloud za vsakogar, ki jo lahko preizkusite brez nastavitev. Zagotavlja vse zmožnosti različice za samostojno namestitev in vključuje 200 brezplačnih klicev GPT-4 v načrtu peskovnika.
- **Self-hosting Dify Community Edition</br>**
Hitro zaženite Dify v svojem okolju s tem [začetnim vodnikom](#quick-start) . Za dodatne reference in podrobnejša navodila uporabite našo [dokumentacijo](https://docs.dify.ai) .
- **Dify za podjetja/organizacije</br>**
Ponujamo dodatne funkcije, osredotočene na podjetja. Zabeležite svoja vprašanja prek tega klepetalnega robota ali nam pošljite e-pošto, da se pogovorimo o potrebah podjetja. </br>
> Za novoustanovljena podjetja in mala podjetja, ki uporabljajo AWS, si oglejte Dify Premium na AWS Marketplace in ga z enim klikom uvedite v svoj AWS VPC. To je cenovno ugodna ponudba AMI z možnostjo ustvarjanja aplikacij z logotipom in blagovno znamko po meri.
## Staying ahead
Star Dify on GitHub and be instantly notified of new releases.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Napredne nastavitve
Če morate prilagoditi konfiguracijo, si oglejte komentarje v naši datoteki .env.example in posodobite ustrezne vrednosti v svoji .env datoteki. Poleg tega boste morda morali prilagoditi docker-compose.yamlsamo datoteko, na primer spremeniti različice slike, preslikave vrat ali namestitve nosilca, glede na vaše specifično okolje in zahteve za uvajanje. Po kakršnih koli spremembah ponovno zaženite docker-compose up -d. Celoten seznam razpoložljivih spremenljivk okolja najdete tukaj .
Če želite konfigurirati visoko razpoložljivo nastavitev, so na voljo Helm Charts in datoteke YAML, ki jih prispeva skupnost, ki omogočajo uvedbo Difyja v Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
- [YAML file by @wyy-holding](https://github.com/wyy-holding/dify-k8s)
#### Uporaba Terraform za uvajanje
namestite Dify v Cloud Platform z enim klikom z uporabo [terraform](https://www.terraform.io/)
##### Azure Global
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Uporaba AWS CDK za uvajanje
Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Prispevam
Za tiste, ki bi radi prispevali kodo, si oglejte naš vodnik za prispevke . Hkrati vas prosimo, da podprete Dify tako, da ga delite na družbenih medijih ter na dogodkih in konferencah.
> Iščemo sodelavce za pomoč pri prevajanju Difyja v jezike, ki niso mandarinščina ali angleščina. Če želite pomagati, si oglejte i18n README za več informacij in nam pustite komentar v global-userskanalu našega strežnika skupnosti Discord .
## Skupnost in stik
* [Github Discussion](https://github.com/langgenius/dify/discussions). Najboljše za: izmenjavo povratnih informacij in postavljanje vprašanj.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Najboljše za: hrošče, na katere naletite pri uporabi Dify.AI, in predloge funkcij. Oglejte si naš [vodnik za prispevke](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
* [X(Twitter)](https://twitter.com/dify_ai). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
**Contributors**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Varnostno razkritje
Zaradi zaščite vaše zasebnosti se izogibajte objavljanju varnostnih vprašanj na GitHub. Namesto tega pošljite vprašanja na security@dify.ai in zagotovili vam bomo podrobnejši odgovor.
## Licenca
To skladišče je na voljo pod [odprtokodno licenco Dify](LICENSE) , ki je v bistvu Apache 2.0 z nekaj dodatnimi omejitvami.
![cover-v5-optimized](https://github.com/langgenius/dify/assets/13230914/f9e19af5-61ba-4119-b926-d10c4c06ebab)
<p align="center">
📌 <a href="https://dify.ai/blog/introducing-dify-workflow-file-upload-a-demo-on-ai-podcast">Predstavljamo nalaganje datotek Dify Workflow: znova ustvarite Google NotebookLM Podcast</a>
</p>
<p align="center">
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Samostojno gostovanje</a> ·
<a href="https://docs.dify.ai">Dokumentacija</a> ·
<a href="https://dify.ai/pricing">Pregled ponudb izdelkov Dify</a>
</p>
<p align="center">
<a href="https://dify.ai" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/Product-F04438"></a>
<a href="https://dify.ai/pricing" target="_blank">
<img alt="Static Badge" src="https://img.shields.io/badge/free-pricing?logo=free&color=%20%23155EEF&label=pricing&labelColor=%20%23528bff"></a>
<a href="https://discord.gg/FngNHpbcY7" target="_blank">
<img src="https://img.shields.io/discord/1082486657678311454?logo=discord&labelColor=%20%235462eb&logoColor=%20%23f5f5f5&color=%20%235462eb"
alt="chat on Discord"></a>
<a href="https://twitter.com/intent/follow?screen_name=dify_ai" target="_blank">
<img src="https://img.shields.io/twitter/follow/dify_ai?logo=X&color=%20%23f5f5f5"
alt="follow on X(Twitter)"></a>
<a href="https://www.linkedin.com/company/langgenius/" target="_blank">
<img src="https://custom-icon-badges.demolab.com/badge/LinkedIn-0A66C2?logo=linkedin-white&logoColor=fff"
alt="follow on LinkedIn"></a>
<a href="https://hub.docker.com/u/langgenius" target="_blank">
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/langgenius/dify-web?labelColor=%20%23FDB062&color=%20%23f79009"></a>
<a href="https://github.com/langgenius/dify/graphs/commit-activity" target="_blank">
<img alt="Commits last month" src="https://img.shields.io/github/commit-activity/m/langgenius/dify?labelColor=%20%2332b583&color=%20%2312b76a"></a>
<a href="https://github.com/langgenius/dify/" target="_blank">
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
</p>
<p align="center">
<a href="./README.md"><img alt="README in English" src="https://img.shields.io/badge/English-d9d9d9"></a>
<a href="./README_CN.md"><img alt="简体中文版自述文件" src="https://img.shields.io/badge/简体中文-d9d9d9"></a>
<a href="./README_JA.md"><img alt="日本語のREADME" src="https://img.shields.io/badge/日本語-d9d9d9"></a>
<a href="./README_ES.md"><img alt="README en Español" src="https://img.shields.io/badge/Español-d9d9d9"></a>
<a href="./README_FR.md"><img alt="README en Français" src="https://img.shields.io/badge/Français-d9d9d9"></a>
<a href="./README_KL.md"><img alt="README tlhIngan Hol" src="https://img.shields.io/badge/Klingon-d9d9d9"></a>
<a href="./README_KR.md"><img alt="README in Korean" src="https://img.shields.io/badge/한국어-d9d9d9"></a>
<a href="./README_AR.md"><img alt="README بالعربية" src="https://img.shields.io/badge/العربية-d9d9d9"></a>
<a href="./README_TR.md"><img alt="Türkçe README" src="https://img.shields.io/badge/Türkçe-d9d9d9"></a>
<a href="./README_VI.md"><img alt="README Tiếng Việt" src="https://img.shields.io/badge/Ti%E1%BA%BFng%20Vi%E1%BB%87t-d9d9d9"></a>
<a href="./README_SI.md"><img alt="README Slovenščina" src="https://img.shields.io/badge/Sloven%C5%A1%C4%8Dina-d9d9d9"></a>
<a href="./README_BN.md"><img alt="README in বাংলা" src="https://img.shields.io/badge/বাংলা-d9d9d9"></a>
</p>
Dify je odprtokodna platforma za razvoj aplikacij LLM. Njegov intuitivni vmesnik združuje agentski potek dela z umetno inteligenco, cevovod RAG, zmogljivosti agentov, upravljanje modelov, funkcije opazovanja in več, kar vam omogoča hiter prehod od prototipa do proizvodnje.
## Hitri začetek
> Preden namestite Dify, se prepričajte, da vaša naprava izpolnjuje naslednje minimalne sistemske zahteve:
>
>- CPU >= 2 Core
>- RAM >= 4 GiB
</br>
Najlažji način za zagon strežnika Dify je prek docker compose . Preden zaženete Dify z naslednjimi ukazi, se prepričajte, da sta Docker in Docker Compose nameščena na vašem računalniku:
```bash
cd dify
cd docker
cp .env.example .env
docker compose up -d
```
Po zagonu lahko dostopate do nadzorne plošče Dify v brskalniku na [http://localhost/install](http://localhost/install) in začnete postopek inicializacije.
#### Iskanje pomoči
Prosimo, glejte naša pogosta vprašanja [FAQ](https://docs.dify.ai/getting-started/install-self-hosted/faqs) če naletite na težave pri nastavitvi Dify. Če imate še vedno težave, se obrnite na [skupnost ali nas](#community--contact).
> Če želite prispevati k Difyju ali narediti dodaten razvoj, glejte naš vodnik za [uvajanje iz izvorne kode](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
## Ključne značilnosti
**1. Potek dela**:
Zgradite in preizkusite zmogljive poteke dela AI na vizualnem platnu, pri čemer izkoristite vse naslednje funkcije in več.
https://github.com/langgenius/dify/assets/13230914/356df23e-1604-483d-80a6-9517ece318aa
**2. Celovita podpora za modele**:
Brezhibna integracija s stotinami lastniških/odprtokodnih LLM-jev ducatov ponudnikov sklepanja in samostojnih rešitev, ki pokrivajo GPT, Mistral, Llama3 in vse modele, združljive z API-jem OpenAI. Celoten seznam podprtih ponudnikov modelov najdete [tukaj](https://docs.dify.ai/getting-started/readme/model-providers).
![providers-v5](https://github.com/langgenius/dify/assets/13230914/5a17bdbe-097a-4100-8363-40255b70f6e3)
**3. Prompt IDE**:
intuitivni vmesnik za ustvarjanje pozivov, primerjavo zmogljivosti modela in dodajanje dodatnih funkcij, kot je pretvorba besedila v govor, aplikaciji, ki temelji na klepetu.
**4. RAG Pipeline**:
E Obsežne zmogljivosti RAG, ki pokrivajo vse od vnosa dokumenta do priklica, s podporo za ekstrakcijo besedila iz datotek PDF, PPT in drugih običajnih formatov dokumentov.
**5. Agent capabilities**:
definirate lahko agente, ki temeljijo na klicanju funkcij LLM ali ReAct, in dodate vnaprej izdelana orodja ali orodja po meri za agenta. Dify ponuja več kot 50 vgrajenih orodij za agente AI, kot so Google Search, DALL·E, Stable Diffusion in WolframAlpha.
**6. LLMOps**:
Spremljajte in analizirajte dnevnike aplikacij in učinkovitost skozi čas. Pozive, nabore podatkov in modele lahko nenehno izboljšujete na podlagi proizvodnih podatkov in opomb.
**7. Backend-as-a-Service**:
AVse ponudbe Difyja so opremljene z ustreznimi API-ji, tako da lahko Dify brez težav integrirate v svojo poslovno logiko.
## Primerjava Funkcij
<table style="width: 100%;">
<tr>
<th align="center">Funkcija</th>
<th align="center">Dify.AI</th>
<th align="center">LangChain</th>
<th align="center">Flowise</th>
<th align="center">OpenAI Assistants API</th>
</tr>
<tr>
<td align="center">Programski pristop</td>
<td align="center">API + usmerjeno v aplikacije</td>
<td align="center">Python koda</td>
<td align="center">Usmerjeno v aplikacije</td>
<td align="center">Usmerjeno v API</td>
</tr>
<tr>
<td align="center">Podprti LLM-ji</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Bogata izbira</td>
<td align="center">Samo OpenAI</td>
</tr>
<tr>
<td align="center">RAG pogon</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Agent</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Potek dela</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Spremljanje</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Funkcija za podjetja (SSO/nadzor dostopa)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Lokalna namestitev</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
</table>
## Uporaba Dify
- **Cloud </br>**
Gostimo storitev Dify Cloud za vsakogar, ki jo lahko preizkusite brez nastavitev. Zagotavlja vse zmožnosti različice za samostojno namestitev in vključuje 200 brezplačnih klicev GPT-4 v načrtu peskovnika.
- **Self-hosting Dify Community Edition</br>**
Hitro zaženite Dify v svojem okolju s tem [začetnim vodnikom](#quick-start) . Za dodatne reference in podrobnejša navodila uporabite našo [dokumentacijo](https://docs.dify.ai) .
- **Dify za podjetja/organizacije</br>**
Ponujamo dodatne funkcije, osredotočene na podjetja. Zabeležite svoja vprašanja prek tega klepetalnega robota ali nam pošljite e-pošto, da se pogovorimo o potrebah podjetja. </br>
> Za novoustanovljena podjetja in mala podjetja, ki uporabljajo AWS, si oglejte Dify Premium na AWS Marketplace in ga z enim klikom uvedite v svoj AWS VPC. To je cenovno ugodna ponudba AMI z možnostjo ustvarjanja aplikacij z logotipom in blagovno znamko po meri.
## Staying ahead
Star Dify on GitHub and be instantly notified of new releases.
![star-us](https://github.com/langgenius/dify/assets/13230914/b823edc1-6388-4e25-ad45-2f6b187adbb4)
## Napredne nastavitve
Če morate prilagoditi konfiguracijo, si oglejte komentarje v naši datoteki .env.example in posodobite ustrezne vrednosti v svoji .env datoteki. Poleg tega boste morda morali prilagoditi docker-compose.yamlsamo datoteko, na primer spremeniti različice slike, preslikave vrat ali namestitve nosilca, glede na vaše specifično okolje in zahteve za uvajanje. Po kakršnih koli spremembah ponovno zaženite docker-compose up -d. Celoten seznam razpoložljivih spremenljivk okolja najdete tukaj .
Če želite konfigurirati visoko razpoložljivo nastavitev, so na voljo Helm Charts in datoteke YAML, ki jih prispeva skupnost, ki omogočajo uvedbo Difyja v Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)
- [Helm Chart by @BorisPolonsky](https://github.com/BorisPolonsky/dify-helm)
- [YAML file by @Winson-030](https://github.com/Winson-030/dify-kubernetes)
- [YAML file by @wyy-holding](https://github.com/wyy-holding/dify-k8s)
#### Uporaba Terraform za uvajanje
namestite Dify v Cloud Platform z enim klikom z uporabo [terraform](https://www.terraform.io/)
##### Azure Global
- [Azure Terraform by @nikawang](https://github.com/nikawang/dify-azure-terraform)
##### Google Cloud
- [Google Cloud Terraform by @sotazum](https://github.com/DeNA/dify-google-cloud-terraform)
#### Uporaba AWS CDK za uvajanje
Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
## Prispevam
Za tiste, ki bi radi prispevali kodo, si oglejte naš vodnik za prispevke . Hkrati vas prosimo, da podprete Dify tako, da ga delite na družbenih medijih ter na dogodkih in konferencah.
> Iščemo sodelavce za pomoč pri prevajanju Difyja v jezike, ki niso mandarinščina ali angleščina. Če želite pomagati, si oglejte i18n README za več informacij in nam pustite komentar v global-userskanalu našega strežnika skupnosti Discord .
## Skupnost in stik
* [Github Discussion](https://github.com/langgenius/dify/discussions). Najboljše za: izmenjavo povratnih informacij in postavljanje vprašanj.
* [GitHub Issues](https://github.com/langgenius/dify/issues). Najboljše za: hrošče, na katere naletite pri uporabi Dify.AI, in predloge funkcij. Oglejte si naš [vodnik za prispevke](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
* [Discord](https://discord.gg/FngNHpbcY7). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
* [X(Twitter)](https://twitter.com/dify_ai). Najboljše za: deljenje vaših aplikacij in druženje s skupnostjo.
**Contributors**
<a href="https://github.com/langgenius/dify/graphs/contributors">
<img src="https://contrib.rocks/image?repo=langgenius/dify" />
</a>
## Star history
[![Star History Chart](https://api.star-history.com/svg?repos=langgenius/dify&type=Date)](https://star-history.com/#langgenius/dify&Date)
## Varnostno razkritje
Zaradi zaščite vaše zasebnosti se izogibajte objavljanju varnostnih vprašanj na GitHub. Namesto tega pošljite vprašanja na security@dify.ai in zagotovili vam bomo podrobnejši odgovor.
## Licenca
To skladišče je na voljo pod [odprtokodno licenco Dify](LICENSE) , ki je v bistvu Apache 2.0 z nekaj dodatnimi omejitvami.

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Bulut</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Kendi Sunucunuzda Barındırma</a> ·
<a href="https://docs.dify.ai">Dokümantasyon</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Yalnızca İngilizce: Kurumsal Sorgulama</a>
<a href="https://dify.ai/pricing">Dify ürün seçeneklerine genel bakış</a>
</p>
<p align="center">

View File

@ -8,7 +8,7 @@
<a href="https://cloud.dify.ai">Dify 雲端服務</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">自行託管</a> ·
<a href="https://docs.dify.ai">說明文件</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">企業諮詢</a>
<a href="https://dify.ai/pricing">產品方案概覽</a>
</p>
<p align="center">

View File

@ -4,7 +4,7 @@
<a href="https://cloud.dify.ai">Dify Cloud</a> ·
<a href="https://docs.dify.ai/getting-started/install-self-hosted">Tự triển khai</a> ·
<a href="https://docs.dify.ai">Tài liệu</a> ·
<a href="https://udify.app/chat/22L1zSxg6yW1cWQg">Yêu cầu doanh nghiệp</a>
<a href="https://dify.ai/pricing">Tổng quan các lựa chọn sản phẩm Dify</a>
</p>
<p align="center">

View File

@ -16,4 +16,4 @@ logs
.ruff_cache
# venv
.venv
.venv

View File

@ -165,6 +165,7 @@ MILVUS_URI=http://127.0.0.1:19530
MILVUS_TOKEN=
MILVUS_USER=root
MILVUS_PASSWORD=Milvus
MILVUS_ANALYZER_PARAMS=
# MyScale configuration
MYSCALE_HOST=127.0.0.1
@ -296,6 +297,7 @@ LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:
LINDORM_USERNAME=admin
LINDORM_PASSWORD=admin
USING_UGC_INDEX=False
LINDORM_QUERY_TIMEOUT=1
# OceanBase Vector configuration
OCEANBASE_VECTOR_HOST=127.0.0.1
@ -326,6 +328,7 @@ UPLOAD_AUDIO_FILE_SIZE_LIMIT=50
MULTIMODAL_SEND_FORMAT=base64
PROMPT_GENERATION_MAX_TOKENS=512
CODE_GENERATION_MAX_TOKENS=1024
PLUGIN_BASED_TOKEN_COUNTING_ENABLED=false
# Mail configuration, support: resend, smtp
MAIL_TYPE=
@ -422,6 +425,12 @@ WORKFLOW_CALL_MAX_DEPTH=5
WORKFLOW_PARALLEL_DEPTH_LIMIT=3
MAX_VARIABLE_SIZE=204800
# Workflow storage configuration
# Options: rdbms, hybrid
# rdbms: Use only the relational database (default)
# hybrid: Save new data to object storage, read from both object storage and RDBMS
WORKFLOW_NODE_EXECUTION_STORAGE=rdbms
# App configuration
APP_MAX_EXECUTION_TIME=1200
APP_MAX_ACTIVE_REQUESTS=0
@ -462,3 +471,19 @@ CREATE_TIDB_SERVICE_JOB_ENABLED=false
MAX_SUBMIT_COUNT=100
# Lockout duration in seconds
LOGIN_LOCKOUT_DURATION=86400
# Enable OpenTelemetry
ENABLE_OTEL=false
OTLP_BASE_ENDPOINT=http://localhost:4318
OTLP_API_KEY=
OTEL_EXPORTER_TYPE=otlp
OTEL_SAMPLING_RATE=0.1
OTEL_BATCH_EXPORT_SCHEDULE_DELAY=5000
OTEL_MAX_QUEUE_SIZE=2048
OTEL_MAX_EXPORT_BATCH_SIZE=512
OTEL_METRIC_EXPORT_INTERVAL=60000
OTEL_BATCH_EXPORT_TIMEOUT=10000
OTEL_METRIC_EXPORT_TIMEOUT=30000
# Prevent Clickjacking
ALLOW_EMBED=false

View File

@ -3,20 +3,11 @@ FROM python:3.12-slim-bookworm AS base
WORKDIR /app/api
# Install Poetry
ENV POETRY_VERSION=2.0.1
# Install uv
ENV UV_VERSION=0.6.14
# if you located in China, you can use aliyun mirror to speed up
# RUN pip install --no-cache-dir poetry==${POETRY_VERSION} -i https://mirrors.aliyun.com/pypi/simple/
RUN pip install --no-cache-dir uv==${UV_VERSION}
RUN pip install --no-cache-dir poetry==${POETRY_VERSION}
# Configure Poetry
ENV POETRY_CACHE_DIR=/tmp/poetry_cache
ENV POETRY_NO_INTERACTION=1
ENV POETRY_VIRTUALENVS_IN_PROJECT=true
ENV POETRY_VIRTUALENVS_CREATE=true
ENV POETRY_REQUESTS_TIMEOUT=15
FROM base AS packages
@ -27,8 +18,8 @@ RUN apt-get update \
&& apt-get install -y --no-install-recommends gcc g++ libc-dev libffi-dev libgmp-dev libmpfr-dev libmpc-dev
# Install Python dependencies
COPY pyproject.toml poetry.lock ./
RUN poetry install --sync --no-cache --no-root
COPY pyproject.toml uv.lock ./
RUN uv sync --locked
# production stage
FROM base AS production

View File

@ -3,7 +3,10 @@
## Usage
> [!IMPORTANT]
> In the v0.6.12 release, we deprecated `pip` as the package management tool for Dify API Backend service and replaced it with `poetry`.
>
> In the v1.3.0 release, `poetry` has been replaced with
> [`uv`](https://docs.astral.sh/uv/) as the package manager
> for Dify API backend service.
1. Start the docker-compose stack
@ -37,19 +40,19 @@
4. Create environment.
Dify API service uses [Poetry](https://python-poetry.org/docs/) to manage dependencies. First, you need to add the poetry shell plugin, if you don't have it already, in order to run in a virtual environment. [Note: Poetry shell is no longer a native command so you need to install the poetry plugin beforehand]
Dify API service uses [UV](https://docs.astral.sh/uv/) to manage dependencies.
First, you need to add the uv package manager, if you don't have it already.
```bash
poetry self add poetry-plugin-shell
pip install uv
# Or on macOS
brew install uv
```
Then, You can execute `poetry shell` to activate the environment.
5. Install dependencies
```bash
poetry env use 3.12
poetry install
uv sync --dev
```
6. Run migrate
@ -57,21 +60,21 @@
Before the first launch, migrate the database to the latest version.
```bash
poetry run python -m flask db upgrade
uv run flask db upgrade
```
7. Start backend
```bash
poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
uv run flask run --host 0.0.0.0 --port=5001 --debug
```
8. Start Dify [web](../web) service.
9. Setup your application by visiting `http://localhost:3000`...
9. Setup your application by visiting `http://localhost:3000`.
10. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.
```bash
poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
```
## Testing
@ -79,11 +82,12 @@
1. Install dependencies for both the backend and the test environment
```bash
poetry install -C api --with dev
uv sync --dev
```
2. Run the tests locally with mocked system environment variables in `tool.pytest_env` section in `pyproject.toml`
```bash
poetry run -P api bash dev/pytest/pytest_all_tests.sh
uv run -P api bash dev/pytest/pytest_all_tests.sh
```

View File

@ -18,7 +18,7 @@ else:
# so we need to disable gevent in debug mode.
# If you are using debugpy and set GEVENT_SUPPORT=True, you can debug with gevent.
if (flask_debug := os.environ.get("FLASK_DEBUG", "0")) and flask_debug.lower() in {"false", "0", "no"}:
from gevent import monkey # type: ignore
from gevent import monkey
# gevent
monkey.patch_all()

View File

@ -51,6 +51,7 @@ def initialize_extensions(app: DifyApp):
ext_login,
ext_mail,
ext_migrate,
ext_otel,
ext_proxy_fix,
ext_redis,
ext_sentry,
@ -81,6 +82,7 @@ def initialize_extensions(app: DifyApp):
ext_proxy_fix,
ext_blueprints,
ext_commands,
ext_otel,
]
for ext in extensions:
short_name = ext.__name__.split(".")[-1]

View File

@ -17,6 +17,7 @@ from core.rag.models.document import Document
from events.app_event import app_was_created
from extensions.ext_database import db
from extensions.ext_redis import redis_client
from extensions.ext_storage import storage
from libs.helper import email as email_validate
from libs.password import hash_password, password_pattern, valid_password
from libs.rsa import generate_key_pair
@ -271,6 +272,7 @@ def migrate_knowledge_vector_database():
upper_collection_vector_types = {
VectorType.MILVUS,
VectorType.PGVECTOR,
VectorType.VASTBASE,
VectorType.RELYT,
VectorType.WEAVIATE,
VectorType.ORACLE,
@ -442,13 +444,13 @@ def convert_to_agent_apps():
WHERE a.mode = 'chat'
AND am.agent_mode is not null
AND (
am.agent_mode like '%"strategy": "function_call"%'
am.agent_mode like '%"strategy": "function_call"%'
OR am.agent_mode like '%"strategy": "react"%'
)
)
AND (
am.agent_mode like '{"enabled": true%'
am.agent_mode like '{"enabled": true%'
OR am.agent_mode like '{"max_iteration": %'
) ORDER BY a.created_at DESC LIMIT 1000
) ORDER BY a.created_at DESC LIMIT 1000
"""
with db.engine.begin() as conn:
@ -666,7 +668,7 @@ def upgrade_db():
click.echo(click.style("Starting database migration.", fg="green"))
# run db migration
import flask_migrate # type: ignore
import flask_migrate
flask_migrate.upgrade()
@ -814,3 +816,331 @@ def clear_free_plan_tenant_expired_logs(days: int, batch: int, tenant_ids: list[
ClearFreePlanTenantExpiredLogs.process(days, batch, tenant_ids)
click.echo(click.style("Clear free plan tenant expired logs completed.", fg="green"))
@click.option("-f", "--force", is_flag=True, help="Skip user confirmation and force the command to execute.")
@click.command("clear-orphaned-file-records", help="Clear orphaned file records.")
def clear_orphaned_file_records(force: bool):
"""
Clear orphaned file records in the database.
"""
# define tables and columns to process
files_tables = [
{"table": "upload_files", "id_column": "id", "key_column": "key"},
{"table": "tool_files", "id_column": "id", "key_column": "file_key"},
]
ids_tables = [
{"type": "uuid", "table": "message_files", "column": "upload_file_id"},
{"type": "text", "table": "documents", "column": "data_source_info"},
{"type": "text", "table": "document_segments", "column": "content"},
{"type": "text", "table": "messages", "column": "answer"},
{"type": "text", "table": "workflow_node_executions", "column": "inputs"},
{"type": "text", "table": "workflow_node_executions", "column": "process_data"},
{"type": "text", "table": "workflow_node_executions", "column": "outputs"},
{"type": "text", "table": "conversations", "column": "introduction"},
{"type": "text", "table": "conversations", "column": "system_instruction"},
{"type": "json", "table": "messages", "column": "inputs"},
{"type": "json", "table": "messages", "column": "message"},
]
# notify user and ask for confirmation
click.echo(
click.style(
"This command will first find and delete orphaned file records from the message_files table,", fg="yellow"
)
)
click.echo(
click.style(
"and then it will find and delete orphaned file records in the following tables:",
fg="yellow",
)
)
for files_table in files_tables:
click.echo(click.style(f"- {files_table['table']}", fg="yellow"))
click.echo(
click.style("The following tables and columns will be scanned to find orphaned file records:", fg="yellow")
)
for ids_table in ids_tables:
click.echo(click.style(f"- {ids_table['table']} ({ids_table['column']})", fg="yellow"))
click.echo("")
click.echo(click.style("!!! USE WITH CAUTION !!!", fg="red"))
click.echo(
click.style(
(
"Since not all patterns have been fully tested, "
"please note that this command may delete unintended file records."
),
fg="yellow",
)
)
click.echo(
click.style("This cannot be undone. Please make sure to back up your database before proceeding.", fg="yellow")
)
click.echo(
click.style(
(
"It is also recommended to run this during the maintenance window, "
"as this may cause high load on your instance."
),
fg="yellow",
)
)
if not force:
click.confirm("Do you want to proceed?", abort=True)
# start the cleanup process
click.echo(click.style("Starting orphaned file records cleanup.", fg="white"))
# clean up the orphaned records in the message_files table where message_id doesn't exist in messages table
try:
click.echo(
click.style("- Listing message_files records where message_id doesn't exist in messages table", fg="white")
)
query = (
"SELECT mf.id, mf.message_id "
"FROM message_files mf LEFT JOIN messages m ON mf.message_id = m.id "
"WHERE m.id IS NULL"
)
orphaned_message_files = []
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
orphaned_message_files.append({"id": str(i[0]), "message_id": str(i[1])})
if orphaned_message_files:
click.echo(click.style(f"Found {len(orphaned_message_files)} orphaned message_files records:", fg="white"))
for record in orphaned_message_files:
click.echo(click.style(f" - id: {record['id']}, message_id: {record['message_id']}", fg="black"))
if not force:
click.confirm(
(
f"Do you want to proceed "
f"to delete all {len(orphaned_message_files)} orphaned message_files records?"
),
abort=True,
)
click.echo(click.style("- Deleting orphaned message_files records", fg="white"))
query = "DELETE FROM message_files WHERE id IN :ids"
with db.engine.begin() as conn:
conn.execute(db.text(query), {"ids": tuple([record["id"] for record in orphaned_message_files])})
click.echo(
click.style(f"Removed {len(orphaned_message_files)} orphaned message_files records.", fg="green")
)
else:
click.echo(click.style("No orphaned message_files records found. There is nothing to delete.", fg="green"))
except Exception as e:
click.echo(click.style(f"Error deleting orphaned message_files records: {str(e)}", fg="red"))
# clean up the orphaned records in the rest of the *_files tables
try:
# fetch file id and keys from each table
all_files_in_tables = []
for files_table in files_tables:
click.echo(click.style(f"- Listing file records in table {files_table['table']}", fg="white"))
query = f"SELECT {files_table['id_column']}, {files_table['key_column']} FROM {files_table['table']}"
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
all_files_in_tables.append({"table": files_table["table"], "id": str(i[0]), "key": i[1]})
click.echo(click.style(f"Found {len(all_files_in_tables)} files in tables.", fg="white"))
# fetch referred table and columns
guid_regexp = "[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}"
all_ids_in_tables = []
for ids_table in ids_tables:
query = ""
if ids_table["type"] == "uuid":
click.echo(
click.style(
f"- Listing file ids in column {ids_table['column']} in table {ids_table['table']}", fg="white"
)
)
query = (
f"SELECT {ids_table['column']} FROM {ids_table['table']} WHERE {ids_table['column']} IS NOT NULL"
)
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
all_ids_in_tables.append({"table": ids_table["table"], "id": str(i[0])})
elif ids_table["type"] == "text":
click.echo(
click.style(
f"- Listing file-id-like strings in column {ids_table['column']} in table {ids_table['table']}",
fg="white",
)
)
query = (
f"SELECT regexp_matches({ids_table['column']}, '{guid_regexp}', 'g') AS extracted_id "
f"FROM {ids_table['table']}"
)
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
for j in i[0]:
all_ids_in_tables.append({"table": ids_table["table"], "id": j})
elif ids_table["type"] == "json":
click.echo(
click.style(
(
f"- Listing file-id-like JSON string in column {ids_table['column']} "
f"in table {ids_table['table']}"
),
fg="white",
)
)
query = (
f"SELECT regexp_matches({ids_table['column']}::text, '{guid_regexp}', 'g') AS extracted_id "
f"FROM {ids_table['table']}"
)
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
for j in i[0]:
all_ids_in_tables.append({"table": ids_table["table"], "id": j})
click.echo(click.style(f"Found {len(all_ids_in_tables)} file ids in tables.", fg="white"))
except Exception as e:
click.echo(click.style(f"Error fetching keys: {str(e)}", fg="red"))
return
# find orphaned files
all_files = [file["id"] for file in all_files_in_tables]
all_ids = [file["id"] for file in all_ids_in_tables]
orphaned_files = list(set(all_files) - set(all_ids))
if not orphaned_files:
click.echo(click.style("No orphaned file records found. There is nothing to delete.", fg="green"))
return
click.echo(click.style(f"Found {len(orphaned_files)} orphaned file records.", fg="white"))
for file in orphaned_files:
click.echo(click.style(f"- orphaned file id: {file}", fg="black"))
if not force:
click.confirm(f"Do you want to proceed to delete all {len(orphaned_files)} orphaned file records?", abort=True)
# delete orphaned records for each file
try:
for files_table in files_tables:
click.echo(click.style(f"- Deleting orphaned file records in table {files_table['table']}", fg="white"))
query = f"DELETE FROM {files_table['table']} WHERE {files_table['id_column']} IN :ids"
with db.engine.begin() as conn:
conn.execute(db.text(query), {"ids": tuple(orphaned_files)})
except Exception as e:
click.echo(click.style(f"Error deleting orphaned file records: {str(e)}", fg="red"))
return
click.echo(click.style(f"Removed {len(orphaned_files)} orphaned file records.", fg="green"))
@click.option("-f", "--force", is_flag=True, help="Skip user confirmation and force the command to execute.")
@click.command("remove-orphaned-files-on-storage", help="Remove orphaned files on the storage.")
def remove_orphaned_files_on_storage(force: bool):
"""
Remove orphaned files on the storage.
"""
# define tables and columns to process
files_tables = [
{"table": "upload_files", "key_column": "key"},
{"table": "tool_files", "key_column": "file_key"},
]
storage_paths = ["image_files", "tools", "upload_files"]
# notify user and ask for confirmation
click.echo(click.style("This command will find and remove orphaned files on the storage,", fg="yellow"))
click.echo(
click.style("by comparing the files on the storage with the records in the following tables:", fg="yellow")
)
for files_table in files_tables:
click.echo(click.style(f"- {files_table['table']}", fg="yellow"))
click.echo(click.style("The following paths on the storage will be scanned to find orphaned files:", fg="yellow"))
for storage_path in storage_paths:
click.echo(click.style(f"- {storage_path}", fg="yellow"))
click.echo("")
click.echo(click.style("!!! USE WITH CAUTION !!!", fg="red"))
click.echo(
click.style(
"Currently, this command will work only for opendal based storage (STORAGE_TYPE=opendal).", fg="yellow"
)
)
click.echo(
click.style(
"Since not all patterns have been fully tested, please note that this command may delete unintended files.",
fg="yellow",
)
)
click.echo(
click.style("This cannot be undone. Please make sure to back up your storage before proceeding.", fg="yellow")
)
click.echo(
click.style(
(
"It is also recommended to run this during the maintenance window, "
"as this may cause high load on your instance."
),
fg="yellow",
)
)
if not force:
click.confirm("Do you want to proceed?", abort=True)
# start the cleanup process
click.echo(click.style("Starting orphaned files cleanup.", fg="white"))
# fetch file id and keys from each table
all_files_in_tables = []
try:
for files_table in files_tables:
click.echo(click.style(f"- Listing files from table {files_table['table']}", fg="white"))
query = f"SELECT {files_table['key_column']} FROM {files_table['table']}"
with db.engine.begin() as conn:
rs = conn.execute(db.text(query))
for i in rs:
all_files_in_tables.append(str(i[0]))
click.echo(click.style(f"Found {len(all_files_in_tables)} files in tables.", fg="white"))
except Exception as e:
click.echo(click.style(f"Error fetching keys: {str(e)}", fg="red"))
all_files_on_storage = []
for storage_path in storage_paths:
try:
click.echo(click.style(f"- Scanning files on storage path {storage_path}", fg="white"))
files = storage.scan(path=storage_path, files=True, directories=False)
all_files_on_storage.extend(files)
except FileNotFoundError as e:
click.echo(click.style(f" -> Skipping path {storage_path} as it does not exist.", fg="yellow"))
continue
except Exception as e:
click.echo(click.style(f" -> Error scanning files on storage path {storage_path}: {str(e)}", fg="red"))
continue
click.echo(click.style(f"Found {len(all_files_on_storage)} files on storage.", fg="white"))
# find orphaned files
orphaned_files = list(set(all_files_on_storage) - set(all_files_in_tables))
if not orphaned_files:
click.echo(click.style("No orphaned files found. There is nothing to remove.", fg="green"))
return
click.echo(click.style(f"Found {len(orphaned_files)} orphaned files.", fg="white"))
for file in orphaned_files:
click.echo(click.style(f"- orphaned file: {file}", fg="black"))
if not force:
click.confirm(f"Do you want to proceed to remove all {len(orphaned_files)} orphaned files?", abort=True)
# delete orphaned files
removed_files = 0
error_files = 0
for file in orphaned_files:
try:
storage.delete(file)
removed_files += 1
click.echo(click.style(f"- Removing orphaned file: {file}", fg="white"))
except Exception as e:
error_files += 1
click.echo(click.style(f"- Error deleting orphaned file {file}: {str(e)}", fg="red"))
continue
if error_files == 0:
click.echo(click.style(f"Removed {removed_files} orphaned files without errors.", fg="green"))
else:
click.echo(click.style(f"Removed {removed_files} orphaned files, with {error_files} errors.", fg="yellow"))

View File

@ -9,9 +9,11 @@ from .enterprise import EnterpriseFeatureConfig
from .extra import ExtraServiceConfig
from .feature import FeatureConfig
from .middleware import MiddlewareConfig
from .observability import ObservabilityConfig
from .packaging import PackagingInfo
from .remote_settings_sources import RemoteSettingsSource, RemoteSettingsSourceConfig, RemoteSettingsSourceName
from .remote_settings_sources.apollo import ApolloSettingsSource
from .remote_settings_sources.nacos import NacosSettingsSource
logger = logging.getLogger(__name__)
@ -33,6 +35,8 @@ class RemoteSettingsSourceFactory(PydanticBaseSettingsSource):
match remote_source_name:
case RemoteSettingsSourceName.APOLLO:
remote_source = ApolloSettingsSource(current_state)
case RemoteSettingsSourceName.NACOS:
remote_source = NacosSettingsSource(current_state)
case _:
logger.warning(f"Unsupported remote source: {remote_source_name}")
return {}
@ -59,6 +63,8 @@ class DifyConfig(
MiddlewareConfig,
# Extra service configs
ExtraServiceConfig,
# Observability configs
ObservabilityConfig,
# Remote source configs
RemoteSettingsSourceConfig,
# Enterprise feature configs

View File

@ -12,7 +12,7 @@ from pydantic import (
)
from pydantic_settings import BaseSettings
from configs.feature.hosted_service import HostedServiceConfig
from .hosted_service import HostedServiceConfig
class SecurityConfig(BaseSettings):
@ -398,6 +398,11 @@ class InnerAPIConfig(BaseSettings):
default=False,
)
INNER_API_KEY: Optional[str] = Field(
description="API key for accessing the internal API",
default=None,
)
class LoggingConfig(BaseSettings):
"""
@ -442,7 +447,7 @@ class LoggingConfig(BaseSettings):
class ModelLoadBalanceConfig(BaseSettings):
"""
Configuration for model load balancing
Configuration for model load balancing and token counting
"""
MODEL_LB_ENABLED: bool = Field(
@ -450,6 +455,11 @@ class ModelLoadBalanceConfig(BaseSettings):
default=False,
)
PLUGIN_BASED_TOKEN_COUNTING_ENABLED: bool = Field(
description="Enable or disable plugin based token counting. If disabled, token counting will return 0.",
default=False,
)
class BillingConfig(BaseSettings):
"""
@ -514,6 +524,11 @@ class WorkflowNodeExecutionConfig(BaseSettings):
default=100,
)
WORKFLOW_NODE_EXECUTION_STORAGE: str = Field(
default="rdbms",
description="Storage backend for WorkflowNodeExecution. Options: 'rdbms', 'hybrid'",
)
class AuthConfig(BaseSettings):
"""

View File

@ -22,6 +22,7 @@ from .vdb.baidu_vector_config import BaiduVectorDBConfig
from .vdb.chroma_config import ChromaConfig
from .vdb.couchbase_config import CouchbaseConfig
from .vdb.elasticsearch_config import ElasticsearchConfig
from .vdb.huawei_cloud_config import HuaweiCloudConfig
from .vdb.lindorm_config import LindormConfig
from .vdb.milvus_config import MilvusConfig
from .vdb.myscale_config import MyScaleConfig
@ -38,6 +39,7 @@ from .vdb.tencent_vector_config import TencentVectorDBConfig
from .vdb.tidb_on_qdrant_config import TidbOnQdrantConfig
from .vdb.tidb_vector_config import TiDBVectorConfig
from .vdb.upstash_config import UpstashConfig
from .vdb.vastbase_vector_config import VastbaseVectorConfig
from .vdb.vikingdb_config import VikingDBConfig
from .vdb.weaviate_config import WeaviateConfig
@ -263,11 +265,13 @@ class MiddlewareConfig(
VectorStoreConfig,
AnalyticdbConfig,
ChromaConfig,
HuaweiCloudConfig,
MilvusConfig,
MyScaleConfig,
OpenSearchConfig,
OracleConfig,
PGVectorConfig,
VastbaseVectorConfig,
PGVectoRSConfig,
QdrantConfig,
RelytConfig,

View File

@ -0,0 +1,25 @@
from typing import Optional
from pydantic import Field
from pydantic_settings import BaseSettings
class HuaweiCloudConfig(BaseSettings):
"""
Configuration settings for Huawei cloud search service
"""
HUAWEI_CLOUD_HOSTS: Optional[str] = Field(
description="Hostname or IP address of the Huawei cloud search service instance",
default=None,
)
HUAWEI_CLOUD_USER: Optional[str] = Field(
description="Username for authenticating with Huawei cloud search service",
default=None,
)
HUAWEI_CLOUD_PASSWORD: Optional[str] = Field(
description="Password for authenticating with Huawei cloud search service",
default=None,
)

View File

@ -32,3 +32,4 @@ class LindormConfig(BaseSettings):
description="Using UGC index will store the same type of Index in a single index but can retrieve separately.",
default=False,
)
LINDORM_QUERY_TIMEOUT: Optional[float] = Field(description="The lindorm search request timeout (s)", default=2.0)

View File

@ -39,3 +39,8 @@ class MilvusConfig(BaseSettings):
"older versions",
default=True,
)
MILVUS_ANALYZER_PARAMS: Optional[str] = Field(
description='Milvus text analyzer parameters, e.g., {"type": "chinese"} for Chinese segmentation support.',
default=None,
)

View File

@ -1,4 +1,5 @@
from typing import Optional
import enum
from typing import Literal, Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
@ -9,6 +10,14 @@ class OpenSearchConfig(BaseSettings):
Configuration settings for OpenSearch
"""
class AuthMethod(enum.StrEnum):
"""
Authentication method for OpenSearch
"""
BASIC = "basic"
AWS_MANAGED_IAM = "aws_managed_iam"
OPENSEARCH_HOST: Optional[str] = Field(
description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')",
default=None,
@ -19,6 +28,16 @@ class OpenSearchConfig(BaseSettings):
default=9200,
)
OPENSEARCH_SECURE: bool = Field(
description="Whether to use SSL/TLS encrypted connection for OpenSearch (True for HTTPS, False for HTTP)",
default=False,
)
OPENSEARCH_AUTH_METHOD: AuthMethod = Field(
description="Authentication method for OpenSearch connection (default is 'basic')",
default=AuthMethod.BASIC,
)
OPENSEARCH_USER: Optional[str] = Field(
description="Username for authenticating with OpenSearch",
default=None,
@ -29,7 +48,11 @@ class OpenSearchConfig(BaseSettings):
default=None,
)
OPENSEARCH_SECURE: bool = Field(
description="Whether to use SSL/TLS encrypted connection for OpenSearch (True for HTTPS, False for HTTP)",
default=False,
OPENSEARCH_AWS_REGION: Optional[str] = Field(
description="AWS region for OpenSearch (e.g. 'us-west-2')",
default=None,
)
OPENSEARCH_AWS_SERVICE: Optional[Literal["es", "aoss"]] = Field(
description="AWS service for OpenSearch (e.g. 'aoss' for OpenSearch Serverless)", default=None
)

View File

@ -0,0 +1,45 @@
from typing import Optional
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class VastbaseVectorConfig(BaseSettings):
"""
Configuration settings for Vector (Vastbase with vector extension)
"""
VASTBASE_HOST: Optional[str] = Field(
description="Hostname or IP address of the Vastbase server with Vector extension (e.g., 'localhost')",
default=None,
)
VASTBASE_PORT: PositiveInt = Field(
description="Port number on which the Vastbase server is listening (default is 5432)",
default=5432,
)
VASTBASE_USER: Optional[str] = Field(
description="Username for authenticating with the Vastbase database",
default=None,
)
VASTBASE_PASSWORD: Optional[str] = Field(
description="Password for authenticating with the Vastbase database",
default=None,
)
VASTBASE_DATABASE: Optional[str] = Field(
description="Name of the Vastbase database to connect to",
default=None,
)
VASTBASE_MIN_CONNECTION: PositiveInt = Field(
description="Min connection of the Vastbase database",
default=1,
)
VASTBASE_MAX_CONNECTION: PositiveInt = Field(
description="Max connection of the Vastbase database",
default=5,
)

View File

@ -0,0 +1,9 @@
from configs.observability.otel.otel_config import OTelConfig
class ObservabilityConfig(OTelConfig):
"""
Observability configuration settings
"""
pass

View File

@ -0,0 +1,44 @@
from pydantic import Field
from pydantic_settings import BaseSettings
class OTelConfig(BaseSettings):
"""
OpenTelemetry configuration settings
"""
ENABLE_OTEL: bool = Field(
description="Whether to enable OpenTelemetry",
default=False,
)
OTLP_BASE_ENDPOINT: str = Field(
description="OTLP base endpoint",
default="http://localhost:4318",
)
OTLP_API_KEY: str = Field(
description="OTLP API key",
default="",
)
OTEL_EXPORTER_TYPE: str = Field(
description="OTEL exporter type",
default="otlp",
)
OTEL_SAMPLING_RATE: float = Field(default=0.1, description="Sampling rate for traces (0.0 to 1.0)")
OTEL_BATCH_EXPORT_SCHEDULE_DELAY: int = Field(
default=5000, description="Batch export schedule delay in milliseconds"
)
OTEL_MAX_QUEUE_SIZE: int = Field(default=2048, description="Maximum queue size for the batch span processor")
OTEL_MAX_EXPORT_BATCH_SIZE: int = Field(default=512, description="Maximum export batch size")
OTEL_METRIC_EXPORT_INTERVAL: int = Field(default=60000, description="Metric export interval in milliseconds")
OTEL_BATCH_EXPORT_TIMEOUT: int = Field(default=10000, description="Batch export timeout in milliseconds")
OTEL_METRIC_EXPORT_TIMEOUT: int = Field(default=30000, description="Metric export timeout in milliseconds")

View File

@ -9,7 +9,7 @@ class PackagingInfo(BaseSettings):
CURRENT_VERSION: str = Field(
description="Dify version",
default="1.2.0",
default="1.3.1",
)
COMMIT_SHA: str = Field(

View File

@ -270,7 +270,7 @@ class ApolloClient:
while not self._stopping:
for namespace in self._notification_map:
self._do_heart_beat(namespace)
time.sleep(60 * 10) # 10分钟
time.sleep(60 * 10) # 10 minutes
def _do_heart_beat(self, namespace):
url = "{}/configs/{}/{}/{}?ip={}".format(self.config_url, self.app_id, self.cluster, namespace, self.ip)

View File

@ -3,3 +3,4 @@ from enum import StrEnum
class RemoteSettingsSourceName(StrEnum):
APOLLO = "apollo"
NACOS = "nacos"

View File

@ -0,0 +1,52 @@
import logging
import os
from collections.abc import Mapping
from typing import Any
from pydantic.fields import FieldInfo
from .http_request import NacosHttpClient
logger = logging.getLogger(__name__)
from configs.remote_settings_sources.base import RemoteSettingsSource
from .utils import _parse_config
class NacosSettingsSource(RemoteSettingsSource):
def __init__(self, configs: Mapping[str, Any]):
self.configs = configs
self.remote_configs: dict[str, Any] = {}
self.async_init()
def async_init(self):
data_id = os.getenv("DIFY_ENV_NACOS_DATA_ID", "dify-api-env.properties")
group = os.getenv("DIFY_ENV_NACOS_GROUP", "nacos-dify")
tenant = os.getenv("DIFY_ENV_NACOS_NAMESPACE", "")
params = {"dataId": data_id, "group": group, "tenant": tenant}
try:
content = NacosHttpClient().http_request("/nacos/v1/cs/configs", method="GET", headers={}, params=params)
self.remote_configs = self._parse_config(content)
except Exception as e:
logger.exception("[get-access-token] exception occurred")
raise
def _parse_config(self, content: str) -> dict:
if not content:
return {}
try:
return _parse_config(self, content)
except Exception as e:
raise RuntimeError(f"Failed to parse config: {e}")
def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]:
if not isinstance(self.remote_configs, dict):
raise ValueError(f"remote configs is not dict, but {type(self.remote_configs)}")
field_value = self.remote_configs.get(field_name)
if field_value is None:
return None, field_name, False
return field_value, field_name, False

View File

@ -0,0 +1,83 @@
import base64
import hashlib
import hmac
import logging
import os
import time
import requests
logger = logging.getLogger(__name__)
class NacosHttpClient:
def __init__(self):
self.username = os.getenv("DIFY_ENV_NACOS_USERNAME")
self.password = os.getenv("DIFY_ENV_NACOS_PASSWORD")
self.ak = os.getenv("DIFY_ENV_NACOS_ACCESS_KEY")
self.sk = os.getenv("DIFY_ENV_NACOS_SECRET_KEY")
self.server = os.getenv("DIFY_ENV_NACOS_SERVER_ADDR", "localhost:8848")
self.token = None
self.token_ttl = 18000
self.token_expire_time: float = 0
def http_request(self, url, method="GET", headers=None, params=None):
try:
self._inject_auth_info(headers, params)
response = requests.request(method, url="http://" + self.server + url, headers=headers, params=params)
response.raise_for_status()
return response.text
except requests.exceptions.RequestException as e:
return f"Request to Nacos failed: {e}"
def _inject_auth_info(self, headers, params, module="config"):
headers.update({"User-Agent": "Nacos-Http-Client-In-Dify:v0.0.1"})
if module == "login":
return
ts = str(int(time.time() * 1000))
if self.ak and self.sk:
sign_str = self.get_sign_str(params["group"], params["tenant"], ts)
headers["Spas-AccessKey"] = self.ak
headers["Spas-Signature"] = self.__do_sign(sign_str, self.sk)
headers["timeStamp"] = ts
if self.username and self.password:
self.get_access_token(force_refresh=False)
params["accessToken"] = self.token
def __do_sign(self, sign_str, sk):
return (
base64.encodebytes(hmac.new(sk.encode(), sign_str.encode(), digestmod=hashlib.sha1).digest())
.decode()
.strip()
)
def get_sign_str(self, group, tenant, ts):
sign_str = ""
if tenant:
sign_str = tenant + "+"
if group:
sign_str = sign_str + group + "+"
if sign_str:
sign_str += ts
return sign_str
def get_access_token(self, force_refresh=False):
current_time = time.time()
if self.token and not force_refresh and self.token_expire_time > current_time:
return self.token
params = {"username": self.username, "password": self.password}
url = "http://" + self.server + "/nacos/v1/auth/login"
try:
resp = requests.request("POST", url, headers=None, params=params)
resp.raise_for_status()
response_data = resp.json()
self.token = response_data.get("accessToken")
self.token_ttl = response_data.get("tokenTtl", 18000)
self.token_expire_time = current_time + self.token_ttl - 10
except Exception as e:
logger.exception("[get-access-token] exception occur")
raise

View File

@ -0,0 +1,31 @@
def _parse_config(self, content: str) -> dict[str, str]:
config: dict[str, str] = {}
if not content:
return config
for line in content.splitlines():
cleaned_line = line.strip()
if not cleaned_line or cleaned_line.startswith(("#", "!")):
continue
separator_index = -1
for i, c in enumerate(cleaned_line):
if c in ("=", ":") and (i == 0 or cleaned_line[i - 1] != "\\"):
separator_index = i
break
if separator_index == -1:
continue
key = cleaned_line[:separator_index].strip()
raw_value = cleaned_line[separator_index + 1 :].strip()
try:
decoded_value = bytes(raw_value, "utf-8").decode("unicode_escape")
decoded_value = decoded_value.replace(r"\=", "=").replace(r"\:", ":")
except UnicodeDecodeError:
decoded_value = raw_value
config[key] = decoded_value
return config

View File

@ -3,6 +3,8 @@ from configs import dify_config
HIDDEN_VALUE = "[__HIDDEN__]"
UUID_NIL = "00000000-0000-0000-0000-000000000000"
DEFAULT_FILE_NUMBER_LIMITS = 3
IMAGE_EXTENSIONS = ["jpg", "jpeg", "png", "webp", "gif", "svg"]
IMAGE_EXTENSIONS.extend([ext.upper() for ext in IMAGE_EXTENSIONS])
@ -14,11 +16,25 @@ AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
if dify_config.ETL_TYPE == "Unstructured":
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls"]
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"]
DOCUMENT_EXTENSIONS.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
if dify_config.UNSTRUCTURED_API_URL:
DOCUMENT_EXTENSIONS.append("ppt")
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])
else:
DOCUMENT_EXTENSIONS = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "docx", "csv"]
DOCUMENT_EXTENSIONS = [
"txt",
"markdown",
"md",
"mdx",
"pdf",
"html",
"htm",
"xlsx",
"xls",
"docx",
"csv",
"vtt",
"properties",
]
DOCUMENT_EXTENSIONS.extend([ext.upper() for ext in DOCUMENT_EXTENSIONS])

View File

@ -0,0 +1,7 @@
# The two constants below should keep in sync.
# Default content type for files which have no explicit content type.
DEFAULT_MIME_TYPE = "application/octet-stream"
# Default file extension for files which have no explicit content type, should
# correspond to the `DEFAULT_MIME_TYPE` above.
DEFAULT_EXTENSION = ".bin"

View File

@ -1,4 +1,4 @@
from flask_restful import fields # type: ignore
from flask_restful import fields
parameters__system_parameters = {
"image_file_size_limit": fields.Integer,

View File

@ -4,8 +4,6 @@ import platform
import re
import urllib.parse
import warnings
from collections.abc import Mapping
from typing import Any
from uuid import uuid4
import httpx
@ -29,8 +27,6 @@ except ImportError:
from pydantic import BaseModel
from configs import dify_config
class FileInfo(BaseModel):
filename: str
@ -87,38 +83,3 @@ def guess_file_info_from_response(response: httpx.Response):
mimetype=mimetype,
size=int(response.headers.get("Content-Length", -1)),
)
def get_parameters_from_feature_dict(*, features_dict: Mapping[str, Any], user_input_form: list[dict[str, Any]]):
return {
"opening_statement": features_dict.get("opening_statement"),
"suggested_questions": features_dict.get("suggested_questions", []),
"suggested_questions_after_answer": features_dict.get("suggested_questions_after_answer", {"enabled": False}),
"speech_to_text": features_dict.get("speech_to_text", {"enabled": False}),
"text_to_speech": features_dict.get("text_to_speech", {"enabled": False}),
"retriever_resource": features_dict.get("retriever_resource", {"enabled": False}),
"annotation_reply": features_dict.get("annotation_reply", {"enabled": False}),
"more_like_this": features_dict.get("more_like_this", {"enabled": False}),
"user_input_form": user_input_form,
"sensitive_word_avoidance": features_dict.get(
"sensitive_word_avoidance", {"enabled": False, "type": "", "configs": []}
),
"file_upload": features_dict.get(
"file_upload",
{
"image": {
"enabled": False,
"number_limits": 3,
"detail": "high",
"transfer_methods": ["remote_url", "local_file"],
}
},
),
"system_parameters": {
"image_file_size_limit": dify_config.UPLOAD_IMAGE_FILE_SIZE_LIMIT,
"video_file_size_limit": dify_config.UPLOAD_VIDEO_FILE_SIZE_LIMIT,
"audio_file_size_limit": dify_config.UPLOAD_AUDIO_FILE_SIZE_LIMIT,
"file_size_limit": dify_config.UPLOAD_FILE_SIZE_LIMIT,
"workflow_file_upload_limit": dify_config.WORKFLOW_FILE_UPLOAD_LIMIT,
},
}

View File

@ -1,7 +1,7 @@
from functools import wraps
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound, Unauthorized

View File

@ -1,7 +1,7 @@
from typing import Any
import flask_restful # type: ignore
from flask_login import current_user # type: ignore
import flask_restful
from flask_login import current_user
from flask_restful import Resource, fields, marshal_with
from sqlalchemy import select
from sqlalchemy.orm import Session

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -1,6 +1,6 @@
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, marshal_with, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
@ -89,7 +89,7 @@ class AnnotationReplyActionStatusApi(Resource):
app_annotation_job_key = "{}_app_annotation_job_{}".format(action, str(job_id))
cache_result = redis_client.get(app_annotation_job_key)
if cache_result is None:
raise ValueError("The job is not exist.")
raise ValueError("The job does not exist.")
job_status = cache_result.decode()
error_msg = ""
@ -186,7 +186,7 @@ class AnnotationUpdateDeleteApi(Resource):
app_id = str(app_id)
annotation_id = str(annotation_id)
AppAnnotationService.delete_app_annotation(app_id, annotation_id)
return {"result": "success"}, 200
return {"result": "success"}, 204
class AnnotationBatchImportApi(Resource):
@ -226,7 +226,7 @@ class AnnotationBatchImportStatusApi(Resource):
indexing_cache_key = "app_annotation_batch_import_{}".format(str(job_id))
cache_result = redis_client.get(indexing_cache_key)
if cache_result is None:
raise ValueError("The job is not exist.")
raise ValueError("The job does not exist.")
job_status = cache_result.decode()
error_msg = ""
if job_status == "error":

View File

@ -1,8 +1,8 @@
import uuid
from typing import cast
from flask_login import current_user # type: ignore
from flask_restful import Resource, inputs, marshal, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, inputs, marshal, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden, abort

View File

@ -1,7 +1,7 @@
from typing import cast
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden

View File

@ -1,7 +1,7 @@
import logging
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from werkzeug.exceptions import InternalServerError
import services
@ -80,8 +80,6 @@ class ChatMessageTextApi(Resource):
@account_initialization_required
@get_app_model
def post(self, app_model: App):
from werkzeug.exceptions import InternalServerError
try:
parser = reqparse.RequestParser()
parser.add_argument("message_id", type=str, location="json")

View File

@ -1,7 +1,7 @@
import logging
import flask_login # type: ignore
from flask_restful import Resource, reqparse # type: ignore
import flask_login
from flask_restful import Resource, reqparse
from werkzeug.exceptions import InternalServerError, NotFound
import services

View File

@ -1,9 +1,9 @@
from datetime import UTC, datetime
import pytz # pip install pytz
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from sqlalchemy import func, or_
from sqlalchemy.orm import joinedload
from werkzeug.exceptions import Forbidden, NotFound

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session

View File

@ -1,7 +1,7 @@
import os
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.error import (
@ -85,5 +85,35 @@ class RuleCodeGenerateApi(Resource):
return code_result
class RuleStructuredOutputGenerateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("instruction", type=str, required=True, nullable=False, location="json")
parser.add_argument("model_config", type=dict, required=True, nullable=False, location="json")
args = parser.parse_args()
account = current_user
try:
structured_output = LLMGenerator.generate_structured_output(
tenant_id=account.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
except QuotaExceededError:
raise ProviderQuotaExceededError()
except ModelCurrentlyNotSupportError:
raise ProviderModelCurrentlyNotSupportError()
except InvokeError as e:
raise CompletionRequestError(e.description)
return structured_output
api.add_resource(RuleGenerateApi, "/rule-generate")
api.add_resource(RuleCodeGenerateApi, "/rule-code-generate")
api.add_resource(RuleStructuredOutputGenerateApi, "/rule-structured-output-generate")

View File

@ -1,8 +1,8 @@
import logging
from flask_login import current_user # type: ignore
from flask_restful import Resource, fields, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_login import current_user
from flask_restful import Resource, fields, marshal_with, reqparse
from flask_restful.inputs import int_range
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
from controllers.console import api

View File

@ -2,8 +2,8 @@ import json
from typing import cast
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource # type: ignore
from flask_login import current_user
from flask_restful import Resource
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -1,4 +1,4 @@
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from werkzeug.exceptions import BadRequest
from controllers.console import api
@ -84,7 +84,7 @@ class TraceAppConfigApi(Resource):
result = OpsService.delete_tracing_app_config(app_id=app_id, tracing_provider=args["tracing_provider"])
if not result:
raise TracingConfigNotExist()
return {"result": "success"}
return {"result": "success"}, 204
except Exception as e:
raise BadRequest(str(e))

View File

@ -1,7 +1,7 @@
from datetime import UTC, datetime
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from werkzeug.exceptions import Forbidden, NotFound
from constants.languages import supported_language

View File

@ -3,8 +3,8 @@ from decimal import Decimal
import pytz
from flask import jsonify
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -3,7 +3,7 @@ import logging
from typing import cast
from flask import abort, request
from flask_restful import Resource, inputs, marshal_with, reqparse # type: ignore
from flask_restful import Resource, inputs, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound

View File

@ -1,7 +1,6 @@
from datetime import datetime
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from dateutil.parser import isoparse
from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from sqlalchemy.orm import Session
from controllers.console import api
@ -41,10 +40,10 @@ class WorkflowAppLogApi(Resource):
args.status = WorkflowRunStatus(args.status) if args.status else None
if args.created_at__before:
args.created_at__before = datetime.fromisoformat(args.created_at__before.replace("Z", "+00:00"))
args.created_at__before = isoparse(args.created_at__before)
if args.created_at__after:
args.created_at__after = datetime.fromisoformat(args.created_at__after.replace("Z", "+00:00"))
args.created_at__after = isoparse(args.created_at__after)
# get paginate workflow app logs
workflow_app_service = WorkflowAppService()

View File

@ -1,5 +1,5 @@
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_restful.inputs import int_range # type: ignore
from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -3,8 +3,8 @@ from decimal import Decimal
import pytz
from flask import jsonify
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.app.wraps import get_app_model

View File

@ -1,7 +1,7 @@
import datetime
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from constants.languages import supported_language
from controllers.console import api

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import api
@ -65,7 +65,7 @@ class ApiKeyAuthDataSourceBindingDelete(Resource):
ApiKeyAuthService.delete_provider_auth(current_user.current_tenant_id, binding_id)
return {"result": "success"}, 200
return {"result": "success"}, 204
api.add_resource(ApiKeyAuthDataSource, "/api-key-auth/data-source")

View File

@ -2,8 +2,8 @@ import logging
import requests
from flask import current_app, redirect, request
from flask_login import current_user # type: ignore
from flask_restful import Resource # type: ignore
from flask_login import current_user
from flask_restful import Resource
from werkzeug.exceptions import Forbidden
from configs import dify_config
@ -74,7 +74,9 @@ class OAuthDataSourceBinding(Resource):
if not oauth_provider:
return {"error": "Invalid provider"}, 400
if "code" in request.args:
code = request.args.get("code")
code = request.args.get("code", "")
if not code:
return {"error": "Invalid code"}, 400
try:
oauth_provider.get_access_token(code)
except requests.exceptions.HTTPError as e:

View File

@ -2,7 +2,7 @@ import base64
import secrets
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
@ -16,7 +16,7 @@ from controllers.console.auth.error import (
PasswordMismatchError,
)
from controllers.console.error import AccountInFreezeError, AccountNotFound, EmailSendIpLimitError
from controllers.console.wraps import setup_required
from controllers.console.wraps import email_password_login_enabled, setup_required
from events.tenant_event import tenant_was_created
from extensions.ext_database import db
from libs.helper import email, extract_remote_ip
@ -30,6 +30,7 @@ from services.feature_service import FeatureService
class ForgotPasswordSendEmailApi(Resource):
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
@ -62,6 +63,7 @@ class ForgotPasswordSendEmailApi(Resource):
class ForgotPasswordCheckApi(Resource):
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=str, required=True, location="json")
@ -86,12 +88,21 @@ class ForgotPasswordCheckApi(Resource):
AccountService.add_forgot_password_error_rate_limit(args["email"])
raise EmailCodeError()
# Verified, revoke the first token
AccountService.revoke_reset_password_token(args["token"])
# Refresh token data by generating a new token
_, new_token = AccountService.generate_reset_password_token(
user_email, code=args["code"], additional_data={"phase": "reset"}
)
AccountService.reset_forgot_password_error_rate_limit(args["email"])
return {"is_valid": True, "email": token_data.get("email")}
return {"is_valid": True, "email": token_data.get("email"), "token": new_token}
class ForgotPasswordResetApi(Resource):
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
@ -107,6 +118,9 @@ class ForgotPasswordResetApi(Resource):
reset_data = AccountService.get_reset_password_data(args["token"])
if not reset_data:
raise InvalidTokenError()
# Must use token in reset phase
if reset_data.get("phase", "") != "reset":
raise InvalidTokenError()
# Revoke token to prevent reuse
AccountService.revoke_reset_password_token(args["token"])

View File

@ -1,8 +1,8 @@
from typing import cast
import flask_login # type: ignore
import flask_login
from flask import request
from flask_restful import Resource, reqparse # type: ignore
from flask_restful import Resource, reqparse
import services
from configs import dify_config
@ -22,7 +22,7 @@ from controllers.console.error import (
EmailSendIpLimitError,
NotAllowedCreateWorkspace,
)
from controllers.console.wraps import setup_required
from controllers.console.wraps import email_password_login_enabled, setup_required
from events.tenant_event import tenant_was_created
from libs.helper import email, extract_remote_ip
from libs.password import valid_password
@ -38,6 +38,7 @@ class LoginApi(Resource):
"""Resource for user login."""
@setup_required
@email_password_login_enabled
def post(self):
"""Authenticate user and login."""
parser = reqparse.RequestParser()
@ -110,6 +111,7 @@ class LogoutApi(Resource):
class ResetPasswordSendEmailApi(Resource):
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")

View File

@ -4,7 +4,7 @@ from typing import Optional
import requests
from flask import current_app, redirect, request
from flask_restful import Resource # type: ignore
from flask_restful import Resource
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import Unauthorized

View File

@ -1,5 +1,5 @@
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from controllers.console import api
from controllers.console.wraps import account_initialization_required, only_edition_cloud, setup_required

View File

@ -1,6 +1,6 @@
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, reqparse
from libs.helper import extract_remote_ip
from libs.login import login_required

View File

@ -2,8 +2,8 @@ import datetime
import json
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound

View File

@ -1,7 +1,7 @@
import flask_restful # type: ignore
import flask_restful
from flask import request
from flask_login import current_user # type: ignore # type: ignore
from flask_restful import Resource, marshal, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, marshal_with, reqparse
from werkzeug.exceptions import Forbidden, NotFound
import services
@ -657,6 +657,7 @@ class DatasetRetrievalSettingApi(Resource):
| VectorType.ELASTICSEARCH
| VectorType.ELASTICSEARCH_JA
| VectorType.PGVECTOR
| VectorType.VASTBASE
| VectorType.TIDB_ON_QDRANT
| VectorType.LINDORM
| VectorType.COUCHBASE
@ -664,6 +665,7 @@ class DatasetRetrievalSettingApi(Resource):
| VectorType.OPENGAUSS
| VectorType.OCEANBASE
| VectorType.TABLESTORE
| VectorType.HUAWEI_CLOUD
| VectorType.TENCENT
):
return {
@ -705,11 +707,13 @@ class DatasetRetrievalSettingMockApi(Resource):
| VectorType.ELASTICSEARCH_JA
| VectorType.COUCHBASE
| VectorType.PGVECTOR
| VectorType.VASTBASE
| VectorType.LINDORM
| VectorType.OPENGAUSS
| VectorType.OCEANBASE
| VectorType.TABLESTORE
| VectorType.TENCENT
| VectorType.HUAWEI_CLOUD
):
return {
"retrieval_method": [

View File

@ -4,8 +4,8 @@ from datetime import UTC, datetime
from typing import cast
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, fields, marshal, marshal_with, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, fields, marshal, marshal_with, reqparse
from sqlalchemy import asc, desc
from werkzeug.exceptions import Forbidden, NotFound
@ -40,7 +40,7 @@ from core.indexing_runner import IndexingRunner
from core.model_manager import ModelManager
from core.model_runtime.entities.model_entities import ModelType
from core.model_runtime.errors.invoke import InvokeAuthorizationError
from core.plugin.manager.exc import PluginDaemonClientSideError
from core.plugin.impl.exc import PluginDaemonClientSideError
from core.rag.extractor.entity.extract_setting import ExtractSetting
from extensions.ext_database import db
from extensions.ext_redis import redis_client

View File

@ -2,8 +2,8 @@ import uuid
import pandas as pd
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, reqparse
from werkzeug.exceptions import Forbidden, NotFound
import services
@ -131,7 +131,7 @@ class DatasetDocumentSegmentListApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
SegmentService.delete_segments(segment_ids, document, dataset)
return {"result": "success"}, 200
return {"result": "success"}, 204
class DatasetDocumentSegmentApi(Resource):
@ -333,7 +333,7 @@ class DatasetDocumentSegmentUpdateApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
SegmentService.delete_segment(segment, document, dataset)
return {"result": "success"}, 200
return {"result": "success"}, 204
class DatasetDocumentSegmentBatchImportApi(Resource):
@ -398,7 +398,7 @@ class DatasetDocumentSegmentBatchImportApi(Resource):
indexing_cache_key = "segment_batch_import_{}".format(job_id)
cache_result = redis_client.get(indexing_cache_key)
if cache_result is None:
raise ValueError("The job is not exist.")
raise ValueError("The job does not exist.")
return {"job_id": job_id, "job_status": cache_result.decode()}, 200
@ -590,7 +590,7 @@ class ChildChunkUpdateApi(Resource):
SegmentService.delete_child_chunk(child_chunk, dataset)
except ChildChunkDeleteIndexServiceError as e:
raise ChildChunkDeleteIndexError(str(e))
return {"result": "success"}, 200
return {"result": "success"}, 204
@setup_required
@login_required

View File

@ -1,6 +1,6 @@
from flask import request
from flask_login import current_user # type: ignore
from flask_restful import Resource, marshal, reqparse # type: ignore
from flask_login import current_user
from flask_restful import Resource, marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
@ -21,12 +21,6 @@ def _validate_name(name):
return name
def _validate_description_length(description):
if description and len(description) > 400:
raise ValueError("Description cannot exceed 400 characters.")
return description
class ExternalApiTemplateListApi(Resource):
@setup_required
@login_required
@ -141,7 +135,7 @@ class ExternalApiTemplateApi(Resource):
raise Forbidden()
ExternalDatasetService.delete_external_knowledge_api(current_user.current_tenant_id, external_knowledge_api_id)
return {"result": "success"}, 200
return {"result": "success"}, 204
class ExternalApiUseCheckApi(Resource):

Some files were not shown because too many files have changed in this diff Show More