You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
gcgj-dify-1.7.0/api/core/model_providers/models/llm
Garfield Dai 42a5b3ec17
feat: advanced prompt backend (#1301)
Co-authored-by: takatost <takatost@gmail.com>
2 years ago
..
__init__.py feat: server multi models support (#799) 3 years ago
anthropic_model.py feat: optimize anthropic connection pool (#1066) 3 years ago
azure_openai_model.py feat: remove llm client use (#1316) 2 years ago
baichuan_model.py fix: prompt for baichuan text generation models (#1299) 2 years ago
base.py feat: advanced prompt backend (#1301) 2 years ago
chatglm_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
huggingface_hub_model.py fix: hf hosted inference check (#1128) 3 years ago
localai_model.py feat: add LocalAI local embedding model support (#1021) 3 years ago
minimax_model.py feat: optimize minimax llm call (#1312) 2 years ago
openai_model.py feat: remove llm client use (#1316) 2 years ago
openllm_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
replicate_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
spark_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
tongyi_model.py fix: compatibility issues with the tongyi model. (#1310) 2 years ago
wenxin_model.py fix: wenxin model name invalid when llm call (#1248) 3 years ago
xinference_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
zhipuai_model.py feat: add zhipuai (#1188) 3 years ago