You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
gcgj-dify-1.7.0/api/core/model_providers/models/llm
Joel 2d5ad0d208
feat: support optional query content (#1097)
Co-authored-by: Garfield Dai <dai.hai@foxmail.com>
3 years ago
..
__init__.py feat: server multi models support (#799) 3 years ago
anthropic_model.py feat: optimize anthropic connection pool (#1066) 3 years ago
azure_openai_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
base.py feat: support optional query content (#1097) 3 years ago
chatglm_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
huggingface_hub_model.py fix: hf hosted inference check (#1128) 3 years ago
localai_model.py feat: add LocalAI local embedding model support (#1021) 3 years ago
minimax_model.py Fix/price calc (#862) 3 years ago
openai_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
openllm_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
replicate_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
spark_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
tongyi_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
wenxin_model.py feat: hf inference endpoint stream support (#1028) 3 years ago
xinference_model.py feat: hf inference endpoint stream support (#1028) 3 years ago