binary-husky
|
9a21e13d33
|
支持gpt-4-vision-preview
|
2023-11-13 13:10:59 +08:00 |
|
binary-husky
|
69f37df356
|
紧急修复终结点覆盖错误的问题
|
2023-11-12 22:15:54 +08:00 |
|
qingxu fu
|
f7f6db831b
|
处理模型兼容的一些细节
|
2023-11-11 22:35:06 +08:00 |
|
qingxu fu
|
a655ce1f00
|
Merge branch 'frontier' into master_autogen
|
2023-11-11 22:03:20 +08:00 |
|
qingxu fu
|
28119e343c
|
将autogen大模型调用底层hook掉
|
2023-11-11 22:01:19 +08:00 |
|
qingxu fu
|
107ea868e1
|
API2D自动对齐
|
2023-11-10 23:08:56 +08:00 |
|
binary-husky
|
caf45ef740
|
Merge pull request #1244 from awwaawwa/fix_gpt_35_16k_maxtoken
修改 gpt-3.5-turbo-16k 系列模型 max_token 为 16385
|
2023-11-10 12:55:02 +08:00 |
|
qingxu fu
|
0ff750b60a
|
修改缩进
|
2023-11-10 12:40:25 +08:00 |
|
qingxu fu
|
8ad2a2bb86
|
Merge branch 'master' of https://github.com/samxiaowastaken/gpt_academic into samxiaowastaken-master
|
2023-11-10 12:37:30 +08:00 |
|
awwaawwa
|
8d94564e67
|
修改 gpt-3.5-turbo-16k 系列模型 max_token 为 16385
根据 https://platform.openai.com/docs/models/gpt-3-5 ,这个16k的3.5上下文窗口其实是16385
|
2023-11-07 15:59:07 +08:00 |
|
binary-husky
|
08f036aafd
|
支持chatglm3
|
2023-10-31 03:08:50 +08:00 |
|
binary-husky
|
527f9d28ad
|
change get_conf
|
2023-10-29 00:34:40 +08:00 |
|
binary-husky
|
40a065ce04
|
Merge branch 'master' into frontier
|
2023-10-28 20:09:49 +08:00 |
|
binary-husky
|
127385b846
|
接入新模型
|
2023-10-28 19:23:43 +08:00 |
|
binary-husky
|
cf085565a7
|
rename folder
|
2023-10-28 17:44:17 +08:00 |
|