Admin message
为了安全,强烈建议开启2FA双因子认证:User Settings -> Account -> Enable two-factor authentication!!!
Tags
Tags give the ability to mark specific points in history as being important
b1621
e18f7345
·
grammar : revert the replacement of llama_token_to_piece with id_to_token (#4396)
·
Dec 09, 2023
b1620
fe680e3d
·
sync : ggml (new ops, tests, backend, etc.) (#4359)
·
Dec 07, 2023
b1619
bcc0eb45
·
llama : per-layer KV cache + quantum K cache (#4309)
·
Dec 07, 2023
b1618
81bc9214
·
train : fix #4227 (double free in...
·
Dec 07, 2023
b1617
05cd6e50
·
server : recognize cache_prompt parameter in OAI API (#4347)
·
Dec 06, 2023
b1616
caa92492
·
common : fix compile warning
·
Dec 06, 2023
b1615
da5eaef1
·
speculative : support `--color` (#4343)
·
Dec 06, 2023
b1614
5f6e0c0d
·
grammar : pre-computed pieces + reserve mem + less string copies (#4330)
·
Dec 05, 2023
b1613
5aa365d8
·
llama : allow overriding GGUF metadata when loading model (#4092)
·
Dec 05, 2023
b1612
52c8bc3c
·
sampling : custom samplers order (#4285)
·
Dec 05, 2023
b1611
e4b76bbe
·
swift : revert compiler checks for swift package (#4332)
·
Dec 05, 2023
b1610
23b5e12e
·
simple : update error message for KV cache check (#4324)
·
Dec 04, 2023
b1609
d208995c
·
swift : fix concatenation method to avoid invalid UTF8 stringfication (#4325)
·
Dec 04, 2023
b1608
5c9f90cb
·
swift : fix prompt tokenization logic (#4321)
·
Dec 04, 2023
b1607
4fa44e84
·
grammar-parser : fix typo (#4318)
·
Dec 04, 2023
b1606
fbbc4282
·
ggml : reuse ggml_get_n_tasks() in ggml_graph_plan() (#4308)
·
Dec 03, 2023
b1605
adf3de4f
·
ggml : fix soft max out-of-bounds access (#4307)
·
Dec 03, 2023
b1604
33e171d1
·
server : fix OpenAI API `stop` field to be optional (#4299)
·
Dec 03, 2023
b1602
d7b800b8
·
llama : pad KV cache size (#4280)
·
Dec 03, 2023
b1601
5a7d3125
·
llama : avoid using "optional" keyword (#4283)
·
Dec 01, 2023
1
…
121
122
123
124
125
126
127
128
129
…
178