Admin message
为了安全,强烈建议开启2FA双因子认证:User Settings -> Account -> Enable two-factor authentication!!!
Tags
Tags give the ability to mark specific points in history as being important
b5018
a6f32f0b
·
Fix clang warning in gguf_check_reserved_keys (#12686)
·
Apr 01, 2025
b5017
2bb3597e
·
vulkan: fix build when glslc doesn't support coopmat (#12683)
·
Apr 01, 2025
b5016
82939705
·
SYCL: Rename oneMKL to oneMath (#12192)
·
Apr 01, 2025
b5015
8bbf2608
·
SYCL: switch to SYCL namespace (#12674)
·
Apr 01, 2025
b5013
c80a7759
·
vocab : add special infill tokens for CodeLlama (#11850)
·
Mar 31, 2025
b5012
250d7953
·
ggml : faster ssm scan (#10558)
·
Mar 31, 2025
b5010
a8a1f335
·
Vulkan: Add DP4A MMQ and Q8_1 quantization shader (#12135)
·
Mar 31, 2025
b5009
1790e731
·
cmake : fix whitespace (#0)
·
Mar 31, 2025
b5006
1a859490
·
llava : proper description fix (#12668)
·
Mar 31, 2025
b5005
6c02a032
·
SYCL: Remove misleading ggml_sycl_op_flatten function (#12387)
·
Mar 31, 2025
b5004
f52d59d7
·
llava : fix clip loading GGUFs with missing description (#12660)
·
Mar 31, 2025
b5003
52de2e59
·
tts : remove printfs (#12640)
·
Mar 31, 2025
b5002
2c3f8b85
·
llama : support BailingMoE (Ling) (#12634)
·
Mar 30, 2025
b5001
4663bd35
·
metal : use constexpr in FA kernels + fix typedef (#12659)
·
Mar 30, 2025
b4999
7242dd96
·
llama-chat : Add Yandex instruct model template support (#12621)
·
Mar 30, 2025
b4998
492d7f1f
·
musa: fix all warnings, re-enable `-DLLAMA_FATAL_WARNINGS=ON` in ci and update doc (#12611)
·
Mar 30, 2025
b4997
d3f1f0ac
·
sync : ggml
·
Mar 30, 2025
b4992
af6ae1ef
·
llama : fix non-causal mask for gemma 3 (#12615)
·
Mar 30, 2025
b4991
0bb29193
·
llama : change cpu_buft_list order: ACCEL -> GPU host -> CPU extra -> CPU (#12632)
·
Mar 29, 2025
b4990
a69f8463
·
cmake : fix ccache conflict (#12522)
·
Mar 29, 2025
1
…
10
11
12
13
14
15
16
17
18
…
178