
DeepSeek has unveiled its latest model, DeepSeek V4, which boasts an impressive 1.6 trillion parameters and a context length of 1 million tokens. This significant upgrade positions DeepSeek V4 as a formidable player in the AI model arena, promising enhanced performance for complex tasks. The launch reflects the ongoing trend of increasing model sizes to improve AI capabilities.
Read originalThe latest version b8991 of llama.cpp has been released, featuring updates for various operating systems.
The latest update to llama-mmap improves compatibility with various platforms and model sizes. Key enhancements include support for 32-bit wasm and updates to gguf.cpp style.
