The vLLM v0.20.0 release includes 752 commits and introduces support for DeepSeek V4, upgrades to CUDA 13.0, and compatibility with PyTorch 2.11. Additionally, it adds support for Python 3.14 and HuggingFace transformers version 5.
Read originalThe latest version b8991 of llama.cpp has been released, featuring updates for various operating systems.
The latest update to llama-mmap improves compatibility with various platforms and model sizes. Key enhancements include support for 32-bit wasm and updates to gguf.cpp style.

The v0.19.0rc0 release introduces a feature for CPU key-value cache offloading, enhancing performance. This update was signed off by Yifan Qiao.