16 × AIAI signal, amplified
AI newsAboutSources
TelegramFollow on Telegram
AI newsAboutSources
16 × AIAI signal, amplified

An AI news engine that ingests trusted sources, scores with Claude, and posts only what clears the bar.

Follow on Telegram →

Subscribe

  • Telegram
  • RSS
  • All channels

Legal

  • Privacy
  • Imprint
© 2026 16 × AI. All rights reserved.Curated by Claude. Posts every 6 hours. No newsletter, no funnel.
Home/Open Source
Open Source

llama.cpp b9008 Release Expands Platform Support

llama.cpp Releases·May 3, 2026·high confidence

Why it matters

  • →Expands llama.cpp's compatibility across multiple platforms and architectures.
  • →Enhances developer flexibility with support for Vulkan and ROCm 7.2.
  • →Solidifies llama.cpp's role as a versatile inference tool for diverse hardware.

The latest b9008 release of llama.cpp has expanded its platform support, offering builds for macOS, Linux, Windows, and Android. This update includes support for Vulkan on both Ubuntu and Windows, as well as ROCm 7.2 on Ubuntu, enhancing its compatibility with various hardware architectures. The release aims to make llama.cpp a more versatile tool for developers by supporting a wide range of systems, including Apple Silicon, Intel, and CUDA. This update reinforces llama.cpp's position as a flexible inference runtime across different platforms.

Read original

More from llama.cpp Releases

Models & Labsmodels

llama.cpp b9010 Release Fixes CUDA Multi-GPU Issue

The b9010 release of llama.cpp tackles a crucial bug in CUDA device PCI bus ID detection, which previously caused out-of-memory errors by failing to recognize multiple GPUs. This update significantly improves multi-GPU support, especially for Windows users leveraging CUDA. The release also brings enhancements for macOS, Linux, and Windows, with specific improvements for Apple Silicon and Vulkan integration. While it doesn't introduce groundbreaking new features, this update strengthens llama.cpp's reliability and compatibility across different hardware setups, including ROCm 7.2 and KleidiAI on Apple Silicon.

llama.cpp Releases·May 3, 2026
Models & Labsother

b9002 Release for Llama.cpp

The b9002 version of Llama.cpp has been released, supporting multiple platforms.

llama.cpp Releases·May 2, 2026
Open Sourceother

b9004 Release for Multiple Platforms

The b9004 release of llama.cpp introduces support for various platforms including macOS, Linux, Android, and Windows.

llama.cpp Releases·May 2, 2026

More in Open Source

DeepSeek V4 Offers Cost-Effective AI Solution© Matt Wolfe
Open Sourcemodels

DeepSeek V4 Offers Cost-Effective AI Solution

DeepSeek V4 is an open-source AI model offering near state-of-the-art capabilities at a significantly lower cost than competitors.

Matt Wolfe·May 2, 2026
vLLM Releases v0.18.2rc0 Update© vLLM Releases
Open Sourceimage

vLLM Releases v0.18.2rc0 Update

The v0.18.2rc0 release includes a fix for handling the max_pixels parameter in the PaddleOCR-VL image processor across transformations.

vLLM Releases·Apr 30, 2026
Anthropic Launches 33 Open Source Plugins© Lev Selector
Open Sourcecoding

Anthropic Launches 33 Open Source Plugins

Anthropic has released a suite of plugins that enhance the Claude ecosystem.

Lev Selector·Apr 17, 2026