
Clawdmeter is an open source project that brings Claude Code usage statistics to a small desktop dashboard. Created by Hermann Haraldsson, the device uses a Bluetooth-connected display to show pixel-art animations and token usage data, offering a nostalgic hardware experience. Since its launch, Clawdmeter has gained popularity on GitHub, reflecting the developer community's interest in tokenmaxxing. The project demonstrates how AI tools like Claude can democratize programming, allowing even non-embedded developers to create innovative devices.
Read original
© TechCrunch AISpaceXAI, the newly rebranded entity from Elon Musk's SpaceX and xAI merger, is experiencing a significant talent drain. Over 50 researchers and engineers have left since February, including key figures in coding and AI model development. This exodus raises questions about the company's commitment to leading AI model development, especially as rivals like Meta and Thinking Machine Labs attract former staff. The departures are partly attributed to Musk's demanding work culture and unrealistic deadlines, which have reportedly led to compromised project quality. This situation casts doubt on SpaceXAI's future in AI innovation.
© TechCrunch AIThe b9150 release of llama.cpp continues its trend of broadening platform compatibility, now including support for macOS Apple Silicon with KleidiAI enabled and a variety of Linux configurations such as Ubuntu with ROCm 7.2 and Vulkan. This update also enhances Windows support with CUDA 12 and 13 DLLs, making it more versatile for developers working across different environments. While there are no groundbreaking new features, the release solidifies llama.cpp's position as a flexible inference runtime for diverse hardware setups. Developers can now leverage these updates to optimize performance across a wider range of systems.
The latest b9159 release of llama.cpp significantly broadens its platform compatibility, making it more accessible to a diverse range of users. With new builds for macOS, Linux, Windows, and Android, the update includes support for Apple Silicon, Vulkan, ROCm 7.2, and CUDA 13. This expansion means developers can now leverage llama.cpp across more environments, enhancing its utility for AI inference tasks. While there are no new model architectures, the focus on platform diversity ensures that llama.cpp remains a versatile tool for developers working with different hardware configurations.
OpenAI has expanded the reach of its Codex coding tool by integrating it into the ChatGPT app, making it accessible on mobile devices. This move allows developers to manage their coding workflows remotely, offering the ability to review outputs, approve commands, and start new tasks directly from their phones. This development follows recent updates that enable Codex to run autonomously in desktop environments and a Chrome extension for live browser sessions. The mobile integration marks a significant step in making Codex more versatile and accessible, intensifying the competition with Anthropic's similar offerings.