Site icon TechPolyp

BitNet AI Model Sets New Benchmark for Efficient, CPU-Friendly AI

BitNet AI model

When you purchase through links on our site, we may earn an affiliate commission. This doesn’t affect our editorial independence.

Microsoft researchers have developed the BitNet AI model, the largest-scale 1-bit AI model. Developers and users call it a “BitNet.”

BitNet b1.58 2B4T is openly available under an MIT licence. Developers also affirm it can run on CPUs, including Apple’s M2.

Essentially, Bitnets are compressed models designed to run on lightweight hardware. In standard models, weights, the values that define the internal structure of a model, are often quantized. This is owing to ensuring the models perform well on a wide range of machines. Quantizing the weights lowers the number of bits—the smallest units a computer can process. So, there’s a need to represent those weights. Consequently, it allows models to run on chips with less memory faster.

The BitNet AI model quantized weights into just three values: -1, 0, and 1. In theory, this confers on them a higher memory efficiency. It also makes these computers more efficient in computing than most models today.

Microsoft researchers say that BitNet b1.58 2B4T is the first BitNet with 2 billion parameters. In this sense, parameters are essentially synonymous with “weights”. Developers train these weights on a dataset of 4 trillion tokens, which are equivalent to about 33 million books. By one estimate, the researchers claim that BitNet b1.58 2B4T outperforms traditional models of similar sizes.

According to the researchers’ testing, the model surpasses Meta’s Llama 3.2 1 B. It’s also more than Google’s Gemma 3 1B and Alibaba’s Qwen 2.5 1.5B on benchmarks, including GSM8K. GSM8K is a collection of grade-school-level math problems and PIQA, which tests physical commonsense reasoning skills.

However, it is more impressive that this BitNet AI model is speedier than other models of its size. In some cases, it is twice the speed while using a fraction of the memory.

Achieving that performance requires using Microsoft’s custom framework, Bitnet.cpp. Meanwhile, this framework only works with particular hardware at the moment. GPUs, which dominate the AI infrastructure landscape, are absent from the list of supported chips.

That’s all to say that the BitNet AI model may hold promise, particularly for resource-constrained devices. But compatibility is and will likely remain a big concern.

Meanwhile, in February 2025, Microsoft announced that its researchers had made a significant advancement in quantum computing. This new development might open the door for the technology expected to solve complex scientific and societal problems.

TechPolyP reported that topological qubits, the basic building blocks of quantum computation, power the “Majorana 1 chip” Microsoft unveiled. This development serves as an essential phase in the decades-long competition among tech companies to influence the future of computing.

Microsoft has formed a quantum processor known as Majorana. It’s a significant innovation in quantum computing. The company recently updated the journal Nature and according to Microsoft, the Majorana 1 chip is made of a new mineral. Therefore, it can perform complex computational operations more quickly and precisely.

Exit mobile version