blog-image

Olix Computing Raises $220M for Optical AI Chip Development

Olix Computing Ltd., a startup building an artificial intelligence chip that uses optical components, has raised $220 million in new funding.

The funding round was led by Belgian venture capital firm Hummingbird Ventures.

The investment values the U.K.-based company at more than $1 billion. Olix had previously raised funding from Plural, Vertex Ventures, LocalGlobe and Entrepreneurs First, though those amounts were not disclosed.

Olix is developing a chip designed mainly for AI inference. Inference is the stage when trained AI models are used in real-world applications. The company has not shared full technical details about its optical components.

However, it has been said that the chip includes a new type of memory and data connection system. This suggests the optical technology is used to improve how data moves inside the processor.

Other startups are also working on similar light-based connections, known as photonic interconnects. One example is Ayar Labs, which has built optical technology that allows the creation of much larger chips.

Optical connections are valuable because light moves faster than traditional electrical signals used in today’s chips. This can increase data speed and reduce power usage.

Olix says its chip is designed to solve a problem known as the “memory wall.” This issue happens when a chip cannot perform at full speed because it is limited by how fast it can move data to and from memory.

Most AI chips rely on high-bandwidth memory (HBM), an external memory system used to store large amounts of data.

Typically, a chip loads data from HBM into its internal memory, processes it, and then sends results back to HBM. If this data movement is too slow, performance suffers.

Olix claims its chip avoids this problem by not using HBM at all. Instead, it relies completely on SRAM, a faster type of memory. SRAM is usually built directly into the chip, which reduces the distance data needs to travel. Because of this closer integration, SRAM can operate faster than HBM.

SRAM is also built differently. While HBM memory cells use a transistor and a capacitor, SRAM uses six transistors, making it more complex but faster.

Another company, Cerebras Systems, also focused on using large amounts of SRAM in its AI chip design. Olix says its optical technology improves performance even further by offering better speed and lower delay compared to traditional silicon-only SRAM designs.

The new chip is called the OLIX Optical Tensor Processing Unit, or OTPU. Tensors are mathematical structures used by AI models to store and process information. Many AI chips include special circuits designed specifically for handling tensors.

The OTPU likely also includes other types of processing units. For example, Google’s AI chips combine tensor cores with other circuits that handle tasks such as memory management.

Olix plans to use the newly raised funding to continue developing its chip. A job listing on its website shows that the company is also building software tools to help existing AI models run on its new hardware.