Inference

Inference

Distributed GPU cluster for LLM inference

Tentang Inference

Didirikan

2024

Komunitas

Inference is adistributed GPU cluster for LLM inference built on Solana. Inference.net is a global network of data centers serving fast, scalable, pay-per-token APIs for models like DeepSeek V3 and Llama 3.3.

Anggota Tim

Sam Heutmaker
Sam HeutmakerFounder & CEO
Ibrahim Ahmed
Ibrahim AhmedCo-Founder & CTO