Inference

Inference

Distributed GPU cluster for LLM inference

Acerca de Inference

Fundado

2024

Comunidad

Etiquetas

Inference is adistributed GPU cluster for LLM inference built on Solana. Inference.net is a global network of data centers serving fast, scalable, pay-per-token APIs for models like DeepSeek V3 and Llama 3.3.

Miembros del equipo

Sam Heutmaker
Sam HeutmakerFounder & CEO
Ibrahim Ahmed
Ibrahim AhmedCo-Founder & CTO