Introduction
Inference.net is the world’s largest distributed GPU network for AI inference.
What is Inference Devnet?
Inference Devnet is a globally distributed network of GPUs that are used to run AI inference tasks for large language models like DeepSeek R1, Gemma3, Llama 3.3 70b, and more. Anyone with a GPU that meets the minimum requirements can join the network and start earning $INT points. The more GPUs you connect, the more $INT points you earn.
How can I get started?
Follow our quick start guide to install the node software and get started.
What is a Devnet Operator?
An operator is a user who connects their GPUs to the Inference Devnet. Operators can earn $INT points by contributing their compute power to the network. Operators are responsible maintaining their GPUs and ensuring they are running optimally.
Is Inference.net Devnet stable?
Inference Devnet is currently in under active development. While the network is mostly stable, there will be bugs, and periods of downtime. When there is downtime, we will notify the community on our Operator Discord.
What should I do if there is downtime?
Monitor the announcements channel in the Discord for downtime updates from the core team.
What should I do if I have a bug report?
Please open a ticket in Discord if you have a bug report. Make sure to include all relevant information in order to help us fix the issue. If you do not, your ticket will be closed. Please avoid tagging the team directly outside of support channels.