Inference Devnet is a globally distributed network of GPUs that are used to run AI inference tasks for large language models like DeepSeek R1, Gemma3, Llama 3.3 70b, and more. Anyone with a GPU that meets the minimum requirements can join the network and start earning $INT points. The more GPUs you connect, the more $INT points you earn.
An operator is a user who connects their GPUs to the Inference Devnet. Operators can earn $INT points by contributing their compute power to the network. Operators are responsible maintaining their GPUs and ensuring they are running optimally.
Inference Devnet is currently in under active development. While the network is mostly stable, there will be bugs, and periods of downtime. When there is downtime, we will notify the community on our Operator Discord.
Please open a ticket in Discord if you have a bug report. Make sure to include all relevant information in order to help us fix the issue. If you do not, your ticket will be closed. Please avoid tagging the team directly outside of support channels.