About Airdrop
Inference makes it effortless for developers to access leading open-source AI models with just a few lines of code. The platform’s mission is to become the AI-native foundation for developers who want to build and scale intelligent applications.
Developers can use Serverless LLM Inference to access top-tier models like Llama-3.1-8B and pay only for the tokens they use. Early-access features include LoRA Inference, which allows users to upload and stream adapters, and Image Generation, supporting models such as FLUX[DEV] and Stable Diffusion.
Backed by major investors like Multicoin Capital, a16z CSX, and Mechanism Capital, Inference redefines AI infrastructure. It enables teams to train, deploy, and scale private AI models that are faster, smarter, and more cost-efficient than traditional centralized alternatives.
Inference Devnet Launch – Power the Network and Earn Rewards
Inference has launched its Devnet, rewarding users who contribute computational power to the network. Participants can run worker nodes, earn points, and help strengthen decentralized AI infrastructure. By joining the Inference Devnet Launch, you help shape a future where AI computation is open, verifiable, and secure — powered by community nodes, not corporations.

Step-by-step guide
- Visit the Devnet Page.
- Complete simple tasks and earn points & rewards:
- Register your account.
- Launch a Worker Node on your system or a remote server. Follow the setup guide on the dashboard.
- Copy your Registration Code and paste it into the downloaded app to activate your node.
- Wait for the LLM model download to complete. Track progress in the Logs section.
- Monitor your performance, uptime, and earned points in the Dashboard.
Note:
- Engage with the platform and complete tasks to earn points.
- The more tasks you complete, the more increases your chances of earning rewards.
- While no official token drop has been confirmed by Inference, but the points earned might influence future tokens airdrop rewards eligibility!

