Skip to content

[BOUNTY: 15,000 WATT] Raspberry Pi inference relay #18

@WattCoin-Org

Description

@WattCoin-Org

Description

Run a small LLM on Raspberry Pi and charge WATT per query via API.

Requirements

  • Run lightweight model (TinyLlama, Phi-2, or similar)
  • API endpoint that accepts prompt + returns model response
  • Queue management for requests
  • Usage tracking and stats
  • Optimized for Pi 4/5 (4GB+ RAM)
  • Configurable model path and parameters

Acceptance Criteria

  • Model runs on Pi with acceptable latency (<30s for short prompts)
  • API endpoint functional
  • Usage stats tracked
  • Install script included
  • Tests included and passing

Bounty

15,000 WATT

Notes

  • Payment verification integration is optional — focus on the inference relay working correctly.
  • Include your personal Solana wallet in the PR body (not the project wallet). Format: **Payout Wallet**: <your_address>

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions