Haven-AOL is an ICP-native Always Online (AOL) protocol for conditional, token-gated access using VetKD keys. Haven CLI is the operator interface for those workflows across web3 collaboration patterns such as DAOs, DataDAOs, and agent swarms.
- Python 3.11+
- FFmpeg (for video processing)
- Deno (for JS runtime)
- yt-dlp (for YouTube plugin)
pip install haven-cligit clone https://github.com/haven/haven-cli
cd haven-cli
pip install -e .haven config initThis creates ~/.config/haven/config.toml with default settings.
| Section | Key | Description | Default |
|---|---|---|---|
| pipeline | vlm_enabled | Enable AI analysis | true |
| pipeline | encryption_enabled | Enable Haven-AOL encryption | true |
| pipeline | upload_enabled | Enable Filecoin upload | true |
| pipeline | sync_enabled | Enable Arkiv sync | true |
| scheduler | enabled | Enable job scheduler | true |
| js_runtime | runtime | JS runtime (deno/node/bun) | auto-detect |
All configuration can be overridden via environment variables:
export HAVEN_VLM_ENABLED=true
export HAVEN_SYNAPSE_API_KEY=your-key
export HAVEN_LOG_LEVEL=DEBUG# Basic upload
haven upload video.mp4
# With encryption
haven upload video.mp4 --encrypt
# Skip analysis
haven upload video.mp4 --no-vlm
# Skip blockchain sync
haven upload video.mp4 --no-arkiv
# Specify dataset
haven upload video.mp4 --dataset 123# Download by CID
haven download bafybeig... --output video.mp4
# With decryption
haven download bafybeig... --output video.mp4 --decrypt
# Force overwrite
haven download bafybeig... --output video.mp4 --forcehaven download info bafybeig...
haven download info bafybeig... --jsonhaven download decrypt-file encrypted.mp4 --output decrypted.mp4
haven download decrypt-file encrypted.mp4 --cid bafybeig...The Haven pipeline processes videos through these steps:
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌────────┐ ┌──────┐
│ Ingest │──▶│ Analyze │──▶│ Encrypt │──▶│ Upload │──▶│ Sync │
└─────────┘ └─────────┘ └─────────┘ └────────┘ └──────┘
- Ingest: Extract metadata, calculate pHash, check duplicates
- Analyze: Run VLM to generate timestamps and tags (optional)
- Encrypt: Encrypt with Haven-AOL (optional)
- Upload: Store on Filecoin via Synapse (optional)
- Sync: Record metadata on Arkiv blockchain (optional)
Each step can be enabled or disabled:
# Full pipeline (default)
haven upload video.mp4
# Skip VLM analysis
haven upload video.mp4 --no-vlm
# Skip encryption
haven upload video.mp4 # encryption enabled by config
# Skip upload (dry run)
# Set upload_enabled = false in config
# Skip blockchain sync
haven upload video.mp4 --no-arkivHaven CLI uses the standardized Haven Cross-Application Data Format v1.0.0, ensuring full compatibility with haven-player (Gold Standard) and haven-dapp.
Payload Fields (Private):
filecoin_root_cid- CID of video on Filecoinis_encrypted- Boolean encryption statuscid_hash- SHA256 hash of CIDvlm_json_cid- CID of VLM analysisencryption_metadata- Encryption metadata
Attributes Fields (Public):
title- Video titleis_encrypted- Integer 0 or 1cid_hash- SHA256 hash (for duplicate detection)created_at- ISO8601 timestampcreator_handle- Content creator
For complete format details, see Arkiv Data Format and Migration Notes.
haven plugins list
haven plugins list --all # include disabled# Show current config
haven plugins config YouTubePlugin --show
# Set a configuration value
haven plugins config YouTubePlugin --set channel_ids=UCxxx,UCyyy
haven plugins config YouTubePlugin --set api_key=YOUR_API_KEYhaven plugins test YouTubePluginhaven plugins enable YouTubePlugin
haven plugins disable YouTubePluginhaven plugins info YouTubePlugin| Plugin | Description |
|---|---|
| YouTubePlugin | Archive videos from YouTube channels/playlists |
| BitTorrentPlugin | Archive torrents from feeds/DHT |
| PumpFunPlugin | Record PumpFun live streams |
| OpenRingPlugin | Capture WebRTC streams |
# Create a job with cron schedule
haven jobs create --plugin YouTubePlugin --schedule "0 * * * *"
# With custom name and action
haven jobs create --plugin YouTubePlugin --schedule "*/30 * * * *" \
--name "Hourly YouTube Check" --on-success archive_newhaven jobs list
haven jobs list --status active
haven jobs list --status pausedhaven jobs run <job-id>haven jobs pause <job-id>
haven jobs resume <job-id>haven jobs delete <job-id>
haven jobs delete <job-id> --force # skip confirmationhaven jobs history
haven jobs history <job-id> --limit 20The scheduler uses standard cron expressions:
| Expression | Description |
|---|---|
0 * * * * |
Every hour |
*/30 * * * * |
Every 30 minutes |
0 */6 * * * |
Every 6 hours |
0 0 * * * |
Daily at midnight |
@daily |
Daily at midnight |
@hourly |
Every hour |
# Run in foreground
haven run
# Run in background
haven run --daemon
# With verbose logging
haven run --verbose
# Limit concurrent pipelines
haven run --max-concurrent 8haven run status# Graceful shutdown
haven run stop
# Force kill
haven run stop --forcehaven run restart
haven run restart --daemon# Show all config
haven config show
# Show specific section
haven config show pipeline
# Output as YAML
haven config show --format yaml
# Show unmasked secrets (use with caution)
haven config show --unmaskhaven config set pipeline.vlm_model zai-org/glm-4.6v-flash
haven config set pipeline.max_concurrent_videos 8
haven config set scheduler.enabled falsehaven config validatehaven config edit # Opens in $EDITORhaven config envJS Runtime not starting
- Ensure Deno is installed:
deno --version - Check JS services path in config
- Try setting
HAVEN_JS_RUNTIME=deno
Upload failing
- Verify Synapse API key is set
- Check network connectivity
- Check
haven config validateoutput
VLM analysis slow
- Consider using local model
- Reduce frame sampling count in config
Database errors
- Ensure data directory is writable
- Check
HAVEN_DATA_DIRpermissions
# Show help for any command
haven --help
haven upload --help
haven jobs --helpexport HAVEN_LOG_LEVEL=DEBUG
haven run --verboseFor more troubleshooting tips, see Troubleshooting Guide.