Shovels

Construction data from your terminal

Your AI agent's gateway to U.S. construction data.

terminal

$ curl -LsSf https://shovels.ai/install.sh | sh

Installed shovels to ~/.shovels/bin/shovels

# How many solar permits in 92024 (Encinitas, CA) last year?

$ shovels permits search --geo-id 92024 \

    --permit-from 2024-01-01 --permit-to 2024-12-31 \

    --tags solar --include-count --limit 1 \

    | jq '.meta.total_count.value'

581

Permits, contractors, addresses

Everything in one binary

No SDKs, no dependencies, no configuration files. Download a single binary and start querying construction data in seconds.

Roofing permits in Miami Beach

$ shovels permits search \

    --geo-id 33139 \ # Miami Beach, FL

    --permit-from 2024-01-01 \

    --permit-to 2024-12-31 \

    --tags roofing --include-count \

    --limit 1 | jq '.meta.total_count.value'

490

How many permits has this contractor filed?

$ shovels contractors permits 03xTGkafsf # Cosmic Solar Inc.

    --limit all \

    | jq '.data | length'

2056

Export electrical contractors to CSV

$ shovels contractors search \

    --geo-id 78701 \ # Austin, TX

    --permit-from 2024-01-01 \

    --permit-to 2024-12-31 \

    --tags electrical --limit 100 \

    | jq -r '.data[] | [.name, .permit_count] | @csv'

"KENNETH TUMLINSON",554

"JOSEPH H MARTINEZ",85

"CDX ELECTRICAL SERVICES",19

...

Resolve an address to a geo_id

$ shovels addresses search \

    -q "1600 Pennsylvania Ave" \

    | jq '.data[0] | {name, geo_id}'

{

  "name": "1600 Pennsylvania Ave Nw, Washington, DC",

  "geo_id": "Kw5MGExoU6Y"

}

Agent-first design

Your AI agent's interface to construction data

No MCP headaches — no context bloat, no credential juggling, no host lock-in. Claude Code, OpenAI Codex, Cursor, or your own agents shell out to shovels and get structured data back instantly.

JSON to stdout. Always.
No tables, no colors, no spinners. stdout is always valid JSON. Errors go to stderr. Your parser never breaks.
Help text written for LLMs
Every --help is specific, example-rich, and jargon-free. An AI agent can read the help and construct the right command on the first try.
Meaningful exit codes
Exit 0 = success. Exit 2 = auth error. Exit 3 = rate limited. Your agent can branch on the exit code without parsing anything.
Zero interactivity
No prompts, no confirmations, no progress bars. It succeeds and prints JSON, or it fails and prints a JSON error. Nothing in between.
claude code

# Your AI agent runs this:

$ shovels permits search --help

...

# Reads the help, builds the query:

$ shovels permits search \

    --geo-id 94110 \ # San Francisco, CA

    --permit-from 2024-06-01 \

    --permit-to 2024-12-31 \

    --tags roofing \

    --property-type residential \

    --limit 20

# Parses the JSON, reasons about the results,

# then asks for contractor details...

Unix philosophy

Pipe it. Chain it. Script it.

JSON output means you can compose the CLI with any tool in your stack. Build workflows that would take weeks with a GUI.

Build a solar contractor CSV in one line

shovels contractors search --geo-id CA \

  --permit-from 2024-01-01 --permit-to 2024-12-31 \

  --tags solar --limit all \

  | jq -r '.data[] | [.name, .phone, .permit_count] | @csv' \

  > solar_contractors_ca.csv

Monitor new permits in your market (cron job)

# crontab: 0 8 * * MON

shovels permits search --geo-id 94110 \ # San Francisco, CA

  --permit-from $(date -v-7d +%Y-%m-%d) \

  --permit-to $(date +%Y-%m-%d) \

  | jq '.meta.count' \

  | xargs -I{} echo "{} new permits this week in 94110"

Why not just curl?

Because pagination shouldn't be your problem

With curl: 15+ lines, manual pagination

cursor=""

while true; do

  resp=$(curl -s -H "X-API-Key: $KEY" \

    "https://api.shovels.ai/v2/permits/search\

    ?geo_id=92024&permit_from=2024-01-01\

    &permit_to=2024-12-31&cursor=$cursor")

  echo "$resp" | jq '.items[]'

  cursor=$(echo "$resp" | jq -r '.next_cursor')

  [ "$cursor" = "null" ] && break

done

With shovels: one command, all records

shovels permits search \

  --geo-id 92024 \

  --permit-from 2024-01-01 \

  --permit-to 2024-12-31 \

  --limit all

The CLI handles auth headers, cursor pagination, rate-limit retries, and credit tracking. You just say how many records you want.

Single binary.
No runtime, no dependencies. Download one file, put it on your PATH, done. macOS, Linux, and Windows.
Pagination handled.
--limit all fetches every record. The CLI manages cursors, page sizes, and the 100K ceiling internally.
Auto-retry.
Rate-limited? The CLI backs off with jitter and retries automatically. Respects Retry-After headers.
Credit tracking.
Every response includes credits_used and credits_remaining. Your agent always knows the cost.
Checksum-verified installs.
The install script downloads from GitHub Releases and verifies SHA256 checksums before touching your system.
Open source.
MIT licensed. Read the code, fork it, contribute. Built with Go and Cobra.

Get started in 10 seconds

curl -LsSf https://shovels.ai/install.sh | sh

Then set your API key and start querying:

$ export SHOVELS_API_KEY=your-key

$ shovels permits search --geo-id 92024 --permit-from 2024-01-01 --permit-to 2024-12-31