Fetch, extract, and crawl the web via GraphQL.

WebGraphQL lets you request “fetch this page and pull out these fields” through a GraphQL API. Mesh v1 means: a small, invite-only network of known operators running the same stack to share the load and build a shared cache. You can run one-off extractions or bounded crawls, without every person re-scraping the same pages. Higher-risk jobs can be put behind allowlists and human approval.

What you can do in v1

  • Run a GraphQL API that returns structured JSON extracted from web pages.
  • Default to cache; use @fresh only when you want to hit the site again.
  • Run a bounded crawl across discovered links (caps + per-origin concurrency + backoff).
  • Optionally support JS-heavy pages via an operator-supplied browser adapter (not bundled).

Why not “just scrape”?

  • Shared cache: your mesh doesn’t re-fetch the same pages repeatedly.
  • Less glue: you write queries, not bespoke scrapers per site.
  • Safety rails: budgets/caps/backoff reduce “oops I hammered a site.”
  • Human control: isolate membership + approve risky jobs before they run.

How it works (simple)

A gateway hosts the GraphQL API and serves cached results. When you add @fresh, the gateway announces a signed job offer to the mesh. A worker runs the job (simple HTTP by default; optional browser adapter for JS-heavy sites), then publishes signed results back to the gateway, which caches them for later queries.

Roles in the mesh

Gateway operator

Runs the GraphQL API and cache. Decides what domains/jobs are allowed.

Worker operator

Runs the executors that fetch pages and extract fields. Can opt into browser support.

Validator operator (optional)

Runs an optional verifier process that helps check/confirm results before the gateway accepts them.

Query user

Sends GraphQL queries. Benefits from shared caching and safer execution controls.

Mesh safety controls

isolation Network namespaces

Use --network-id so your mesh gossip doesn’t mix with anyone else’s.

membership Allowlist with veto

Use --allowed-peer on gateway/worker/validator so everyone can ignore non-members.

human review Offer approval gate

Use --require-offer-approval so risky @fresh jobs require a manual approve/deny step.

defaults Deny-by-default

POST/submit/cross-origin/browser are off unless explicitly enabled by both job policy and operator policy.

Use cases (v1)

Build a small product on top

Example: price/availability pages, documentation pages, public directories, changelogs. You get a stable API and a shared cache instead of running custom scraping code in your app.

Run safer scrapes as a team

Split duties: one person runs the gateway API, others run workers/validators. Add approvals for risky jobs. Great for small teams that want to share infra and a cache.

Who participates (and why)

Query users

  • One API to fetch+extract across targets.
  • Shared caching to reduce re-scraping and friction.

Operators

  • Share workload with a known group (“mesh”).
  • Control risk: membership allowlists, approvals, budgets, and optional domain allowlists.

What v1 is not

Important: WebGraphQL is an execution tool. You are responsible for compliance, site terms, and safe operation. Start with strict domain allowlists and treat browser adapters as a trust boundary.