API Fundamentals

REST, auth, webhooks, rate limits, error handling — from scratch.

What an API actually is

An API is how one program asks another program for something. Every button that hits the internet, every Fathom webhook that shows up in your inbox, every Claude message you send — it's one piece of software sending a request over HTTP and another one sending a response back. Tab 3 gave you the one-paragraph version. This tab gives you the model you'll actually design against.

Client, server, and the plumbing between them

HTTP is a client-server protocol. Your code — a script, a serverless function, your Claude-built dashboard, a curl one-liner — plays the client. The platform on the other end (GHL, Fathom, Stripe, GitHub) plays the server. Between you and them sit caches, load balancers, and proxies you never see; they're usually invisible unless something goes wrong.

The two sides exchange discrete messages, not streams. The client sends a request; the server sends back a response. Then the connection either closes or gets reused for the next request. The whole flow is four steps: open a TCP connection, send the request, read the response, close or reuse. That's it.

Stateless by design, stateful by cookie

HTTP is stateless. The server doesn't remember your last request when the next one arrives. That sounds like a flaw — how do you stay logged in? — but it's the whole reason the web scales. Any server can handle any request at any time.

State gets layered on top. Cookies, API keys, Bearer tokens, and session IDs all do the same job: they ride along with every request to tell the server who you are right now. The server reads that identifier, looks up whatever state it needs, and responds. Stateless protocol, stateful interactions.

Why this matters for agency work

Every platform you'll wire into Claude (GHL, Fathom, Typeform, Zapier, Discord, Stripe, GitHub, OpenAI, your own) uses the same shape: an HTTPS URL, a method, some headers, an optional body, and a status code back. Learn the shape once. After that, every new API is just a different docs page filling in the same blanks.

REST verbs — five of them carry the work

Every request has a method. Five of them show up in 99% of real work: GET, POST, PUT, PATCH, DELETE. They're not arbitrary names — they signal intent to every layer in the pipeline. Proxies cache GETs. Load balancers retry idempotent ones. Your framework routes on them. Pick the right verb and the rest of the stack does the right thing automatically.

Two properties that decide retry safety

Safe= the method doesn't change server state. A safe request can be fired a thousand times without side effects. Idempotent = sending the request twice produces the same result as sending it once. Safe methods are always idempotent, but not all idempotent methods are safe.

Why it matters: if your retry logic retries a non-idempotent verb (POST, PATCH) after a network timeout, you've just risked creating a duplicate — a second charge, a second contact, a second email. Retry-safe code only retries on idempotent methods, or uses an idempotency key (see §5.9).

MethodSafeIdempotentHas bodyUse for
GETnofetch a resource
POSTyescreate / trigger action
PUTyesreplace entire resource
PATCHyespartial update
DELETEnoremove a resource

Auth — four patterns, one decision

You'll meet four auth patterns across every integration in this course. Two are trivial. Two are protocols with RFCs behind them. Knowing which one a platform uses — before you scaffold any code — saves you from writing the wrong integration.

1. API key in a header

The simplest pattern. The platform gives you a long random string. You put it in a header on every request. Fathom uses it, most webhook providers use it, internal APIs use it. Often in a custom header like X-Api-Key; sometimes in Authorization.

bashAPI key · custom header
curl -H "X-Api-Key: $API_KEY" https://api.example.com/ping

2. Bearer token

Same idea, standard header. “Bearer” means anyone who has this token can use it — so treat it like a password. GitHub, Anthropic, Notion, most modern REST APIs use this. Often the token is short-lived and rotates.

bashBearer · RFC 6750
curl -H "Authorization: Bearer $TOKEN" https://api.anthropic.com/v1/models

3. OAuth 2.0 — Authorization Code (RFC 6749 §4.1)

Used when your app acts on behalf ofa user. “Sign in with GitHub”, the GHL marketplace app you publish for other agencies, Google Calendar integrations. Six-step flow:

  1. Your app redirects the user's browser to the platform's authorization endpoint with your client ID and a redirect URI.
  2. The platform shows a consent screen. User clicks Allow.
  3. The platform redirects the browser back to your redirect URI with an authorization code.
  4. Your server POSTs the code to the platform's token endpoint with your client secret.
  5. The platform verifies and returns an access token (and usually a refresh token).
  6. Your app stores the token and uses it as a Bearer token on subsequent requests.
bashstep 4 · exchange code for token
curl -X POST https://example.com/oauth/token \
  -d "grant_type=authorization_code" \
  -d "code=$CODE" \
  -d "redirect_uri=$REDIRECT_URI" \
  -d "client_id=$CLIENT_ID" \
  -d "client_secret=$CLIENT_SECRET"

4. OAuth 2.0 — Client Credentials (RFC 6749 §4.4)

Pure service-to-service. No user involved. Your app is both the client and the authorized party. Used when you own both ends and just want a rotating short-lived token instead of a static API key.

bashclient credentials · server-to-server
curl -X POST https://example.com/oauth/token \
  -u "$CLIENT_ID:$CLIENT_SECRET" \
  -d "grant_type=client_credentials"

Which one to use

PatternBest forComplexityExpires?
API keyOne service calling another, internal toolingvery lowrarely
Bearer tokenModern REST APIs, most SaaSlowoften (rotate)
OAuth auth-codeUser-delegated access, marketplace appshigh (redirects, state, refresh)yes
OAuth client-credentialsService-to-service, you own both endsmediumyes (short-lived)

Headers that matter

Headers are metadata attached to every request and response — small key/value pairs sitting above the body. Most of them you never set yourself. A handful, you do, on every call you make.

Canonical one-liners from MDN:

  • Authorization — the credentials that identify you to the server (API key, Bearer token, Basic auth).
  • Content-Type— the media type of the body you're sending (application/json, multipart/form-data, etc.).
  • Accept — what media types you can handle in the response. Servers use this to decide between JSON, XML, HTML.
  • User-Agent— who's calling. Some APIs require a descriptive one (GitHub's does); others block generic curl defaults.

The X-prefix thing

Custom headers historically started with X-. That convention was deprecated in 2012 by RFC 6648 because too many “experimental” X-headers became standard and the rename broke everything. But the prefix still shows up everywhere: Fathom's X-Api-Key, GitHub's X-GitHub-Event, GHL's X-Signature.

Rule: follow the docs. If a platform sends X-Signature-256, read it. Don't invent your own X-* headers for new work — give them a real namespace instead (e.g., MyCompany-Request-Id).

bashinspect every header with curl -v
curl -v -H "Accept: application/json" \
  -H "Authorization: Bearer $TOKEN" \
  https://httpbin.org/headers
# the -v flag shows request AND response headers inline — invaluable for
# debugging auth issues, missing Content-Type, CORS, cache behavior

Request and response bodies

Three body encodings carry everything you'll ever send over HTTP. Picking the right one is a Content-Type decision — the header tells the server how to parse what you sent. Miss it and the server rejects the payload.

JSON — the default

Modern REST APIs expect JSON. Set Content-Type: application/jsonand send a JSON string as the body. 99% of what you'll do in Tab 6 uses this.

bashPOST with JSON body
curl -X POST https://httpbin.org/post \
  -H "Content-Type: application/json" \
  -d '{"name":"Client A","stage":"qualified","score":87}'

Form-encoded — OAuth and legacy

application/x-www-form-urlencoded — the old HTML form format. Key/value pairs joined with &. You'll see it at OAuth token endpoints (required by RFC 6749) and some legacy APIs. curl's default body encoding is form-encoded, so no explicit Content-Type is needed.

bashPOST with form body
curl -X POST https://httpbin.org/post \
  -d "grant_type=client_credentials" \
  -d "scope=read"

Multipart — for files

multipart/form-data is the encoding for file uploads and mixed field/file forms. The body gets broken into boundary-delimited parts. You never hand-write this; use -F in curl or FormData in JS.

bashPOST with file upload
curl -X POST https://httpbin.org/post \
  -F "file=@./transcript.txt" \
  -F "caption=setter-call-2026-04-20"

Rate limits — 429, Retry-After, exponential backoff

Every serious API pushes back when you call too hard. You'll hit a 429 response. A polite server tells you how long to wait via a Retry-After header. An impolite one just sends 429 and leaves you to guess.

The 429 status

From MDN: “429 Too Many Requests indicates the client has sent too many requests in a given amount of time.” The response may include Retry-After with either a number of seconds (Retry-After: 120) or an HTTP-date. It's optional — many APIs send only the status and leave the timing to you.

live probe · httpbin rate-limit simulator
~curl -sI https://httpbin.org/status/429 | head -5
HTTP/2 429 date: Mon, 20 Apr 2026 12:00:00 GMT content-type: text/html; charset=utf-8 content-length: 0 server: gunicorn

Exponential backoff with jitter

If Retry-Afteris set, respect it. If not, back off exponentially: 1s, 2s, 4s, 8s — capped somewhere sensible — plus a random jitter so two simultaneous clients don't synchronise their retries (“thundering herd”).

tsfetchWithBackoff.ts · respects Retry-After, caps at 5 attempts
export async function fetchWithBackoff(
  url: string,
  init: RequestInit = {},
  attempt = 0,
  maxAttempts = 5,
): Promise<Response> {
  const res = await fetch(url, init);
  // only retry on 429 and 5xx
  if (res.status !== 429 && res.status < 500) return res;
  if (attempt >= maxAttempts) {
    throw new Error(
      `backed off ${attempt} times, giving up at ${res.status}`,
    );
  }
  const retryAfter = Number(res.headers.get("Retry-After"));
  const delay = Number.isFinite(retryAfter)
    ? retryAfter * 1000
    : Math.min(30_000, 2 ** attempt * 1_000);
  const jitter = Math.floor(Math.random() * 500);
  await new Promise((r) => setTimeout(r, delay + jitter));
  return fetchWithBackoff(url, init, attempt + 1, maxAttempts);
}

Pagination — three styles, same goal

No real list endpoint returns everything at once. APIs paginate — “give me the first page of 50 contacts” — and hand you a way to ask for the next page. Three styles cover 99% of real APIs.

1. Cursor-based

The server returns a list plus an opaque token pointing to the next page. You send the token back on your next request. Fathom, Stripe, Shopify, most modern platforms use this. Scales to any list size; the cursor hides offset/limit internals from you.

bashcursor pagination · Fathom-style
# First page
curl -H "X-Api-Key: $KEY" "https://api.example.com/items?limit=50"
# response body: { "data": [...], "next_cursor": "eyJvZmZzZXQiOjUwfQ==" }

# Next page — pass the cursor back
curl -H "X-Api-Key: $KEY" \
  "https://api.example.com/items?limit=50&cursor=eyJvZmZzZXQiOjUwfQ=="

2. Offset / limit

The oldest pattern. ?page=3&per_page=50 or ?offset=100&limit=50. Simple to implement, simple to explain. Slow on deep pages — the server has to scan past all skipped rows.

bashoffset pagination
curl "https://api.example.com/items?page=3&per_page=50"
# response: { "items": [...], "total": 1432, "page": 3, "per_page": 50 }

3. Link header — RFC 8288

The server puts next/prev/first/last URIs in a Link response header. Your client follows them like hyperlinks. RFC 8288 replaced RFC 5988 in 2017 with the same pagination semantics — when docs reference either, they mean this.

httpGitHub · verbatim Link header example from docs
link: <https://api.github.com/repositories/1300192/issues?page=2>; rel="prev",
      <https://api.github.com/repositories/1300192/issues?page=4>; rel="next",
      <https://api.github.com/repositories/1300192/issues?page=515>; rel="last",
      <https://api.github.com/repositories/1300192/issues?page=1>; rel="first"

Webhooks — HMAC-SHA256 verification, side-by-side

A webhook is a reverse API. Instead of you calling the platform, the platform calls you. GHL sends a POST to your server when a contact's pipeline stage changes. Fathom sends one when a meeting transcript is ready. Stripe sends one when a customer pays.

The problem: anyone on the internet can POST to your public URL. How do you know it's actually GHL and not an attacker? Answer: the platform signs the body with a shared secret using HMAC-SHA256, and you verify the signature before trusting the payload.

webhook verification — every inbound eventPLATFORMevent + signatureYOUR SERVERread RAW bodyCOMPUTE HMACsha256(secret, body)COMPAREtiming-safematch → process · mismatch → 401 · drop · log

The self-test values for this section

Both snippets below produce the same digest from the same inputs. The expected digest is baked into each code comment so you can verify the pair works end-to-end before deploying anything.

  • Secret: claude-agency-playbook-v1
  • Payload: {"event":"test","ts":"2026-04-20T00:00:00Z"}
  • Expected HMAC-SHA256 (hex): b5dd7c1924a8ae47fc06ff12b9fe1cfcd354eb9d265e5f1ee8e44da2acb73949

TypeScript / Node

tsverify.ts · Node built-in crypto
import { createHmac, timingSafeEqual } from "node:crypto";

export function verifyWebhook(
  rawBody: string,
  signatureHex: string,
  secret: string,
): boolean {
  const expected = createHmac("sha256", secret)
    .update(rawBody)
    .digest("hex");
  const a = Buffer.from(expected, "hex");
  const b = Buffer.from(signatureHex, "hex");
  return a.length === b.length && timingSafeEqual(a, b);
}

// self-test — run with: node --experimental-strip-types verify.ts
const sig = createHmac("sha256", "claude-agency-playbook-v1")
  .update('{"event":"test","ts":"2026-04-20T00:00:00Z"}')
  .digest("hex");
console.log(sig);
// → b5dd7c1924a8ae47fc06ff12b9fe1cfcd354eb9d265e5f1ee8e44da2acb73949

Python 3

pythonverify.py · stdlib hmac module
import hmac
import hashlib


def verify_webhook(raw_body: bytes, signature_hex: str, secret: str) -> bool:
    expected = hmac.new(
        secret.encode(), raw_body, hashlib.sha256
    ).hexdigest()
    return hmac.compare_digest(expected, signature_hex)


# self-test — run with: python3 verify.py
sig = hmac.new(
    b"claude-agency-playbook-v1",
    b'{"event":"test","ts":"2026-04-20T00:00:00Z"}',
    hashlib.sha256,
).hexdigest()
print(sig)
# → b5dd7c1924a8ae47fc06ff12b9fe1cfcd354eb9d265e5f1ee8e44da2acb73949

Parity check — both implementations produce the same digest

parity test · both should print the same hex
~node verify.ts
b5dd7c1924a8ae47fc06ff12b9fe1cfcd354eb9d265e5f1ee8e44da2acb73949
~python3 verify.py
b5dd7c1924a8ae47fc06ff12b9fe1cfcd354eb9d265e5f1ee8e44da2acb73949

Error handling — 4xx, 5xx, idempotency, circuit breakers

Errors split cleanly into two families. The first digit of the status code tells you everything you need to know about whether retrying helps.

4xx

Your request was wrong

400 bad body · 401 no auth · 403 forbidden · 404 not there · 422 validation failed. Retrying won't help — the request itself is broken. Fix it and resend.

5xx

The server broke

500 generic fault · 502 bad gateway · 503 unavailable · 504 timeout. Retrying might help — the server was transiently wrong. Back off and try again.

idempotency

Safe POST retries

Send Idempotency-Key: <uuid>. Server dedupes within a window (Stripe uses 24h). Same key → same result, no duplicate charge / contact / order.

circuit breaker

Stop hammering a broken endpoint

After N consecutive failures, trip open. Don't send real requests for a cooldown window. Send a probe; if it passes, close. Prevents cascading failure.

wrong
Blind retry on 422
If the server said the request body was invalid, retrying with the same body gets the same rejection. You just wasted quota and delayed the real fix.
right
Read the response, surface the reason
Parse response.json(), log the validation error to your ops channel, and treat it as a “Claude needs to look at this” event — not retry fuel.

Storing keys safely

Every key in this course is a live credential. Leak it, rotate immediately. Commit it to git, rotate and assume the window you had before rotation is burned.

1. .env file + .gitignore

The baseline for local development. A plain-text file at the project root, loaded by the framework or by dotenv. Never committed. .gitignore must list it before you git init anything.

bash.env · always gitignored
# .gitignore
.env
.env.local
.env.*.local

# .env (at repo root — never committed)
ANTHROPIC_API_KEY=sk-ant-xxx...
FATHOM_API_KEY=fm_xxx...
GHL_PRIVATE_TOKEN=pit-xxx...

2. macOS Keychain

Secrets stored in the OS keychain, read into env at shell startup. Survives reboots, encrypted at rest, optionally syncs across your devices via iCloud Keychain. The security CLI has been on macOS forever.

keychain-backed env var
~security add-generic-password -a $USER -s claude-key -w sk-ant-xxx...
# stored in login keychain
~security find-generic-password -a $USER -s claude-key -w
sk-ant-xxx...
~# add this line to ~/.zshrc
~export ANTHROPIC_API_KEY=$(security find-generic-password -a $USER -s claude-key -w)

3. 1Password CLI

Team-friendly. Secrets live in a shared vault; the CLI pulls them on demand. Zero plaintext on disk, and rotation/sharing is a vault click rather than a grep-across-machines exercise.

bash1Password CLI · pull at runtime
# install once
brew install 1password-cli

# biometric signin (Touch ID on Mac)
eval $(op signin)

# read a single secret
export FATHOM_API_KEY=$(op read "op://Private/Fathom/api_key")

# or wrap an entire command so secrets never touch env
op run -- node fathom-sync.ts

4. Vercel environment variables

For deploys. Three scopes — Production, Preview, Development — each populated independently from the dashboard or the CLI. Pull them locally into .env.local with vercel env pull.

bashvercel env · three scopes
# add a production-only var via CLI (prompts for value)
vercel env add ANTHROPIC_API_KEY production

# sync Development-scope vars to .env.local for local dev
vercel env pull .env.local

# list everything in the project
vercel env ls
wrong
git add .env && git push
The key is now in the git history forever. Force-pushing doesn't help — GitHub indexes dangling commits and surveillance bots crawl new commits continuously.
right
.env in .gitignore · Keychain locally · Vercel for deploys
Plaintext never enters the repo. Local dev reads from the OS secret store. Production reads from the deploy platform. Rotation happens in one place, not a codebase scan.

Testing tools — curl, HTTPie, Postman, Insomnia

Before you wire a new API into a Claude-built pipeline, you hit it manually. Four tools cover 95% of that work. The first is always installed; the other three are conveniences you pick by taste.

curl — the one you always have

Ships with every Mac, every Linux, and WSL. No install. Ugly syntax, unmatched power. The examples throughout this tab already use it. Pair it with httpbin.org, a free live sandbox, to practice without spending real quota.

curl · live httpbin probe
~curl -s https://httpbin.org/anything -H "X-Api-Key: demo" | head -12
{ "args": {}, "data": "", "files": {}, "form": {}, "headers": { "Accept": "*/*", "Host": "httpbin.org", "User-Agent": "curl/8.7.1", "X-Api-Key": "demo" }, "json": null, "method": "GET",

HTTPie — curl for humans

Same idea, friendlier syntax. Coloured output, JSON by default, short auth helpers. Best when you're debugging interactively. Install with brew install httpie on Mac or pipx install httpie via Python.

HTTPie · same probe
~http GET httpbin.org/anything X-Api-Key:demo
HTTP/2 200 OK content-type: application/json { "args": {}, "headers": { "X-Api-Key": "demo" }, "method": "GET", "url": "https://httpbin.org/anything" }

GUIs — Postman and Insomnia

postman

Postman

GUI platform for requests, collections, environments, test scripts, and team collaboration. Pre-request scripts set up state; test scripts assert responses. Free tier for solo use, paid tiers for teams and automation.

insomnia

Insomnia (Kong)

Open-source GUI. Owned by Kong since 2019. Design-first via OpenAPI spec, local vault + Git sync, scratchpad mode for fully offline work. Slightly lighter than Postman; strong pick for privacy-focused shops.

Pick one — the short version

ToolInstallBest forCost
curlalready therescripts, CI, quick one-linersfree
HTTPiebrew install httpie · pipx install httpieinteractive debuggingfree
Postmanpostman.com downloadteam collections, runners, monitorsfree + paid
Insomniainsomnia.rest (Kong) downloaddesign-first, OpenAPI, privacyfree + paid

Plug any new API into Claude — the meta-pattern

This is the single most reusable pattern in the course. The rest of Tab 6 — GHL, Fathom, Typeform, Zapier, Discord — is just this loop applied to named platforms. Once you've got it internalised, plugging a new API into your Claude workflow takes about 15 minutes.

The loop

  1. Paste docs URL into Claude. Give it the auth section and the one endpoint you care about today. Not the whole API.
  2. Ask Claude to scaffold. A minimal curl that exercises exactly that endpoint, with auth wired from env. Stop before any TypeScript.
  3. Verify with one test call. Run the curl. Paste the response back. Claude confirms it matches the docs — same status code, same shape, no surprises.
  4. Wrap in a typed function. Only after the curl works. Typed params and return, reads env, throws on non-2xx, one inline test call at the bottom.

Three prompt templates below, one per step. Copy them, paste into a Claude session, fill in the bracketed bits, and you're integrating a new API without hand-writing boilerplate.

prompttemplate 1 · scaffold a new API
I'm going to use [API NAME] in [PROJECT NAME].

Official docs: [DOCS URL]

Read ONLY the auth section and the one endpoint I care about today: [ENDPOINT NAME + METHOD + PATH].

Then:
1. Show me the auth pattern (API key / Bearer / OAuth) verbatim from the docs.
2. Give me a minimal curl that exercises that endpoint — one I can paste into my terminal right now with [ENV VAR NAME] exported.
3. STOP. Wait for me to confirm the curl works before writing any TypeScript.

If the docs don't cover this endpoint, say so in one sentence — don't guess.
prompttemplate 2 · verify with one test call
I ran the curl. Here's the response:

[PASTE STATUS LINE + HEADERS + BODY]

Confirm or flag:
- Status code matches what docs promised (2xx)?
- Response shape matches the example in docs?
- Any unexpected fields, auth prompts, rate-limit notices, version warnings?

One-sentence verdict. If anything is off, point at it. Don't scaffold code yet.
prompttemplate 3 · wrap into a typed function
The curl works. Wrap it into a TypeScript function at [PATH/FILE.ts].

Requirements:
- Named export: [functionName]
- Params typed from the docs
- Return type modeled from the actual response I pasted (not from the docs' example)
- Reads auth from process.env.[VAR NAME]
- Use the standard fetch API with explicit Content-Type and Authorization headers — no hand-rolled wrapper library
- Throw a typed Error on non-2xx, including the response body as a string
- One self-contained import block at the top
- Add one inline test call at the bottom, commented out, that I can uncomment and run with tsx

Show me the function. Then stop and wait — I'll run it and report back.