The performance layer for AI agents

Just the deltas,
nothing more.

Every time your AI agent looks at a page, it processes the whole thing. Again. JD Codec is a stateful compression layer that sends only what changed. Your agent sees less, understands more, and acts faster.

0%
Token reduction
1,689k → 230k input tokens
0pp
Success rate gain
66.7% → 83.3%
0%
Faster wall-clock
207.8s → 198.2s across 6 tasks

Same task. Same model. Same tools.

One finishes before
the other starts reading.

Task: Filter products by status 4/6

Standard Agent Waiting
Read page state
Find filter control
Apply filter
Verify results
0
tokens
$0.00
cost
0.0s
time
With JD Codec Waiting
Read page state
Find filter control
Apply filter
Verify results
0
tokens
$0.00
cost
0.0s
time

Same model · Same task · Same tools

99% fewer tokens. $2.23 saved per task.

Speed

Setup

Three commands
to 86% savings.

01

Install

One command. No dependencies beyond Python.

pip install jdcodec
02

Connect

Drop-in layer between your agent and the browser. No changes to your agent, model, or tools.

jdcodec start
03

Save

86% fewer input tokens. Higher task success rates. Faster execution. Your agent sees only what changed, and performs better because of it.

Benchmarked

Head-to-head with
standard browser automation.

Same model (Claude Sonnet 4). Same tasks. Same environment. The only difference: standard snapshots vs. JD Codec.

Task Baseline JD Codec Baseline tokens JD Codec tokens Reduction
Navigate to Products PASS PASS 108.8k 10.2k -91%
Navigate to Customers PASS PASS 30.6k 8.6k -72%
Navigate to Orders PASS PASS 54.3k 9.9k -82%
Filter products by status PASS PASS 748.9k 5.2k -99%
Change customer group FAIL FAIL 203.2k 50.8k -75%
Toggle product status FAIL PASS 543.5k 145.3k -73%
Total (6 tasks) 4/6 5/6 1,689k 230k -86.4%

Methodology: 6 real-world tasks on production web app. Input tokens via tiktoken. Wall-clock includes all latency.

Stop burning tokens
on things that didn't change.

Get early access to JD Codec. Be the first to know when we launch.