Puppeteer Stealth: How to Avoid Bot Detection (Without Getting Your IP Burned)
If you’re searching for puppeteer stealth, you’re probably here for one reason:
Your script works locally… and gets blocked the moment you run it at scale.
This post is the practical, 2026 version of stealth:
- what “stealth” actually means (and what it doesn’t)
- the most common fingerprint mistakes
- how to configure Puppeteer to look less like a bot
- how to avoid burning your IPs
- when to stop fighting and switch to a different approach
Stealth isn’t just plugins — it’s consistent network behavior, retries, and not reusing burned IPs. ProxiesAPI helps you rotate IPs and keep crawl coverage stable.
First principles: what sites detect
Modern bot defenses don’t rely on one signal. They blend:
- Network signals
- IP reputation (datacenter vs residential)
- request rate and burstiness
- TLS fingerprint / JA3-like signals
- Browser fingerprint
- headless indicators
- WebGL, canvas, audio
- fonts, screen size, locale
- Behavior
- instant interactions
- no scrolling
- unrealistic navigation
- Consistency
- same IP + new fingerprint every request
- timezone mismatch with IP region
So “stealth” isn’t a single flag.
It’s a system that keeps your traffic plausible and consistent.
The stealth spectrum (don’t overpay)
Not every target needs a full stealth stack.
Here’s a good mental model:
| Target | Typical defenses | Recommended approach |
|---|---|---|
| Docs/blogs | low | requests + HTML parsing |
| Small e-comm | rate limits | requests + proxies + retries |
| JS-heavy apps | dynamic rendering | Playwright/Puppeteer (headful when needed) |
| High-value marketplaces | advanced | browser + residential proxies + strict pacing |
A lot of people jump straight to headless browser + stealth plugins.
Often the cheaper fix is simply:
- slow down
- rotate IPs
- keep sessions consistent
Core Puppeteer setup (baseline)
Use a recent Chromium, set realistic viewport + locale, and control headless mode.
// package.json deps:
// npm i puppeteer
import puppeteer from "puppeteer";
const UA =
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) " +
"AppleWebKit/537.36 (KHTML, like Gecko) " +
"Chrome/123.0.0.0 Safari/537.36";
export async function launchBrowser({ headless = true } = {}) {
const browser = await puppeteer.launch({
headless,
args: [
"--no-sandbox",
"--disable-setuid-sandbox",
"--disable-dev-shm-usage",
"--lang=en-US,en",
],
});
const page = await browser.newPage();
await page.setUserAgent(UA);
await page.setViewport({ width: 1366, height: 768 });
await page.setExtraHTTPHeaders({ "Accept-Language": "en-US,en;q=0.9" });
return { browser, page };
}
Why this matters
- Small viewport sizes and weird languages are common automation giveaways.
- Headless is fine for many sites, but some targets still treat headless differently.
Stealth plugins: useful, not magical
The popular option is puppeteer-extra + stealth plugin.
// npm i puppeteer-extra puppeteer-extra-plugin-stealth
import puppeteer from "puppeteer-extra";
import StealthPlugin from "puppeteer-extra-plugin-stealth";
puppeteer.use(StealthPlugin());
export async function launchStealth({ headless = true } = {}) {
const browser = await puppeteer.launch({
headless,
args: ["--no-sandbox", "--disable-setuid-sandbox", "--lang=en-US,en"],
});
const page = await browser.newPage();
await page.setViewport({ width: 1366, height: 768 });
return { browser, page };
}
Where it helps:
- removes a bunch of obvious
navigator.webdriversignals - patches some headless-specific quirks
Where it doesn’t:
- bad IP reputation
- aggressive rate limiting
- behavior that looks automated
The biggest stealth mistake: “new fingerprint every request”
People rotate everything:
- user agent
- viewport
- timezone
- language
…on every request.
That often looks more suspicious.
A better model:
- Create a session profile and reuse it for a while.
- Rotate IP when you need to, but keep the browser fingerprint stable per session.
Session profile example
export function makeSessionProfile(seed = 1) {
// Keep it deterministic for a session.
const viewports = [
{ width: 1366, height: 768 },
{ width: 1440, height: 900 },
{ width: 1536, height: 864 },
];
const vp = viewports[seed % viewports.length];
return {
viewport: vp,
locale: "en-US",
timezone: "America/New_York",
};
}
Behavior: add pacing and real navigation
If your script:
- loads a page
- instantly clicks five things
- extracts content
- closes
…that’s bot behavior.
Do this instead:
- add jitter
- scroll
- wait for network to go idle
function sleep(ms) {
return new Promise((r) => setTimeout(r, ms));
}
function jitter(baseMs, spreadMs = 300) {
return baseMs + Math.floor(Math.random() * spreadMs);
}
export async function humanize(page) {
await sleep(jitter(700));
await page.mouse.move(200, 200);
await sleep(jitter(400));
await page.evaluate(() => window.scrollBy(0, 400));
await sleep(jitter(800));
}
Proxies: how to not burn your IPs
If you only take one thing from this article, make it this:
Stealth without a proxy strategy just burns IPs more slowly.
Practical proxy rules
- don’t hammer one IP
- don’t use a single IP across multiple domains simultaneously
- rotate when you see block signals (403/429, captcha pages)
- keep a cooldown list for “burned” IPs
Using a proxy with Puppeteer
Puppeteer supports a proxy server via launch args:
const browser = await puppeteer.launch({
headless: true,
args: [
"--no-sandbox",
"--disable-setuid-sandbox",
"--proxy-server=http://USERNAME:PASSWORD@HOST:PORT",
],
});
If you’re using ProxiesAPI as your proxy layer, the principle is the same:
- keep requests stable
- rotate IPs when blocked
- avoid bursty traffic patterns
Detection signals you should log
To debug stealth, log these per request:
- status code
- final URL (redirects)
- response size
- presence of keywords like
captcha,verify you are human - screenshot on failure
In Puppeteer:
page.on("response", async (res) => {
const url = res.url();
const status = res.status();
if (status >= 400) {
console.log("HTTP", status, url);
}
});
Comparison table: common stealth tactics
| Tactic | Helps? | When to use | Risk |
|---|---|---|---|
| Stealth plugin | sometimes | generic bot checks | can break sites |
| Headful mode | often | headless-blocked targets | slower |
| Residential proxies | big help | high-value targets | cost |
| Slow pacing | huge help | almost always | slower throughput |
| Randomize everything | usually no | rarely | looks inconsistent |
When to stop using Puppeteer (and do something else)
Use Puppeteer when you need rendering.
But if your target has usable underlying APIs or structured data:
- scrape the JSON endpoints
- parse JSON-LD
- use server-rendered HTML
Browsers are expensive. They should be your last resort, not your default.
Where ProxiesAPI fits (honestly)
ProxiesAPI won’t make a badly-behaved bot “undetectable.”
But it helps keep your crawl stable by:
- rotating IPs
- reducing repeated failures
- letting you pace requests without losing coverage
Combine it with realistic sessions, pacing, and failure logging — that’s real puppeteer stealth in 2026.
Stealth isn’t just plugins — it’s consistent network behavior, retries, and not reusing burned IPs. ProxiesAPI helps you rotate IPs and keep crawl coverage stable.