Integration Guide
Scrape Apollo leads, extract Google Maps businesses, find business emails, validate deliverability, look up mobile numbers, run skip traces, and export CSVs - all with natural language from your terminal.
Claude Code is Anthropic's terminal-based AI coding assistant. It reads files, runs commands, and executes multi-step workflows from your command line. On its own it has no access to B2B data. The ScraperCity MCP server bridges that gap.
MCP (Model Context Protocol) is an open standard that lets Claude Code connect to external tools and APIs through a standardized protocol. When you add the ScraperCity MCP server, Claude Code gains access to every ScraperCity scraper as a named tool it can call directly - no copy-pasting curl commands, no manual API calls, no scripting required on your end.
The result: you describe what data you need in plain English, and Claude Code figures out which scraper to use, handles authentication with your API key, polls for results, paginates through every page, and saves the output to a file. It can chain multiple scrapers together in a single session - for example, scraping Apollo leads, then running email validation on the results, then looking up mobile numbers, all from one prompt.
One config change. Claude Code gets access to every ScraperCity scraper. No plugins, no extra installs beyond npx.
Go to app.scrapercity.com/dashboard/api-docs and copy your API key. Any plan ($49, $149, or $649) gives full API access to all scrapers.
Add to ~/.claude/mcp.json:
{
"mcpServers": {
"scrapercity": {
"command": "npx",
"args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
"env": {
"SCRAPERCITY_API_KEY": "your_api_key_here"
}
}
}
}Windows users: wrap the command with cmd /c so Windows can execute npx. Change "command": "npx" to "command": "cmd" and add "/c", "npx" as the first two entries in the args array.
Close and reopen Claude Code completely - not just the window. MCP server changes only take effect after a full restart. You can verify the server is registered by running claude mcp list in your terminal.
Claude Code can now scrape leads, find emails, validate contacts, and more. Just ask:
“Find 2000 marketing directors at SaaS companies in California, validate their emails, and save to leads.csv”
Claude Code chains Apollo scrape → email validation automatically. Downloads and merges the CSVs for you.
If you prefer not to use MCP, you can give Claude Code the API details directly and it will make HTTP requests on its own.
Add this to your shell profile (~/.bashrc, ~/.zshrc, etc.):
export SCRAPERCITY_API_KEY="your_api_key_here"Alternatively, create a .env file in your project directory. Claude Code reads .env files automatically.
Paste this at the start of your session, or save it in a CLAUDE.md file in your project root so Claude Code loads it automatically every session:
I need you to call the ScraperCity API.
Base URL: https://app.scrapercity.com/api/v1
Auth: Bearer $SCRAPERCITY_API_KEY
Available endpoints:
- POST /scrape/apollo - B2B contacts by title, industry, location ($0.0039/lead, 11-48+ hour delivery)
- POST /scrape/maps - Google Maps businesses ($0.01/place, 5-30 min)
- POST /scrape/email-validator - Verify email deliverability ($0.0036/email)
- POST /scrape/email-finder - Find business email from name + company ($0.05/contact)
- POST /scrape/mobile-finder - Phone numbers from LinkedIn/email ($0.25/input)
- POST /scrape/people-finder - Skip trace ($0.02/result)
- POST /scrape/store-leads - Shopify/WooCommerce stores ($0.0039/lead)
- POST /scrape/builtwith - Sites using a technology ($4.99/search)
- GET /database/leads - millions of B2B contacts, instant query (requires $649/mo plan)
- GET /wallet - Check balance
- GET /scrape/status/:runId - Poll scrape status
- GET /scrape/logs/:runId - Download results
All POST /scrape endpoints accept { url, limit } and return { runId }.
Poll status until complete, then download results as CSV.Tell Claude Code what data you need. It will pick the right endpoint, build the API call, handle polling and pagination, and save the output.
“Find 2000 marketing directors at SaaS companies in California, validate their emails, find their mobile numbers, and save everything to leads.csv”
Claude Code chains 3 tools: Apollo scrape → email validation → mobile finder. Downloads and merges the CSVs automatically.
“Scrape all plumbers in Denver from Google Maps, then find the business owner's email for each one”
Maps scrape returns businesses. Claude Code extracts names + domains, feeds them to email finder, merges results.
“Check my ScraperCity balance, then pull 500 Shopify stores in the US that have Instagram accounts and export to CSV”
Claude Code calls wallet first to verify credits, then runs store-leads with the Instagram filter, downloads results.
“Query the lead database for all VPs of Sales at companies with 50-200 employees in New York. Paginate through every page and save to nyc-vps.csv”
Claude Code reads the pagination response, loops through all pages at 100 leads each, combines into a single CSV. Lead Database queries require the $649/mo plan.
“Write a bash script that scrapes Google Maps for "dentists" in every US state, polls until each finishes, downloads the CSVs, and merges them into all-dentists.csv”
Generates a reusable script you can run on a cron. Claude Code handles the state list, rate limiting, and file merging.
“I have a CSV of 300 people with first name, last name, and company domain. Find the business email for each person and add an email column to the CSV.”
Claude Code reads your file, batches requests to the email finder endpoint, and writes results back into the same CSV structure.
“Find all commercial real estate listings in Chicago on Crexi, then skip trace the property owners to find their contact info”
Crexi scrape returns listings. Claude Code feeds owner names and addresses to People Finder to surface phone numbers and emails.
Claude Code is not just a one-off query tool - it can write scripts, set up automation, and chain scrapers into multi-step pipelines. Here are some real workflows teams use.
Ask Claude Code to write a Python script for this. Run it on a daily cron for a continuous inbound feed.
Works for any local business type - HVAC, dental, legal, real estate. Repeat for every city in your target market.
Target Shopify or WooCommerce stores by niche. Claude Code can filter the CSV results for you before finding emails.
Great for SaaS companies targeting users of a competitor or complementary tool.
| Tool | What It Does | Cost | Plan |
|---|---|---|---|
| Apollo | B2B contacts by title, industry, location | $0.0039/lead | All plans |
| Google Maps | Local businesses with phones, emails, reviews | $0.01/place | All plans |
| Email Validator | Verify deliverability, catch-all, MX records | $0.0036/email | All plans |
| Email Finder | Business email from name + company | $0.05/contact | All plans |
| Mobile Finder | Phone numbers from LinkedIn or email | $0.25/input | All plans |
| People Finder | Skip trace by name, email, phone, address | $0.02/result | All plans |
| Store Leads | Shopify/WooCommerce stores with contacts | $0.0039/lead | All plans |
| BuiltWith | All sites using a technology | $4.99/search | All plans |
| Criminal Records | Background check by name | $1.00 if found | All plans |
| Airbnb Email | Host emails by city or listing URL | $0.019/listing | All plans |
| YouTube Email | Business emails for YouTube channels | Per channel | All plans |
| Website Finder | Contact info from website domains | Per domain | All plans |
| Yelp | Business listings from Yelp | $0.01/listing | All plans |
| Angi | Service providers from Angie's List | $0.01/listing | All plans |
| Zillow Agents | Real estate agent listings | Per agent | All plans |
| BizBuySell | Businesses for sale listings | $0.01/listing | All plans |
| Crexi | Commercial real estate listings | $0.029/listing | All plans |
| Property Lookup | Property data + owner contact | $0.15/address | All plans |
| Lead Database | millions of B2B contacts, instant query | Included | $649/mo only |
The MCP server is the fastest way to get started. One config change and Claude Code can call any ScraperCity tool by name.
Apollo scrapes typically take 11-48+ hours. Configure a webhook at app.scrapercity.com/dashboard/webhooks instead of having Claude Code poll in a loop. All other scrapers complete in minutes.
Always have Claude Code check wallet balance before running expensive scrapes. This prevents 402 errors mid-workflow.
Create a separate API key for Claude Code use. If you need to revoke it, your manual access is unaffected. Keep the key in your .env file, not hardcoded in scripts.
For recurring pulls, ask Claude Code to generate a standalone bash or Python script. Run it on a cron and you have a daily lead pipeline.
After editing your MCP config, restart Claude Code completely - not just the terminal tab. Run 'claude mcp list' to confirm the ScraperCity server shows as connected before issuing scrape requests.
Break large jobs into batches. Instead of one prompt for 10,000 leads, ask for 1,000 at a time. This makes it easier to resume if something interrupts, and you can validate quality before spending more credits.
Save the API context block (base URL, endpoints, auth) in a CLAUDE.md file in your project root. Claude Code loads this automatically every session so you never need to paste it again.
Most MCP connection issues fall into a small set of categories. Work through these in order before filing a support ticket.
▶MCP server not connecting / "Connection closed" error
The most common cause is a JSON syntax error in your config file. JSON does not allow trailing commas after the last item in an object or array. Run your ~/.claude/mcp.json through a JSON linter before restarting. On Windows, you also need to wrap the npx command with cmd /c - without that wrapper, Windows cannot execute npx directly.
▶"command not found: npx" or server fails to start
Claude Code launches MCP server processes with a different shell environment than your terminal, so npx may not be on its PATH. Either use the full absolute path to your npx binary (find it with 'which npx'), or make sure your shell profile exports PATH correctly for non-interactive shells. If you use nvm, add the nvm initialization block to ~/.zshrc or ~/.bashrc, not just ~/.zprofile.
▶Tools not appearing after config change
Configuration changes do not take effect until you fully restart Claude Code. Close the session entirely and reopen it. Then run 'claude mcp list' in your terminal to verify the scrapercity server appears in the list. If it does not appear, the config file path or JSON structure is wrong - the mcpServers key must be at the root level of the JSON document.
▶401 Unauthorized from ScraperCity API
Your SCRAPERCITY_API_KEY value in the MCP config env block is missing or wrong. Copy the key fresh from app.scrapercity.com/dashboard/api-docs and paste it directly into the config. Make sure there are no extra spaces, quotes inside the value, or line breaks introduced by your editor.
▶402 Payment Required mid-workflow
Your wallet balance ran out. Have Claude Code check your balance first with a wallet call before running expensive scrapes. Top up at app.scrapercity.com before resuming. For large jobs, ask Claude Code to check balance after every N results.
▶Apollo scrape stuck on 'processing' for hours
Apollo scrapes are intentionally asynchronous and take 11-48+ hours. This is normal. Rather than having Claude Code poll in a loop, set up a webhook at app.scrapercity.com/dashboard/webhooks to receive a notification when results are ready. All other scrapers (Maps, Email Finder, Email Validator, etc.) complete within minutes.