ScraperCity logo

Integration Guide

ScraperCity + Claude Code

Scrape Apollo leads, extract Google Maps businesses, find business emails, validate deliverability, look up mobile numbers, run skip traces, and export CSVs - all with natural language from your terminal.

What This Integration Does

Claude Code is Anthropic's terminal-based AI coding assistant. It reads files, runs commands, and executes multi-step workflows from your command line. On its own it has no access to B2B data. The ScraperCity MCP server bridges that gap.

MCP (Model Context Protocol) is an open standard that lets Claude Code connect to external tools and APIs through a standardized protocol. When you add the ScraperCity MCP server, Claude Code gains access to every ScraperCity scraper as a named tool it can call directly - no copy-pasting curl commands, no manual API calls, no scripting required on your end.

The result: you describe what data you need in plain English, and Claude Code figures out which scraper to use, handles authentication with your API key, polls for results, paginates through every page, and saves the output to a file. It can chain multiple scrapers together in a single session - for example, scraping Apollo leads, then running email validation on the results, then looking up mobile numbers, all from one prompt.

Option 1: MCP Server Setup (Recommended)

One config change. Claude Code gets access to every ScraperCity scraper. No plugins, no extra installs beyond npx.

1

Get your ScraperCity API key

Go to app.scrapercity.com/dashboard/api-docs and copy your API key. Any plan ($49, $149, or $649) gives full API access to all scrapers.

2

Add the MCP server to your Claude Code config

Add to ~/.claude/mcp.json:

{
  "mcpServers": {
    "scrapercity": {
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": {
        "SCRAPERCITY_API_KEY": "your_api_key_here"
      }
    }
  }
}

Windows users: wrap the command with cmd /c so Windows can execute npx. Change "command": "npx" to "command": "cmd" and add "/c", "npx" as the first two entries in the args array.

3

Restart Claude Code

Close and reopen Claude Code completely - not just the window. MCP server changes only take effect after a full restart. You can verify the server is registered by running claude mcp list in your terminal.

Claude Code can now scrape leads, find emails, validate contacts, and more. Just ask:

Find 2000 marketing directors at SaaS companies in California, validate their emails, and save to leads.csv

Claude Code chains Apollo scrape → email validation automatically. Downloads and merges the CSVs for you.

Option 2: Manual Setup

If you prefer not to use MCP, you can give Claude Code the API details directly and it will make HTTP requests on its own.

1

Set your API key as an environment variable

Add this to your shell profile (~/.bashrc, ~/.zshrc, etc.):

export SCRAPERCITY_API_KEY="your_api_key_here"

Alternatively, create a .env file in your project directory. Claude Code reads .env files automatically.

2

Give Claude Code the API context

Paste this at the start of your session, or save it in a CLAUDE.md file in your project root so Claude Code loads it automatically every session:

I need you to call the ScraperCity API.

Base URL: https://app.scrapercity.com/api/v1
Auth: Bearer $SCRAPERCITY_API_KEY

Available endpoints:
- POST /scrape/apollo - B2B contacts by title, industry, location ($0.0039/lead, 11-48+ hour delivery)
- POST /scrape/maps - Google Maps businesses ($0.01/place, 5-30 min)
- POST /scrape/email-validator - Verify email deliverability ($0.0036/email)
- POST /scrape/email-finder - Find business email from name + company ($0.05/contact)
- POST /scrape/mobile-finder - Phone numbers from LinkedIn/email ($0.25/input)
- POST /scrape/people-finder - Skip trace ($0.02/result)
- POST /scrape/store-leads - Shopify/WooCommerce stores ($0.0039/lead)
- POST /scrape/builtwith - Sites using a technology ($4.99/search)
- GET /database/leads - millions of B2B contacts, instant query (requires $649/mo plan)
- GET /wallet - Check balance
- GET /scrape/status/:runId - Poll scrape status
- GET /scrape/logs/:runId - Download results

All POST /scrape endpoints accept { url, limit } and return { runId }.
Poll status until complete, then download results as CSV.
3

Ask for what you want in plain English

Tell Claude Code what data you need. It will pick the right endpoint, build the API call, handle polling and pagination, and save the output.

Example Prompts

Find 2000 marketing directors at SaaS companies in California, validate their emails, find their mobile numbers, and save everything to leads.csv

Claude Code chains 3 tools: Apollo scrape → email validation → mobile finder. Downloads and merges the CSVs automatically.

Scrape all plumbers in Denver from Google Maps, then find the business owner's email for each one

Maps scrape returns businesses. Claude Code extracts names + domains, feeds them to email finder, merges results.

Check my ScraperCity balance, then pull 500 Shopify stores in the US that have Instagram accounts and export to CSV

Claude Code calls wallet first to verify credits, then runs store-leads with the Instagram filter, downloads results.

Query the lead database for all VPs of Sales at companies with 50-200 employees in New York. Paginate through every page and save to nyc-vps.csv

Claude Code reads the pagination response, loops through all pages at 100 leads each, combines into a single CSV. Lead Database queries require the $649/mo plan.

Write a bash script that scrapes Google Maps for "dentists" in every US state, polls until each finishes, downloads the CSVs, and merges them into all-dentists.csv

Generates a reusable script you can run on a cron. Claude Code handles the state list, rate limiting, and file merging.

I have a CSV of 300 people with first name, last name, and company domain. Find the business email for each person and add an email column to the CSV.

Claude Code reads your file, batches requests to the email finder endpoint, and writes results back into the same CSV structure.

Find all commercial real estate listings in Chicago on Crexi, then skip trace the property owners to find their contact info

Crexi scrape returns listings. Claude Code feeds owner names and addresses to People Finder to surface phone numbers and emails.

Common Workflows You Can Build

Claude Code is not just a one-off query tool - it can write scripts, set up automation, and chain scrapers into multi-step pipelines. Here are some real workflows teams use.

Automated Lead Pipeline

  1. Apollo scrape by title + industry + location
  2. Email validation to filter undeliverable addresses
  3. Mobile finder for phone numbers
  4. Export to CRM-ready CSV

Ask Claude Code to write a Python script for this. Run it on a daily cron for a continuous inbound feed.

Local Business Outreach

  1. Google Maps scrape by category + city
  2. Email finder on business names + domains
  3. Deduplicate against previous exports
  4. Output to outreach spreadsheet

Works for any local business type - HVAC, dental, legal, real estate. Repeat for every city in your target market.

Ecommerce Prospecting

  1. Store Leads scrape by platform + country
  2. Filter by revenue signal or social presence
  3. Email finder for store owners
  4. Validate emails before sending

Target Shopify or WooCommerce stores by niche. Claude Code can filter the CSV results for you before finding emails.

Technology-Based Outreach

  1. BuiltWith scrape for sites using a specific tool
  2. Website Finder for contact info on each domain
  3. Email validation pass
  4. Export segmented by technology stack

Great for SaaS companies targeting users of a competitor or complementary tool.

Available Tools

ToolWhat It DoesCostPlan
ApolloB2B contacts by title, industry, location$0.0039/leadAll plans
Google MapsLocal businesses with phones, emails, reviews$0.01/placeAll plans
Email ValidatorVerify deliverability, catch-all, MX records$0.0036/emailAll plans
Email FinderBusiness email from name + company$0.05/contactAll plans
Mobile FinderPhone numbers from LinkedIn or email$0.25/inputAll plans
People FinderSkip trace by name, email, phone, address$0.02/resultAll plans
Store LeadsShopify/WooCommerce stores with contacts$0.0039/leadAll plans
BuiltWithAll sites using a technology$4.99/searchAll plans
Criminal RecordsBackground check by name$1.00 if foundAll plans
Airbnb EmailHost emails by city or listing URL$0.019/listingAll plans
YouTube EmailBusiness emails for YouTube channelsPer channelAll plans
Website FinderContact info from website domainsPer domainAll plans
YelpBusiness listings from Yelp$0.01/listingAll plans
AngiService providers from Angie's List$0.01/listingAll plans
Zillow AgentsReal estate agent listingsPer agentAll plans
BizBuySellBusinesses for sale listings$0.01/listingAll plans
CrexiCommercial real estate listings$0.029/listingAll plans
Property LookupProperty data + owner contact$0.15/addressAll plans
Lead Databasemillions of B2B contacts, instant queryIncluded$649/mo only

Tips for Better Results

The MCP server is the fastest way to get started. One config change and Claude Code can call any ScraperCity tool by name.

Apollo scrapes typically take 11-48+ hours. Configure a webhook at app.scrapercity.com/dashboard/webhooks instead of having Claude Code poll in a loop. All other scrapers complete in minutes.

Always have Claude Code check wallet balance before running expensive scrapes. This prevents 402 errors mid-workflow.

Create a separate API key for Claude Code use. If you need to revoke it, your manual access is unaffected. Keep the key in your .env file, not hardcoded in scripts.

For recurring pulls, ask Claude Code to generate a standalone bash or Python script. Run it on a cron and you have a daily lead pipeline.

After editing your MCP config, restart Claude Code completely - not just the terminal tab. Run 'claude mcp list' to confirm the ScraperCity server shows as connected before issuing scrape requests.

Break large jobs into batches. Instead of one prompt for 10,000 leads, ask for 1,000 at a time. This makes it easier to resume if something interrupts, and you can validate quality before spending more credits.

Save the API context block (base URL, endpoints, auth) in a CLAUDE.md file in your project root. Claude Code loads this automatically every session so you never need to paste it again.

Troubleshooting

Most MCP connection issues fall into a small set of categories. Work through these in order before filing a support ticket.

MCP server not connecting / "Connection closed" error

The most common cause is a JSON syntax error in your config file. JSON does not allow trailing commas after the last item in an object or array. Run your ~/.claude/mcp.json through a JSON linter before restarting. On Windows, you also need to wrap the npx command with cmd /c - without that wrapper, Windows cannot execute npx directly.

"command not found: npx" or server fails to start

Claude Code launches MCP server processes with a different shell environment than your terminal, so npx may not be on its PATH. Either use the full absolute path to your npx binary (find it with 'which npx'), or make sure your shell profile exports PATH correctly for non-interactive shells. If you use nvm, add the nvm initialization block to ~/.zshrc or ~/.bashrc, not just ~/.zprofile.

Tools not appearing after config change

Configuration changes do not take effect until you fully restart Claude Code. Close the session entirely and reopen it. Then run 'claude mcp list' in your terminal to verify the scrapercity server appears in the list. If it does not appear, the config file path or JSON structure is wrong - the mcpServers key must be at the root level of the JSON document.

401 Unauthorized from ScraperCity API

Your SCRAPERCITY_API_KEY value in the MCP config env block is missing or wrong. Copy the key fresh from app.scrapercity.com/dashboard/api-docs and paste it directly into the config. Make sure there are no extra spaces, quotes inside the value, or line breaks introduced by your editor.

402 Payment Required mid-workflow

Your wallet balance ran out. Have Claude Code check your balance first with a wallet call before running expensive scrapes. Top up at app.scrapercity.com before resuming. For large jobs, ask Claude Code to check balance after every N results.

Apollo scrape stuck on 'processing' for hours

Apollo scrapes are intentionally asynchronous and take 11-48+ hours. This is normal. Rather than having Claude Code poll in a loop, set up a webhook at app.scrapercity.com/dashboard/webhooks to receive a notification when results are ready. All other scrapers (Maps, Email Finder, Email Validator, etc.) complete within minutes.

FAQ

Get API access to ScraperCity: