ScraperCity logo

Automate Your Lead Generation Pipeline

Connect ScraperCity to your CRM, cold email tool, AI agent, or automation platform. Pick a guide below.

Cursor

Scrape leads, find emails, and validate contacts from inside Cursor. Add ScraperCity as an MCP server and use Composer to pull B2B data with natural language.

AI AgentMCPIDE
Claude Code

Scrape leads, validate emails, and export CSVs from your terminal with natural language. Claude Code calls the ScraperCity API and handles everything automatically.

AI AgentCLI
HubSpot

Create HubSpot contacts from scraped Apollo or Google Maps leads. Map emails, phones, titles, and company data to HubSpot properties automatically.

CRMMarketing
ChatGPT

Build a Custom GPT that searches the Lead Database by title, industry, and location using natural language inside ChatGPT.

AI AgentCustom GPTLead Database
Clay

Add an HTTP API enrichment column to enrich your Clay tables with B2B contact data. Find emails, validate contacts, and pull leads for every row.

EnrichmentGTM
Zapier

Trigger a Webhooks by Zapier action to query leads and route them to Google Sheets, HubSpot, Salesforce, or 6,000+ apps.

AutomationNo-Code
n8n

Use the HTTP Request node with built-in pagination to pull leads into any n8n workflow and push them to your CRM or outreach tools.

AutomationSelf-Hosted
Instantly

Pull scraped leads and push them directly into Instantly campaigns via the V2 API. Leads start receiving sequences within minutes.

Cold EmailOutreach
GoHighLevel

Push scraped leads into GoHighLevel as contacts with custom fields. Trigger GHL workflows for nurture sequences, pipeline stages, and team notifications.

CRMAgency
Airtable

Build a searchable lead database inside Airtable. Push scraped contacts as rows with name, email, phone, title, company, and LinkedIn URL.

DatabaseNo-Code
GitHub Copilot

Add ScraperCity as an MCP server in Copilot CLI or VS Code. Scrape leads, find emails, and commit results in one git-native workflow.

AI AgentCLI
Perplexity

Connect ScraperCity as a local MCP connector in Perplexity. Pull live B2B data into your research sessions on Mac.

AI AgentResearch
Pipedrive

Fill your Pipedrive sales pipeline with scraped leads. Create persons and deals from Apollo contacts or Google Maps businesses.

CRMSales
Gemini CLI

Connect ScraperCity to Google Gemini CLI as an MCP server. Also works with Gemini Code Assist in VS Code and Firebase Studio.

AI AgentCLI
Windsurf

Add ScraperCity as an MCP server in Windsurf. Use Cascade to scrape leads, find emails, and validate contacts from inside the editor.

AI AgentMCPIDE
Smartlead

Push scraped leads into Smartlead campaigns via the API. Send up to 100 leads per request with custom fields for merge tags.

Cold EmailOutreach
Lemlist

Scrape leads and push them into Lemlist multichannel campaigns. Email, LinkedIn, and phone steps all personalized with scraped contact data.

Cold EmailOutreach
Pipedream

Build lead scraping automations with full code access. Paginate through ScraperCity results and route leads to any of 2,000+ apps.

AutomationDeveloper
Google Sheets

Export scraped leads to Google Sheets automatically. Build a lead spreadsheet that fills itself on a schedule via Zapier or n8n.

SpreadsheetNo-Code
Cline

Add ScraperCity as an MCP server in the Cline VS Code extension. Scrape leads, find emails, and validate contacts from inside your editor.

AI AgentMCP
EmailBison

Import ScraperCity leads into EmailBison campaigns through their REST API. Attach leads to sequences running on dedicated IP pools.

Cold EmailOutreach
AI Agents (MCP)

The full guide to connecting ScraperCity to any AI agent via MCP server, CLI, or skill file. Covers every compatible tool.

MCPAI AgentCLI

What You Can Automate

Outbound prospecting

Scrape leads from Apollo or Google Maps, validate emails, and push contacts into Instantly, Smartlead, or EmailBison campaigns automatically.

Account-based enrichment

Feed a list of target company domains into Clay or n8n and get back decision-maker contacts with verified emails for every account.

AI agent workflows

Give Claude Code, Gemini CLI, or Copilot access to every ScraperCity scraper via MCP. The agent chains scrapes, validation, and exports in one prompt.

Scheduled lead pipelines

Set up n8n or Zapier workflows that pull fresh leads daily and route them to Google Sheets, HubSpot, Salesforce, or Slack.

API Quick Start

Every integration on this page is powered by the same REST API. Base URL: https://app.scrapercity.com/api/v1. All requests use Bearer token authentication. Get your API key at app.scrapercity.com/dashboard/api-docs.

1

Authenticate

Every request requires your API key as a Bearer token. Keep it in an environment variable - never hard-code it in client-side code.

bash
curl -H "Authorization: Bearer $SCRAPERCITY_API_KEY" \
  https://app.scrapercity.com/api/v1/wallet
2

Start a scrape and capture the runId

Every scrape is async. POST to the relevant endpoint, capture the runId from the response, then poll or use a webhook to know when results are ready.

bash
# Start a Google Maps scrape
curl -X POST https://app.scrapercity.com/api/v1/scrape/maps \
  -H "Authorization: Bearer $SCRAPERCITY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "searchStringsArray": ["plumbers"],
    "locationQuery": "Denver, CO",
    "maxCrawledPlacesPerSearch": 200
  }'
# Response: { "runId": "abc123", "message": "Maps scrape started" }
3

Poll for completion

Check the status endpoint until status returns SUCCEEDED. Most scrapers complete in 1-30 minutes. Apollo takes 11-48+ hours - use webhooks for those runs.

bash
curl https://app.scrapercity.com/api/v1/scrape/status/abc123 \
  -H "Authorization: Bearer $SCRAPERCITY_API_KEY"
# Response: { "status": "SUCCEEDED", "count": 200 }
4

Download results as CSV

Once status is SUCCEEDED, download the CSV. The download endpoint streams the file directly - pipe it to a file or pass the URL to your automation tool.

bash
curl -O https://app.scrapercity.com/api/downloads/abc123 \
  -H "Authorization: Bearer $SCRAPERCITY_API_KEY"
# Saves leads.csv with name, email, phone, title, company, LinkedIn URL

Email Validation API

The ScraperCity Email Validator is a purpose-built email validation API for B2B cold email workflows. Submit a batch of email addresses and receive a deliverability verdict, catch-all flag, MX record check, and quality rating (high / medium / low) for each one. Validation costs $0.0036 per email and completes in 1-10 minutes.

Unlike standalone email verification tools, the Email Validator lives inside the same API you use to scrape leads - so you can chain a scrape and a validation pass in one n8n workflow or Clay table without switching providers or managing separate API keys.

What the Email Validator checks

Email validation API example

bash
curl -X POST https://app.scrapercity.com/api/v1/scrape/email-validator \
  -H "Authorization: Bearer $SCRAPERCITY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "emails": [
      "[email protected]",
      "[email protected]",
      "[email protected]"
    ]
  }'
# Response: { "runId": "val_xyz", "message": "Validation started for 3 emails" }

After polling to SUCCEEDED, the downloaded CSV includes one row per email with email, status, quality, is_catch_all, and mx_found columns. Filter to quality = high before loading into any cold email platform.

Lead Enrichment API and Contact Enrichment API

Lead enrichment takes a partial contact record and appends the missing fields. ScraperCity covers this with three complementary endpoints that work well in sequence inside any automation tool.

EndpointInputOutputCostSpeed
Email FinderFirst name, last name, company domainVerified business email$0.05/contact1-10 min
Email ValidatorEmail addressDeliverability, quality, catch-all, MX$0.0036/email1-10 min
Mobile FinderLinkedIn URL or email addressMobile phone number$0.25/input1-5 min
People FinderName, email, phone, or addressFull contact profile, relatives, addresses$0.02/result2-10 min

Email finder (contact enrichment) API example

bash
curl -X POST https://app.scrapercity.com/api/v1/scrape/email-finder \
  -H "Authorization: Bearer $SCRAPERCITY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "contacts": [
      { "firstName": "Jane", "lastName": "Doe", "domain": "acme.com" },
      { "firstName": "Bob", "lastName": "Smith", "domain": "widgetco.io" }
    ]
  }'
# Response: { "runId": "ef_abc", "message": "Email finder started for 2 contacts" }

Typical enrichment workflow in Clay or n8n

  1. Upload a CSV of company domains and decision-maker names to Clay or load them as an n8n array.
  2. Call the Email Finder endpoint for each row - pass firstName, lastName, and domain.
  3. Pass returned emails to the Email Validator to filter to high-quality addresses only.
  4. Optionally run the Mobile Finder on the verified contacts to append phone numbers.
  5. Push the enriched rows to HubSpot, Instantly, Smartlead, or a Google Sheet.

ScraperCity as a B2B Data API

ScraperCity is a B2B data API with 25 endpoints under one key and one subscription. Every plan from $49/mo includes API access to all scrapers. Below is a summary of the full catalog organized by use case.

Lead Sourcing

  • Apollo - B2B contacts by title, industry, location ($0.0039/lead, 11-48+ hrs)
  • Google Maps - Local businesses with phones, emails, reviews ($0.01/place, 5-30 min)
  • Lead Database - 4.6M+ B2B contacts, instant query ($649/mo plan, 100k/day limit)
  • Yelp - Business listings with reviews ($0.01/listing, 5-15 min)
  • Angi - Service providers from Angie's List ($0.01/listing, 5-15 min)

Contact Enrichment

  • Email Finder - Business email from name + domain ($0.05/contact, 1-10 min)
  • Email Validator - Deliverability, catch-all, MX ($0.0036/email, 1-10 min)
  • Mobile Finder - Phone from LinkedIn or email ($0.25/input, 1-5 min)
  • People Finder - Skip trace by name, email, phone ($0.02/result, 2-10 min)
  • Website Finder - Contact info from domain ($per domain, 5-15 min)

Niche Scraping

  • Store Leads - Shopify/WooCommerce stores ($0.0039/lead, instant)
  • BuiltWith - Sites using a technology ($4.99/search, 1-5 min)
  • YouTube Email - Business emails from channels (per channel, 5-15 min)
  • Airbnb Email - Host emails by city or listing ($0.019/listing, 10-30 min)
  • Zillow Agents - Real estate agent listings (per agent, 5-15 min)

Real Estate & Property

  • Property Lookup - Owner contact + property data ($0.15/address, 2-10 min)
  • Crexi - Commercial real estate listings ($0.029/listing, 5-15 min)
  • BizBuySell - Businesses for sale ($0.01/listing, 5-15 min)
  • Criminal Records - Background check by name ($1.00 if found, 2-5 min)

All endpoints share the same async pattern: POST to start, GET status to poll, GET download to retrieve results. The Lead Database endpoint is the only synchronous endpoint - it returns paginated results immediately (100 leads/page, 100,000 leads/day limit) and requires the $649/mo plan.

MCP Server for AI Agents

ScraperCity ships an MCP (Model Context Protocol) server that exposes every scraper as a callable tool inside any compatible AI agent. Add the config block below to your MCP client and your agent can scrape leads, find emails, validate contacts, and export CSVs using natural language - no manual API calls required.

Compatible clients: Claude Code, Cursor, Windsurf, GitHub Copilot (VS Code), Cline, Gemini CLI, Perplexity (Mac), and any MCP-compatible agent.

json
{
  "mcpServers": {
    "scrapercity": {
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": { "SCRAPERCITY_API_KEY": "your_api_key_here" }
    }
  }
}

Add this block to your agent's config file (e.g. ~/.claude/claude_desktop_config.json for Claude Code, or .cursor/mcp.json for Cursor). Then prompt your agent to use ScraperCity directly.

Example agent prompts

CLI commands

bash
npx scrapercity login          # authenticate with your API key
npx scrapercity wallet         # check credit balance
npx scrapercity maps -q "plumbers" -l "Denver, CO" --limit 200
npx scrapercity poll <runId>   # wait for results
npx scrapercity download <runId> -o leads.csv

Troubleshooting Common Integration Issues

401 Unauthorized

Your API key is missing, expired, or malformed. Confirm the header is exactly Authorization: Bearer YOUR_API_KEY with no extra whitespace. Regenerate your key at app.scrapercity.com/dashboard/api-docs if needed.

429 Duplicate Request Blocked

ScraperCity blocks identical requests within 30 seconds to prevent duplicate charges. If you are retrying after a timeout, wait at least 30 seconds before resubmitting. In n8n or Zapier, add a 35-second delay node between a failed attempt and a retry.

Status stuck on PENDING or RUNNING

Apollo scrapes take 11-48+ hours. All other scrapers complete in 1-30 minutes. If a non-Apollo run stays in PENDING beyond 45 minutes, check your credit balance - runs will not start if your wallet is insufficient. Check balance with GET /wallet or npx scrapercity wallet.

Lead Database returns 403 Forbidden

The Lead Database (GET /database/leads) requires the $649/mo plan specifically. It is not available on the $49/mo or $149/mo plans. Upgrade your plan at app.scrapercity.com if you need instant access to the 4.6M+ contact database.

n8n HTTP Request node returns empty body

Set the response format to "JSON" in the n8n HTTP Request node options. If you are downloading a CSV, switch the response format to "File" and pass the file binary to a Write Binary File node or Google Sheets import node. Confirm your Content-Type: application/json header is set on POST requests.

MCP server not found in Cursor or Claude Code

Make sure Node.js 18+ is installed and npx is on your PATH. Confirm the config file is valid JSON and the SCRAPERCITY_API_KEY value is your actual key, not the placeholder. Restart the editor after saving the config.

Clay enrichment column returns no data

Clay's HTTP API enrichment column runs synchronously but ScraperCity's scrapers are async. Use the Email Validator and Email Finder endpoints - these are the fastest (1-10 min) and return results that Clay can poll for. Set your Clay request timeout to at least 15 minutes and map the runId from the initial response to a second column that polls the status endpoint.

Performance Tips and Best Practices

Use webhooks for Apollo

Apollo scrapes take 11-48+ hours. Polling the status endpoint every 5 minutes wastes requests and can trip duplicate-request protection. Configure a webhook at app.scrapercity.com/dashboard/webhooks and let ScraperCity POST to your n8n or Zapier webhook trigger when the run succeeds.

Validate emails before enriching phones

Mobile lookups cost $0.25/input. Run the Email Validator first and filter to high-quality addresses before passing emails to the Mobile Finder. This avoids spending $0.25 on a contact whose email is already invalid.

Paginate the Lead Database in batches of 100

The Lead Database returns 100 leads per page with a 100,000 lead/day limit. In n8n, use a loop node that increments a page parameter and stops when the returned array length is less than 100. In Pipedream, use a standard while loop with an async await on each paginated GET.

Deduplicate before pushing to CRM

ScraperCity blocks duplicate scrape requests within 30 seconds but does not deduplicate contacts across different runs. Before pushing to HubSpot, Pipedrive, or GoHighLevel, check for existing contacts by email using the CRM's search API to avoid creating duplicate records.

Schedule scrapes during off-peak hours

For large Google Maps or Apollo jobs, trigger scrapes overnight with a scheduled n8n or Zapier workflow. Results will be ready by morning and you avoid any perceived slowdowns during peak hours on external data sources.

Store runIds for redownload

Completed run CSVs are available for redownload using the same runId. Log runIds to a Google Sheet or Airtable table so you can redownload any run without re-paying scraping credits.

About the ScraperCity API

ScraperCity is a B2B data platform with a full REST API covering 25 tools under a single authentication key. Scrape Apollo lead lists ($0.0039/lead, 11-48+ hours), extract Google Maps businesses ($0.01/place, 5-30 min), find business emails ($0.05/contact, 1-10 min), validate email deliverability ($0.0036/email, 1-10 min), look up mobile phone numbers ($0.25/input, 1-5 min), run skip traces ($0.02/result, 2-10 min), and pull ecommerce store data instantly. Every tool works on every plan starting at $49/mo.

For AI agents, ScraperCity ships an MCP server compatible with Claude Code, Cursor, Windsurf, Gemini CLI, GitHub Copilot, Cline, Perplexity, and any MCP client. For automation, connect via Clay, n8n, Zapier, or Pipedream. For CRMs, push leads directly to HubSpot, GoHighLevel, Pipedrive, or Airtable. For cold email, load contacts into Instantly, Smartlead, Lemlist, or EmailBison.

The async pattern is consistent across all 25 endpoints: POST to start a run and receive a runId, GET the status endpoint to poll for completion, then GET the download endpoint to retrieve a CSV. The Lead Database is the only synchronous endpoint - it returns paginated JSON immediately on every GET. This consistent design means any integration you build for one scraper works for all of them.

Frequently Asked Questions

Get API access to ScraperCity: