Integration Guide
Scrape B2B leads without leaving Cursor. Add ScraperCity as an MCP server and Cursor can pull Apollo contacts, extract Google Maps businesses, find and validate emails, look up phone numbers, and export results to CSV - all through natural language in Composer.
Model Context Protocol (MCP) is an open standard that lets Cursor's AI agent talk to external tools and data sources. Without MCP, Cursor can only work with files open in your editor. With an MCP server connected, Cursor's Agent can call APIs, query databases, and run scrapers as part of any task - all without you switching tabs or writing boilerplate HTTP code.
ScraperCity runs as a local MCP server via npx. When you open Composer in Agent mode, Cursor automatically discovers every ScraperCity tool - Apollo scraping, Google Maps extraction, email finding, email validation, phone lookup, skip tracing, and more - and decides which ones to call based on your prompt. You write plain English; Cursor handles the API calls, async polling, and file output.
Sign up at app.scrapercity.com and copy your API key from the API Docs page. Plans start at $49/mo and include access to all 25 scrapers.
Create or edit .cursor/mcp.json in your project root. This is a project-scoped config - it only activates when you open that project in Cursor. To make ScraperCity available in every project instead, use ~/.cursor/mcp.json in your home directory.
// .cursor/mcp.json
{
"mcpServers": {
"scrapercity": {
"command": "npx",
"args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
"env": {
"SCRAPERCITY_API_KEY": "your_api_key_here"
}
}
}
}Replace your_api_key_here with your actual key. The env block passes the key to the MCP server process securely - it is never exposed in your source code.
After saving the config, restart Cursor. Navigate to Cursor Settings > Tools & MCP. You should see scrapercity listed with a green dot indicating it is connected. A red dot means the server failed to start - see the Troubleshooting section below.
You can also verify quickly by opening a new Composer session in Agent mode and asking: "What ScraperCity tools do you have access to?" Cursor will list all available scrapers.
Open Composer (Cmd+I on Mac, Ctrl+I on Windows), switch to Agent mode, and describe what you need in plain English. MCP tools only run in Agent mode - they are not available in normal Cursor chat. Cursor will ask for confirmation before running each tool call unless you enable Yolo mode in settings.
“Scrape all real estate agents in Miami from Google Maps. Save to data/miami-agents.csv and write a Python script that sends each one a personalized email using their business name.”
Cursor scrapes the data, saves the CSV, then builds a working email script in the same session - with full context of the scraped fields.
“Find the business email for every company in my targets.csv file. Add an email column to the file with the results.”
Cursor reads the CSV, calls the ScraperCity email finder for each row, and writes the enriched file back. All inside the IDE.
“Build me a Node.js script that queries the ScraperCity API for marketing managers at SaaS companies, validates their emails, and pushes them to HubSpot via the Contacts API.”
Cursor writes the complete script with proper error handling, rate limiting, and field mapping - using the ScraperCity tools to test the API calls live.
“Check my ScraperCity wallet balance. If I have credits, scrape Apollo for 1000 CTOs at companies with 50-200 employees in the UK.”
Cursor checks the balance first, then conditionally runs the scrape. Results download as a CSV into your project directory.
“Scrape the top 200 Shopify stores selling fitness equipment. For each one, find the owner's email and validate it. Export everything to leads/fitness-stores.csv.”
Cursor chains three ScraperCity tools in sequence: Store Leads for the Shopify data, Email Finder for the contacts, Email Validator to check deliverability. One prompt, three API calls, one output file.
“I have a list of 50 LinkedIn URLs in prospects.txt. Look up the mobile number for each one and add it to a new column.”
Cursor reads the file, calls the Mobile Finder tool for each URL, and writes the enriched output back to your project. Results typically come back within 1-5 minutes per contact.
Every ScraperCity scraper is available through the MCP server. Cursor's Agent picks the right tool automatically based on your prompt, or you can name a specific tool explicitly.
| Tool | What It Does | Cost | Time |
|---|---|---|---|
| Apollo Scraper | B2B contacts by title, industry, and location | $0.0039/lead | 11-48+ hrs |
| Google Maps | Local businesses with phones, emails, and reviews | $0.01/place | 5-30 min |
| Email Finder | Business email from name + company domain | $0.05/contact | 1-10 min |
| Email Validator | Verify deliverability, catch-all detection, MX records | $0.0036/email | 1-10 min |
| Mobile Finder | Phone numbers from LinkedIn profile or email | $0.25/input | 1-5 min |
| People Finder | Skip trace by name, email, phone, or address | $0.02/result | 2-10 min |
| Store Leads | Shopify/WooCommerce stores with owner contacts | $0.0039/lead | Instant |
| BuiltWith | All websites using a specific technology | $4.99/search | 1-5 min |
| Website Finder | Contact info scraped from any domain | Per domain | 5-15 min |
| Yelp | Business listings from Yelp by category and city | $0.01/listing | 5-15 min |
| Zillow Agents | Real estate agent listings and contact data | Per agent | 5-15 min |
| Property Lookup | Property data and owner contact information | $0.15/address | 2-10 min |
| Lead Database | Query 3M+ B2B contacts instantly ($649/mo plan) | Included | Instant |
| Wallet | Check your ScraperCity credit balance | Free | Instant |
The real power of connecting ScraperCity to Cursor is combining scraping with code generation in a single session. Here are complete end-to-end workflows you can run in one Composer thread.
This pattern removes bad addresses before you send - protecting sender reputation without leaving the IDE.
Cursor keeps context of the scraped fields throughout - it knows the column names when writing the email template and the send script.
Set a webhook at app.scrapercity.com/dashboard/webhooks to get notified when long-running Apollo scrapes finish.
Cursor sees your full project. Ask it to scrape leads AND build the application that uses them in the same conversation.
Composer Agent mode lets Cursor chain multiple tool calls autonomously - scrape, validate, enrich, and export in one prompt.
Cursor works with Claude, GPT, and other models. All of them can call ScraperCity tools through MCP regardless of which model you pick.
.cursor/mcp.json scopes the ScraperCity connection to specific projects. Different projects can use different API keys.
Cursor reads your existing CSVs, enriches them with ScraperCity data, and writes the results back - no copy-paste between tools.
In Agent mode, Cursor can run terminal commands alongside ScraperCity tool calls - install dependencies, run scripts, and verify output in the same thread.
Cursor loads tool descriptions for every enabled MCP server into the agent's context. With too many servers active at once, the agent gets slower at picking the right tool. Disable MCP servers you are not actively using via Cursor Settings > Tools & MCP.
When validating large lists, ask Cursor to validate in batches of 100-200 at a time rather than one by one. This keeps individual tool calls fast and makes it easy to retry a batch if a call fails without reprocessing the entire list.
Apollo results take 11-48+ hours. Instead of leaving Cursor polling overnight, set up a webhook at app.scrapercity.com/dashboard/webhooks to receive a notification when the results are ready. Then start a new Cursor session to download and process them.
Ask Cursor to check your ScraperCity wallet balance before running expensive scrapes. This is free and instant - the Wallet tool takes a second and confirms you have enough credits before committing to a large Apollo run.
If multiple people are using the same codebase, each person should set their own SCRAPERCITY_API_KEY in their local .cursor/mcp.json. Add .cursor/mcp.json to .gitignore to prevent API keys from being committed.
Check that the server shows a green dot in Cursor Settings > Tools & MCP. If it shows a red dot, open the Output panel (Cmd+Shift+U on Mac, Ctrl+Shift+U on Windows), select MCP from the dropdown, and read the startup error. The most common causes are: Node.js not installed or not on your PATH, a missing or incorrect SCRAPERCITY_API_KEY, or a JSON syntax error in mcp.json.
Make sure you are using Agent mode in Composer. MCP tools are only called in Agent mode - they are not available in regular Cursor chat. Switch the mode toggle in the Composer panel to Agent before sending your prompt.
Test your API key by running npx scrapercity wallet in your terminal. If that returns an error, the key is invalid or expired - generate a new one from app.scrapercity.com/dashboard/api-docs. Also confirm that the env block in mcp.json is using the key name SCRAPERCITY_API_KEY exactly as shown.
For most scrapers this is expected behavior - Apollo scrapes take 11-48+ hours and Cursor may time out waiting. Ask Cursor to save the run ID to a file in your project, then start a new session later and ask it to poll that run ID and download results. You can also set a webhook to get notified when the scrape completes.
Cursor loads the MCP config on startup. After editing mcp.json, you must fully restart Cursor (not just reload the window) for changes to take effect. You can also click the Reload button next to the server in Cursor Settings > Tools & MCP.
ScraperCity blocks identical requests submitted within 30 seconds to prevent duplicate charges. If Cursor retries a failed tool call immediately with the same parameters, it will get a duplicate protection error. Wait 30 seconds before retrying, or change a parameter slightly (e.g. adjust the result limit by 1).