Integration Guide
Scrape B2B data from inside Windsurf. Add ScraperCity as an MCP server and Cascade can pull Apollo contacts, extract Google Maps businesses, find and validate emails, look up phone numbers, and export results - all through natural language in your editor.
The Model Context Protocol (MCP) is an open standard that lets AI agents like Windsurf's Cascade connect to external tools and data sources through a unified interface. Instead of writing API integration code yourself, you configure an MCP server once and Cascade can call it directly in plain English.
ScraperCity's MCP server gives Cascade access to all 25 B2B data scrapers: Apollo contact lists, Google Maps business extraction, an email validation API, an email finder API, mobile number lookup, skip tracing, ecommerce store data, property records, and more. Every endpoint is exposed as a tool Cascade can call, chain, and orchestrate as part of a multi-step workflow - without you touching a single line of code.
This makes ScraperCity one of the most practical MCP servers for lead generation workflows. You describe what you want in the Cascade chat panel and the agent selects the right endpoint, submits the job, polls for completion, downloads the CSV, and saves it to your project directory.
Follow these steps to connect ScraperCity to Windsurf Cascade. The whole process takes under five minutes.
Sign up at app.scrapercity.com and navigate to the API Docs section of your dashboard. Copy your API key - you will add it to the Windsurf config in the next step. Plans start at $49/mo and all plans include access to the full scraper catalog.
Open Windsurf and click the MCPs icon in the top-right corner of the Cascade panel, then click Configure to open the config file directly in your editor. Alternatively, open the file manually:
~/.codeium/windsurf/mcp_config.json%USERPROFILE%\.codeium\windsurf\mcp_config.jsonThe file may not exist yet if you have not added any MCP servers. Create it at the path above if needed.
Paste the following into your mcp_config.json, replacing your_api_key_here with your real key. If you already have other MCP servers configured, add the scrapercity block inside the existing mcpServers object.
{
"mcpServers": {
"scrapercity": {
"command": "npx",
"args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
"env": {
"SCRAPERCITY_API_KEY": "your_api_key_here"
}
}
}
}Save the file, then click the refresh icon in the Cascade MCP panel (or restart Windsurf). After a moment you should see a green dot next to the ScraperCity server in the MCPs list. Cascade now has access to every ScraperCity tool. If you see a red indicator instead, check the troubleshooting section below.
Open the Cascade chat panel and type a simple test prompt to confirm the server is responding:
Check my ScraperCity wallet balance.Cascade will call the wallet endpoint and return your current credit balance. If it does, the integration is working and you are ready to scrape.
Cascade understands multi-step instructions and chains ScraperCity endpoints automatically. Here are examples of what you can ask directly in the chat panel.
“Scrape Google Maps for all gyms in Austin, TX. For each one that has a website, find the owner email using ScraperCity email finder. Merge into one CSV.”
Cascade chains the Maps scraper, filters results, runs the email finder API on each website, and merges both datasets into a single output file.
“Build a Python script that takes a CSV of company names, finds 3 contacts at each company using ScraperCity, validates their emails, and outputs an enriched CSV.”
Cascade writes the script with proper API calls, error handling, and rate limiting - testing the ScraperCity endpoints live as it builds. You get a working enrichment pipeline, not just boilerplate.
“Check my ScraperCity balance. If I have at least $20, scrape Apollo for 5000 marketing managers at ecommerce companies in the US and validate their emails.”
Cascade checks the wallet, conditionally triggers the Apollo scrape, waits for delivery, then runs the email validation API on the results - all without you touching the terminal.
“Scrape 200 Shopify stores selling fitness supplements. Export store name, URL, estimated revenue, and any contact emails to a CSV called shopify-fitness-leads.csv.”
Cascade calls the Store Leads endpoint, filters by platform and category, downloads the results, and saves them with the filename you specified.
“I have a list of 500 email addresses in leads.csv. Validate all of them with ScraperCity. Flag any that are catch-all or undeliverable and write a cleaned CSV with only valid contacts.”
Cascade reads your file, batches the addresses through the email validation API endpoint, parses deliverability status for each, and writes a filtered output file.
Once configured, Cascade has access to the full ScraperCity catalog. Below is a reference for the most commonly used tools.
| Tool | What It Does | Cost | Delivery |
|---|---|---|---|
| Apollo | B2B contacts by title, industry, location | $0.0039/lead | 11-48+ hrs |
| Google Maps | Local businesses with phones, emails, reviews | $0.01/place | 5-30 min |
| Email Validator | Verify deliverability, catch-all, MX records | $0.0036/email | 1-10 min |
| Email Finder | Business email from name + company domain | $0.05/contact | 1-10 min |
| Mobile Finder | Phone numbers from LinkedIn or email | $0.25/input | 1-5 min |
| People Finder | Skip trace by name, email, phone, or address | $0.02/result | 2-10 min |
| Store Leads | Shopify/WooCommerce stores with contacts | $0.0039/lead | Instant |
| BuiltWith | All sites using a specific technology | $4.99/search | 1-5 min |
| Yelp | Business listings from Yelp | $0.01/listing | 5-15 min |
| Property Lookup | Property data and owner contact info | $0.15/address | 2-10 min |
| Lead Database | 3M+ B2B contacts, instant query | $649/mo plan | Instant |
| Wallet / Status / Download | Check balance, poll job status, download CSV | Free | Instant |
Apollo scrapes are async and take 11-48+ hours. All other scrapers complete in minutes. Cascade handles polling and download automatically.
These are the most common ways developers use ScraperCity through Windsurf Cascade. Each workflow is something you can describe in natural language and Cascade will execute end to end.
Apollo delivery: 11-48+ hours. Email validation and finding: minutes. Cascade handles the full pipeline.
Google Maps delivers in 5-30 minutes. Email Finder runs at $0.05/contact.
Ask Cascade to read your existing CSV, enrich it, and write a new file - it will handle the whole loop.
Store Leads delivers instantly. BuiltWith is $4.99/search and delivers in 1-5 minutes.
One of the most commonly used ScraperCity tools in Windsurf is the email validation API. It checks deliverability, catch-all status, and MX records for each address at $0.0036/email, delivering results in 1-10 minutes. This is useful for cleaning Apollo exports, validating inbound signups, or filtering any contact list before sending.
Unlike standalone email validation tools, ScraperCity's validator is integrated directly into the same workflow as your lead scraping. You can ask Cascade to scrape, then immediately validate in the same prompt. No copy-pasting between tools, no CSV uploads to third-party services. Cascade calls the endpoint, gets the deliverability status per address, and writes a filtered output file.
The validator checks the following for each address:
You can also use ScraperCity as a contact enrichment API by chaining the Email Finder and Mobile Finder after validation. Ask Cascade to fill in missing emails with the finder, validate everything, and add mobile numbers for your highest-priority records - all in one multi-step prompt.
Red indicator next to ScraperCity in the MCPs panel
The server failed to start. Check that Node.js and npm are installed and that npx is accessible from your terminal. Open Windsurf Settings, navigate to Advanced Settings, and confirm MCP is enabled under the Cascade section. Save the config file again and click the refresh icon in the MCP panel.
Cascade says it does not have access to ScraperCity tools
Windsurf has a limit of 100 total tools across all active MCP servers. If you have multiple MCP servers installed, you may be hitting the cap. Navigate to the MCPs icon in the Cascade panel, open each server, and disable tools you do not need to stay within the 100-tool limit.
401 Unauthorized errors when Cascade calls ScraperCity
Your API key is missing or incorrect in the env block of mcp_config.json. Open the file at ~/.codeium/windsurf/mcp_config.json, confirm the SCRAPERCITY_API_KEY value matches your key from app.scrapercity.com/dashboard/api-docs, save, and refresh the MCP server.
Duplicate request blocked error
ScraperCity blocks identical requests submitted within 30 seconds to prevent accidental duplicates. Wait 30 seconds and ask Cascade to retry. If you are running tests in quick succession, slightly vary the request parameters.
Apollo scrape status shows pending for a long time
Apollo scrapes take 11-48+ hours by design - this is normal. Ask Cascade to poll the status by providing the run ID, or set up a webhook at app.scrapercity.com/dashboard/webhooks to receive a notification when the job completes. Do not submit the same request again or you will incur duplicate charges.
Config changes are not taking effect
Windsurf reads mcp_config.json on startup and when you click the refresh button in the MCP panel. After saving changes to the config file, click the refresh icon in the Cascade MCP section rather than fully restarting the editor. On Teams or Enterprise plans, your admin may need to whitelist the scrapercity server ID before it can connect.
npx command not found on Windows
Ensure Node.js is installed and that the npm binaries directory is on your system PATH. Open a new terminal window after installing Node.js and verify npx --version returns a version number. If using Windows Subsystem for Linux (WSL), confirm that Windsurf is configured to use the WSL terminal for MCP server processes.
maps-output.csv, validated-leads.csv). This makes multi-step workflows easier to track and avoids Cascade overwriting previous results.Both Windsurf and Cursor are AI-powered code editors that support MCP servers and connect to the same ScraperCity tooling. The difference is in the AI agent experience, not the data access.
Windsurf uses Cascade, which is built around multi-step planning and what Windsurf calls “flow awareness” - Cascade maintains context across a long sequence of actions and can orchestrate complex scraping pipelines in a single prompt. Cursor uses Composer for agentic tasks. Both agents can call ScraperCity tools, handle async polling, and write the results to your project.
The MCP config format is also different: Windsurf stores its config at ~/.codeium/windsurf/mcp_config.json while Cursor uses its own settings path. These are separate files - editing one does not affect the other. If you use both editors, you will need to add the ScraperCity server block to each config independently. The ScraperCity MCP server package and API key are identical either way.