Integration Guide
Add ScraperCity as an MCP server in GitHub Copilot CLI or VS Code. Copilot can scrape Apollo leads, extract Google Maps businesses, validate emails, find phone numbers, and enrich contacts without leaving your development environment.
The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources through a standardized interface. When you add an MCP server to Copilot, the AI gains the ability to call that server's tools directly from natural language - no copy-pasting between apps, no manual API calls.
ScraperCity exposes all 25 of its B2B data scrapers as MCP tools. Once the server is connected, Copilot can find leads, validate emails, look up phone numbers, run skip traces, and pull Google Maps business data just by understanding what you ask it to do. The result is a complete lead generation workflow that lives inside your editor or terminal.
There are two places to use ScraperCity with Copilot: the Copilot CLI in your terminal, and Copilot Chat in VS Code. Both use the same underlying MCP server - the configuration is slightly different for each environment, and both are covered below.
npx).app.scrapercity.com/dashboard/api-docs.Use this path if you prefer terminal-first workflows. Copilot CLI stores MCP configs in ~/.copilot/mcp-config.json by default.
Inside an interactive Copilot CLI session, run /mcp add. A configuration form appears. Fill in the following fields:
{"SCRAPERCITY_API_KEY": "your_api_key_here"}Press Ctrl+S to save. Alternatively, you can edit the config file directly:
// ~/.copilot/mcp-config.json
{
"mcpServers": {
"scrapercity": {
"type": "stdio",
"command": "npx",
"args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
"env": {
"SCRAPERCITY_API_KEY": "your_api_key_here"
}
}
}
}If you prefer to manage the key outside the config file, export it from your shell profile or a .env file instead:
export SCRAPERCITY_API_KEY="your_api_key_here"Never commit your API key to source control. Use a .gitignored .env file or your OS keychain for team environments.
In your Copilot CLI session, run /mcp show scrapercity to confirm the server is connected and see the full list of available tools. You should see all 25 ScraperCity scrapers listed.
/mcp show scrapercityIf the server shows as offline, check that Node.js is installed and that your API key is set correctly. Run /mcp edit scrapercity to update the config without starting over.
Use this path if you want ScraperCity available inside VS Code's Copilot Chat panel. The workspace config in .vscode/mcp.json can be committed to your repo so every team member gets the same tools automatically.
Add ScraperCity to .vscode/mcp.json in your project root. Create the file if it does not exist:
// .vscode/mcp.json
{
"servers": {
"scrapercity": {
"command": "npx",
"args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
"env": {
"SCRAPERCITY_API_KEY": "your_api_key_here"
}
}
}
}If you want the server available across all projects rather than just one workspace, place the same config in your VS Code user settings file (settings.json) under the mcp key instead.
Open .vscode/mcp.json in the editor. A Start button appears above the server definition - click it. VS Code launches the ScraperCity MCP process and discovers the available tools.
Then open Copilot Chat (Ctrl+Alt+I on Windows/Linux, Ctrl+Cmd+I on Mac) and select Agent from the mode dropdown. Click the tools icon to confirm ScraperCity tools are listed.
Replace the literal key value with a VS Code input variable so teammates supply their own key on first start:
// .vscode/mcp.json (with input variable)
{
"inputs": [
{
"id": "scrapercity_key",
"type": "promptString",
"description": "ScraperCity API key",
"password": true
}
],
"servers": {
"scrapercity": {
"command": "npx",
"args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
"env": {
"SCRAPERCITY_API_KEY": "${input:scrapercity_key}"
}
}
}
}VS Code prompts each user for the key once and stores it securely. You can safely commit this version of the file.
Once connected, Copilot has access to all 25 ScraperCity tools. Here are the most commonly used ones for B2B lead generation workflows:
| Tool | What Copilot can do | Cost | Turnaround |
|---|---|---|---|
| Apollo Scraper | Scrape B2B contacts by title, industry, company size, and location | $0.0039/lead | 11-48+ hrs |
| Google Maps | Extract local businesses with phones, emails, and reviews | $0.01/place | 5-30 min |
| Email Finder | Find a business email from a name and company domain | $0.05/contact | 1-10 min |
| Email Validator | Verify deliverability, catch-all status, and MX records | $0.0036/email | 1-10 min |
| Mobile Finder | Look up mobile phone numbers from a LinkedIn URL or email | $0.25/input | 1-5 min |
| People Finder | Skip trace by name, email, phone, or address | $0.02/result | 2-10 min |
| Lead Database | Query 3M+ B2B contacts instantly with filters ($649/mo plan) | Plan required | Instant |
| Property Lookup | Get property data and owner contact from an address | $0.15/address | 2-10 min |
These prompts work in both Copilot CLI and Copilot Chat in VS Code. Copilot selects the right ScraperCity tool, handles async polling, and delivers results - all without you writing a line of code.
“Run an Apollo scrape for CTOs at cybersecurity companies in the US. Save results to data/cto-leads.csv and create a GitHub issue with the count and a sample of the first 10 rows.”
Copilot triggers the scrape, waits for results, writes the CSV, then uses its native GitHub integration to create an issue documenting the output.
“Read the target accounts from accounts.txt, look up the first 3 contacts at each company using ScraperCity email finder, and write a Python script that formats them for our CRM import.”
Copilot reads the file, calls the email finder API per domain, and generates a formatting script tailored to your codebase patterns.
“Scrape all yoga studios in San Francisco from Google Maps. For each one that has a website, run the email finder to get the owner contact. Export to a single CSV.”
Copilot chains the Maps scrape, filters for businesses with websites, enriches each with the email finder, and merges the two datasets into one file.
“Validate every email in our seed list at data/cold-outreach.csv. Remove invalid ones and commit the cleaned file.”
Copilot runs validation in batches, filters out addresses that fail deliverability checks, writes the cleaned file, and commits it to your repo.
“I have a list of LinkedIn URLs in linkedin.txt. Look up the mobile phone number for each one using the ScraperCity mobile finder and append the results to contacts.csv.”
Copilot reads the file, calls the mobile finder for each URL, and merges the phone data back into the existing CSV - preserving all other columns.
“Skip trace the 20 names in prospects.txt using the People Finder. Flag any matches that also have a criminal record and write a summary report to reports/flagged.md.”
Copilot calls People Finder and Criminal Records in sequence, cross-references the results, and writes a structured Markdown report - ready to commit or share.
ScraperCity tools compose naturally. Here are multi-step pipelines that developers run through Copilot to automate lead generation end to end.
Apollo delivery takes 11-48+ hours. Copilot polls in the background and resumes the rest of the pipeline when results arrive.
Google Maps results come back in 5-30 minutes. For large city+category queries, split into multiple smaller scrapes to stay within your wallet balance.
Store Leads results are instant. Copilot can write the import-ready formatter in whatever schema your CRM expects.
Copilot can commit scraped data, create branches for different lead lists, and open PRs with enrichment results - all in one prompt. Your lead data lives in version control alongside your code.
Copilot sees your repo. Ask it to write scraping scripts that match your existing import formats, data models, or CI pipeline - no explaining the schema from scratch each time.
Use Shift+Tab in Copilot CLI to enter plan mode and review a multi-step scraping workflow before it executes. Verify each step, adjust the plan, then let it run.
Copilot CLI supports Claude, GPT, and Gemini models. Switch between them mid-session with /model. All models have access to the full set of ScraperCity tools.
Apollo scrapes take 11-48+ hours. Copilot handles the polling loop automatically. You don't need to write a separate polling script or remember to check back manually.
Commit .vscode/mcp.json (with input variables for API keys) to your repo. Every developer on the team gets ScraperCity tools in Copilot the moment they clone the project.
ScraperCity tools do not appear after adding the server
In Copilot CLI, run /mcp show to see all configured servers and their status. A server shown as offline usually means Node.js is not installed or not on your PATH. Confirm with node --version in a new terminal. In VS Code, make sure you clicked the Start button in mcp.json and that Copilot Chat is in Agent mode - tools are not available in Ask or Edit modes.
401 Unauthorized error when Copilot calls ScraperCity
Your SCRAPERCITY_API_KEY is missing or incorrect. Double-check the key at app.scrapercity.com/dashboard/api-docs. If you set the key in your shell profile, make sure you opened a new terminal session after adding it. If you used the env block in mcp.json, verify there are no extra spaces or quotes around the value.
'MCP servers in Copilot' is disabled for my organization
On Copilot Business and Enterprise plans, the MCP policy is disabled by default. An org admin must go to Organization Settings > Copilot > Policies and enable 'MCP servers in Copilot'. This policy does not apply to Copilot Free, Pro, or Pro+ users.
Apollo scrape is still pending after several hours
Apollo scrapes take 11-48+ hours depending on query size. This is normal. You can check the status by asking Copilot to poll the run ID, or visit app.scrapercity.com/dashboard to monitor active jobs. Duplicate requests submitted within 30 seconds are blocked automatically - this prevents accidental double charges.
npx takes a long time to start the MCP server
The first run downloads the scrapercity package from npm. Subsequent starts use the npm cache and are much faster. If startup time is a concern in a team environment, pre-install the package globally with npm install -g scrapercity and change the command to scrapercity-mcp instead of the npx invocation.
Insufficient wallet balance error
Each scrape deducts from your ScraperCity wallet per result. Check your balance by asking Copilot to call the Wallet tool, or visit app.scrapercity.com/dashboard. Top up before running large Apollo or Google Maps jobs. You can set a budget limit in your prompt - for example, ask Copilot to stop after 500 results.