ScraperCity logo

Integration Guide

ScraperCity + GitHub Copilot

Add ScraperCity as an MCP server in GitHub Copilot CLI or VS Code. Copilot can scrape Apollo leads, extract Google Maps businesses, validate emails, find phone numbers, and enrich contacts without leaving your development environment.

What Is a GitHub Copilot MCP Server?

The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources through a standardized interface. When you add an MCP server to Copilot, the AI gains the ability to call that server's tools directly from natural language - no copy-pasting between apps, no manual API calls.

ScraperCity exposes all 25 of its B2B data scrapers as MCP tools. Once the server is connected, Copilot can find leads, validate emails, look up phone numbers, run skip traces, and pull Google Maps business data just by understanding what you ask it to do. The result is a complete lead generation workflow that lives inside your editor or terminal.

There are two places to use ScraperCity with Copilot: the Copilot CLI in your terminal, and Copilot Chat in VS Code. Both use the same underlying MCP server - the configuration is slightly different for each environment, and both are covered below.

Prerequisites

Setup: Copilot CLI

Use this path if you prefer terminal-first workflows. Copilot CLI stores MCP configs in ~/.copilot/mcp-config.json by default.

1

Add the ScraperCity MCP server

Inside an interactive Copilot CLI session, run /mcp add. A configuration form appears. Fill in the following fields:

  • Server Name: scrapercity
  • Server Type: Local or STDIO
  • Command: npx -y --package scrapercity scrapercity-mcp
  • Environment Variables: {"SCRAPERCITY_API_KEY": "your_api_key_here"}

Press Ctrl+S to save. Alternatively, you can edit the config file directly:

// ~/.copilot/mcp-config.json
{
  "mcpServers": {
    "scrapercity": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": {
        "SCRAPERCITY_API_KEY": "your_api_key_here"
      }
    }
  }
}
2

Set your API key in your shell profile

If you prefer to manage the key outside the config file, export it from your shell profile or a .env file instead:

export SCRAPERCITY_API_KEY="your_api_key_here"

Never commit your API key to source control. Use a .gitignored .env file or your OS keychain for team environments.

3

Verify the connection

In your Copilot CLI session, run /mcp show scrapercity to confirm the server is connected and see the full list of available tools. You should see all 25 ScraperCity scrapers listed.

/mcp show scrapercity

If the server shows as offline, check that Node.js is installed and that your API key is set correctly. Run /mcp edit scrapercity to update the config without starting over.

Setup: Copilot Chat in VS Code

Use this path if you want ScraperCity available inside VS Code's Copilot Chat panel. The workspace config in .vscode/mcp.json can be committed to your repo so every team member gets the same tools automatically.

1

Create the workspace MCP config

Add ScraperCity to .vscode/mcp.json in your project root. Create the file if it does not exist:

// .vscode/mcp.json
{
  "servers": {
    "scrapercity": {
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": {
        "SCRAPERCITY_API_KEY": "your_api_key_here"
      }
    }
  }
}

If you want the server available across all projects rather than just one workspace, place the same config in your VS Code user settings file (settings.json) under the mcp key instead.

2

Start the server and switch to Agent mode

Open .vscode/mcp.json in the editor. A Start button appears above the server definition - click it. VS Code launches the ScraperCity MCP process and discovers the available tools.

Then open Copilot Chat (Ctrl+Alt+I on Windows/Linux, Ctrl+Cmd+I on Mac) and select Agent from the mode dropdown. Click the tools icon to confirm ScraperCity tools are listed.

3

Keep the API key out of source control

Replace the literal key value with a VS Code input variable so teammates supply their own key on first start:

// .vscode/mcp.json (with input variable)
{
  "inputs": [
    {
      "id": "scrapercity_key",
      "type": "promptString",
      "description": "ScraperCity API key",
      "password": true
    }
  ],
  "servers": {
    "scrapercity": {
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": {
        "SCRAPERCITY_API_KEY": "${input:scrapercity_key}"
      }
    }
  }
}

VS Code prompts each user for the key once and stores it securely. You can safely commit this version of the file.

Available ScraperCity Tools in Copilot

Once connected, Copilot has access to all 25 ScraperCity tools. Here are the most commonly used ones for B2B lead generation workflows:

ToolWhat Copilot can doCostTurnaround
Apollo ScraperScrape B2B contacts by title, industry, company size, and location$0.0039/lead11-48+ hrs
Google MapsExtract local businesses with phones, emails, and reviews$0.01/place5-30 min
Email FinderFind a business email from a name and company domain$0.05/contact1-10 min
Email ValidatorVerify deliverability, catch-all status, and MX records$0.0036/email1-10 min
Mobile FinderLook up mobile phone numbers from a LinkedIn URL or email$0.25/input1-5 min
People FinderSkip trace by name, email, phone, or address$0.02/result2-10 min
Lead DatabaseQuery 3M+ B2B contacts instantly with filters ($649/mo plan)Plan requiredInstant
Property LookupGet property data and owner contact from an address$0.15/address2-10 min

What You Can Ask Copilot

These prompts work in both Copilot CLI and Copilot Chat in VS Code. Copilot selects the right ScraperCity tool, handles async polling, and delivers results - all without you writing a line of code.

Run an Apollo scrape for CTOs at cybersecurity companies in the US. Save results to data/cto-leads.csv and create a GitHub issue with the count and a sample of the first 10 rows.

Copilot triggers the scrape, waits for results, writes the CSV, then uses its native GitHub integration to create an issue documenting the output.

Read the target accounts from accounts.txt, look up the first 3 contacts at each company using ScraperCity email finder, and write a Python script that formats them for our CRM import.

Copilot reads the file, calls the email finder API per domain, and generates a formatting script tailored to your codebase patterns.

Scrape all yoga studios in San Francisco from Google Maps. For each one that has a website, run the email finder to get the owner contact. Export to a single CSV.

Copilot chains the Maps scrape, filters for businesses with websites, enriches each with the email finder, and merges the two datasets into one file.

Validate every email in our seed list at data/cold-outreach.csv. Remove invalid ones and commit the cleaned file.

Copilot runs validation in batches, filters out addresses that fail deliverability checks, writes the cleaned file, and commits it to your repo.

I have a list of LinkedIn URLs in linkedin.txt. Look up the mobile phone number for each one using the ScraperCity mobile finder and append the results to contacts.csv.

Copilot reads the file, calls the mobile finder for each URL, and merges the phone data back into the existing CSV - preserving all other columns.

Skip trace the 20 names in prospects.txt using the People Finder. Flag any matches that also have a criminal record and write a summary report to reports/flagged.md.

Copilot calls People Finder and Criminal Records in sequence, cross-references the results, and writes a structured Markdown report - ready to commit or share.

Common Workflows You Can Build

ScraperCity tools compose naturally. Here are multi-step pipelines that developers run through Copilot to automate lead generation end to end.

Full B2B outreach pipeline

  1. Apollo scrape by job title and industry to pull a raw contact list
  2. Email Finder fills in missing addresses from company domains
  3. Email Validator removes catch-all and undeliverable addresses
  4. Export cleaned list to CSV, commit to repo, open a PR for review

Apollo delivery takes 11-48+ hours. Copilot polls in the background and resumes the rest of the pipeline when results arrive.

Local business lead generation

  1. Google Maps scrape for a business category and city
  2. Filter results to those with a website domain
  3. Email Finder or Website Finder to enrich each record with contact info
  4. Merge datasets and export to a single enriched CSV

Google Maps results come back in 5-30 minutes. For large city+category queries, split into multiple smaller scrapes to stay within your wallet balance.

Ecommerce store prospecting

  1. Store Leads scrape for Shopify or WooCommerce stores by niche keyword
  2. Filter for stores above a revenue threshold
  3. Email Finder to get the decision-maker contact at each store
  4. Format results for a Klaviyo, HubSpot, or custom CRM import

Store Leads results are instant. Copilot can write the import-ready formatter in whatever schema your CRM expects.

Why Use ScraperCity Through Copilot

Git-native workflows

Copilot can commit scraped data, create branches for different lead lists, and open PRs with enrichment results - all in one prompt. Your lead data lives in version control alongside your code.

Codebase context

Copilot sees your repo. Ask it to write scraping scripts that match your existing import formats, data models, or CI pipeline - no explaining the schema from scratch each time.

Plan mode in Copilot CLI

Use Shift+Tab in Copilot CLI to enter plan mode and review a multi-step scraping workflow before it executes. Verify each step, adjust the plan, then let it run.

Model choice

Copilot CLI supports Claude, GPT, and Gemini models. Switch between them mid-session with /model. All models have access to the full set of ScraperCity tools.

Async polling handled for you

Apollo scrapes take 11-48+ hours. Copilot handles the polling loop automatically. You don't need to write a separate polling script or remember to check back manually.

Team config via source control

Commit .vscode/mcp.json (with input variables for API keys) to your repo. Every developer on the team gets ScraperCity tools in Copilot the moment they clone the project.

Troubleshooting

ScraperCity tools do not appear after adding the server

In Copilot CLI, run /mcp show to see all configured servers and their status. A server shown as offline usually means Node.js is not installed or not on your PATH. Confirm with node --version in a new terminal. In VS Code, make sure you clicked the Start button in mcp.json and that Copilot Chat is in Agent mode - tools are not available in Ask or Edit modes.

401 Unauthorized error when Copilot calls ScraperCity

Your SCRAPERCITY_API_KEY is missing or incorrect. Double-check the key at app.scrapercity.com/dashboard/api-docs. If you set the key in your shell profile, make sure you opened a new terminal session after adding it. If you used the env block in mcp.json, verify there are no extra spaces or quotes around the value.

'MCP servers in Copilot' is disabled for my organization

On Copilot Business and Enterprise plans, the MCP policy is disabled by default. An org admin must go to Organization Settings > Copilot > Policies and enable 'MCP servers in Copilot'. This policy does not apply to Copilot Free, Pro, or Pro+ users.

Apollo scrape is still pending after several hours

Apollo scrapes take 11-48+ hours depending on query size. This is normal. You can check the status by asking Copilot to poll the run ID, or visit app.scrapercity.com/dashboard to monitor active jobs. Duplicate requests submitted within 30 seconds are blocked automatically - this prevents accidental double charges.

npx takes a long time to start the MCP server

The first run downloads the scrapercity package from npm. Subsequent starts use the npm cache and are much faster. If startup time is a concern in a team environment, pre-install the package globally with npm install -g scrapercity and change the command to scrapercity-mcp instead of the npx invocation.

Insufficient wallet balance error

Each scrape deducts from your ScraperCity wallet per result. Check your balance by asking Copilot to call the Wallet tool, or visit app.scrapercity.com/dashboard. Top up before running large Apollo or Google Maps jobs. You can set a budget limit in your prompt - for example, ask Copilot to stop after 500 results.

FAQ

Get API access to ScraperCity: