ScraperCity logo

Integration Guide

ScraperCity + Instantly

Pull leads from ScraperCity and push them straight into Instantly campaigns via the V2 API. The example below uses the Lead Database endpoint, but you can use the same approach with any ScraperCity data source - Apollo scraper, Google Maps, Store Leads, and more.

No CSV exports. No manual uploads. Your ICP contacts go from query to active sequence in minutes.

The Pipeline

ScraperCity API

GET leads

Your Script

Transform fields

Instantly V2 API

POST leads to campaign

You query ScraperCity for contacts matching your ICP (via the Lead Database, Apollo scraper, or any other endpoint), then POST those contacts to an Instantly campaign. Instantly picks them up and starts sending your sequence. No CSV exports, no manual uploads.

Both APIs are REST-based with Bearer token authentication and JSON payloads, so any language or automation tool that can make HTTP requests will work. The full integration takes under 30 minutes to set up.

What You Need

ScraperCity API Key

From app.scrapercity.com/dashboard/api-docs. Any plan includes API access. The Lead Database endpoint requires the $649/mo plan.

Instantly API Key (V2)

From your Instantly workspace: Settings > Integrations > API. Create a key with the leads:create scope. Note: V2 keys are separate from V1 keys - you must generate a new one.

Instantly Campaign ID

Open your campaign in Instantly, copy the ID from the URL or use the GET /api/v2/campaigns endpoint to list all campaigns and their IDs.

A script runner

curl, Python, Node.js, n8n, or any tool that can make HTTP requests. The examples below use curl for clarity, with a full Python script in Step 4.

Important: The Instantly V2 API is not compatible with V1. If you have an existing V1 key, you must generate a new V2 key from your Instantly dashboard. V2 uses Authorization: Bearer authentication, not the query-string api_key= format used in V1.

Step by Step

1

Pull leads from ScraperCity

curl -s "https://app.scrapercity.com/api/v1/database/leads?title=CTO&country=United%20States&hasEmail=true&limit=100" \
  -H "Authorization: Bearer $SCRAPERCITY_API_KEY"

Returns up to 100 leads with email, name, title, company, and LinkedIn. Paginate with &page=2, &page=3, etc. The Lead Database returns up to 100,000 leads per day.

You can filter by title, industry, country, state, city, companySize, and more to narrow your ICP before pulling. Only pull contacts you intend to contact - tight targeting improves reply rates and keeps your Instantly sender reputation healthy.

2

Push leads into an Instantly campaign

For each lead, POST to the Instantly V2 leads endpoint. You can send them one at a time or loop through your list:

curl -X POST "https://api.instantly.ai/api/v2/leads" \
  -H "Authorization: Bearer $INSTANTLY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "campaign": "YOUR_CAMPAIGN_ID",
    "email": "[email protected]",
    "first_name": "John",
    "last_name": "Smith",
    "company_name": "TechCorp",
    "website": "techcorp.com",
    "custom_variables": {
      "title": "CTO",
      "linkedin": "https://linkedin.com/in/johnsmith"
    }
  }'

The endpoint returns the created lead object on success (HTTP 200). If the email already exists in the campaign, Instantly skips it silently - no error is thrown.

3

Field mapping reference

Map ScraperCity response fields to Instantly lead fields:

emailemail
first_namefirst_name
last_namelast_name
company_namecompany_name
company_domainwebsite
mobile_numberphone
titlecustom_variables.title
linkedin_urlcustom_variables.linkedin
company_industrycustom_variables.industry
city + statecustom_variables.location

Any field you put in custom_variables is available as a merge tag in your Instantly sequences. Use {{title}}, {{industry}}, or {{location}} in your email copy to personalize at scale.

4

Full Python script

This script pulls one page of leads from ScraperCity and loads them all into an Instantly campaign. Extend with a loop over pages for larger batches.

import os
import requests
import time

SCRAPERCITY_KEY = os.environ["SCRAPERCITY_API_KEY"]
INSTANTLY_KEY   = os.environ["INSTANTLY_API_KEY"]
CAMPAIGN_ID     = os.environ["INSTANTLY_CAMPAIGN_ID"]

# 1. Pull leads from ScraperCity
sc_resp = requests.get(
    "https://app.scrapercity.com/api/v1/database/leads",
    headers={"Authorization": f"Bearer {SCRAPERCITY_KEY}"},
    params={"title": "VP of Sales", "country": "United States", "hasEmail": "true", "limit": 100}
)
sc_resp.raise_for_status()
leads = sc_resp.json().get("data", [])
print(f"Fetched {len(leads)} leads from ScraperCity")

# 2. Push each lead into Instantly
success, skipped, errors = 0, 0, 0
for lead in leads:
    payload = {
        "campaign": CAMPAIGN_ID,
        "email":        lead.get("email"),
        "first_name":   lead.get("first_name"),
        "last_name":    lead.get("last_name"),
        "company_name": lead.get("company_name"),
        "website":      lead.get("company_domain"),
        "custom_variables": {
            "title":    lead.get("title"),
            "linkedin": lead.get("linkedin_url"),
            "industry": lead.get("company_industry"),
        }
    }
    resp = requests.post(
        "https://api.instantly.ai/api/v2/leads",
        headers={
            "Authorization": f"Bearer {INSTANTLY_KEY}",
            "Content-Type": "application/json"
        },
        json=payload
    )
    if resp.status_code == 200:
        success += 1
    elif resp.status_code == 429:
        # Rate limit hit - back off and retry once
        time.sleep(2)
        retry = requests.post(
            "https://api.instantly.ai/api/v2/leads",
            headers={"Authorization": f"Bearer {INSTANTLY_KEY}", "Content-Type": "application/json"},
            json=payload
        )
        if retry.status_code == 200:
            success += 1
        else:
            errors += 1
    else:
        errors += 1

print(f"Done: {success} added, {errors} errors")

Store your keys in environment variables, not in the script itself. Never hardcode API credentials in source files.

Ways to Automate This

Bash/Python script on a cron

Write a script that pulls from ScraperCity, transforms the data, and POSTs to Instantly. Run it daily or weekly via cron or a scheduled task. This is the simplest approach for recurring campaigns targeting a stable ICP filter.

n8n workflow

Use the HTTP Request node to pull from ScraperCity (with pagination), then a second HTTP Request node to POST each lead to Instantly. Use the Split In Batches node to loop through results. Schedule the workflow to run on any interval. n8n is self-hostable and free for basic use.

Zapier or Make

Both platforms support HTTP request actions and can connect to any REST API. Build a Zap or Scenario that pulls leads from ScraperCity on a schedule and pushes each one to Instantly. Good option if you are already using either platform for other automations.

Claude Code or Cursor (MCP)

Tell Claude Code: "Pull 500 VP of Sales contacts from ScraperCity and load them into my Instantly campaign [ID]. Map title, company, and industry to custom variables." With the ScraperCity MCP server configured, it writes and runs the script for you without leaving your editor.

ScraperCity MCP Server Config

Add this to your MCP client config (Claude Code, Cursor, Windsurf, Cline, or Devin) to control ScraperCity directly from your AI editor:

{
  "mcpServers": {
    "scrapercity": {
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": { "SCRAPERCITY_API_KEY": "your_api_key_here" }
    }
  }
}

Common Workflows

The ScraperCity + Instantly integration covers a wide range of outbound use cases. Here are the most common ones teams build.

ICP targeting by job title and geography

Query the Lead Database for a specific job title (e.g. Head of Engineering) in a specific country or city. Push the results into an Instantly campaign with location-specific copy. This is the most common use case - targeted lists by role and region tend to outperform broad sprays.

Industry vertical campaigns

Filter ScraperCity by company_industry to build a list for a specific vertical - SaaS, fintech, healthcare, logistics, etc. Use the industry custom variable in your Instantly sequences to personalize the opening line and pain points.

Technographic targeting via BuiltWith scraper

Use ScraperCity's BuiltWith scraper to find all companies using a specific technology (e.g. Salesforce, HubSpot, Shopify). Then run those company domains through the Email Finder or Lead Database to get contacts. Push into Instantly with technology-specific messaging.

Local business outreach via Google Maps

Pull local business listings from ScraperCity's Google Maps scraper with phone numbers and emails. Push into an Instantly campaign for a location-specific outreach sequence. Good for agencies targeting businesses in a specific city or metro.

Ecommerce store prospecting via Store Leads

Query ScraperCity's Store Leads endpoint for Shopify or WooCommerce stores by category or revenue range. Load the store owner contacts into Instantly for an ecommerce-specific cold email campaign.

Performance Tips

A few practices that keep the pipeline running cleanly at volume.

Filter for verified emails before pushing

Use ScraperCity's hasEmail=true filter when querying the Lead Database to only pull contacts that have a known email address. For Apollo scraper results, run the emails through ScraperCity's Email Validator before loading them into Instantly. This keeps your bounce rate low and protects your sender reputation.

Stay under the Instantly rate limit

The Instantly V2 API allows up to 6,000 requests per minute. For most use cases this is more than enough, but if you are pushing thousands of leads in a tight loop, add a small sleep (10-20ms) between requests or use a queue. If you receive a 429 response, back off with exponential delay before retrying.

Keep ScraperCity pagination tight

The Lead Database returns up to 100 leads per page and allows up to 100,000 leads per day. For large pulls, loop through pages using the page parameter. Start with page=1 and increment until you get fewer results than your limit, which signals you have reached the last page.

Use targeted custom variables for personalization

The more custom_variables you pass from ScraperCity to Instantly, the more you can personalize your sequences without manual work. At minimum, pass title, company_name, and industry. These three fields let you write one sequence that reads as personalized for thousands of prospects.

Deduplicate across campaigns at the source

Instantly deduplicates within a single campaign by email address, but not across campaigns. If you are running multiple campaigns in parallel, track which emails you have already pushed in a simple local set or database table before each POST to avoid sending the same contact multiple sequences.

Troubleshooting

Common errors and how to fix them.

401 Unauthorized or 403 Forbidden

Wrong API key, missing scope, or incorrect header format. Confirm you are using a V2 key (not a V1 key), that the Authorization header is exactly Authorization: Bearer YOUR_KEY, and that the key has the leads:create scope enabled.

400 Bad Request

Invalid payload or missing required fields. The email and campaign fields are required on every POST. Validate that your JSON is well-formed and that the campaign ID is a valid UUID from your Instantly workspace.

429 Too Many Requests

You have exceeded the rate limit. Add a retry loop with exponential backoff. Wait at least 1 second before retrying, and double the wait time on each subsequent failure. The Instantly V2 API allows up to 6,000 requests per minute.

404 Not Found

Wrong endpoint path. Confirm the URL is https://api.instantly.ai/api/v2/leads (not /api/v1/ and not app.instantly.ai). Also verify the campaign ID exists in your workspace.

Lead added but custom variables are empty in sequences

Custom variable keys must exactly match the merge tag names in your Instantly sequence templates. If your template uses {{title}}, the custom_variables key must be title (lowercase, no spaces). Check your sequence templates for the exact tag names.

ScraperCity returns 0 leads

Your filter combination returned no results. Try broadening the query - remove the state or city filter, widen the title filter, or remove hasEmail=true temporarily to see the full result set. The Lead Database requires the $649/mo plan - confirm your plan includes access.

FAQ

Get API access to ScraperCity: