Integration Guide
Pull leads from ScraperCity and push them into Smartlead campaigns via the API. The example below uses the Lead Database endpoint, but you can use the same approach with any ScraperCity data source.
Smartlead exposes a REST API for adding leads to campaigns in bulk. You query ScraperCity for contacts matching your ICP (via the Lead Database, Apollo scraper, Google Maps scraper, or any other endpoint), map the fields to Smartlead's format, and POST them to a campaign. Leads start receiving your sequence based on the campaign schedule you have configured.
Smartlead authenticates via API key as a query parameter - not a Bearer token in headers. You can send up to 100 leads per request, and the API enforces a rate limit of 60 requests per 60 seconds per API key. That means you can load up to 6,000 leads per minute when batching at the maximum size.
The Smartlead API base URL is https://server.smartlead.ai/api/v1. Every endpoint requires your API key appended as a query parameter: ?api_key=YOUR_KEY.
From app.scrapercity.com/dashboard/api-docs. Any plan includes API access. The Lead Database endpoint requires the $649/mo plan.
Settings > Your Profile > API settings. Requires the PRO plan or higher. The Basic plan does not include API access.
The numeric ID of your Smartlead campaign. Visible in the URL when viewing a campaign, or retrieve all IDs via GET /api/v1/campaigns/?api_key=YOUR_KEY.
curl, Python, Node.js, n8n, or any tool that can make HTTP requests. All examples below use curl for clarity.
curl -s "https://app.scrapercity.com/api/v1/database/leads?title=CEO&industry=computer%20software&country=United%20States&hasEmail=true&limit=100" \
-H "Authorization: Bearer $SCRAPERCITY_API_KEY"Returns up to 100 leads with email, name, title, company, and LinkedIn. Paginate with &page=2, &page=3, etc. The Lead Database supports filtering by title, industry, country, company size, and more. The daily limit is 100,000 leads.
If you do not know your Smartlead campaign ID, retrieve all campaigns in your account. The response is an array ordered by ID descending (newest first):
curl -s "https://server.smartlead.ai/api/v1/campaigns/?api_key=$SMARTLEAD_API_KEY"Each object in the response includes an id field - that is your campaign ID. Campaign status values are: ACTIVE, PAUSED, STOPPED, ARCHIVED, DRAFTED. Only add leads to an ACTIVE campaign.
POST to the Smartlead campaign leads endpoint. You can send up to 100 leads per request in the lead_list array:
curl -X POST "https://server.smartlead.ai/api/v1/campaigns/YOUR_CAMPAIGN_ID/leads?api_key=$SMARTLEAD_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"lead_list": [
{
"email": "[email protected]",
"first_name": "John",
"last_name": "Smith",
"company_name": "TechCorp",
"website": "techcorp.com",
"location": "San Francisco, CA",
"linkedin_profile": "https://linkedin.com/in/johnsmith",
"custom_fields": {
"Title": "CEO",
"Industry": "Computer Software"
}
}
],
"settings": {
"ignore_global_block_list": false,
"ignore_unsubscribe_list": false,
"ignore_duplicate_leads_in_other_campaign": false,
"ignore_community_bounce_list": false
}
}'Note the API key goes in the URL as a query parameter, not in headers. A successful response returns { "ok": true }.
Map ScraperCity response fields to Smartlead lead fields:
email→emailfirst_name→first_namelast_name→last_namecompany_name→company_namecompany_domain→websitelinkedin_url→linkedin_profilecity + state→locationmobile_number→phone_numbertitle→custom_fields.Titlecompany_industry→custom_fields.IndustryAnything in custom_fields is available as a merge tag in your Smartlead sequences. Use {{Title}} or {{Industry}} in your email copy. Smartlead supports a maximum of 20 custom fields per lead.
After uploading, confirm the leads landed correctly by fetching the lead list for your campaign:
curl -s "https://server.smartlead.ai/api/v1/campaigns/YOUR_CAMPAIGN_ID/leads?api_key=$SMARTLEAD_API_KEY&offset=0&limit=100"Each lead in the response includes a status field. New leads will show as STARTED (scheduled, not yet sent), then transition to INPROGRESS once the first email goes out, and finally COMPLETED when all sequence steps are exhausted.
The settings object in the POST body controls how Smartlead handles incoming leads. All four flags default to false:
ignore_global_block_listWhen set to true, leads that appear on your global block list will still be added to the campaign. Default false. Leave this false unless you have a specific reason to override.
ignore_unsubscribe_listWhen set to true, leads who previously unsubscribed will be added back. Default false. Keep this false to stay compliant with CAN-SPAM and GDPR.
ignore_duplicate_leads_in_other_campaignWhen set to true, a lead already active in another campaign will be added to this one as well. Default false. Set to true when running parallel campaigns targeting the same audience on purpose.
ignore_community_bounce_listWhen set to true, leads that have bounced across the Smartlead user base will bypass that check and be added anyway. Default false. Only override this if you have independently verified the email address.
The three-step pattern - pull from ScraperCity, transform fields, POST to Smartlead - is easy to wire up in whatever automation layer you already use. Here are the most common approaches:
Write a script that pulls from ScraperCity, transforms the data into Smartlead's lead_list format, and POSTs in batches of 100. Run it daily or weekly via cron. Add a 1-second sleep between batch requests to stay comfortably under the 60-requests-per-minute rate limit.
HTTP Request node pulls from ScraperCity with pagination. A Function node transforms the fields and splits the array into chunks of 100. A second HTTP Request node POSTs each chunk to Smartlead. Add a Wait node between POST requests to respect the rate limit. n8n's built-in error handling will retry failed requests automatically.
Use an HTTP module to call the ScraperCity API, an Iterator module to loop through leads, and a second HTTP module to build and POST each batch to Smartlead. Make's built-in router can branch on errors - send failures to a separate data store for review without stopping the whole run.
Tell Claude Code: "Pull 1000 VPs of Sales from ScraperCity and load them into my Smartlead campaign [ID]. Map title and industry to custom fields. Batch in groups of 100 with a 1-second delay between requests." It writes and runs the full script in one shot, handling pagination and batching automatically.
Install the ScraperCity MCP server (npx -y --package scrapercity scrapercity-mcp) in Claude Desktop, Cursor, or any MCP-compatible client. From there you can pull leads by natural language prompt, then use the Smartlead API to push them - no manual scripting required.
This integration is not just for one-off imports. Below are four repeatable workflows teams use to keep their Smartlead campaigns filled with fresh, targeted contacts.
Every Monday, pull the 500 most recently added CEOs and VPs at Series A-C SaaS companies from the Lead Database, deduplicate against leads already in your campaigns, and add net-new contacts to a nurture campaign. Your pipeline stays full without any manual list building.
Use the Google Maps scraper to pull restaurants, salons, or contractors in a target city. Filter for businesses with fewer than 50 reviews (a proxy for low digital maturity) and push them into a Smartlead campaign offering website or SEO services. Repeat per city.
When a company posts a new job on LinkedIn for a VP of Sales or Revenue Operations, that is a buying signal for sales tools. Scrape relevant job postings, enrich the company record via the Apollo scraper to get the decision-maker's email, and add them to a timely Smartlead campaign within 24 hours of the job going live.
Use the BuiltWith scraper to find all e-commerce sites running Shopify. Enrich with the Store Leads scraper to get owner contact details and revenue estimates. Push only stores above a revenue threshold into a campaign offering your B2B service. Technographic targeting dramatically improves reply rates because the pitch is specific.
Pushing leads into Smartlead is only half the automation. The other half is reacting to what happens after emails go out. Smartlead webhooks fire on these events:
EMAIL_SENTLog send events to your own analytics or CRM.
EMAIL_OPENTrack open rates without manual export.
EMAIL_LINK_CLICKFire a CRM task when a prospect clicks a link.
EMAIL_REPLYThe most important signal - route to sales immediately.
LEAD_UNSUBSCRIBEDSuppress from all future ScraperCity pulls.
LEAD_CATEGORY_UPDATEDSync intent signals (Interested, Not Interested) to your CRM.
Register a webhook for a campaign by POSTing to https://server.smartlead.ai/api/v1/campaigns/YOUR_CAMPAIGN_ID/webhooks?api_key=YOUR_KEY with a JSON body specifying your webhook_url and an array of event_types.
A practical closed-loop setup: when Smartlead fires an EMAIL_REPLY webhook, your endpoint updates the lead's status in your CRM, removes them from all other active Smartlead campaigns via the delete endpoint, and creates a follow-up task for a sales rep. This prevents prospects from receiving additional cold emails after they have already engaged.
Most errors when connecting ScraperCity to Smartlead fall into a small set of categories. Here is what each one means and how to fix it:
Cause: Your API key is missing, incorrect, or being passed in the wrong place.
Fix: Smartlead does not use Bearer token auth. The key must be a query parameter: ?api_key=YOUR_KEY. Confirm the key is from Settings > Your Profile > API settings, not a webhook secret or any other credential.
Cause: The campaign ID in the URL does not exist or belongs to a different account.
Fix: Verify the ID by calling GET /api/v1/campaigns/?api_key=YOUR_KEY and matching the name. If you manage multiple Smartlead accounts, confirm you are using the API key that belongs to the account that owns the campaign.
Cause: One or more lead objects failed validation. Common causes: malformed email address, lead_list array exceeds 100 entries, or a required field is missing.
Fix: Check that every object in lead_list has a valid email field. Trim your batch to 100 or fewer leads. Validate email strings before sending - the ScraperCity Email Validator endpoint ($0.0036/email) can pre-screen addresses before they reach Smartlead.
Cause: You are exceeding Smartlead's limit of 60 requests per 60 seconds.
Fix: Add a delay between requests. A 1-second sleep between each batch of 100 gives you a comfortable ceiling of 60 batches (6,000 leads) per minute. In n8n, use a Wait node. In Python, use time.sleep(1). Do not hammer the API with immediate retries on 429 - use exponential backoff.
Cause: The campaign is paused, stopped, or still in draft status.
Fix: Check campaign status via GET /api/v1/campaigns/YOUR_CAMPAIGN_ID?api_key=YOUR_KEY. The status field should read ACTIVE. If it is PAUSED or DRAFTED, activate it in the Smartlead dashboard before uploading leads. Leads added to a paused campaign queue up and begin sending when the campaign resumes.
Cause: The Authorization header is malformed or the API key has been rotated.
Fix: ScraperCity uses Bearer token auth in the header: Authorization: Bearer YOUR_KEY. The key comes from app.scrapercity.com/dashboard/api-docs. If you recently rotated your key, update it in all scripts and automation workflows.
Smartlead's add-leads endpoint accepts up to 100 leads per POST. Sending 1 lead per request instead of 100 means 100x more API calls for the same volume, burning your rate limit budget 100x faster. Always fill the batch.
Use GET /api/v1/campaigns/YOUR_CAMPAIGN_ID/leads to check whether an email already exists in the campaign before uploading. Alternatively, query GET /api/v1/leads?api_key=YOUR_KEY&email=ADDRESS to check across all campaigns. Uploading duplicates wastes credits on both platforms.
Run your lead list through the ScraperCity Email Validator endpoint before pushing to Smartlead. Bounced emails hurt your sender reputation. The validator checks deliverability, catch-all status, and MX records. Filter out invalid and catch-all addresses before they enter your Smartlead campaign.
Instead of uploading 5,000 leads at once, load 100-200 per day into an active campaign. Smartlead supports drip-feeding leads into live campaigns via the API, which spreads sending volume and is gentler on deliverability. A cron that runs once or twice a day is more sustainable than a weekly bulk import.
Any key you include in custom_fields maps directly to a merge tag in your Smartlead sequence. Passing Title, Industry, and Company_Size as custom fields lets you write one email template that dynamically adapts to each prospect. More personalization at the field level means better reply rates without rewriting copy.