Integration Guide
Import leads from ScraperCity into EmailBison campaigns through the API. The example below uses the Lead Database endpoint, but the same approach works with any ScraperCity data source.
EmailBison exposes a full REST API for managing leads and campaigns. You create leads through the API, then attach them to a campaign. Once attached, leads start receiving your sequence steps based on the campaign schedule - routed across dedicated IP pools for deliverability.
The integration is two API calls: one to ScraperCity to pull contacts matching your ICP, and one to EmailBison to import those contacts into a campaign. You can do this manually with curl, automate it with a script, or wire it up through n8n.
ScraperCity gives you 25 data sources to build those contact lists - from the 4.6M+ contact Lead Database to Apollo scrapes, Google Maps local businesses, Shopify store owners, and more. Every source returns structured contact data you can pipe directly into EmailBison without manual cleanup.
From app.scrapercity.com/dashboard/api-docs. Any plan includes API access. The Lead Database endpoint requires the $649/mo plan.
Settings > Developer API > New API Token. Use the api-user type. Token is shown once - save it immediately.
Your workspace URL + /api. Example: https://dedi.emailbison.com/api
The numeric ID of the campaign you want to load leads into. Find it in the EmailBison campaign URL.
Query the Lead Database endpoint with your ICP filters. The example below pulls VP of Sales contacts in the computer software industry who have a verified email. You can swap in any combination of title, industry, location, company size, and more.
curl -s "https://app.scrapercity.com/api/v1/database/leads?title=VP+of+Sales&industry=computer%20software&hasEmail=true&limit=100" \
-H "Authorization: Bearer $SCRAPERCITY_API_KEY"The Lead Database returns up to 100 leads per page. Use the page parameter to paginate. The daily limit is 100,000 leads. Requires the $649/mo plan.
For each contact from the ScraperCity response, POST to the EmailBison leads endpoint. Include all fields you want available for personalization inside custom_variables.
curl -X POST "https://dedi.emailbison.com/api/leads" \
-H "Authorization: Bearer $EMAILBISON_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"email": "[email protected]",
"first_name": "Jane",
"last_name": "Chen",
"company": "SaaSCorp",
"custom_variables": {
"title": "VP of Sales",
"linkedin": "https://linkedin.com/in/janechen",
"industry": "Computer Software",
"city": "San Francisco",
"employee_count": "250"
}
}'This creates the lead in your EmailBison workspace. Note the lead ID in the response for the next step. If the email already exists, EmailBison will return the existing lead record.
Add the leads to your target campaign by passing an array of lead IDs. You can collect IDs from step 2 and batch them together:
curl -X POST "https://dedi.emailbison.com/api/campaigns/6/leads" \
-H "Authorization: Bearer $EMAILBISON_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"lead_ids": [101, 102, 103, 104, 105]
}'Leads added to an active campaign sync within 5 minutes and begin receiving sequence steps according to your campaign schedule.
For ongoing prospecting, wrap the three curl calls in a script that runs on a schedule. Here is the full flow in Node.js:
// import-leads.mjs
const SCRAPERCITY_KEY = process.env.SCRAPERCITY_API_KEY;
const BISON_TOKEN = process.env.EMAILBISON_TOKEN;
const BISON_BASE = 'https://dedi.emailbison.com/api';
const CAMPAIGN_ID = 6;
async function sleep(ms) {
return new Promise((r) => setTimeout(r, ms));
}
async function bisonPost(path, body, attempt = 0) {
const res = await fetch(`${BISON_BASE}${path}`, {
method: 'POST',
headers: {
Authorization: `Bearer ${BISON_TOKEN}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(body),
});
if (res.status === 429 && attempt < 5) {
// Respect rate limits - back off and retry
await sleep(Math.min(500 * 2 ** attempt, 30000));
return bisonPost(path, body, attempt + 1);
}
return res.json();
}
async function run() {
// 1. Pull leads from ScraperCity
const sc = await fetch(
'https://app.scrapercity.com/api/v1/database/leads?title=VP+of+Sales&industry=computer%20software&hasEmail=true&limit=100',
{ headers: { Authorization: `Bearer ${SCRAPERCITY_KEY}` } }
);
const { leads } = await sc.json();
// 2. Create each lead in EmailBison
const leadIds = [];
for (const lead of leads) {
const result = await bisonPost('/leads', {
email: lead.email,
first_name: lead.first_name,
last_name: lead.last_name,
company: lead.company,
custom_variables: {
title: lead.title,
linkedin: lead.linkedin_url,
industry: lead.industry,
city: lead.city,
},
});
if (result.id) leadIds.push(result.id);
await sleep(200); // stay under rate limits
}
// 3. Attach to campaign
await bisonPost(`/campaigns/${CAMPAIGN_ID}/leads`, { lead_ids: leadIds });
console.log(`Imported ${leadIds.length} leads into campaign ${CAMPAIGN_ID}`);
}
run().catch(console.error);Run with node import-leads.mjs or schedule via cron. The 200ms delay between lead creation calls keeps you well within EmailBison rate limits.
Any field you include in custom_variables when creating the lead becomes available as a Liquid variable in your EmailBison sequence. EmailBison supports the full Liquid syntax - conditionals, templates, and date combinations - so you can personalize every email with data pulled directly from ScraperCity.
Beyond simple merge fields, you can use Liquid conditionals to vary copy based on industry, company size, or location. For example, reference a prospect's city only when it is populated, or use different value propositions for different verticals - all in the same campaign.
{{title}}=VP of SalesHi {{first_name}}, as a {{title}} at {{company}}...
{{industry}}=Computer SoftwareI work with other {{industry}} companies...
{{city}}=San FranciscoNext time I am in {{city}}...
{{linkedin}}=linkedin.com/in/...Saw your profile on LinkedIn...
{{employee_count}}=250For a team your size at {{company}}...
{{technology}}=SalesforceSince you are using {{technology}}...
The Lead Database is not the only way to build lists for EmailBison. Every ScraperCity data source returns structured contact data on the same API pattern - pull once, import anywhere. Here are the most common sources for cold outreach automation:
millions of B2B contacts filterable by job title, industry, location, company size, and more. Returns email, name, title, company, LinkedIn URL. Requires $649/mo plan. Results are instant.
Scrape contacts from Apollo.io by search URL. Returns the same fields as Apollo exports without a separate Apollo subscription. Delivery in 11-48+ hours.
Pull local business listings with phone, email, and owner contact. Useful for outreach to SMBs, restaurants, medical offices, and service businesses. Delivery in 5-30 min.
Have a name and company but no email? The Email Finder generates a verified business email from name + domain. Pair with any list of contacts missing emails.
Shopify and WooCommerce store owners with verified emails. Filter by platform, category, and revenue range. Good for SaaS tools targeting ecommerce operators.
Before importing a large batch into EmailBison, run it through the Email Validator to check deliverability, catch-all status, and MX records. Protects sender reputation.
The basic three-step flow above covers the majority of use cases, but there are several patterns worth knowing when building a production-grade sales prospecting automation pipeline.
Schedule a cron job to run every morning. It queries ScraperCity for new leads matching your ICP added in the last 24 hours, creates them in EmailBison, and attaches them to a rolling campaign. Your sequence always has fresh contacts without manual work.
Use the Google Maps scraper to pull businesses in a target city and vertical. Filter by presence of a business email, then import directly into an EmailBison campaign aimed at local decision makers. Works well for agencies, software tools, and service providers targeting SMBs.
Use the BuiltWith scraper to identify companies running a specific tech stack - then target the relevant buyer persona at those companies. A tool that integrates with Salesforce, for example, should only prospect companies that use Salesforce.
EmailBison can push interested replies directly into a follow-up campaign via the API or from the master inbox. Combine this with a webhook to notify your CRM or Slack when a reply is marked as interested, so your team can follow up immediately while the lead is warm.
EmailBison sends webhook events for key actions: email sent, reply received, interested, unsubscribed, bounced, and tag attached. You can subscribe to these in Settings > Webhooks.
This means you can build a full loop: ScraperCity sources the leads, EmailBison sequences them, and webhooks notify your CRM or spreadsheet when someone replies or shows interest. Use n8n or Zapier to catch the webhook and route the data wherever it needs to go.
A typical webhook payload includes the workspace ID, lead details, and event type. Here is what a TAG_ATTACHED event looks like:
{
"event": {
"type": "TAG_ATTACHED",
"name": "Tag Attached",
"instance_url": "https://dedi.emailbison.com",
"workspace_id": 2,
"workspace_name": "Red Team"
},
"lead": {
"id": 101,
"email": "[email protected]",
"first_name": "Jane",
"company": "SaaSCorp"
},
"tag": {
"name": "interested"
}
}Catch this in an n8n Webhook node, then branch: push the lead into a follow-up campaign, create a deal in your CRM, or send a Slack notification to your sales rep.
Most issues with this integration fall into one of three categories: auth, rate limits, or data shape. Here is how to handle each one.
Cause: Your EmailBison Bearer token is missing, expired, or scoped to the wrong workspace.
Fix: Regenerate the token at Settings > Developer API. Confirm you are using api-user type (not super-admin). Verify the token matches the workspace whose campaign ID you are targeting. The Authorization header must read: Authorization: Bearer YOUR_TOKEN with no extra spaces.
Cause: You are hitting the EmailBison rate limit by creating leads too quickly in a tight loop.
Fix: Add a 200-300ms delay between individual lead creation calls. If you continue to hit 429s, implement exponential backoff: wait 500ms, then 1s, then 2s, doubling with each retry up to a cap. The script example in Step 4 above includes this pattern.
Cause: The campaign ID in your URL does not exist or belongs to a different workspace.
Fix: Open the campaign in the EmailBison UI and check the numeric ID in the browser URL. Confirm your API token is scoped to the same workspace that owns the campaign.
Cause: The campaign is paused, the lead was added to an inactive campaign, or the schedule has not reached a send window.
Fix: Check campaign status in the EmailBison UI. Confirm the campaign is active and the schedule includes the current time window. Leads added to an active campaign typically sync within 5 minutes.
Cause: Your filter combination is too narrow, or the Lead Database does not have contacts matching all criteria simultaneously.
Fix: Broaden one filter at a time. Remove hasEmail=true first to see total volume, then re-add it. Use the ScraperCity dashboard UI to test queries before scripting them.
Run your ScraperCity leads through the Email Validator endpoint ($0.0036/email) before creating them in EmailBison. Removing catch-all and invalid addresses upfront protects your sender reputation on the dedicated IP pool.
The Lead Database returns up to 100 leads per page. Loop through pages sequentially rather than in parallel to avoid hitting the 100,000 leads/day cap accidentally on a misconfigured script.
You can pass multiple lead IDs in a single POST to /api/campaigns/{id}/leads. Collect all IDs from your lead creation loop, then send one attach request instead of one per lead. This reduces API calls significantly on large imports.
EmailBison recommends api-user tokens for integrations. They are scoped to a single workspace and do not carry the elevated permissions of super-admin tokens, reducing risk if a token is accidentally exposed in logs or environment files.
Check whether an email already exists in EmailBison before POSTing it as a new lead. Duplicate leads in the same campaign cause sequence confusion. Store created lead IDs in your database and skip any email you have already imported.
Use the same key names across all leads in a campaign. If some leads have a 'city' key and others do not, Liquid will render a blank string by default - but inconsistent keys make debugging harder. Normalize your ScraperCity output before importing.