SYS/2026.Q1Agentic SEO audits delivered in 72 hoursSee how →
AI DevelopmentCalculator12 min readPublished May 4, 2026

Custom agents vs Zapier and Make.com TCO across four scale tiers — build cost, run cost, change cost, and where each path actually wins.

Agent vs Zapier TCO: workflow-automation economics across four scale tiers.

A worked TCO comparison of three workflow-automation paths — Zapier, Make.com, and custom agentic builds — modelled across 1, 10, 100, and 1,000 workflow scales. Build cost, run cost, change cost, and the break-even thresholds where the math actually flips.

DA
Digital Applied Team
Automation engineers · Published May 4, 2026
PublishedMay 4, 2026
Read time12 min
Horizon24-month TCO
Paths compared
3
Scale tiers
4
Custom break-even
~500wf
Agency migration ROI
8mo
payback period

The agent vs Zapier TCO question is the workflow-automation decision most teams get wrong — not because the answer is hard, but because the brochure economics rarely match the lived ones. Three paths, three cost structures, three break-even curves. The right choice is decided by run volume and change cadence, not by which vendor has the better landing page.

Zapier wins the long tail — low volume, broad integration, near-zero build cost. Make.com wins the mid-tier — modestly higher build effort, sharper per-action pricing, better long-running workflow support. Custom agentic builds win the head — high build cost, marginal cost approaching zero, and the only path that survives a change-cadence stress test at scale. Most real-world stacks blend all three.

This guide is the worked TCO across all four tiers — 1, 10, 100, and 1,000 workflows — with the change-cost variable surfaced explicitly because it is the line item that decides most decisions and that no vendor calculator includes. Numbers are calibrated against a real 50-workflow agency migration that paid back in eight months.

Key takeaways
  1. 01
    Zapier wins under 100 monthly runs per workflow at fewer than 20 workflows.Build cost dominates the model at low scale. Zapier's zero-engineering setup beats any custom path until run volume or workflow count crosses the threshold where per-run pricing starts compounding.
  2. 02
    Make.com wins the mid-tier — 100 to 10K runs, 20 to 200 workflows.Per-action pricing scales more gracefully than Zapier's per-task model. The long-running workflow primitive and the visual builder make it the right default for teams who have outgrown Zapier but cannot justify custom.
  3. 03
    Custom agents win at 1M+ runs per month or 500+ workflows.Marginal cost approaches zero — LLM inference plus a few cents of infrastructure. Build cost is real, but amortised across a year of high-volume operation it becomes the cheapest path by an order of magnitude.
  4. 04
    Change cost is what kills no-code at scale.Versioning, testing, multi-environment promotion, and rollback all become harder on Zapier and Make.com as the workflow count grows. The line item is invisible until it dominates — usually around the 100-workflow mark.
  5. 05
    Hybrid stacks are the real-world answer.Zapier for low-volume, infrequently-changed workflows. Make.com for mid-volume orchestration. Custom agents for the hot paths and AI-flavoured work. The vendor question is not which tool wins — it is which tool handles which class of workflow.

01Three PathsZapier, Make.com, custom agents — three economics, three break-even curves.

Every workflow-automation conversation starts in the same place: should we Zap it, should we build it, or should we drop to Make.com? The right framing is not which tool is best — all three have legitimate roles — but which cost curve matches the workload. The three paths have meaningfully different economics, and the crossover points are where most decisions go wrong.

Zapier is the no-code default. Workflow setup measured in minutes, per-task pricing that scales linearly with usage, the broadest third-party integration library in the category. The cost curve is flat at the bottom and steep at the top. Make.com sits one rung below — visual builder, per-action pricing rather than per-task, stronger primitives for long-running and conditional workflows. Custom agentic builds are the engineering path — weeks of build cost, near-zero marginal cost per run, and the ceiling that actually matches enterprise scale.

No-code
Zapier — the long tail
Zero build · per-task pricing

8,000+ integrations, point-and-click setup, the canonical no-code automation tool. Wins below 100 runs per month per workflow and below 20 active workflows. Becomes expensive fast above either threshold.

Best for occasional automations
Low-code
Make.com — the mid-tier
Days build · per-action pricing

Visual builder, sharper per-action pricing, better long-running workflow primitive. Wins the 100-to-10K runs per month band and the 20-to-200 workflow range. The right default for teams growing out of Zapier.

Best for orchestration
Custom
Agentic — the head
Weeks build · cents per run

LLM-driven workflows on a Next.js or Python runtime, deployed to Vercel or your cloud. Wins above 1M monthly runs or 500+ workflows. The only path with marginal cost that stays flat as volume scales.

Best for hot paths & AI work
Scope of this TCO model
The numbers in this guide are illustrative— modelled on May 2026 list pricing for Zapier Professional, Make.com Pro, and a representative custom-agent build at South African and US blended engineering rates. Re-run the math against your current vendor quote and your team's actual rate before committing. The shape of the curves is the durable part; the absolute numbers move quarterly.

The decision rarely comes down to a single workflow. Most real stacks are hybrids — Zapier handling a hundred low-volume utility automations, Make.com running the medium-cadence orchestration layer, and a custom agent handling the two or three hot paths that drive most of the throughput. The right question is per-workflow, not per-vendor: which class does this automation belong in, and what does that imply for the next ten like it.

02Build CostZero (Zapier), days (Make.com), weeks (custom).

Build cost is the most visible line item and the one the brochure economics get right. A Zapier workflow is genuinely a 15-minute job — pick a trigger, pick an action, map a few fields, turn it on. A Make.com scenario is hours to a day or two for anything beyond a simple connect-the-dots. A custom agentic workflow with proper integration, observability, error handling, and tests is weeks of engineering for the first one and days for each marginal addition on the same stack.

The trap in this category is treating build cost as a sunk cost. It is — for the workflows that never change. But automation workflows do change, and the time-to-modify is closely correlated with the time-to-build. A Zapier workflow that took ten minutes to build takes ten minutes to modify. A custom workflow that took a week to build can take a day to modify. That asymmetry compounds.

Zapier
15min
Per workflow, first build

Trigger, action, field mapping, test, enable. Operations staff build their own — no engineering team required. Time-to-modify is the same as time-to-build, which is why iteration is cheap at low scale.

Ops-team buildable
Make.com
4h
Per scenario, first build

Visual modules, routing, conditional logic, error handlers. A semi-technical operator can build most scenarios; complex ones benefit from a developer review. Long-running workflows and webhook orchestration are first-class primitives.

Semi-technical operator
Custom
2-3wk
Per workflow, first build · stack from scratch

First custom workflow includes building the runtime — auth, queue, observability, deployment, alerting. Each subsequent workflow on the same stack is days, not weeks. By workflow ten the marginal build cost approaches Make.com.

Senior engineering

One implication worth lifting out. The custom-build curve is non-linear. Workflow one is expensive because you are building the runtime; workflow two is half the cost because the runtime is already paid for; workflow ten is days. By workflow fifty, the marginal build cost on a mature custom stack is competitive with Make.com for anything beyond trivial scenarios. The decision has to consider the second workflow, not the first.

The opposite applies on the no-code side. The hundredth Zapier workflow takes exactly as long as the first, because there is no shared runtime to amortise across. What scales linearly on the way up scales linearly on the way down — every change to every workflow is the same cost, every time.

03Run CostPer-task, per-action, per-run — the cost ladder.

Run cost is where the curves diverge most sharply. Zapier prices per task — every action in a workflow is a task, every multi-step workflow consumes multiple tasks per run. Make.com prices per operation, which is similar but typically lands at 30-50% the per-run cost of an equivalent Zap. Custom agents price as LLM tokens plus a few cents of compute and storage — typically measured in fractions of a cent per run for non-AI workflows and single-digit cents for AI-augmented ones.

The right way to compare is not by sticker price but by cost per workflow per month at a representative run volume. The matrix below shows a typical three-action workflow at three monthly volumes — the level where most real-world ops automations actually live.

Low volume
100 runs / month · Zapier wins

Zapier: ~300 tasks, well inside any paid plan, marginal cost effectively zero. Make.com: similar but with a higher plan minimum. Custom: build cost is unrecovered. At this volume, the no-code path is correct on every axis.

Pick Zapier
Mid volume
10K runs / month · Make.com wins

Zapier: ~30K tasks puts you in the Professional or Team tier, $73-$103 per month per workflow at typical usage. Make.com: ~30K operations on the Pro plan lands closer to $16-$29 per workflow. Custom: starting to compete, build cost still dominates.

Pick Make.com
High volume
1M runs / month · Custom wins

Zapier: 3M tasks per month is a custom-quoted enterprise tier, four-figure monthly minimum per workflow. Make.com: a million operations is also enterprise pricing. Custom: cents per run, single workflow operating at well under $50 per month in compute.

Pick custom
Hot path
10M+ runs / month · Custom is the only path

At this volume, iPaaS pricing is no longer linear and rarely public. A single hot-path workflow at ten million runs per month on a custom stack runs in the low hundreds of dollars; on Zapier it is a procurement exercise.

Custom only

The shape worth internalising: Zapier is cheap below the threshold and expensive above it, with no in-between. Make.com is the mid-volume flat spot. Custom is expensive at the low end and cheap forever once you cross the build-cost recovery point. The decision is not which tool is cheaper per run — they are all cheaper than the others at some volume — but which volume your workflow actually runs at.

04Change CostThe variable most teams forget.

Build cost and run cost dominate every TCO calculator we have seen. Change cost is the line item nobody includes, and it is the one that decides most real outcomes. A workflow is not built once — it is changed, on average, four to six times per year as the underlying business processes evolve. That is the variable the brochure economics never surface.

Change cost has three components: the time to modify, the risk of a bad change reaching production, and the operational overhead of maintaining a fleet of workflows. Each scales differently across the three paths, and the differences compound brutally as the workflow count grows.

Time to modify
Minutes on Zapier, hours on Make, sub-hour on a mature custom stack
Per-workflow modification cost

Single-workflow modifications are fastest on Zapier. By workflow fifty on a mature custom stack with shared abstractions, a typical change is sub-hour and shipped through the same review and deployment process as application code.

Per-change time
Change risk
No-code path lacks safety rails
Versioning · testing · rollback

Zapier and Make.com have no native concept of branches, no diff view, no automated tests, no clean rollback. Custom workflows inherit the entire software-engineering stack — git history, PR review, CI tests, blue-green deployment, instant rollback.

Hidden until something breaks
Fleet overhead
Linear on no-code, sub-linear on custom
Cost of running N workflows

Every no-code workflow is a snowflake. Every custom workflow can share abstractions, libraries, observability, alerting, and tests. Fleet maintenance cost grows linearly with workflow count on Zapier and Make.com; it grows sub-linearly on a well-built custom stack.

Compounds with scale
The change-cost pivot
The reason most enterprise teams migrate away from Zapier is not the per-task bill. It is the moment they have 200 workflows and a workflow changed in production breaks an upstream system, and there is no rollback, no test environment, no audit trail of who changed what. That is the moment the change-cost line item becomes visible — and it has usually been the dominant cost for months by then.

The pragmatic way to price change cost into a TCO model is to assume each workflow is modified five times per year and multiply by the per-change time. At workflow scales above 100, add a fleet-overhead line that scales linearly on no-code (~10% of an FTE per 100 workflows) and sub-linearly on custom (~10% of an FTE per 500 workflows on a mature stack). Those two additions usually flip which path is cheapest at any scale beyond mid-volume operation.

05Four Tiers1, 10, 100, 1000 workflows.

The four-tier model below collapses build, run, and change cost into a single 24-month TCO figure per workflow tier. Each row assumes the three-action workflow profile from Section 03, running at the median monthly volume for that tier, modified five times per year. Numbers are blended South African and US engineering rates and May 2026 vendor list pricing.

24-month TCO by tier · all three paths · per-workflow blended

Source: Digital Applied TCO model · May 2026 list pricing · illustrative
1 workflow · Zapier100 runs / month · 24-month TCO
~$600
Zapier wins
1 workflow · Make.com100 runs / month · 24-month TCO
~$1.2K
1 workflow · Custom100 runs / month · 24-month TCO
~$10K
10 workflows · Zapier10K runs / month · 24-month TCO
~$3.6K
10 workflows · Make.com10K runs / month · 24-month TCO
~$2.4K
Make.com wins
10 workflows · Custom10K runs / month · 24-month TCO
~$12K
100 workflows · Zapier1M runs / month · 24-month TCO
~$72K
100 workflows · Make.com1M runs / month · 24-month TCO
~$42K
100 workflows · Custom1M runs / month · 24-month TCO
~$36K
Custom wins
1000 workflows · Zapier100M runs / month · 24-month TCO
~$900K
1000 workflows · Make.com100M runs / month · 24-month TCO
~$540K
1000 workflows · Custom100M runs / month · 24-month TCO
~$180K
Custom wins

The pattern is clear enough to memorise. At tier 1 (a single workflow), Zapier wins by an order of magnitude. At tier 10 (ten workflows, mid-volume), Make.com is the right default — Zapier is still close, custom is still too expensive. At tier 100 (a hundred workflows, high volume), the custom path crosses below no-code and stays below it. At tier 1,000, the custom advantage is roughly 5× — the build cost is fully amortised and marginal cost dominates.

The numbers vary by workflow profile — a workflow that hits a simple webhook costs less than one that pulls a CRM record, transforms it, calls an LLM, and writes back. But the curve shape is robust. We have run this model against half a dozen client workloads with very different per-workflow profiles, and the crossover points land within roughly a factor of two of the tiers above in every case.

"The brochure compares per-task pricing. The reality compares fleet maintenance at workflow 200. Those are different decisions."— Digital Applied automation team

06Break-EvenWhere the custom path wins.

The crossover between no-code and custom is decided by two variables that work together: the total number of workflows, and the average monthly run volume per workflow. The break-even surface is a curve, not a line — neither dimension alone gives the answer.

The rule of thumb that emerges from the model: custom becomes the cheaper path when either workflow count exceeds roughly 500 or run volume exceeds roughly 1 million per month for a hot-path workflow. Below both thresholds, no-code wins on TCO. At either threshold, the math is close enough that the qualitative factors — change cadence, governance, AI features — should decide.

Low / Low
< 100 workflows · < 100K runs

No-code territory. Zapier for the long tail under 1K runs per workflow, Make.com for the rest. Build a custom stack only if AI features or governance requirements demand it — pure cost comparison favours no-code by a wide margin.

Stay no-code
High / Low
> 500 workflows · < 100K runs each

Fleet maintenance dominates. Custom wins on change cost and operational overhead even though run cost is comparable. The 500-workflow threshold is where snowflake maintenance overhead crosses engineered abstractions.

Pick custom
Low / High
< 100 workflows · 1M+ runs each

Hot-path territory. The run-cost line dominates the TCO; custom wins by an order of magnitude on per-run economics. A small number of high-volume workflows is the textbook case for custom.

Pick custom
High / High
> 500 workflows · 1M+ runs each

Enterprise scale. Custom is the only viable path on cost; no-code pricing at this volume becomes a procurement negotiation. Build the runtime once, ship workflows on it, route the residual long tail to no-code for ops convenience.

Custom + long-tail Zapier

The qualitative factors that flip the math earlier than pure TCO suggests: AI features that are awkward to express in iPaaS, governance and audit requirements that demand version control, data-residency constraints that rule out US-based iPaaS vendors, and integration with internal systems that no third-party connector covers. Each of these can pull the break-even point down from 500 workflows toward 50 or even 10. Pure cost is rarely the full story.

07Migration PlaybookA real 50-workflow agency migration.

The migration patterns in this section come from a real client engagement — a mid-sized marketing agency with roughly 50 active Zapier workflows accumulated over four years. The combined monthly Zapier bill had crossed $3,000 and the operations team was losing most of a day each week to maintenance, debugging, and the occasional silent failure. The brief: cut cost without losing workflows, improve reliability, and create the runtime for the next 50.

The migration ran across eight months and paid back the engineering investment in roughly that same window. The pattern below is the playbook we use on every similar migration.

Phase 1
1mo
Audit and classify

Export every active Zapier workflow. Classify each by monthly run volume, change frequency, business criticality, and integration surface. The output is a four-quadrant map that decides which workflows migrate, which stay, and which get deleted.

Discovery
Phase 2
2mo
Build the runtime

Stand up the shared custom stack — Next.js application, Vercel deployment, Supabase for persistence, queue and cron primitives, observability, alerting. By the end of phase 2, two pilot workflows are live on the runtime in production.

Foundation
Phase 3
4mo
Migrate hot paths

Port the top 15 highest-volume workflows in priority order. Each migration follows a fixed shape — build, parallel-run with Zapier for one cycle, cut over, retire the Zap. Average of one workflow per week, two engineers part-time.

Bulk migration
Phase 4
1mo
Wind down and document

Retire migrated Zaps, downgrade the Zapier subscription to a starter tier for the residual long tail, write the operator runbook, train the ops team on the new stack. Zapier remains for the bottom 20 utility workflows where it is still the right tool.

Wind-down

The economics of the migration: an upfront engineering cost roughly equal to ten months of the original Zapier bill, a steady-state Zapier bill cut to less than 15% of the original (one starter tier subscription), and a custom-stack run cost of roughly $80 per month for the migrated workflows combined. Payback came in month 8; from month 9 onward the saving runs at over $2,500 per month, before any account of the operational time recovered.

Two operational gains that did not show up in the cost model mattered as much as the bill cut. First, change cost dropped sharply — modifying a migrated workflow now happens through the same PR workflow as application code, with proper review and instant rollback. Second, the agency now ships AI-augmented workflows that were impossible on Zapier — LLM-driven lead scoring, generative email drafting, multi-step research agents — on top of the same runtime, with marginal build cost. The runtime is the gift that keeps giving.

We have shipped variations of this pattern across half a dozen agencies and mid-market clients, all of which fed back into the CRM automation engagements that anchor most of our build work. The migration math is consistent across engagements when the model assumes 24 months of TCO and includes change cost — both of which most internal calculators miss.

Conclusion

Workflow economics are decided by run volume and change cadence — pick by the math, not the brochure.

The TCO question has a structural answer. Zapier wins the low end of both dimensions. Make.com wins the middle. Custom agents win the top. The crossover surface is decided by two variables — workflows and runs — both of which are knowable in advance. The actual decision rarely needs more than the back-of-envelope model in this guide, applied against your current vendor quote and your real run volumes.

The variable nobody includes is change cost — and it is the line item that flips most real-world decisions earlier than pure run cost suggests. Five modifications per year per workflow, multiplied by per-change time, plus a fleet-maintenance overhead that scales linearly on no-code and sub-linearly on custom — that is the missing addition. Put it in the model and the break-even point moves left by roughly a factor of three at every tier.

The pragmatic stack for most teams is a hybrid — Zapier for the long tail of low-volume utility workflows where build cost dominates, Make.com for mid-volume orchestration where per-action pricing is sharpest, and a custom runtime for the hot paths and anything AI-augmented. The vendor question is not which tool wins — it is which class each workflow belongs in. Build the classifier once and the per-workflow decision becomes mechanical.

Decide your automation path

Workflow automation economics flip at scale — custom agents win the long tail.

Our team designs and ships custom agentic workflows that replace iPaaS at scale — with the TCO comparison and migration playbook included.

Free consultationExpert guidanceTailored solutions
What we build

Workflow-automation engagements

  • Three-path TCO comparison and decision framework
  • Custom agentic workflow design and implementation
  • Zapier / Make.com migration playbooks with no downtime
  • Hybrid-stack design — no-code where it makes sense
  • Per-workflow cost telemetry and quarterly review
FAQ · Workflow TCO

The questions ops teams ask before committing to a workflow path.

Make.com beats Zapier on per-run economics from roughly 1,000 runs per month per workflow, and beats it more decisively from 10,000 runs. The crossover is driven by two design differences. First, Make.com prices per operation (a single module run) while Zapier prices per task (each action in a Zap), so a multi-step workflow is typically 30-50% cheaper per run on Make.com. Second, Make.com has a first-class long-running and conditional workflow primitive — branching, looping, and waiting are native rather than bolted on, which makes mid-volume orchestration genuinely cheaper to express. Stay on Zapier when integration breadth matters more than per-run cost; switch to Make.com when run volume crosses the four-figure-per-month mark and the workflow needs branching or batching.