Process Billions of Jobs Without Losing Sleep
Thousands of Elixir applications rely on Oban Pro for background jobs with advanced workflows, distributed concurrency, highly optimized performance, and a trove of enterprise grade features.
Upgrade to Oban Pro in Minutes

OSS Foundation. Next Level Productivity
Oban Pro builds on open source Oban's rock-solid foundation with the advanced features you need to build, control, and scale complex background job systems with confidence.
Oban
Reliability
Persistent background jobs with automatic retries, error handling, and transaction safety
Consistency
Guaranteed job execution with database-backed persistence and ACID compliance
Observability
Built-in instrumentation and job retention for monitoring execution and historic jobs
Oban Pro
Productivity
Workflows, decorators, and relay features to build complex job systems faster
Performance
Smart engine with advanced concurrency control, global limits, rate limiting, and burst mode
Flexibility
Granular queue partitioning, chaining, and dynamic configuration for every use case
Advanced Workflows
Oban Pro Workflows orchestrate complex multi-step processes with ease. Perfect for AI tasks, billing pipelines, media processing, and reporting systems that require reliable coordination across distributed jobs.
Fully Distributed
Workflows are fully distributed between nodes, ensuring high availability and scalability across your infrastructure.
Cascading Context
Build workflows from vanilla Elixir functions, passing cumulative context between jobs for seamless data flow.
Nested Sub-Workflows
Compose complex workflows hierarchically by nesting sub-workflows for better organization and reusability.
Smart Concurrency
Oban Pro's Smart engine provides advanced concurrency control across your entire cluster. With Pro you have global limits, rate limiting, partitioning for granular control, and burst mode to maximize throughput when capacity is available.
Granular Partitioning
Split a single queue into multiple partitions, each with its own concurrency and rate limits for independent job processing.
Distributed Rate Limiting
Control execution rates across your cluster by limiting how many jobs run within a specific time window, with limits enforced globally across all nodes.
Burst Performance
Maximize throughput temporarily by exceeding per-partition limits when there's available capacity in the system.
Productive Extensions
Oban Pro's powerful extensions boost productivity with decorators for any function, relay for distributed job execution, hooks for lifecycle events, recording for debugging, structured args for validation, chains for sequential job orchestration, and more.
Decorators
Decorated functions act as fully fledged background jobs with retries, scheduling, and the other guarantees you'd expect from Oban jobs.
Relay
Relay lets you insert and await the results of jobs locally or remotely, across any number of nodes, so you can seamlessly distribute jobs.
Chains
Chains link jobs together for strict sequential execution, ensuring downstream jobs run only after upstream jobs are completed.


Support is Here
Need help unlocking Pro features, scaling your system, or tackling gnarly troubleshooting? Expedite the support process with a pairing session!
Learn About Pairing SessionsFewer Queries, Faster Job Processing
Want more reasons to use Pro? The Smart engine delivers increased job throughput with dramatically reduced database load.
92%
Fewer Queries
96%
Fewer Transactions
Total queries to process N unique jobs (more is worse)
-
Insert Optimization
Less data over the wire with batched, paramaterized inserts
-
Async Tracking
Process jobs with far fewer transactions thanks to async tracking
-
Accurate Recovery
Accurate job retry after shutdown, without waiting for a timeout
┌─[CUSTOMER-A]─┐ ┌─[AI-API]─┐ │ ████████████ │ │ │ │ 80/100 req/h ├───►│ ▅▅▅▅▅▅▅▅ │ └──────────────┘ │ │ ┌─[CUSTOMER-B]─┐ │ │ │ ██████░░░░░░ │ │ │ │ 60/100 req/h ├───►│ │ └──────────────┘ └──────────┘
Per-Customer Rate Limiting
┌─────────────────┐ │ DECOMPOSE │ └─────────────────┘ │ │ │ │ ┌────┘ │ │ └────┐ ▼ ▼ ▼ ▼ [RES] [ANA] [SYN] [RSP] ✓ ⚡ ⧗ ○
Agentic Task Decomposition
[PREP] ─► [TRAIN] ─► [EVAL] ─► [DEPL] │ │ │ │ ▼ ▼ ▼ ▼ ┌───┐ ┌───┐ ┌───┐ ┌───┐ │ ✓ │ │ ✓ │ │ ✓ │ │ ✓ │ └───┘ └───┘ └───┘ └───┘ ┌───┐ ┌───┐ ┌───┐ │ ✓ │ │ ✓ │ │ ○ │ └───┘ └───┘ └───┘
Training Pipeline Orchestration
┌─[BATCH-001]─┐ │ * * * * * * │ ─► [TRANSACT] ──────┐ └─────────────┘ │ ┌─[BATCH-002]─┐ ▼ │ * * * * * * │ ──► [SUMMARY] ─► [OUTPUT] └─────────────┘ ▲ │ 2AM GPU TIME │ ▅▅▅▅▅▅▅▅▅▅▅▅▅▅▅ ──────────┘ LOW COST PERIOD
Scheduled Batch Processing

Oban AI
Oban AI uses machine learning to optimize your workflows and…syke! All you need is Oban Pro, it's ideal for building AI integrated applications.
Learn How to Use Pro with AIBuilt for Enterprise
Tailored to organizations that want dedicated support, private chat, video pairing, priority access to new features, and ongoing configuration review.































“The best tooling decision we've made was Oban Pro for async tasks. We love the way our system functions with it as the backbone. Your work really takes ours to the next level.”

Britton Kowalk
Crew
“Oban is probably the greatest library I have ever used, especially with Pro on top. The value you get from Pro-for a very reasonable amount-is completely mind-blowing.”

Peter Ullrich
IndieCourses
“We 100% couldn't have put our new, monolithic ETL into production without Oban Pro. FLAME + Oban is actually a dream. Every time we double down on the BEAM, we win.”

Christopher Grainger
Amplified