The AI Agent Orchestrator
Every wave of workplace automation has produced a new role to manage it. The factory floor produced the foreman. The mainframe produced the systems administrator. The software era produced the product manager, who coordinated engineers, designers, and data to ship outcomes that no individual function owned. Each role exists because a new class of worker was introduced into the business, and someone had to set the goals, allocate the work, and remain accountable for the result. AI agents are now becoming a class of worker, and the role that manages them is starting to take shape. Call it the agent orchestrator.
The orchestrator's mandate is straightforward to describe and difficult to execute: take a business outcome, decompose it into work, and assign that work across a mixed team of humans and agents in whatever combination produces the best result. The unit of management is no longer a person or a project. It is a workflow, and the workflow is staffed dynamically. A claims-handling outcome might be 80% agent and 20% human one quarter, and 95% agent the next, as model capability improves and edge cases get codified. The orchestrator owns that ratio, and is measured on the outcome rather than on how the work was divided to produce it.
This is not a rebranding of the project manager or the operations lead. The skill set overlaps but the substrate is different. A project manager allocates a known quantity of human capacity against a defined scope. An orchestrator allocates against a capacity that is heterogeneous (some workers are humans with judgment and relationships, some are agents with speed and scale), nondeterministic (an agent's reliability on a given task is a probability, not a guarantee), and rapidly changing (the capability frontier moves on a roughly seven-month doubling cadence, per METR's task-horizon work, which we covered in (https://alten.capital/blog/the-capability-of-llms-and-ai-agents-to-do). Yesterday's manual step is today's agent step. The org chart has to keep up.
Three capabilities define whether an orchestrator succeeds. The first is workflow decomposition. Most business processes were designed for humans, which means they bundle judgment, execution, and communication into single roles because that was the cheapest way to staff them. Orchestration requires unbundling: identifying which sub-steps require human judgment, which are deterministic enough for agents, and which are the messy connective tissue that still needs a person to hold together. This is closer to industrial engineering than to people management. The orchestrators who do it well will treat workflows as living artifacts, instrumented and revised, rather than as policies written once and inherited.
The second is evaluation. Agents fail differently than humans do. A human employee who is uncertain usually escalates; an agent that is uncertain often produces a confident, plausible, wrong answer. Managing agents, therefore, require building the equivalent of a quality function around them: defining what "good" looks like for each task, sampling outputs, measuring drift, and deciding when a workflow needs human review, retraining, or a different model entirely. The orchestrator role draws on QA, data science, and operations, and the firms that staff it well will treat eval infrastructure as a first-class internal product rather than an afterthought.
The third is judgment about where to keep humans in the loop. Not every task should be automated, even when it can be. Some work is high-stakes enough that the cost of a rare agent failure exceeds the savings from automating the common case. Some work generates the relationships and context that make a business defensible, and removing humans from it erodes the moat. Some work is how junior employees learn, and automating it hollows out the talent pipeline. The orchestrator is the person making these calls explicitly, rather than letting them get made by default through whichever team adopts agents fastest.
For technology services firms, this role is particularly consequential. As we have written, services pricing is shifting from time to outcomes (https://alten.capital/blog/from-time-to-outcomes, and outcome-based delivery only produces software-like margins if the cost of producing each outcome decouples from human labor. That decoupling does not happen automatically when a firm "adopts AI." It happens when someone is accountable for redesigning the delivery model for a mixed workforce and is measured on the unit economics of the resulting model. Without an orchestrator, AI tools get distributed across existing teams, used to make existing workflows marginally faster, and the margin profile barely moves. With one, the workflow itself gets rebuilt, and the leverage shows up in the P&L.
The role does not yet have a settled title, job description, or reporting line. It will. The firms that figure out what it looks like and staff it deliberately rather than by accident will have a structural advantage over the firms that treat agents as a tooling decision rather than an organizational one.
Alten Capital invests in technology services businesses. Reach out to explore potential partnerships.