Strategy that survives contact with delivery.
Clear problem framing, sharp prioritization, and product strategy that connects to business goals — not just OKR theatre. I write the kind of strategy doc engineering and finance both nod at.
I'm a Director of Product working at the intersection of AI, automation, and customer experience — translating ambitious ideas into focused roadmaps, calm delivery, and outcomes that hold up in the operating review.
Clear problem framing, sharp prioritization, and product strategy that connects to business goals — not just OKR theatre. I write the kind of strategy doc engineering and finance both nod at.
Customer research, qualitative insight, and disciplined evidence — turned into bets the team can actually run.
A repeatable lens for scoring AI use cases against value, feasibility, risk, and operational fit — so the roadmap reflects reality, not hype.
Tight feedback loops between product, engineering, design, and operations. Ship, learn, adjust — every sprint, every quarter.
I align execs, ops leaders, and engineers around the same picture of value.
I instrument what matters — handle time, deflection, accuracy, CSAT, cost-to-serve — and report on them honestly. Real outcomes, not vanity metrics dressed up as wins.
A short cross-section of the work I lead: AI-assisted customer support, internal automation platforms, and CX strategy programs. Tap any row to expand.
A multi-region BPO operation supporting a global consumer brand, with thousands of agents handling voice and chat across complex product lines.
Agents were spending up to a third of their day on summary writing, after-call work, and switching between knowledge tools — eroding both quality and capacity.
Led discovery with frontline agents and ops leaders, then defined a focused AI assist surface: real-time context retrieval, automated wrap-up, and intent-aware routing. Shipped in stages alongside QA so guardrails matured with the model.
A growing operations org running a dozen separate tools, with manual handoffs between hiring, training, scheduling, QA, and reporting.
Workflows were brittle, ownership was unclear, and every new client launch added more glue work. The team needed a platform spine, not another point tool.
Defined the v1 of an internal automation platform around a small set of reusable primitives — entities, events, and workflow steps. Partnered closely with engineering and ops leadership to retire two legacy tools and make the new platform the default for client launches.
An enterprise CX program with strong executive sponsorship, a sprawling AI wishlist, and no shared lens for what to fund first.
Thirty-plus candidate AI initiatives, competing stakeholders, and an honest concern that the program would optimize for activity over impact.
Built a use-case scoring framework grounded in customer value, feasibility, operational risk, and time-to-impact. Ran structured working sessions with ops, tech, and finance, then converted the survivors into a sequenced roadmap with named owners and clear kill criteria.
There's no secret method. Just a working rhythm I keep coming back to, whether the project is an AI assist on the floor or a multi-quarter platform bet.
Start by writing the problem in language a customer would recognize. If the team can't agree on the problem in a sentence, no roadmap will save us.
Talk to the people who live with the problem — agents, admins, end customers. Pair what they say with what the data shows, then look for the gap.
Score against value, effort, risk, and strategic fit. Be honest about what we won't do this quarter and why.
Decide whether AI is the right tool, the right surface, and the right risk profile. Cheap deterministic logic still wins more often than anyone admits.
Ship in slices that real users can react to. Wire the metrics in from day one so we can tell whether we're improving or just shipping.
Keep execs, ops, engineering, and design on the same picture of value, with the same risks visible. Boring updates beat surprise announcements.
Short, opinionated essays for product leaders working through the same questions I am.
A simple framework for separating AI use cases that earn their place on the roadmap from the ones that should stay in the slide deck.
Automation programs love efficiency metrics. The good ones design backwards from the customer experience — and pay it back in retention.
Most AI pilots produce a demo and a memo. Turning them into product takes a different kind of discipline. A practical look at what changes between the two.
I'm Jason Donaghy — Director of Product at IntouchCX, focused on AI software, automation, and the customer experiences they reshape.
My work sits at the seam between business strategy and shipped product. I help organizations decide which AI bets are worth making, define the products that deliver them, and lead the cross-functional teams that get them into customers' hands. The throughline across the last decade is the same: pair deep customer understanding with disciplined delivery, and let outcomes do the talking.
I lead with calm, candor, and a strong bias toward writing things down. I believe great product leadership is mostly the work of holding a clear picture of value steady while everyone else negotiates the messy reality of building it. I prefer small, capable teams; I trust craftsmanship; and I'd rather kill a feature in discovery than ship one that quietly damages the customer.
Outside of product, I think a lot about how AI changes the operator's job — and how product leaders can be honest stewards of that change for the people doing the work.
Open to senior product leadership roles, fractional advisory work, and the occasional thoughtful intro. Recruiters and founders welcome — please include a sentence on the work, not just the title.