- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
2 hours ago
The Beautifully Boring Project Plan: A PMP-Based Playbook for Your AI Transformation
Kelli Abbott ServiceNow Employee | AI Center of Excellence
I set out to create the most boring project plan I possibly could. Not boring like "fall asleep in the status meeting" boring. Boring like your favorite pair of jeans — predictable, reliable, fit just right. That kind of boring.
Too many AI transformations turn into a circus of competing priorities and "we'll figure out governance later" hand-waving. The technology is exciting. The project management shouldn't be. So I built a PMP-based framework for ServiceNow AI rollouts: five use cases, clear phases, no mysteries. The use cases I've included are common starting points — swap them for whatever your organization actually needs. Your five will look different from the next team's five. That's by design.
This article includes a PowerPoint Action Guide and an Excel Project Tracker with formula-driven dates. Grab them, make them yours, and tell us what you think in the comments.
Why Five? Why Not Thirty?
You know the pattern: an organization identifies 30 AI opportunities, launches pilots everywhere, and six months later has zero production value. Welcome to pilot purgatory.
Five is big enough to prove enterprise value and small enough to govern. Your first two use cases — governance and analytics — are foundational and enable everything after. Plug in whatever fits.
The Starter Set (Customize These)
UC1 — AI Control Tower Governance (Cross-Product) Not optional. Establish your operating model for how AI gets approved, monitored, and controlled before anyone touches a prompt. Approval gates, audit logging, release controls. This one enables the other four.
UC2 — Usage Analytics & Quality Baseline (Cross-Product) Also non-negotiable. Your before/after KPI dashboards, instrumentation guidance, and guardrails for hallucination incidents and user overrides. Without this, every AI value conversation is just vibes.
UC3 — KB Auto-Generation (ITSM) — Generate KB drafts from resolved incidents with a human review workflow. Popular because it delivers visible value quickly with a built-in safety net. But if your pain point is ITOM event correlation or Creator workflow automation — plug that in instead.
UC4 — Change Risk Assessment (ITSM) — Surface risk signals and generate risk narratives for CAB. Directly reduces failed changes and speeds approval cycles. Or swap in auto-assignment, a CSM agent, whatever you need.
UC5 — HR Case Summarization (HRSD) — Summarize long cases and standardize resolution docs. Shows the framework works beyond IT. Swap for any domain — just know that heavier automation means more governance effort, so be honest about readiness.
Choosing your three:
- High value + low complexity = start here (data exists, workflow clear, human-in-the-loop built in).
- High value + high complexity = plan for later.
- Low value + any complexity = skip it.
Ask your team: Where is repetitive work highest? Where is data cleanest? Where do we have exec sponsorship?
The Delivery Framework
Structured as a governed "use case factory" — shared governance and analytics workstreams running parallel with iterative delivery per use case. Same framework no matter which use cases you select.
Phase 0 — Mobilize (2–3 weeks): Charter, RACI, environments, security posture, definition of done. Measure twice, cut once.
Phase 1 — Define & Feasibility (3–5 weeks): Requirements, workflow mapping, data checks, KPI baselines. The unsexy work that separates success from stall.
Phase 2 — Configure & Build (4–10 weeks/UC): Now Assist config, prompt tuning, workflow integration, UAT. Having governance in place means you're not waiting on approvals.
Phase 3 — Validate (2–6 weeks/UC): Technical checks plus business validation. Not just "does it work?" but "does it deliver value?"
Phase 4 — Rollout & Adopt (2–4 weeks/UC): Phased enablement, training, hypercare. Go-live is the starting line, not the finish line.
Phase 5 — Operate & Optimize (Ongoing): Dashboards, drift monitoring, enhancements backlog.
Governance, Risks, and Metrics
Governance accelerates. Three forums: biweekly steering committee (scope, value, risk), weekly AI governance review (approvals, guardrails, incidents), and weekly delivery demos (working increments, feedback). Clear roles: exec sponsor owns outcomes, product owners own adoption, governance lead owns compliance, platform engineering owns environments. No "I thought someone else was handling that."
The risks that matter: PII exposure, hallucinated outputs, inconsistent quality, low adoption. Mitigations: least-privilege access, human-in-the-loop checkpoints, output logging, red-team testing, and clear escalation paths. Use Control Tower and your analytics baseline to catch problems early.
Metrics that mean something: At the portfolio level — active users, utilization, hours saved, safety incidents. For each delivery use case, define 3–4 KPIs tied to the business outcome, instrument them in Phase 1, and track through your dashboard. The Excel tracker has a full KPI catalog with examples.
Start This Week
This week: Download the Excel tracker and PowerPoint. Walk through the starter use cases with your team. Decide what to swap in.
This month: Stand up governance (UC1) and analytics (UC2). These enable everything else in this example.
This quarter: Launch your first delivery use case through the phased framework. Track it boringly.
This year: Scale what works. Document what didn't. Build repeatable AI operating muscle for the next five.
The use cases in the tracker are starting points — make them yours. Then come back and tell us what worked. Because the best AI transformation isn't the flashiest one. It's the boring one that actually delivers.
Drop your questions in the comments. We're building this for you.
Resources:
- PowerPoint: AI Transformation Project Plan — Action Guide
- Excel: AI Transformation 5-Use-Case Project Tracker (formula-driven dates)
Views expressed are my own and do not represent ServiceNow, my team, partners, or customers.

