MI for different audiences in a contact centre

Leadership · ~9 minute read

The trap of giving everyone the same dashboard

A contact centre has five distinct MI audiences — agents, team leaders, operations managers, directors, and the board. Each one needs a different view, on a different cadence, at a different level of detail, framed around a different set of decisions. Most operations get this wrong. They build one dashboard, usually designed for and by the planning team, and they try to make every audience use it. The result is that everyone is underserved — agents drown in detail they can’t act on, team leaders are exposed to strategy they don’t need, directors are dragged into interval-level operational chatter, the board is fed metrics it can’t connect to business outcomes. This article walks through what each audience actually needs and the design principles for serving them well.

The agent self-view

The agent needs a view of their own performance, updated daily, comparing them to themselves over time and to a sensible peer cohort. Three metrics are usually enough: adherence, quality (or QA score), and one customer-outcome metric (FCR, CSAT, or transferred-out rate depending on the operation). What agents don’t need is the operational dashboard — total queue depth, service level, attrition. Showing them tells them very little about what they personally can do, and creates the impression that everything bad is their responsibility.

The agent view should answer three questions: am I doing what was asked of me, am I improving, and where would attention to my own behaviour move the needle. Cadence: daily, with a weekly summary. Format: simple, visual, comparative. Owned by: the team leader, supported by the planning team.

The team leader daily pack

The team leader is the most under-served audience in most contact centre MI. They sit at the boundary between operations and planning. They need a daily view of their team’s performance, an intraday view of the queue they’re responsible for, a weekly view of where their team sits in the wider operation, and clarity on the two or three things they can influence today.

The team leader pack typically includes: yesterday’s adherence by team member, today’s scheduled vs actual coverage, the current queue position and service level, the week’s rolling QA pass rate, and the planning-team callout for anything unusual happening today. What it doesn’t include: strategic capacity metrics, attrition trend lines, forecast accuracy detail, anything that isn’t actionable in the next 24 hours.

Cadence: daily morning, with an intraday refresh available on demand. Format: one screen, no scrolling, headline visible from across the room. Owned by: the planning team, consumed by the team leader. The pair we recommend in the numbers TLs should track covers the right set in more detail.

The operations manager weekly review

The operations manager (the person running a section or a site) needs a weekly view that supports tactical decisions: do we re-forecast, do we open overtime next week, is shrinkage drifting, is the recruit class on track, are any agents at risk of leaving. Slightly longer horizon, slightly broader scope, slightly more diagnosis-oriented than the team leader.

The pack typically includes: service level achieved against target by day, forecast accuracy by day, intraday adherence trend, shrinkage build-up against plan, attrition early-warning signals from the leading indicators (see leading vs lagging indicators), and a one-line callout per area on what changed and why.

Cadence: weekly, ideally tied to the regular operations review meeting. Format: 3–5 pages, headline summary, drill-down available. Owned by: the planning team manager, consumed by the operations manager.

The director monthly review

The director (operations director, head of customer service, similar) needs a monthly view that supports strategic decisions: is the contact centre on plan, is cost per contact moving the right way, is attrition stable, is the recruitment pipeline keeping up, are we ahead or behind on the capacity model. Longer horizon, broader scope, more accountability-oriented than the operations weekly view.

The director pack should include: service level achieved vs target by week (the month in a single chart), forecast accuracy trend over the last 13 weeks, capacity vs plan, attrition by tenure cohort, the cost-per-contact trajectory, and a one-page narrative on what changed in the month and why. The narrative is the most valuable part — directors don’t want to be data analysts; they want the planning team’s view of the operation.

Cadence: monthly, with the planning team explaining it in person (a director-level audience deserves the conversation, not just the document). Format: 8–10 pages, with the headline and the narrative on page 1. Owned by: the head of planning, consumed by the operations director.

The board quarterly pack

The board needs a quarterly view that connects the contact centre to the business. Five questions: are customers being served at the agreed level, is the unit cost on plan, is the workforce stable, is the technology investment paying back, and is the operation on track to meet the year. Each question deserves one or two metrics — no more.

The board pack typically includes: service level achieved over the quarter, cost per contact vs budget, attrition vs target, NPS or CSAT trend, total FTE vs plan, and a one-page commentary that links these to the broader business performance — usually EBITDA contribution and revenue at risk. The board doesn’t need (and won’t read) the operational detail; it needs the executive summary and the assurance that the operation is being run well.

Cadence: quarterly, with appendices available for any board member who wants to dig in. Format: 4 pages plus appendices. Owned by: the operations director (the planning team builds it, but the director presents it). The chain that ties contact centre metrics to enterprise value is covered in understanding contact centre finance.

Design principles across audiences

Three principles apply regardless of audience.

Match level of detail to authority. An agent shouldn’t be expected to interpret strategic metrics; a board member shouldn’t be expected to interpret interval data. The metric should sit at the level the audience can act on.

Match cadence to decision rhythm. Daily MI for daily decisions. Weekly for tactical. Monthly for strategic. Quarterly for governance. Misaligning cadence and decision wastes both.

Match format to context. The team leader needs the pack on screen at the team brief. The director needs the pack in their inbox before the operations review. The board needs the pack in the meeting room two weeks before the board. Each format has different production discipline.

Common mistakes

Three patterns recur. One pack for all audiences. Usually the planning team’s pack, dressed up. Serves nobody well, takes longer to produce than five separate packs would, and trains every audience to ignore most of it. Producer-led design. The format that’s easy for the planning team to build isn’t the format that’s easy for the audience to use. Designing backwards from the audience matters more than designing forwards from the data. No narrative for senior audiences. A director or board member can read the numbers. What they need from the planning team is the interpretation — what changed, why, and what we’re doing about it. The narrative is the value-add, not the chart.

Conclusion

A contact centre with five audiences and one dashboard is under-serving four of them. The discipline of designing per-audience MI isn’t about more reports; it’s about fewer, sharper ones — each tuned to the cadence, format, and decision frame of the audience it serves. The operations that get this right find their MI pack does more with less, the executive audience pays attention to fewer numbers but takes them more seriously, and the planning team’s influence rises across every level of the operation. The starting point is the simplest one: ask each audience what decision they need MI to help them make, and design backwards from there.

Pair this with designing meaningful MI in contact centres, leading vs lagging indicators, and the numbers team leaders should track.