Five Numbers and the Truth
Most executive dashboards fail because they are built by people who are not executives. They include everything that could possibly matter instead of the five things that actually do. A well-designed executive dashboard is closer to a car's warning lights than to the engine manual. It tells you when to pay attention, not how to fix the problem.
The executive who opens a CEO dashboard crammed with thirty metrics, twelve charts, and three scrollable tables does what any rational person would do: closes the tab and asks someone for an update instead. That dashboard cost months to build. It gets used for days. The pattern is so common it barely registers as a failure any more. It is just what dashboards do.
They do not have to. The executive dashboards that earn daily attention share a single trait: every element on screen passed a decision test before it was included. Editorial restraint, not technical ambition, is the difference between a dashboard someone relies on and one they tolerate.
The warning light principle: A car dashboard does not show engine temperature, oil pressure, coolant level, and RPM with equal prominence. It shows a warning light when something needs attention. The rest of the time, you drive. Executive dashboards should work the same way: invisible when things are fine, unmissable when they are not.
The Executive Mindset
Executives make two types of decisions: resource allocation and risk management. "Where should we direct attention?" and "Is anything off track?" cover nearly every question a CEO, MD, or department head needs answered on a given morning. This shapes everything about how the dashboard should be designed.
The executive is not analysing data. They are scanning for signals. They want to know if the business is broadly on track and, if not, where the problem sits. The depth comes later, in a conversation or a drill-down view. They do not need granular detail. They do not need raw data. They do not need metrics they cannot influence. An executive cannot act on "average API response time" or "emails sent by marketing this week." Those metrics belong on operational dashboards, not here.
The cadence matters. An executive dashboard serves three distinct rhythms, and the design must support all three without requiring separate views.
If the dashboard cannot serve the 30-second daily glance, it has already failed. Everything else builds on that foundation. Design for the glance first, then layer in support for the weekly review and the monthly drill-down. Stephen Few, whose Information Dashboard Design remains the canonical reference, makes the same point: a dashboard should present the most important information needed to achieve one or more objectives, consolidated on a single screen so it can be monitored at a glance. The key phrase is "needed to achieve." Not nice to see. Not interesting to know. Needed.
Choosing the Right 5-7 Metrics
Every metric on an executive dashboard must pass one test: would this number change what the executive does today? If the answer is no, the metric does not belong. This single filter eliminates most of what typically clutters business dashboards. George Miller's research on working memory suggests people hold roughly seven items in mind at once, plus or minus two. For dashboards, aim for the low end: five to seven executive dashboard metrics, with most landing closer to five.
Four categories cover the decisions most business leaders actually make.
Revenue and Financial Health
Monthly revenue vs target, cash position, gross margin. The numbers that determine whether the business is viable right now. For a SaaS business, this becomes MRR and runway. For an agency, project margin and cash flow.
Pipeline and Growth
Total pipeline value, conversion rate, new business coming in. The numbers that determine whether the business will be viable next quarter. A thin pipeline today is next quarter's revenue problem.
Risk and Issues
Projects at risk, overdue deliverables, client escalations. The items that need executive intervention before they become crises. This is where the dashboard earns its keep.
Team Capacity and Customer Health
Utilisation rate, open roles, NPS, retention rate. The constraints that determine what the business can deliver and whether clients stay. These are leading indicators that revenue metrics lag behind.
Not every business needs all four categories. A pre-revenue startup might replace financial health with runway and burn rate. A professional services firm might weight utilisation more heavily than pipeline. The categories are a framework, not a prescription. Industry specifics shape the exact KPI dashboard selection: ecommerce tracks average order value, conversion rate, and return rate; SaaS tracks MRR, churn, and LTV:CAC ratio; manufacturing tracks output, defect rate, and capacity utilisation.
What to Include vs What to Exclude
Knowing what to exclude is as important as knowing what to include. Every metric competes for the same limited attention. The wrong ones dilute focus on the ones that matter.
| Area | Exclude | Include |
|---|---|---|
| Activity | Emails sent, meetings held, tasks completed | Pipeline value, deals closed, revenue recognised |
| Vanity | Social followers, website visits, newsletter subscribers | Conversion rate, qualified leads, customer retention |
| Influence | Metrics the executive cannot change (server uptime, code commits) | Metrics where executive attention changes the outcome |
| Granularity | Individual task status, daily support ticket counts | Project health summary, escalation count, trend direction |
The Collaborative Selection Process
Metric selection is where most executive dashboards go wrong, and the reason is that neither executives nor data teams can do it alone. Executives without data context choose vanity metrics: numbers that look impressive in board presentations but do not drive daily decisions. Data teams without executive context choose activity metrics: numbers that are easy to pull from the database but do not connect to anything the executive would act on. The overlap between what the executive actually decides and what the data can reliably measure is where useful dashboards live.
The selection process starts with one question asked of each executive: what decisions do you make regularly, and what information would change your behaviour? The answers, not a list of available data fields, determine what goes on the dashboard. We have found that this conversation typically reduces an initial wish list of 20-30 metrics down to 5-7 that genuinely pass the decision test.
Context Is Everything
A number alone is meaningless. Revenue: 47,000. Good or bad? Without context, no one can tell. The number needs an anchor, and every metric on an executive dashboard should show three things: the current value, a comparison (vs target, vs last period, vs forecast), and a trend direction (improving, stable, declining). This is the "compared to what?" principle. A metric without comparison is decoration. These examples show the difference context makes.
Colour coding provides instant status recognition. Green means on or above target. Amber means within tolerance but worth watching: a trend heading the wrong direction, a metric approaching its threshold. Red means attention required now. This vocabulary must be consistent across every metric on the dashboard. Mixed conventions destroy trust. The brain processes colour faster than it processes numbers, so an executive scanning five KPI cards reads the colour before the value. Five green cards and a red one: the red card gets attention. That is exactly the behaviour the dashboard should encourage.
A practical example: revenue at £47k against a £55k target might display as green if the business is seasonal and historically hits its number in the final week. Or it might display as amber if the trend suggests the gap will widen. Context and calibration determine the colour, not a fixed percentage rule. Thresholds must be defined deliberately: "red" should mean "materially off track and requiring executive intervention," not "slightly below target." Overly sensitive thresholds train people to ignore colour coding entirely. We calibrate thresholds with the people who will act on them, and review them quarterly.
Exception-Based Attention
The best executive dashboard is boring when everything is fine. Green across the board means no action needed. The check takes five seconds. Close the tab, start the day. This is a feature, not a limitation. It is the single most important executive dashboard best practice.
The boring dashboard principle: Designing for boredom means the dashboard earns attention only when attention is genuinely required. Red items stand out precisely because they are rare. The executive trusts the dashboard because it does not cry wolf.
When something does turn red, the dashboard must provide enough context to act. Not just "Revenue: Red" but "Revenue: £38k of £55k target, down 12% vs last month, trending down for 3 consecutive weeks." The executive knows the severity, the direction, and the duration without clicking anywhere. The red item earns attention because the green items do not demand it. This is the core mechanism of exception-based attention design: trust is built through quiet accuracy, and focus is directed through deliberate contrast.
Threshold Calibration and Alert Fatigue
Alert fatigue is what happens when a dashboard presents so many warnings that the executive learns to ignore all of them. It is the same phenomenon that plagues hospital monitoring systems: when every other indicator is amber or red, the brain stops treating colour as a signal. The green/amber/red hierarchy only works when green is the dominant state. If your dashboard routinely shows more than 10 to 15 percent of its indicators in a warning state, your thresholds are miscalibrated.
Calibrating thresholds requires three decisions. Who defines them: the executive, in conversation with whoever understands the underlying metric. How they are set: based on historical variance and business tolerance, not arbitrary percentages. When they are reviewed: quarterly, or whenever the business context changes materially. A growing business should expect revenue targets to shift. A seasonal business needs different thresholds in different quarters. Thresholds that were right six months ago may be wrong today.
The exception list at the bottom of the dashboard is the action centre. It shows only items currently amber or red, with the metric name, current state, the person responsible, and how long the item has been flagged. Three items, not thirty. If more than five items are red simultaneously, the business has a bigger problem than dashboard design.
What an Executive Dashboard Looks Like
Simplicity is harder than complexity. Anyone can display thirty metrics. Choosing five requires understanding which five actually matter. The executive dashboard design follows a clear hierarchy: status at the top, trend in the middle, exceptions at the bottom. The entire dashboard fits on one screen. No scrolling. If it does not fit, something needs to be removed, not shrunk.
Consider the information density. A KPI card is roughly 200 pixels wide. Five of them fit comfortably across a standard screen. Each card communicates one metric with full context: name, value, comparison, sparkline, and status colour. That is 1,000 pixels of width conveying the health of the entire business. No amount of complexity can compete with that efficiency. For more on how spatial position, whitespace, and visual weight guide the viewer's eye, see the dashboard layout patterns guide.
The data refresh cadence for executive dashboards is typically hourly or daily, not real-time. These numbers do not change minute by minute, and polling more frequently adds server load without adding value. Match the refresh to the decision rhythm: daily numbers for the daily glance, weekly aggregates for the weekly review, monthly summaries available in the drill-down.
Adapting by Role
The same design principles apply across the C-suite, but the specific metrics shift by role. A CFO dashboard emphasises cash flow, profitability margins, budget variance, and working capital ratios. A COO dashboard tracks operational throughput, team capacity, delivery timelines, and process bottlenecks. A CMO dashboard monitors marketing-qualified leads, customer acquisition cost, channel conversion rates, and LTV:CAC ratio. The structure stays the same (KPI cards, trend chart, exception list). The metrics reflect the decisions each role actually makes.
Drill-Down, Not Clutter
The executive dashboard is the landing page, not the whole application. It answers "do I need to worry about anything?" and provides one-click access to find out why. The detail lives behind the summary, available on demand, never cluttering the top-level view.
The interaction pattern follows three deliberate steps.
Spot. The executive sees a red KPI card during their morning glance. Revenue is below target and trending down. The colour and sparkline tell the story before the number does.
Click. One click on the revenue card reveals the breakdown: by region, by team, by product line, by time period. The executive can see where the shortfall is concentrated without opening a separate report or spreadsheet.
Understand. The drill-down shows context: which deals slipped, which clients delayed, which pipeline items moved. The executive has enough information to direct a conversation, assign a follow-up, or escalate.
This pattern keeps the top-level view clean while providing depth on demand. The executive never needs to open a separate report, email someone for context, or dig through a spreadsheet. The answer is one click away. The executive who has to send an email to get context has already lost confidence in the dashboard.
Building effective drill-down requires anticipating the questions that follow from each metric. Revenue is down: the executive will want to know which region, which team, which product line. A project is at risk: they will want to know the timeline, the owner, and the proposed resolution. Make the next question one click away. For a deeper treatment of progressive disclosure and interaction patterns, see the dashboard UX patterns guide.
Signs Your Executive Dashboard Is Failing
Dashboard failure is rarely dramatic. It is gradual: people stop checking, revert to asking colleagues, and the dashboard becomes background noise. These are the signals to watch for.
The fix for each of these is the same: return to the decision test. Which decisions does this dashboard support? Which metrics connect to those decisions? Everything else goes. A quarterly review of dashboard usage (which metrics get viewed, which get ignored) keeps the tool sharp. Governance sounds heavy, but it breaks down into one practical habit: every 90 days, audit usage, prune what is dead, and check whether the business questions have changed. A leadership dashboard that is not actively maintained will be passively abandoned.
The Dashboard That Gets Checked Every Morning
The best executive dashboard earns a place in the morning routine. It gets checked with the first coffee because checking it is faster and more reliable than asking someone. It surfaces problems before they become crises. It confirms that things are on track without requiring investigation. When leadership visibly uses a dashboard, the rest of the organisation follows. The morning glance becomes a shared habit, and the metrics on the dashboard become the shared language of the business.
That only happens when the CEO dashboard is simple, honest, and connected to decisions. Five numbers and the truth.
-
Checked daily, not monthly Because it answers questions faster than asking a colleague
-
Problems visible before they escalate Amber items get attention before they turn red
-
Decisions grounded in data, not instinct Every metric connects to something the executive can act on
-
Context that prevents misreading Every number shown with target, trend, and status colour
-
One click from summary to cause Drill-down eliminates the need for follow-up emails and spreadsheets
-
Trust built through restraint Fewer metrics, better chosen, consistently accurate
A dashboard that earns daily use pays for itself many times over: problems caught a week earlier, decisions made with data instead of guesswork, leadership meetings that start with shared context instead of status updates. The executive dashboard is a small piece of the overall dashboard design strategy, but it is the one that sets the tone for how the entire organisation uses data.
Build a Dashboard Your Leadership Team Will Actually Use
We design executive dashboards around the decisions your leadership team makes. Your metrics, your thresholds, your business rhythm. Not a generic template with thirty charts no one reads. A focused tool that earns its place in the morning routine.
Book a discovery call →