Shipping analytics is easy. Making it useful is not.
- Marc Alexander
- Jan 7
- 7 min read

The bit nobody puts in the project plan
Most analytics implementations “work”.
Events fire. GA4 fills up. A dashboard appears. Someone says “we’re in a much better place now” with the confidence of a person who has not yet tried to use the data in a board meeting.
Then comes the first proper question. Not “is GA4 installed?” (it is), but something deeply inconvenient, like: “Are we generating better leads than last quarter?” or “Which part of the site actually drives enquiries we’d want to ring back?”
That’s where the polite British silence arrives. The sort where everyone suddenly becomes fascinated by the rim of their mug.
Because shipping analytics is easy. Making it useful is not.
I saw this up close on an 18-month rebuild of Knight Frank’s UK website, moving from a heavily customised legacy CMS to Optimizely.
The risk wasn’t whether tracking would exist on launch day.
The risk was whether the measurement would still mean anything once the site was alive, changing weekly, edited by humans, constrained by consent, and pulled in ten directions by different teams with perfectly reasonable goals.
“Useful” is not a synonym for “present”. Useful means the data stays coherent when the real world gets involved.
Launch-day analytics is a fairy-tale
There are two types of analytics in the world.
The first is launch-day analytics: pristine, tested, approved, and briefly adored. Everything has been checked in staging by someone heroic enough to click through every journey while whispering “please work” under their breath.
The second is Tuesday-in-November analytics: somebody has created a new page, someone else has renamed a category because it “reads better”, and an urgent campaign has landed with a form nobody told you about until it was already live.
If your measurement plan only works in the first world, it’s not a measurement plan. It’s a launch checklist.
Knight Frank’s rebuild exposed exactly the kinds of problems that tend to hide until a big change forces you to look at them properly: data continuity at risk during a full rebuild, an overcomplicated dataLayer and tagging framework with inconsistent naming conventions, an outdated privacy banner that wasn’t fit for modern GDPR expectations, and lead-gen forms that were clunky on mobile with abandonment exceeding 80%.
You can “ship” analytics in that environment. You can also ship a canoe in a storm. The question is whether anyone will get where they’re trying to go.
What “useful” actually means
Here’s the definition I use in practice. A measurement plan is successful when it becomes infrastructure.
Not a spreadsheet. Not a slide. Infrastructure. The sort of thing that quietly holds the building up and only gets noticed when it’s missing.
In a rebuild like this, that infrastructure has to do four jobs at once. It needs to keep meaning consistent across the site, capture behaviour in a way that explains outcomes, connect web activity to business systems, and survive modern privacy reality without turning the dataset into Swiss cheese.
If that sounds demanding, it is. Websites are demanding. They are never finished. They just keep happening to you.
Make the CMS do the hard work, because humans will not
The quickest way to ruin analytics is to rely on people remembering the rules.
If page context is optional, it will be missing. If naming is free-text, it will be creative. If the rules are in a Confluence page called “FINAL_v7_MEASUREMENT_PLAN_REALLY_FINAL”, they will be ignored with impressive consistency.
The unglamorous solution is to force meaning at source. In this case, that meant building a CMS-integrated settings framework with mandatory dropdowns for core attributes such as page type, category, division, and transaction type, ensuring a page-load dataLayer was automatically created on page creation and consistent metadata flowed into GA4 from day one.
This is the sort of thing that sounds boring until you’ve tried to report on “performance by division” and discovered half the site is labelled “Divison”, “Division”, “Div”, “Sales”, and “KF (old)”, with the remainder labelled “TBC”.
When the CMS is made to declare what a page is, you stop having philosophical debates about what a page “counts as”. You start having useful conversations about what happened and why.
And crucially, it scales. It still works when new pages appear. It still works when teams change. It still works when nobody involved in the original build remembers how it was put together. Which is ideal, because six months after launch, nobody remembers anything. We’re all just doing our best.
Forms are not buttons. They are journeys.
Most organisations measure forms like they measure a light switch.
Off. On. Submitted.
Then they wonder why they can’t improve conversion rates without arguing for three weeks about whether the problem is “traffic quality” or “UX”. If you only track submissions, you are measuring the ending and calling it a story.
The useful approach is to treat forms as journeys you can diagnose. In Knight Frank’s case, an “all pages – all forms” solution was developed so that form opens, submissions, and closes could be automatically populated as dataLayer events for any form generated across the website.
That seemingly small difference changes everything.
Instead of “submissions are down”, you can see whether people are opening forms and abandoning, whether they’re failing on mobile, whether a particular step is acting like a bouncer outside a nightclub, and whether changes made by design have genuinely reduced friction or merely moved it around.
It also makes optimisation less emotional. You don’t have to rely on the loudest opinion in the room, or the person who “has a gut feel” (which is very often a polite way of saying “I am guessing”). You can point to behaviour and move on with your life.
The lead doesn’t end at the website, so measurement can’t either
Here’s the bit that separates decorative analytics from grown-up analytics.
A form submission is not the outcome. It’s a handover.
Most attribution breaks at the exact moment the lead becomes real. GA4 records the conversion. Marketing celebrates. Sales looks at the CRM and says, “Yes, but was it any good?” The two systems then sit in opposite corners, refusing to make eye contact.
If your measurement plan stops at “form_submit”, you’ve built a lovely instrument panel that disconnects the moment the plane leaves the runway.
The fix is to build continuity across platforms, and the simplest reliable way to do that is shared identifiers. In this implementation, a unique conversion ID from Knight Frank’s internal form database was fed into GA4 on submission via the dataLayer and Google Tag Manager, and also passed into LeadPro via an API, enabling accurate funnel tracking and CRM attribution.
That’s the difference between optimising for “more leads” and optimising for “more worthwhile leads”. Without that bridge, you can get very good at generating enquiries that go absolutely nowhere, which is a bit like being proud you’ve increased the number of people walking into your shop while forgetting to mention they all leave immediately because the floor is lava.
Privacy is not a banner you bolt on at the end
Consent is not something you sprinkle on after the build like parsley.
It changes the shape of your dataset, and it changes it in ways that can quietly destroy decision-making. If your consent experience is confusing, heavy-handed, or out of date, you don’t just create compliance risk. You create patchy, biased data, where the people you can measure are a weird subset of the people you actually serve.
Knight Frank’s privacy banner was flagged as outdated and lacking granular consent controls, creating regulatory risk, so a modern banner with clear, user-friendly consent options was designed and implemented. Reported acceptance rates were 67% on desktop and 63% on mobile, stated as above industry benchmarks.
Those numbers matter because they affect what you can reliably compare. When acceptance is healthy, channel performance and landing-page trends are less likely to be statistical theatre. When acceptance is poor, every “insight” arrives with an invisible footnote: “Based on the behaviour of a subset of users who are not necessarily representative of your audience.”
Which is not the sort of footnote you want when someone asks why revenue is down.
Success is not “tags firing”. Success is outcomes.
At some point, every analytics project has to face its final exam.
Not “is GA4 live?” Not “is the dataLayer populated?” Those are table stakes. The exam is whether the measurement plan enables real improvements without everyone arguing about definitions, data quality, or what something “really means”.
On this project, the outcomes reported were a 27.4% uplift in overall form completion rates within the first month post-launch, and a 49.2% conversion rate on the critical Market Appraisal form from organic traffic.
The reason those results belong in an article about measurement plans is not because they’re nice numbers. It’s because they show the point of doing the unsexy infrastructure work. When you can see where friction lives, when your page context is consistent, and when your form tracking is diagnostic rather than decorative, you can actually improve things with confidence.
You stop playing whack-a-mole with guesses. You make targeted changes, measure the impact, and keep going.
The quiet win nobody celebrates
The best measurement plan reduces the amount you talk about measurement.
Not because people stop caring, but because the conversation changes. It moves away from “Can we trust this?” and “What does this event mean?” and towards “What should we do next?”
That’s when analytics stops being a performance and becomes a utility. Like plumbing. You don’t praise plumbing daily, but you notice immediately when it’s gone.
Shipping analytics is easy. Making it useful is the work.
And if you’re planning an implementation project, that’s the bar worth setting. Not “we installed GA4”, but “we built measurement that survives change, survives humans, survives privacy reality, and still tells the truth when someone asks the simplest question imaginable.”
Which, in Britain, will usually happen at 4:47pm on a Friday.
Whether you need GA4 implementation, analytics strategy, technical integration, or ongoing support, I can help transform your data into reliable business intelligence.
I'm surprisingly friendly, even when telling you your GA4 tracking's a disaster.
Or explore my specialised services: