top of page
Search

How to Measure If Users Read Your Page in GA4: Meet Strix

  • Writer: Marc Alexander
    Marc Alexander
  • Dec 2
  • 3 min read
Cartoon owl with big eyes on a white background next to bold black text "STRIX." Brown and white colors, playful mood.
Strix: Your Shortcut to Content Truth. Metrics To Super Charge Your GA4

Most “engagement” numbers reward motion, not attention. A fast flick looks like progress. An idle tab looks like time spent. A sticky header CTA looks like intent. If you ship content and need to know whether people actually read it, you need a different signal. That’s what Strix provides: a clear, first-party, 0–100 reading score with the few supporting metrics that matter, emitted as a single, SQL-ready event per view.


The problem Strix solves with GA4


Marketing and product teams often end up reconciling two sources of truth—GA4 (or warehouse events) and a replay platform’s analytics API. The models don’t match: different session clocks, SPA handling, consent states, and proprietary “engagement” labels. You spend meetings debating whose number is right instead of deciding what to change. Strix removes the reconciliation step by owning the attention KPI outright and keeping the method transparent.


What Strix measures (in plain language)


Strix answers two simple questions with defensible math:


Did they progress through the content?


The content container is divided into adaptive slices (usually 8–14, never wafer-thin). A slice only counts after it has been in view for at least two seconds while the user has been recently active. That combination blocks false positives like drive-by scrolls, idle tabs, and content hidden under sticky headers.


How much time did they actively spend?


Two clocks run in parallel: active time (only ticks after recent interaction) and total time (wall-clock since the view started). Active time anchors the score to genuine reading, while total time helps you reason about latency and cadence.


Optionally, Strix records time to first CTA click when it happens. Raw “CTA visible seconds” are not published because they’re easy to misinterpret; they can still contribute internally if you want landing pages to emphasize promotional exposure.


The Strix score (0–100)


The score blends three terms you can explain in 30 seconds:

  1. Reading progress: full credit by halfway down the content.

  2. Active time versus expected time: expected time comes from words/WPM, capped so very long pages don’t dominate.

  3. Optional CTA term: contributes on promotional pages; many teams set it to zero for editorial.


Weights are configurable (default 0.6 / 0.3 / 0.1). Because the score is bounded and the inputs are explicit, it travels cleanly across dashboards, experiments, and weekly reviews.


Why use Strix as the primary KPI (and keep replay as the assistant)


Strix is first-party, consent-aware, and warehouse-native. Every field is a numeric second or percent, with stable join keys (view_id, content_id). There’s no GA4 stitching, no API sampling, no sessionization gotchas.


You make decisions from one event per view, then open session replay when you need qualitative context (rage-clicks, form friction, layout thrash). Replay is for why. Strix is for how much reading happened.


What the data looks like


A single compact payload lands in your data layer and warehouse:


Code snippet displays an API call titled "strix_score_index" with engagement metrics and timestamps on a white background.

No string times. No derived joins. Everything is ready to aggregate.


How teams can actually use it


A content lead compares Strix by headline variant and ships the version that lifts the score and shortens time to CTA. A product manager evaluates a new template by active time versus expected and keeps the layout that earns more reading with fewer pixels.


A growth analyst ranks channels by Strix to find sources that deliver real readers, not just page views. Leadership sees a KPI that sits comfortably next to revenue without requiring a reconciliation preamble.


Quick queries you’ll run on day one


Average Strix by template with CTA latency:


SELECT
  DATE_TRUNC(snapshot_ts, WEEK) AS week,
  page_template,
  COUNT(*) AS views,
  ROUND(AVG(engagement_score),1) AS avg_strix,
  ROUND(AVG(active_time_s),1)    AS avg_active_s,
  ROUND(AVG(read_progress_pct),1) AS avg_progress_pct,
  ROUND(AVG(CASE WHEN cta_clicked THEN cta_time_to_first_click_s END), 2) AS avg_ttf_click_s
FROM analytics.reading_engagement
GROUP BY 1,2
ORDER BY 1,2;


Find pages that earn attention but delay action (high Strix, slow CTA):



SELECT
  content_id,
  ROUND(AVG(engagement_score),1) AS avg_strix,
  ROUND(AVG(read_progress_pct),1) AS avg_progress,
  ROUND(AVG(cta_time_to_first_click_s),2) AS avg_ttf_click_s
FROM analytics.reading_engagement
WHERE cta_clicked = TRUE
GROUP BY 1
HAVING AVG(engagement_score) >= 70
   AND AVG(cta_time_to_first_click_s) > 30
ORDER BY avg_ttf_click_s DESC
LIMIT 25;


Implementation notes (what’s under the hood)


Strix runs as a lightweight, SPA-safe script. It detects your content container, adapts the number of segments to length, offsets sticky headers so covered content doesn’t count, requires a brief visibility guard before dwell starts, and rebuilds once or twice if the page’s height changes materially (lazy images, accordions). It respects consent states and emits on route change or unload—one event per view.


Access and demo


I’ll demo Strix live—method, data, and real pages—so you can see exactly how it works and how it resolves “two sources of truth” debates.


Code and implementation details are provided only to clients under contract. 


If you want it in your stack, book a demo; if we’re a fit, we’ll scope rollout, wire the event to your warehouse, and tune the score to your content model.


Measure if users read your page. Then act with confidence.

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page