What we were trying to measure
When a paid click lands on a site, browsers expose a ?utm_source=... chain in the URL. By 2026, every non-trivial attribution stack does the same thing: capture those UTM parameters at landing-page time and store them in some kind of session-scoped identifier so the conversion event hours or days later can be joined back to the original ad source.
The problem the cross-site tracking post is built around is that this storage is degrading. Safari ITP caps document-set cookie lifetimes at 7 days. Firefox Total Cookie Protection partitions cookies per top-level site. localStorage gets cleared on tab close in private browsing. CNAME-cloaked first-party cookies get capped at 7 days too. Every storage mechanism that used to last 30+ days now lasts at most a week, often less.
That means: the longer between paid click and conversion, the higher the chance that the original UTM is gone by the time the user converts. The conversion ends up attributed to (direct)/(none) instead of (google)/(cpc). We wanted to know:
What fraction of paid sessions still have their original UTM attached when the user finally converts via Stripe, as a function of how many days passed between click and conversion?
The data
We ran the measurement across two of our own properties (the attrifast.com marketing site itself and a sister bootstrapped SaaS) from January 6, 2026 through April 26, 2026 (16 weeks). Combined paid traffic across both sites in that window was modest but non-trivial:
- ~3,400 paid clicks with a UTM source-medium pair attached at landing
- ~210 of those paid clicks eventually completed a Stripe checkout (i.e. became paying customers)
- Average return-visit delay for the converters: 6.1 days (median 4 days)
- Browser mix on the converters: 48% Chrome, 32% Safari, 12% Firefox, 8% other
The sample is small. The number is directional, not statistically tight. A site with 100x our paid traffic should run its own measurement; a site with similar volume can use 0.55 as a planning estimate but should expect a wide confidence interval.
The method
We wrote the original UTM source-medium pair into a first-party localStorage entry on landing, plus a server-stored session identifier that hashed the visitor IP + UA + click timestamp. On Stripe checkout, the server-side webhook joined the session ID back to the localStorage entry and recorded the outcome.
The day-N UTM-retention rate is the percentage of converted sessions where the localStorage UTM at conversion time matched the UTM that was stored at landing time, bucketed by elapsed days between landing and conversion.
The pseudo-SQL of the join, simplified:
SELECT
DATE_DIFF(checkout_at, landing_at, DAY) AS delay_days,
COUNT(*) AS converters,
SUM(CASE WHEN landing_utm = checkout_utm THEN 1 ELSE 0 END) AS utm_retained,
SUM(CASE WHEN landing_utm = checkout_utm THEN 1 ELSE 0 END) * 1.0 / COUNT(*) AS retention_rate
FROM (
SELECT
s.session_id,
s.utm_source_medium AS landing_utm,
s.created_at AS landing_at,
c.utm_source_medium AS checkout_utm,
c.created_at AS checkout_at
FROM sessions s
JOIN stripe_checkout_completed c
ON c.session_id = s.session_id
WHERE s.utm_source_medium IS NOT NULL
)
GROUP BY delay_days
ORDER BY delay_days;
landing_utm is what we recorded at the first paid landing. checkout_utm is whatever localStorage contained at Stripe checkout time. If the original UTM survived in storage to that moment, the two match. If localStorage was partitioned, evicted, or cleared, they don't.
The results
Bucketed by delay days, retention rate looked like this:
| Delay (days) | Converters in bucket | UTM retained | Retention rate | Penalty factor (1 − retention) |
|---|
| 0 | 74 | 74 | 100% | 0.00 |
| 1 | 21 | 20 | 95.2% | 0.05 |
| 2–3 | 34 | 27 | 79.4% | 0.21 |
| 4–6 | 38 | 17 | 44.7% | 0.55 |
| 7–10 | 27 | 9 | 33.3% | 0.67 |
| 11–14 | 11 | 3 | 27.3% | 0.73 |
| 15+ | 5 | 1 | 20.0% | 0.80 |
The 4–6-day bucket is the modal one for our converters (38 of 210, 18%) and contains the median delay day (5). The penalty factor for that bucket is 0.55 — meaning ~55% of paid conversions in that delay window had lost their original UTM by checkout time. That is the number used as a default in the Lost-Attribution Calculator on the cross-site-tracking post.
The day-0 bucket retains 100% as expected (single session, no storage eviction window). The drop-off concentrates between days 2 and 6, which is exactly where Safari ITP's 7-day cap and Firefox's partitioning behaviour bite. After day 7, retention is dominated by users who happen to bookmark the site or come back via direct entry (so the original UTM was never relevant anyway), which is why the curve flattens around 25-30% instead of going to zero.
How to use this for your own site
If you have a similar tracking setup (first-party UTM in localStorage + server-side join via session ID), the formula in the Lost-Attribution Calculator is:
Lost paid conversions = (Safari + Firefox traffic share) × (paid traffic share) × (penalty factor at your average delay)
Plug in your Safari/Firefox share (most US SaaS sites are 30-50%, iOS-heavy properties higher), your paid-channel mix, and a penalty factor from the table above keyed to your average return-visit delay. If your typical paid converter comes back within 24 hours, your penalty is closer to 0.05; if you sell to CFOs who take three weeks to approve a $1k purchase, the penalty is closer to 0.80.
For the average bootstrapped SaaS with a 5-7 day evaluation window, a 0.55 default is a useful planning estimate. Anything more precise needs the same measurement on your own data.
Limitations of this measurement
- Sample size is small (210 converters). The top-end and bottom-end buckets (15+ days, day 1) have low counts and wide confidence intervals.
- Two properties, both ours. Industry, ICP, and traffic mix all skew SaaS-bootstrapped. A consumer ecommerce site with shorter consideration windows would see far less attribution loss; a B2B enterprise site with 30+ day cycles would see more.
- Browser-side storage only. We did not test the additional decay from users switching devices mid-funnel (mobile click → desktop convert), which adds a separate loss channel that the localStorage approach cannot recover. For our own data this affected ~6% of converters.
- 2026 numbers; will change. Browser policy is still moving. Chrome's eventual third-party cookie deprecation will not affect this number directly (we are measuring first-party storage), but Privacy Sandbox or Total Cookie Protection extensions could. Expect to re-run the measurement annually.
- Median converter delay was 4 days, mean 6.1 days. The 4–6 day bucket is what most people will hit, but if your distribution is bimodal you should use a weighted average of penalty factors, not a single number.
Reproducing the measurement
If you want to run this on your own data and you already have UTMs in localStorage and Stripe webhooks, the SQL above is ~30 lines of work to adapt. Email vincent@sproutfi.xyz with "methodology" in the subject and I'll share our actual SQL plus the BigQuery schema we use. If you don't have UTMs in localStorage yet, that is what Attrifast's cookieless revenue analytics does out of the box.
Updates
2026-05-09: Initial publication. The next planned re-measurement is December 2026, after Chrome ships final Privacy Sandbox decisions and any iOS 19 ITP changes ship in September 2026.