How Helpful Was the Helpful Content Update?
Search Engines | Content Marketing
On August 25, Google started rolling out the Helpful Content Update, an ongoing effort to reward sites with “people-first” (i.e. not written specifically for SEO) content. MozCast measured rankings flux peaking at 92°F on August 26, which sounds relatively high, but this is what the two weeks on either side of the update looked like:
The dotted blue line shows the 30-day average for the period prior to the start of the update, which came in at 87°F. Ranking flux actually peaked on August 23 above any day of the update rollout. To make matters worse, we had to remove August 8-9 from the 30-day average, because Google’s data center outage completely disrupted search results.
Let me sum up: it’s a mess. I like to think I’m pretty good at handling messes, but this is like trying to find one particular drop of water in two weeks of rain during a summer-long storm. If you like messes, read on, but for the rest of you, I’ll tell you this — I found no clear evidence that this first iteration of the Helpful Content Update moved the needle for most sites.
Averages, lies, and damned lies
Given the extended rollout, I attempted to look at the difference in visibility for individual domains for the 14 days before and after the start of the rollout (which helps smooth out single-day outliers and keeps the days of the week consistent across both sides). One “loser” that immediately popped up was Conch-House.com, with nearly a 50% visibility loss in our data set. I admit, I even got a little judgmental about the hyphen in the domain name. Then, I looked at the daily data:
The averages don’t tell even half of this story. Whatever happened to Conch-House.com, they were completely knocked out of the rankings for 20 out of the 28 days analyzed. Note that the MozCast data set is limited, but our much larger STAT data set showed a similar pattern, with Conch-House.com ranking for up to 14,000 keywords on one day during this period.
What happened? I have no idea, but it quite definitely, almost certainly, very probably maybe was not the Helpful Content Update.
Confirmed content coincidence
Here’s an example I got pretty excited about. WhiteHouse.gov saw a +54% total visibility gain across the two time periods. The keyword set was pretty small so, once again, I dug into the daily numbers:
Looks great, right? There’s a clear spike on August 25 (although it fades a bit), and while the spike wasn’t as large, I was able to confirm this against a larger data set. If I was smart, I would’ve stopped the analysis right here. My friends, I was not smart.
One of the challenges of the Helpful Content Update is that Google has explicitly stated that helpful (or unhelpful) contact will impact rankings across a domain:
Any content — not just unhelpful content — on sites determined to have relatively high amounts of unhelpful content overall is less likely to perform well in Search …
Even so, it’s interesting to dig into specific pieces of content that improved or declined. In this case, WhiteHouse.gov clearly saw gains for one particular page:
This brief was published on August 24, immediately followed by a storm of media attention driving people to the official details. The timing was entirely a coincidence.
Is it helpful content (regardless of your take on the issue)? Almost certainly. Could WhiteHouse.gov be rewarded for producing it? Entirely possibly. Was this increase in visibility due to the Helpful Content Update? Probably not.
Is this blog post helpful content?
Hey, I tried. I’ve probably lost three nights of sleep over the past three weeks thanks to the Helpful Content Update. The truth is that extended rollouts mean extended SERP churn. Google search results are real-time phenomena, and the web is always changing. In this case, there was no clear spike (at least, no clear spike relative to recent history) and every once-promising example I found ultimately came up short.
Does that mean the update was a dud? No, I think this is the beginning of something important, and reports of niche impacts in sites with clear quality issues may very well be accurate (and, certainly, some have been reported by reputable SEOs whom I know and respect). The most viable explanation I can come up with is that this was a first step in rolling out a “helpfulness” factor, but that factor is going to take time to iterate on, ramp up, and fully build into the core algorithm.
One mystery that remains is why Google chose to pre-announce this update. Historically, for updates like Mobile-friendly or HTTPS, pre-announcements were Google’s way of influencing us to make changes (and, frankly, it worked), but this announcement arrived only a week before the update began, and after Google stated they had updated the relevant data. In other words, there was no time between the pre-announcement and the rollout to fix anything.
Ultimately, Google is sending us a signal about their future direction, and we should take that signal seriously. Look at the HTTPS update — when it first rolled out in 2014, we saw very little rankings flux and only about 8% of page-one organic results in MozCast were HTTPS URLs. Over time, Google turned up the volume and Chrome began to warn about non-HTTPS sites. In 2017, 50% of page-one organic results in MozCast were HTTPS. Today, in late 2022, it’s 99%.
The Helpful Content Update probably isn’t going to change the game overnight, but the game will change, and we would all do well to start learning the new rules