What is the Quality Score?
The Google Quality Score is an important yet misunderstood cost metric in Google Ads. It is a quantifiable representation of the User Experience (UX). It ranges on a scale from 1 to 10.
Because we believe that better ads are a good thing for everyone, the Google Ads system is set up to identify and reward quality ads. High-quality ads can give you a higher Ad Rank and lead to other potential benefits, including:
Better ad positions
Eligibility for ad extensions and other ad formats
Does Google Really Lose Money on Better Quality Scores?
Throughout my Digital Advertising Career, I have met with Digital Marketing Managers of all kinds, large charities, and large corporations who spend millions of dollars per year on digital ads. Arguments against the effectiveness of the quality score are abundant.
Quality Score is a myth
Google would never lower their fees, as this would hurt their business model.
Quality Score is Not a Myth
Google’s mission is to organize the internet. So therefore it would make sense to charge someone more to show ads that are less relevant to a query and charge less to someone who is advertising something more relevant to a query. On a fundamental level, 2/3rds of the quality score is based on relevance, 1/3rd is based on Landing Page Experience. The quality score, therefore, deters advertisers from advertising products that do not match the intent of the user or advertising websites with a bad experience, as either of these might hurt the Google Brand.
The User Experience is Quantifiable
The User Experience can be quantified, and those metrics are included in Google Pagespeed Insights, and within Google Adwords Quality scores.
The Google Quality Score for any landing page is based on three factors. Those factors are Expected clickthrough rate, Ad relevance, and Landing page experience.
I decided to try call-only ads. I was using call-only ads in my other campaigns, however, the campaigns also had text ads and responsive ads, a good mix of ads, not just call-only ads. Some keywords in the other campaigns had 10/10 quality scores, like Dumpster rental.
In the first period, call-only ads were more expensive than similar ads in other campaigns. The average cost per click on a campaign by campaign comparison was $2.47 Average CPC for the same Adgroup in the Oklahoma City Campaign, and $14.31 Average CPC for the same Adgroup in the Cincinnati Campaign. A 579% Increase in Cost Per Click.
Fixing A Quality Score
To fix the quality score, I added a landing page and responsive and text ads to the Cincinnati Campaign.
Quality score can have dramatic effects on cost per click. In the example below. In the first period, "+Rent +A +Dumpster", had a 1/10 quality score, which puts the CPC for that phrase at $30.43 per click.
In the second period, after adding more ad variety and a landing page, the exact same campaign, and keyword, ended with a quality score of 8/10, which lowered the CPC for "+Rent +A +Dumpster" to $2.64 CPC.
$30.43/$2.64 = 11.52x more cost-effective.
A highly relevant term to a Dumpster Rental website, from period to period the quality score went from 1/10 to 8/10. Cost per click went down from an average of $30.43 to $2.64 CPC.
I suspect this is because the call only ads did not have a landing page for users to click on, therefore landing page experience had a score of 0. Since the Adgroup also lacked variety in Ad types, Ad Relevance, and Expected CTR also suffered. The same exact search phrase, the same exact advertisement in the same campaign had entirely different cost metrics.
Quality score matters. The difference in the period over period cost per click for "+Rent +A +Dumpster" is 1,152%.