Why isn’t it worthwhile to work with all possible semantics for each campaign (> 500-1000 words) and how is the Pareto principle applied here (20% of effort gives 80% of the result)?
2 years ago, we got a client’s account (He wanted to change the advertising agency). There were 55,000+ keywords on the account, 2000+ advertising groups, 5000+ ads. And all of this was divided into 85+ advertising campaigns.
The structure was cumbersome, slow and inconvenient to manage. It took the whole 10 minutes to download the account, massive changes in all campaigns followed by uploading and re-loading of the account would take 10 minutes per each iteration.
The analysis of the advertising account showed that 90% of the budget was spent for less than 1% of words. Those 99% of words hung dead weight on the account. Work on assembling, structuring and writing ads for these words was done for nothing.
Focus your efforts where they have the greatest impact: accounts should start with a small number of keywords; depending on the situation (landing page should start from a really small number, 20-50. Online store is another).
Using the report “search queries” we can add new words to the campaign. This way, we do not work out initially a huge number of keywords with an efficiency unchecked for us. But we expand our semantic core by reporting search queries, paying attention to the metrics that these queries have, drawing conclusions about the effectiveness of these words for our advertising campaigns.
Since we use words with high traffic potential and high competition in this approach, the problem of high cost per click arises. We solve it the following way: we create a structure of three campaigns in three different key matches – exact, phrase and broad.
|Match type||CPC(ratio from broad match)|
Due to the fact that we actually make three campaigns from one, (with one difference is that in each campaign there will be the same keys, but in different types of matches) we pull the total CPC down. Thanks to this method, which has been tested on dozens of accounts, we can concentrate on the strategy. The algorithm will help us to low rates in the niche, and we can do other things.
With such a structure, so that the keys in different matches do not block the display of the same keys, in other campaigns in other matches, you need to cross minus them.
As it can be seen on the graph, the keys from the left column, respectively, are added to the campaign from the right, as negative words. Due to this, in our campaign with exact match, there will be only exact requests while in the campaign with phrase match, only requests with words before or after the key, because showing the key itself “as it is” blocks the negative word – this key is in exact match.
Summarizing, in order to launch such a campaign regardless to the goals, budget, etc., we need:
- Search for 20-50 keys, most common to your niche, and launch 3 campaigns with them in all key matches, so that the requests do not overlap, and that the click price is lower than the average one for the niche.
- Add as negative words the semantic core of the campaign as in the diagram above to the accordant campaigns
- Expand the semantic core of the campaign not through the natural or synthetic addition of all possible keys from key collection services, but from the report of search queries, based on the measures we received and which can be analyzed.
- Add all valuable keys from the search query report to 3 campaigns – in exact, phrase and broad match, also making changes to the list of negative words (for convenience, general lists are created at the account level, which are regularly updated with new minus words that are already automatically assigned to the right campaigns).