Tracking Google’s algorithm updates is a full-time job, mainly because Google doesn’t explain why frequent changes in the SERP (Search Engine Results Page) occur. These updates happen when increasing relevance in the SERP occurs. They benefit sites that create quality, new, applicable content.

What happens when your site traffic takes a hit? We’re going to discuss a couple of Google algorithm updates that occurred in the past couple weeks, one “official” and one unofficial. We’re also going to talk about what that should be explained while SEO variations may or may not have occurred on your site as a result of these changes.

BERT (Bidirectional Encoder Representations from Transformers) 

BERT stands for Bidirectional Encoder Representations from Transformers. If you can’t remember that, it will probably have minimal effect on your website’s life. What should be remembered is this is

“[BERT] represent[s] the biggest jump forward in the past five years, and one of the biggest leaps forward in the history of Searches.” 

BERT’s impact on the organic search prospect began when it rolled out the week of October 21, 2019, for English-language queries. 

What is BERT? 

BERT allows Google better to understand words in the context of search queries.

Queries are getting more informal these days as users increasingly treat their devices like companions, and as voice search continues to proliferate. The Hummingbird update, in 2013, as Google’s great first acknowledgment of this trend. Where does BERT differs, it goes about processing and interpreting the language. 

How does BERT work? 

BERT is a neural network-based technique for natural language processing pre-training. Let’s break that down in layman’s terms “neural network” means “pattern recognition.” Natural Language Processing (NLP) means “a system that helps computers process how people communicate.” So, if we combine the two, BERT is a system by which Google’s algorithm uses pattern recognition to understand better how people communicate so that it can acquire more purposeful effective results for users

How should I react to BERT? 

Here’s the kicker, there is nothing you can do don’t to “respond” or “prepare for” BERT.

Why, with all algorithm updates, Google did not create BERT to penalize individual sites. Nor was it designed to benefit others. If your site got hit by BERT, you were likely accruing traffic from a group of specific queries that shouldn’t have been getting you traffic, because they weren’t the best fit. Your traffic was probably of pretty low quality. For example, MedlinePlus might miss out on some information via the query “can you get medicine for someone pharmacy,” but the users who made that query were looking for any rate of information, not for a commercial solution. 

If you optimize BERT, focus on creating more importantly useful content, and continue doing it. If you lost traffic from BERT, take consolation in the fact that you will start benefiting in the future from BERT and bring in more traffic. 

Possum 2.0? The November 6 local algorithm update

Let’s talk about everyone’s favorite rumors of the algorithm update, also known as the “unconfirmed update.”

This has caused a lot of “chatter” in SEO forums and Twitter. The update is suspected of having taken place by Google. It hasn’t been confirmed whether it occurred. It’s exhilarating, and very unsettling, because nobody knows for sure what’s going on, and speculation runs rampant. We know about the local algorithm update that took place last week, and how it may or may not be affecting your local SEO, and how to react to it if it did make a change.

What was the November 6 local algorithm update? 

What in the world is a “local” algorithm update? Well, it’s an algorithm update aimed towards delivering more relevant results for local queries. The most famous example of this was Possum, which rolled out in 2016, and improved search composer by giving businesses just outside the city limits their fair share of the local search pie (if they were, in fact, closer to the searcher): 

Since then, local search, at least on the organic side of the coin, has remained stagnant. Until now! The update that rolled out last week echoes the intentions of Possum: It’s all about proximity. Sites that saw the most significant losses in 2016 were those that weren’t in the zip code of a user at the time query was made. 

The results after this past week’s local algorithm update again have to do with proximity: Google My Business listings that were closer to the searcher won out, while those that were farther lost traction. 

How do I optimize for a local algorithm update?

This one’s a little more complicated. The result of algorithm updates that focus on proximity, in the past, has been an increase in spam. It’s an unwanted byproduct that Google has not been able to clean up this time around. 

However, its most likely not going to be the reason you lose traffic from this update—if you lose traffic at all. If you see an ingress of local business listings that weren’t there before report them using Google’s Business Redressal Complaint Form

Otherwise, you can safely assume that if you lost traffic from this update, your local listing wasn’t the closest listing to the users in your location who were making most of the queries. 

Lastly, the best way to “optimize” for this local update may not be via organic search. Marketers active on Google Ads and/or Facebook Ads can compensate for lost local traffic with precise geotargeting, more aggressive bidding, local service ads, and locally inspired ad copy and creative

What we talk about when we talk about algorithm updates

Don’t panic about these algorithm updates or any that occur in the future. For the most part, Google’s algorithm updates increase relevance and deliver better search results. 

That’s not to say that all sites that lose traffic from an algorithm update “deserve” to do so. But it would be a mistake to think that you can react a sure way to neutralize the effects of a given update. In organic search (like in life!), it’s better to be proactive than reactive. If you can focus on creating relevant, fresh content that meets the needs of your user, and if you can shore up any technical holes that might be bogging down performance, you’ll win more often than you lose.

BERT & Google’s Recent Local Algorithm Update

Track Google’s algorithm updates is a full-time job, mainly because Google doesn’t explain why frequent changes in the SERP (Search Engine Results Page) occur. These updates happen when increasing relevance in the SERP occurs. They benefit sites that create quality, new, applicable content.

What happens when your site traffic takes a hit? We’re going to discuss a couple of Google algorithm updates that occurred in the past couple weeks, one “official” and one unofficial. We’re also going to talk about what that should be explained while SEO variations may or may not have occurred on your site as a result of these changes.

BERT (Bidirectional Encoder Representations from Transformers) 

BERT stands for Bidirectional Encoder Representations from Transformers. If you can’t remember that, it will probably have minimal effect on your website’s life. What should be remembered is this is

“[BERT] represent[s] the biggest jump forward in the past five years, and one of the biggest leaps forward in the history of Searches.” 

BERT’s impact on the organic search prospect began when it rolled out the week of October 21, 2019, for English-language queries. 

What is BERT? 

BERT allows Google better to understand words in the context of search queries.

Queries are getting more informal these days as users increasingly treat their devices like companions, and as voice search continues to proliferate. The Hummingbird update, in 2013, as Google’s great first acknowledgment of this trend. Where does BERT differs, it goes about processing and interpreting the language. 

How does BERT work? 

BERT is a neural network-based technique for natural language processing pre-training. Let’s break that down in lemans terms “neural network” means “pattern recognition.” Natural Language Processing (NLP) means “a system that helps computers process how people communicate.” So, if we combine the two, BERT is a system by which Google’s algorithm uses pattern recognition to understand better how people communicate so that it can acquire more purposeful effective results for users

How should I react to BERT? 

Here’s the kicker, there is nothing you can do don’t to “respond” or “prepare for” BERT.

Why, with all algorithm updates, Google did not create BERT to penalize individual sites. Nor was it designed to benefit others. If your site got hit by BERT, you were likely accruing traffic from a group of specific queries that shouldn’t have been getting you traffic, because they weren’t the best fit. Your traffic was probably of pretty low quality. For example, MedlinePlus might miss out on some information via the query “can you get medicine for someone pharmacy,” but the users who made that query were looking for any rate of information, not for a commercial solution. 

If you optimize BERT, focus on creating more importantly useful content, and continue doing it. If you lost traffic from BERT, take consolation in the fact that you will start benefiting in the future from BERT and bring in more traffic. 

Possum 2.0? The November 6 local algorithm update

Let’s talk about everyone’s favorite rumors of the algorithm update, also known as the “unconfirmed update.”

This has caused a lot of “chatter” in SEO forums and Twitter. The update is suspected of having taken place by Google. It hasn’t been confirmed whether it occurred. It’s exhilarating, and very unsettling, because nobody knows for sure what’s going on, and speculation runs rampant. We know about the local algorithm update that took place last week, and how it may or may not be affecting your local SEO, and how to react to it if it did make a change.

What was the November 6 local algorithm update? 

What in the world is a “local” algorithm update? Well, it’s an algorithm update aimed towards delivering more relevant results for local queries. The most famous example of this was Possum, which rolled out in 2016, and improved search composer by giving businesses just outside the city limits their fair share of the local search pie (if they were, in fact, closer to the searcher): 

Since then, local search, at least on the organic side of the coin, has remained stagnant. Until now! The update that rolled out last week echoes the intentions of Possum: It’s all about proximity. Sites that saw the most significant losses in 2016 were those that weren’t in the zip code of a user at the time query was made. 

The results after this past week’s local algorithm update again have to do with proximity: Google My Business listings that were closer to the searcher won out, while those that were farther lost traction. 

How do I optimize for a local algorithm update?

This one’s a little more complicated. The result of algorithm updates that focus on proximity, in the past, has been an increase in spam. It’s an unwanted byproduct that Google has not been able to clean up this time around. 

However, its most likely not going to be the reason you lose traffic from this update—if you lose traffic at all. If you see an ingress of local business listings that weren’t there before report them using Google’s Business Redressal Complaint Form

Otherwise, you can safely assume that if you lost traffic from this update, your local listing wasn’t the closest listing to the users in your location who were making most of the queries. 

Lastly, the best way to “optimize” for this local update may not be via organic search. Marketers active on Google Ads and/or Facebook Ads can compensate for lost local traffic with precise geotargeting, more aggressive bidding, local service ads, and locally inspired ad copy and creative

What we talk about when we talk about algorithm updates

Don’t panic about these algorithm updates or any that occur in the future. For the most part, Google’s algorithm updates increase relevance and deliver better search results. 

That’s not to say that all sites that lose traffic from an algorithm update “deserve” to do so. But it would be a mistake to think that you can react a sure way to neutralize the effects of a given update. In organic search (like in life!), it’s better to be proactive than reactive. If you can focus on creating relevant, fresh content that meets the needs of your user, and if you can shore up any technical holes that might be bogging down performance, you’ll win more often than you lose.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.