seo-nerd® – digital success
Bundesallee 39-40a, 10717, Berlin, Berlin, Germany


Search engine optimization changes daily. With the seo-nerd SEO news you will stay up-to-date.


Latest SEO news

Latest SEO news

These are the topics of our SEO Weekly Review this time: Google’s fight against search spam and spammy websites / clarifications about mobile first indexing / the question whether it plays a role in ranking, where in the title the keyword is written / whether ratings are ranking signals / why images are becoming increasingly important for SEO / what a blog needs to have in order to provide rankings.


Google’s search engine lives from providing users with relevant content for search queries as much as possible. To ensure this, the algorithms not only search for the best search results, they also identify pages that use tricks to give the impression that they are relevant for certain search queries (keywords).

Google calls such pages search spam and deliberately draws an analogy to the well-known mail spam. Without spam filters, our email inboxes would be useless. It would take far too long to distinguish the important e-mails from the spam mails. In addition to the search spam that violates Google policies, Google is also fighting spammy pages. This includes, for example, fake shops, pages that lead to subscription traps or hacked pages for spreading malware and the like.

By using its own systems and working with website operators, Google is doing a lot to keep the web healthy. Google has now announced on its blog that in 2017, more than 45 million website operators were warned or alerted to possible problems on their websites. Around 6 million webmasters received a penalty in 2017, so they were specifically asked to change certain things on the website. In the worst case such a penalty leads to the exclusion of domains from the rankings.

In the blog post, Google asks that users continue to contact the company when they come across search spam or spammy pages.


Google now treats mobile pages primarily as opposed to desktop pages. In short, this is the Mobile First indexing program that Google has been rolling out since March/April 2018. Pages are indexed immediately after crawling (roughly speaking: “scanning”) web pages.

With indexing, Google assigns meanings to the crawled pages. If the meanings of a page (X) match those of a search query (Y), the page (X) comes into question as a result of the search query (Y). Indexing is therefore a central component of the search engine. The switch to prioritize desktop to mobile for page ranking is correspondingly important.

That there has been some confusion about the implications and consequences of mobile first indexing in the last few months, Google has now corrected some misunderstandings via Twitter:

  • URLs displayed on the search results page: If there is a mobile AND a desktop version for a page, Google always indexes the mobile version, but always refers to the desktop version on desktop devices.
  • Number of crawlings: Google will crawl pages neither more nor less frequently than before the switch to mobile first indexing. Only when switching to Mobile-First indexing pages are crawled more often for the time being.
  • Pages stored in the cache: Currently Google does not show a cache version for some pages (already indexed as Mobile-First). This is not an intention, but a bug that Google is working to fix.
  • Page Speed and Mobile-First: In July 2018 Google rolls out its “Speed Update”. The speed with which mobile pages are loaded becomes a ranking factor. However, this update has nothing to do with mobile indexing.
  • Design (1): “Accordions” (pop-up menus) and “hamburger menus” (three horizontal stripes as symbols for menus behind them) are good solutions from Google’s point of view on mobile websites.
  • Design (2): For mobile first indexing, it does not matter how the page is designed, whether as an external mobile page, as a responsive design or even only for desktop devices. However, Google also recommends switching to mobile-friendly designs
  • The mobile index itself is not a ranking factor: Mobile-friendliness, on the other hand, is it already



This clarification was provided by Google spokesman John Mueller in a tweet this week. He wrote: “If you’re swapping the order of keywords in a title, I wouldn’t expect that to have an effect on ranking.” However, Mueller points out that it may make sense to try out whether swapping the word order leads to a title “working” better, i.e. causing more clicks. Such tests could be carried out in advertisements or social media.


Google does not use page ratings and surveys for the page recommendations on the search results pages. This was revealed by John Mueller in a hangout. One of the most important reasons for this is that Google usually does not know how the rating should be weighted, as there are too many different rating systems (e.g. 5 stars, 3 cups etc.). For Local SEO, the case would be different. Google can clearly classify the ratings there, which is why they are also included in the rankings.


The good old search with a keyword gets competition – the Visual Search: Apps like Google Lens make it possible to scan objects with the mobile phone camera and to search for information or online shops with corresponding offers. In the future, webmasters and SEOs will therefore have to concentrate more on images in order to display for search queries that are made exclusively via images. What becomes important in this context was written down by Jes.Scholz in the blog of MOZ.


Blogs have become part of the content strategy of many sites. The problem: Blog texts rarely guarantee high rankings – at least that is the result of a current Sistrix study. The most important reason for this is that blog texts often treat topics so focused that they can hardly be assigned to specific search queries or keywords.

However, the study should not be generalized. Sistrix admits this and gives examples of blog texts that achieve very good rankings. Finally, Sistrix uses the study to refer to the in-house concept of high-performance content formats (HPC formats). This means content formats that achieve an above-average number of top 10 rankings (with over 20 percent of keywords). Such HPC formats are optimized in terms of both content and technology. Ergo: Who simply writes down texts, will hardly improve rankings. Blog texts must also be well thought out and analytically planned.


Latest SEO news

Latest SEO news

You don’t have time to keep up to date with all relevant SEO news in blogs and news portals? Then our summary of the SEO News of the week is just right for you. Let’s get started right away: there was a lot of excitement about a European Court of Justice ruling on the use of Facebook fan pages. Google is also giving website operators new forms of contribution for local search. In the mixed messages, we will tell you why loading time optimization continues to gain in importance.


Since June 5, 2018, this question has occupied everyone who has created and operates a page on Facebook. The reason is a judgement of the European Court of Justice (ECJ). Specifically, the verdict concerns the operation of a fan page on Facebook. In particular, the Facebook Insights Tool, which analyzes the data of visitors to such fan pages. Data protectionists consider it to be not in compliance with the law.

The tool cannot be cancelled by fanpage operators, so it is always active. The tool became a data protection problem in the case dealt with by the European Court of Justice because neither Facebook nor the operator himself points out that personal data is collected and stored during the visit. The operator of the fan page (Wirtschaftsakademie Schleswig-Holstein) had argued that he was not responsible for data collection and storage.


The ECJ judgement now clarifies: whoever creates a Facebook page bears joint responsibility for possible data protection violations of the social media network. Attorney Dr. Thomas Schwenke believes that other social networks may also be affected by this judgement. At the same time, the lawyer advises not to panic immediately.

The ECJ ruling serves only as a basis for a pending decision of the Federal Administrative Court. If it agrees with the judgment of the European Court of Justice, German data protection authorities may prohibit the use of Facebook pages. The consequence would be a wave of admonitions, unless Facebook changes its current handling of personal data. This would probably not only affect the sides of companies and authorities, but also those of private individuals. However, their risk of damage would be lower than that of those who run Facebook pages for business reasons.


If you also want to be found on Google for search queries with local or regional references, you should have a Google My Business Account. It has already been possible to post articles there for some time. Google has now announced on Twitter that there are two new types of posts for Google My Business:

  1. product contributions
  2. offer contributions

Posts written for Google My Business are displayed directly in the Google search in the Knowledge Panel or on Google Maps (in the tab “Overview”). They offer the possibility to give users current information about discounts, promotions or news. For example, in the new contribution type “Offer”, the period of the offer, possible voucher codes, and terms of use can be entered in addition to the contribution text.

Das Bild zeigt ein Beispiel eines Local Business Eintrags

For product contributions, it is possible to state the name and price of the product and to provide a description of the product. A text length between 100 and 300 words is currently defined for all contribution types. The amounts can and should always be provided with a picture. See Google My Business Help for a detailed description of of how to post contributions to Google.


  • Google takes a generous view of spelling and grammar errors in its ranking, says Google’s J. Müller in a tweet. Erroneous texts can also rank. Most website users will probably be less generous than the search engine and turn to more serious looking sites. Too many spelling and grammar errors should therefore not be allowed to occur on your pages.
  • In Yoast’s popular WordPress SEO plugin, a bug has crept in with the 7.0 update that can lead to ranking losses. This error causes images and other media files to have their own URLs, which are then without content. The bug is now fixed with the current version. For security reasons, Yoast customers should switch their plugin to the current version and manually ensure that the setting for the question “Forward attachment URLs to attachment files” is set to “Yes” in the tab “Media”.

    Joost de Valk of Yoast apologized to users who were affected. The story also had a media afterplay. Google spokesman John Müller criticized SEO blogs that had not covered the bug comprehensively enough and instead only stoked fear to collect clicks.

  • Google will evaluate the page speed of pages more differentiated in the future. Google spokesman John Müller announced in a hangout that the fastest website will secure a place at the top of the ranking. The optimization of loading times will thus become even more important as an SEO task.


Latest SEO news

Latest SEO news

Only in December 2017 Google had given more space to the snippets (i.e. the short presentations or link tips on the search results pages). Instead of the meta-description running over only two lines, there were now three or even four lines possible. Now Google shows that it can also play the role backwards: since this week snippets are again only displayed in two lines. This returns Google to the process before the December 2017 change.


Danny Sullivan, spokesman for Google’s public relations team, has now confirmed the change in a tweet. According to this, the snippets are a little longer than before December (which is not true so far, see below). Nevertheless, Google still leaves all doors open. According to Sullivan, Google does not give a general recommendation for the length of snippets. The display shows what the algorithms think makes the most sense for the respective search query.

From an SEO point of view, this “look and see if it fits” information is of course of little help. Finally, the display on the search results page affects the click-through rate. According to studies, users of snippets mainly pay attention to the title (i.e. the headline) as well as the URL, but in case of doubt an appealing description is decisive for users to click on a page. Since Google does not help here, the SEO scene has to help itself. The following has been found out so far:


According to RankRanger, the average length of descriptions displayed in Germany on May 18 was only around 151 characters for the desktop view. There is even less space available in the mobile view. In its review, the seo-nerd® came to average lengths of 110 to 115 characters. At the moment, however, it looks as if these are approximate values at best.

Our recommendation for writing meta-descriptions is: Try to include all the essentials in the first 110 characters so that the user understands what he or she can expect on the page. If you expect your customers to use desktop computers (typical for B2B offers), you can also work with 150 characters.

Since Google decides for itself in more than half of the cases what is written in the description (and often also chooses excerpts from the text of the page), your descriptions can and should be longer anyway. If you’ve already expressed the most important thing within the 110/150 mark, you can use as much text as you like to place a call to action.


That always depends on which users you are thinking about. Basically, not much changes for mobile users. They are much more strongly influenced by the recent deletion of pagination in the mobile SERP display. In mobile use, scrolling down is part of the normal habit anyway.

The situation is different with desktop use. People don’t like to scroll around here. Users therefore often choose one of the results that they see at first glance (above the fold). By changing back to shorter descriptions, two to two and a half results become more visible with the usual status settings (1366*768 or 1920*1080). This means that the positions seven and eight in the search results in particular benefit from Google’s latest turnaround.


Latest SEO news

Latest SEO news

As in the previous week, there are new things to report about Google this week as well.


At the in-house developer conference I/O Google announced some innovations. For many users and SEOs, the most exciting feature is the optimized language assistant. By the end of 2018 at the latest, it should be possible to talk more or less “normally” with the language assistant. Then, there will be six different voices to choose from. The “Ok Google” can be omitted in the “conversation” in the future.

At the same time, the Google Assistant also understands more complex queries such as “What is the weather like in Berlin and Hamburg? Anyone using Google Home will be able to set up routine commands. For example, when the message “Ok, Google, dinner’s ready” is displayed, the TV is automatically switched off, the favourite music is played and the family members are notified on their devices that they can now go to the table.

Who sports the opinion, that this is quite nice, but not necessarily needed, perhaps is pleased about this: The assistant is going to be your personal secretary: It will soon reserve seats for you in the restaurant, manage your calendar by voice command, type messages for you (e.g. while driving) and send them right away. Of course, the assistant will also give you the DJ and play your music if you wish.


  • Gmail users will soon receive a “Smart Compose”. This is apparently the writing aid already known from Messanger programs, which accesses frequently used phrases and thus helps you to write faster and at the same time with less errors.
  • App developers will be able to share Google’s experience with artificial intelligence. Under the title ML Kit, the company from Mountain View offers developers APIs that help with text recognition, face recognition or image labeling, for example.
  • Those using Google Photos will soon receive AI-based suggestions on how to optimize photos (by adjusting brightness, angle, etc.). In addition, it will soon also be possible to automatically colorize black and white photos.
  • With the help of Google Lens you could already take pictures of any things, people or animals in order to search for information on the web about the photo object you took. Google Lens will also be a scanner in the future. You can then scan any text with Google Lens. Lens will soon be available in German, French, Italian, Spanish and Portuguese.
  • With Lookout, visually impaired or blind people receive an app that provides them with information about their immediate surroundings.


Google has also updated its guidelines for the use of images on websites in week 19. As with other optimizations, Google advises not to optimize for the search engine, but to think of the user first. Therefore, when using images, the user experience should be in the foreground. Images should therefore be selected and used according to the following criteria:



  • Pictures should match the content of the page. Arbitrarily used images are rated negatively by Google
  • Images should always be placed as close as possible to the text passages to which they refer (in reverse: The text environment gives Google important hints for the interpretation of the image)
  • Images should not contain important text elements such as headings or menu items – these should always be included in the HTML text.
  • Images alone cannot be used to design informative pages – at least not from Google’s point of view. The search engine remains dependent on texts worth reading in order to be able to recommend pages
  • Pages should be mobile-friendly.
  • It helps Google to create a sitemap of images. It may also contain URLs of other domains, for example, if the images are delivered via a content delivery network.


Besides the user experience, these things are also important when dealing with pictures:

  • The loading time of the pictures should be as short as possible
  • The images should match the title and description of the page, as they are also used for Google Image Search
  • Those who use structured data may benefit from better rankings (e.g. an image that is excellent as a recipe has a better chance of being displayed for suitable search queries)
  • The same applies to the quality of pictures: sharp and optimally illuminated ones are preferred
  • Use a descriptive alternative text. So don’t just use the keyword, but describe what the keyword has to do with the photo motif


Better not. John Mueller has now clarified this in a tweet. If there are two versions of a website optimized for mobile devices, Google doesn’t know which one to display. Therefore, it is best to simply implement the website on responsive.

You have questions to the topics raised here or need search engine optimization? The seo-nerd® is looking forward to your inquiry!


Latest SEO news

Latest SEO news

SEO News of the week 18 2018

Google announced some changes that are important to everyone who needs Local SEO. We also take a look into the near future, for which Google plans to make audio content searchable. Last but not least, we remind you of May 25th, the deadline for the new General Data Protection Regulation (GDPR).


More and more Google users are making local search queries to the search engine. Compared to the previous year, their number increased by 50 percent. Especially on the smartphone, people like to search for service providers or offers from the region. Around a third of all mobile search queries are location-based. Google is now responding to the growing importance of local search and expands Google My Business.

My Business was created in 2014 by merging Google Places, Google Local and Google+ Local and has since been something like the “contact point” at Google for local SEO measures.

The latest updates extend the range of APIs. In the future, posts with offers and descriptions can also be integrated into the local search results and customers can thus be informed more specifically.


On the other hand, agencies will soon be able to create their own My Business accounts, Anita Yuen, Product Management Director at Google My Business announced in the Google blog. The agencies will receive their own dashboard to manage any number of locations.

The previous restriction of 100 loactions per my-business user will then be lifted for approved agencies. The dashboard will also facilitate collaborations between several employees, for example by sending out invitations and managing listings.

Google plans to launch a new partner program for Google My Business shortly. It is not yet known which criteria agencies have to fulfil in order to be admitted there.


Google Maps plays an important role in local searches. So it’s no coincidence that Google announced changes to Google Maps at the same time. In order to use Google Maps, everyone will have to deposit a payment profile. However, Google is generous (at least for the time being) and deposits $200 a month for each user. This should be sufficient for standard uses of smaller projects.

Good news: Embedding maps remains free of charge. The new prices are valid from June 11 – click here for the price list.

Google Maps Preisliste


Podcasts are becoming increasingly popular thanks to streaming services such as Spotify or Deezer. The amount of knowledge that comes together in the many good conversations that you can now listen to on the net is immense. And it is precisely this pool of knowledge that Google would like to bring closer to its users in the future. The search engine is therefore working on making the knowledge slumbering in podcasts usable for the search, as this interview with Zack Reneau-Wedeen, an employee of Google’s podcast team, reveals.

One goal here is to be able to search for audio content directly in the Google search mask. With one click out of the SERP, one would be led to a suitable Podcast – audio contents would be just as easy to find then as for instance suitable videos to the search term.

Should Zack Reneau-Wedeen’s team make good progress, search engine optimization would be given a new field of activity: Audio-SEO. This development is certainly fuelled by the success of digital language assistants such as Amazon Alexa or Google Home. The digital helpers also like to read something aloud themselves, but they are also almost perfect for playing suitable conversations or other audio content.


On May 25, the new General Data Protection Regulation (GDPR) enters into force. After that you can still use Google Analytics, but there are some things you have to consider or apply for and change:

  • conclude a contract for order data processing
  • confirm additional agreements
  • implement DSGVO-compliant tracking codes
  • optimize the right of objection
  • define the retention period for the data
  • delete old data

The nerd wrote down exactly what and how this works for you under the keywords GDPR and Google Analytics. If you are not able to make the changes yourself or are unsure, you are welcome to contact the seo-nerd®.

Return of the Pagerank?

Latest SEO news

Latest SEO news

This week, many SEOs literally saw a ghost. By name: the PageRank. This ranking factor, which used to be so decisive and now is mostly smiled at mildly, has become quite quiet in recent years. Now a new patent for PageRank has been announced and SEO experts are pondering: Is this the return of an undead? In addition, Google’s SEO contact John Müller, gave enlightening answers to questions about website indexing and the ranking of hreflang references.


Not so long ago, backlinks were considered one of the most important tools for search engine optimization in the SEO scene. Google’s PageRank algorithm rates websites based on the links they receive. Simply put: The more links, the more important Google rated the page.

In 2016 Google buried the Toolbar PageRank with which you could easily see the evaluation of the current status of a website. The end was in line with Google’s strategy, since the Hummingbirg-Update put more emphasis on valuable content of websites.

Now Google has applied for a new PageRank patent. But what exactly does this patent mean for the further development of search engine optimization? Will it soon be a question of placing backlinks again and will other factors be devalued as a result? From experience the seo-nerd® can say: With new patents of Google, hysteria and snap breathing is rather inappropriate. Patents do not mean that they will ever be used. It may be that links remain important for longer than many suspect. This will not change the semantic-holistic search orientation Google has been pursuing for years. King Content will therefore not be easily pushed from the throne.


While the patent may leave you with one or two questions, Google`s spokesman John Müller has finally solved two other mysteries of the Google algorithm this week:


The HTML meta element “hreflang” is used to support Google in classifying the geographical orientation of a web page. This makes it easier for the search engine to provide the user with the appropriate language version or the regional URL of a website.

This week, a Twitter user publicly considered whether these hreflang references to other language and country versions of a website would ensure that signals from the individual versions were transmitted. John Müller answered with a clear “No”. The different versions of a website rank separately, even if they are connected by hreflang. He also provided the logical explanation for this: Just because one page is relevant in a certain region, it does not necessarily have to be relevant for another.


During the week, a webmaster on Twitter asked why only 5 of the 22 pages on his website were indexed. Simple question, simple answer: According to John Müller, very few websites are fully indexed. Thousands of pages are often left standing. This is usually due to small things that are overlooked, such as setting the “nofollow” attribute, canonicals or redirects. It is therefore worthwhile to let an expert like seo-nerd take a look at your pages.

If you have any questions about these or other SEO topics, please feel free to contact us. The nerd is eagerly waiting to talk to you about his favorite topic.


Latest SEO news

Latest SEO news

SEO perspective-wise, nothing really earth-shattering happened this week. We take this opportunity to point out a study and an article offering useful tips for search engine optimization.


Felix Meyer from Seokratie has measured and checked which snippet tools deliver the best results. Google itself does not offer a tool into which one could enter title and description and then the optimal length would be displayed. There is a simple reason for this: Google wants to keep all doors open when it comes to the length of the title and description.

An incorrect length of title and description is therefore not a direct ranking signal. In principle, one could also write a very long title: In its full splendour, however, no user would ever see it. Space on the search results page (SERP) is limited. Google therefore simply cuts off titles that are too long. Not only does this look unattractive, it also usually doesn’t encourage you to click. Indirectly, therefore, the length of title and description is a ranking signal.

As far as the descriptions are concerned, Google sports the opinion that it knows best what users would like to read there. In almost 60% of the cases, Google extracts the text for the description itself from the content. This way it can react to the entered search term and the individual search history. Nevertheless, nobody should refrain from entering a description: It provides Google with important clues as to what the page in question is aimed at.


Therefore, it is not so easy to give general recommendations for their length. The study by Seokratie comes to the conclusion that a title should have a maximum of 569 pixels or 65 characters. Since mobile also wraps the title into a second line, your title should be at least 40 characters or about 330 pixels long. This is how your site gets more attention in the SERPs.

A description should be at least 100 characters long and no more than 290 characters long. Whether the full description is displayed is always decided by Google on a case-by-case basis. It is recommended to always put the most important things at the beginning of the description. The end result should rather be a request to visit the site or a call-to-action. If this is not displayed, users still know what they can expect on the site.

According to the study, Screaming Frog’s tool was the most reliable. However, first the tool needs to be downloaded. If you just want to check for the right length quickly, you can also visit Torben Leuschner’s tool. This does not warn you if your title or description becomes too long, but you now know how long they can or should be.


This suspicion is obvious when you look at a new patent. According to this patent, Google is working on getting to know things, people and places better. The key term for this is “entities”. In the field of Semantics, this means clearly recognizable terms with characteristic properties. Google has patented the answer to the question of how such entities can be used for a search engine.

At its core, this patent amounts to a huge database of semantic entities in which the relationships between the entities are recorded. Google is thus further freed from dependence on backlinks. For each search query, Google can now refer to this database to see which terms semantically belong to the search term entered. The names Barack Obama, George Bush and Bill Clinton, for example, often appear on pages that at least touch on the topic of “US President”. Google recognizes these relationships and can therefore also refer to pages in which the individual names mentioned appear, but not the term “US President”.

All this is not entirely new, but joins seamlessly into the holistic-semantic restructuring of the search engine. The patent and the entity database refine and complete Google’s understanding of terms. Ronald Reagan, for example, should be mentioned far more frequently as “US President” than as an actor. Although Reagan was an actor for much longer, Google “knows” that the term “Ronald Reagan” is almost always aimed at the term “US President”.

This is extremely important for online marketing. This way Google can give better weight to the content of pages. When asked about “Ronald Reagan”, it will therefore be preferable to have pages that also have something to say about the topic “US President”. In other words, Google no longer relies on links to create relationships between terms and topics. It has a database in which the relationships between the words (or rather “entities”) are stored and can be retrieved. The whole thing is like a kind of entity page rank – only much more powerful. Since this is only a patent, you should definitely continue to work on the link profile of your pages.


Latest SEO news

Latest SEO news

The seo-nerd regularly views SEO blogs and news pages for you. In week KW15 2018 he noticed the following:


When you enter a search term on a mobile device in Google search, no more page numbers (pagination) appear at the bottom of the page, but “Further results” (see screenshot). If you click on this button, further results will appear. The advantage: the further results load faster now, since they are integrated into the existing page.

Der Weitere-Ergebnisse-Button von Google

Weitere Ergebnisse-Button von Google

For publishers, this creates new opportunities: even those who appear further back in the search results (classically speaking: from page 2 of the search results page and further) will probably now be able to hope more often for clicks from users. By the way, Google does not rule out that the button “More results” could also be introduced soon for the desktop search. This would mean nothing less than the end of page 2 for the SERPs.


In a hangout, John Müller (Google’s “contact point” for webmasters on SEO issues) explained why even well-done pages with strong content can sometimes lose rankings. The reason for this is that Google not only considers the quality of web pages, but also their relevance for the respective search query. And this can change over time.

Ranking losses do not always have to point out errors or poor quality of a page. In some cases, the search behavior of users simply changes. An example of this would be the release of a new iPhone model. This would automatically change the weighting and thus the relevance of pages on the topic of smartphones. As soon as the hype about the newest model diminishes, again more general sides can be in demand around the topic.

Müller recommends, however, always to take a closer look at the page again in the event of ranking fluctuations. Are all contents still indexable? Are the outgoing links still correct? In addition, it is of course always useful to ask your own users for feedback to the site to find out what should be changed.


If you use images of a stock photo provider for your site, you should not use the provider’s search from an SEO perspective. You will achieve better rankings if you select a photo that already ranks well with your keyword in Google’s image search.

Use the “site:” command in Google’s image search like this:

“site:URL-STOCK PROVIDER/ KEYWORD” (space after the URL)

This way you’ll get the photo that matches your keyword best on Google. This strengthens your site for the keyword you are targeting. The reason for this is: Google stores a kind of meta-image for each image. If the word “cat” also appears in the old text, in the title of the image, in the description and in the textual environment of the image, Google will evaluate this meta-image as a search result for the keyword cat.

Therefore, make sure that you always mark images as consistently as possible. If you would use the cat picture for example in an article about tigers, lions and other big cats, Google will irritate at least once. You could still use the picture, of course. However, it would be advisable to use at least one more picture, which ranks well with the term “big cat”. This way Google understands the intention of your article better and you have a better chance to rank with your article for the desired keyword.

By the way, you when awarding images. Your keyword should even appear in the old text, the title, the description and the text around the image.


Ranking with two keywords (or more) on one page is no problem. The requirement is that both keywords match the topic of the page. This fact, already known in SEO circles since 2005, has now once again been confirmed by Google employee Aaseeh Marina in the Webmaster Central Help Forum.


Maccabees Update

The Maccabees Update is part of the Google strategy to boost the quality of webpages.

The Maccabees Update is part of a long tradition Google prefers to deny. The cat-and-mouse-game follows a certain dramaturgy: SEO Analysts observe ups and downs in rankings and assume a recent update of Google’s algorithm structure. Google itself appeases ups and downs and rejects any rumours of a recent update. This is how the relationship between Google and SEO Analysts has been for years. The reason for this is very simple: Both sides focus on the same (Search Engine Result Pages, also known as SERPS), but do so from different angles.

The dispute between Google and SEO Analysts

Google by now applies 1000 to 1500 updates on a yearly basis. Usually, this only causes minimal changes in SERPs. Since the application of the Hummingbird algorithm with its focus on a semantic and holistic search (between 2013 and 2015) Google particularly publishes smaller updates. The reason is quite obvious: Neither users nor most webmasters will get nervous. The tactic of applying smaller updates is Google’s reaction to its experiences with huge updates in former years (around 2015). Especially large penalty updates</strong< like Pinguin and Panda had massive impacts on the visibility of webpages and even crashed them causing the death of many smaller businesses.

SEO Analysts consider themselves to be the lawyers of webmasters. They have a strong interest in the methods websites use to rank. The predominent opinion with SEO experts is that they need to be the ones to know because Google holds back core information about this. Okay, there is the Google Webmaster Guidelines Paper. But even here you will only find very shallow rules about what webmasters should do or not. Google wriggles for an answer when it comes to what is essential for a page to be ranked. To find out nevertheless, SEO Analysts constantly trawl SERPs to find saliences. Doing so, they will ideally come to a conclusion which measures rank webpages successfully in the SERPs (and which not). We need to regard the Maccabees Update with this in mind.

A short history of the Maccabees Update

Tweets of Google team member Danny Sullivan about the Maccabees UpdateOn December 12th of 2017, SEO Analysts tracked noticeable ups and downs in SERPs.
Analysts monitored a loss of 20 to 30 percent, especially for organic traffic. The ups and downs were still trackable on December 13th and 14th. On his website Search Engine Roundtable, Barry Schwartz named the update “Maccabees” according to the Jewish Holidays of Chanukka being celebrated on those days. A short glimpse into the background of the name: The successful Maccabean Revolt, lasting from 167 to 160 B.C. enabled the ritual cleansing of the Second Temple in Jerusalem and established the place for Jewish worship. The Holidays of Chanukka celebrate this part of Jewish history.

Not before the 20th of December 2017 Google gave news about the update, appeasingly as usual. Google’s Search Engine Consultant Danny Sullivan rather announced “some usual changes due to routine”. He did not confirm an update which is why there is no official name for it.

The usual guessing game: What effects does the Maccabees Update have?

With Google denying the existence of a new update (again) the SEO Community was left with riddles (again) about what the update might be about. Roger Montti outlined the most frequent presumptions on Searchenginejournal:

    • Thesis 1: The Maccabees Update is part of the Mobile First Strategy. The thesis was discarded very soon when it was discovered that also pages which have already been optimized for mobile devices dropped in rankings.
    • Thesis 2: Maccabees affected the visibility on desktop searches in a higher rate than on mobile devices . A very interesting thesis, but discarded too according to a significant lack of evidence.
    • Thesis 3: It is a seasonal update connected with the upcoming Christmas time. The thesis was tempting but could not answer the question what this exactly intended. So this track was closed very soon too.
    • Thesis 4: The update has something to do with links. This is a classic when talking about Google’s updates. There are a lot of good examples to prove this thesis with Google proceeding against aggressive link strategies by now. Still, it does not fit in the bigger picture of all abnormalities of the Maccabees Update combined.

Finally a trace: The Maccabees Update is a member of the Fred’s Update Family

Name giver Barry Schwartz concluded from his analysis that Maccabees shows

      • typical symptons of the Fred Update.
      • a very noticeable pattern according to which pages are affected that only exist to vary keywords.

Symptoms of Fred

Fred was never officially confirmed by Google. In March 2017 SEO Analysts worldwide were occupied finding out what the update was about. Soon it was clear that especially websites which only served the purpose of earning money (through an overload of ads or affiliate links) had been punished. Characteristic features of those pages are

      • to contain content that is old, inferior, or simply too poor to serve the user’s search intent.
      • to overwhelmingly display ads.
      • to not offer other services than paid affiliate links to other shops.

Since the first rollout in March 2017 SEO-Analysts could track several smaller follow-ups of the Fred Update. As webpages with very few relevant content fell victim to the update it is widely considered to be a Quality Update. The Maccabees Update can be queued with this category.

No landingpages with keyword variants, please!

One of the favorite SEO strategies so far is building singular landingpages for keyword variants. With the Maccabees Update Google obviously put a stop to this strategy. The idea behind this strategy is spammy indeed: SEOs not only want to rank with one keyword but also with all its variants.

Here is an example: A craftsman wants to have his service listed with Google. He offers this service not only in Cologne but also in several different other German cities like Hannover, Frankfurt, or Berlin. Therefore he builds subpages for each city with the pattern:

Keyword= SERVICE + Keyword-Variant= NAME OF CITY.

The service is the exact same in every city, so the page’s content only varies in the name of the city. As Google attempts to take the view of the user and aims to deliver relevant search results, the strategy for landingpages with keyword variants is unnecessary and even annoying.

According to Google, only search results matching the search intent should be displayed to users. Therefore it is perfectly appropriate to explain the SERVICE once, but really good and with only as many words as necessary. Explaining the SERVICE again and again on different subpages creates a confusion for Google: Which page should rank in SERPs? The Maccabees Update has an answer to that: NONE.

How to handle keyword variants

Mid- and Long-Tail keywords will get more and more interesting for SEO competition. Voice Search with tools like Alexa, Cortana, and Siri is the reason for this. In oral form words tend to be longer. Webmasters who incorporate this major change will have the best chances to dominate rankings in the future. Still, not every single Mid-Tail oder Long-Tail keyword needs to be featured on separate subpages.

The better strategy would be to think around a topic. Connected to the example, this means to write one page for the SERVICE and mention the cities the SERVICE is also available.

It still can be appropriate to produce several different landingpages if the services vary from city to city in a way that service variation is really given. This can be also applied for products or product categories. The basic information for users is already given with explaining the product adequately on just one page. It is not necessary to copy the text for several different target groups with the intention to serve the keyword PRODUCT + TARGET GROUP.

Dominic Woodman presented an analysis on MOZ which comes to a very similar conclusion. He not only limits the effects of the update for pages that only use keyword variations to rank. He also sees an impact on pages dealing with a topic that has been widely covered with another article. Woodmann is not taking a counter position to Schwartz, but extends his thesis with another aspect.

Conclusion: The Maccabees Update perfectly goes along with Google’s strategy of the past years

The main attempt of this strategy is to rank pages higher that contain relevant content for search queries. At the same time other pages that obviously only want to attract user’s attention with keyword tactics are pushed to the bottom. When your texts succeed in describing your services and products well and also manage to answer customer’s questions you do not need to worry about the Maccabees Update. Even better, you do not need to worry about upcoming Quality Updates, because you are obviously already on the right track.