Search Engines & SEO Blog

Google Quality-Update: initial data

By rolling out a new algorithm-update during the night, after it had been quite on that front for a while, it is likely that Google caused quite a bit of amazement for German SEOs this morning. Additionally, they published a blogposting on the subject to shine some light on the reasoning behind this update, which is to fight webspam – of any kind whatsoever. As far as that is concerned, this update is on par with the Panda-update. While these changes should only be relevant for about 3% of all searchqueries, it can be assumed that extremely competitive searchqueries are disproportionately often the target, which will let the “perceived” rate of change be much higher in the SEO community.

To evaluate the impact on the SERPs, we ran a special analysis of your keyword-base during the morning. We were unable to make out major fluctuations in the SERPs, this time. This would mean that you will need to get used to the way things are now, until the next iteration of the underlying algorithm takes place. Here we have the winning- and losing-domains for last week, based on their VisibilityIndex score:

At first glance, it seems that Google got exactly the results they were hoping for: sites with high-quality content are returned more often; domains with a multitude of affiliate-links and -content have the tendency to show up less. We can surely expect Google to fine-tune their filters multiple times in the coming weeks and month, their line of approach, however, is set.

The next few days will likely spawn many more articles on possible causes and reasons for the changes in ranking. One thing though that should already be clear is that, in this case, Google is not just betting all their chips on one signal, but weighing and evaluating a host of different references. This makes it harder and harder go gain a ranking-advantage without having the corresponding “substance”, as far as content and user behavior are concerned, backing it up.

Additional posts on this subject (in German):

Johannes Beus - 25.04.2012 12:00

Google Social Search & Social Connections

Due to the much discussed assumption that Google Social Signals will be valued higher in the future, there is also a higher momentum in discussions about Google Fake Accounts. This led me to take a look at the kind of connections that Google already recognizes and uses in their social search.

Google explains that an online-review by a friend might be more relevant than one by an unknown, which is an assumption that can hardly be denied. Thanks to this, results by established connections to friends are highlighted, which is not always apparent though, as not all of your friends will be actively using Google Plus to publicly review or share content. This means that it is not necessary to have a Google Plus account to influence Social Search as there are a multitude of factors at play:

  • Blogs, websites and other content that a users has created or shared

  • Pictures that have been shared by your own social connections

  • Google Reader subscriptions

  • All known web-profiles like Twitter or Flickr

  • Content that has been recommended or shared through the use of the +1 button in the SERPs

What's interesting, is the way in which your own social connections are identified. There is, once again, more to it than just Google Plus. All you need to receive personalized results in the social search is a Google Mail account.

Social Connections

By using the contacts in your Google Mail account, Google can identify more connections which, in turn, become part of the Social Search results. This means I will also get the recommendations from the friends of my friends, something that can hardly be controlled anymore but which also opens up the opportunity of influencing what third-parties see. This would mean that if the average number of contacts for a Google Mail were 100, then I could influence my 100 contacts by sharing a link, but I would also influence those 100 contacts that my contacts have, which would mean I would influence 10.000 contacts. On the flip-side, this would also mean that those 10.000 users would be influencing my own Social Search results. This is quite the impressive number, though it is more likely that the actual number is much higher, since Google is also able to identify the contacts on my Twitter account, for example, which are then also added to my social connections. Incidentally, Twitter is once again in a growth spurt in Germany and has reached the 4 million user threshold for the first time.

As far as Google is concerned, they divide you connections into public (e.g. Twitter) and private (e.g. Google Chat) contacts.

If you are interested in why someone is shown in your Social Search results, you can hover your mouse over the name of the person or use the link to get to the Social connections summary page (the link will only work if you are logged in).

Quite the nice feature, but personally, this also gives me insight into the large amount of factors that need to be kept in mind when setting up a Google Fake Account. The difficulty increases even more if you are planning on setting up a “subject,-” or “expert,-” profile to increase the thematic quality of the profile, as you will need to pick the connections that also have the right topicality. While this is not an impossible feat, it surely will be a challenge and I am already looking forward to the first creative talks and articles on the subject of “setting up social connections”.

Anna-Lena Radünz - 23.04.2012 09:30

Google vs. Linktrade, part 42

The trade in links, in order to manipulate the Google resultspages, is nearly as old as the searchengine itself. While it started out as a reasonably small market segment, with a relatively small effect on the SERPs, it seems that 2007 was the year that Google decided it was a worrisome development. From then on, they have been fighting a relentless war against both linksellers and -buyers. “Google's conscience”, appearing most of the time in form of Matt Cutts threatens, pleads, intimidates – and at the end of the performance, nothing has changed and we get to see the same play again, next year.

A couple of weeks ago, that time was once again here: this year, Google decided to add more of a personal touch and addressed a very large number of Google Webmaster-Tools-Accounts with a short greeting, in which they speak of unnatural links on the webmasters' websites. They did this without getting into specifics (of course), deliberately opting to stay on the side of ominous. They encourage webmaster to get rid of those unnatural links and then ask Google nicely for reconsideration. This measure was accompanied by vague insinuations that links would be evaluated differently in the future, as well as Matt Cutts musings on the subject of overoptimisation of websites. To make sure that everybody is also aware of Google's resolve in that matter, they decided to sacrifice some pawns (appropriately enough in the leadgen-sector, which, in the future, will surely be of no interest to Google).

When I look at the reactions that this years show had within the SEO-sector, it seems that Google managed to be more successful with this approach that in the years prior. Google's deftly scattered remarks are often incorporated without thinking twice about them, links are removed and linksellers are ratted out in the Webmaster-Tools under the guise of self-protection. This, of course, leaves a lasting impression, even on those who, at the moment, do not see a cause for action: maybe there is something to these threats and I might be better off, not buying this link?

Personally, I am looking forward to seeing what play Google will put on next year.

Johannes Beus - 17.04.2012 09:18

Geotargeting for Websites

Today, while looking at this week's 'Movers & Shakers” for Germany, I noticed the websites for the car-rental company Hertz. We have the domain in the Top-3 winners, while is in the Top-3 on the losing side. This is a great example for how Google deals with geo-targeting for websites.

Movers & Shakers excerpt CW 16 2012 GermanyMovers & Shakers excerpt for CW 16, 2012

First, let's take a look at the VisibilityIndex history for from 2009 on, we can see a rather nice, steady increase in visibility, but also some remarkable pitfalls, especially since August of 2011.

VisibilityIndex history hertz.deVisibilityIndex history (Germany)

Oftentimes, we can explain large breaks in visibility with a failed relaunch (with insufficient 301-redirects) or a Google-penalty. Seeing how we have a continuous pattern of ups and downs on, we can be relatively certain that none of these apply here.

These zig-zag patterns in visibility are usually indicative of a domain-wide duplicate-content problem. We can confirm this theory by looking at the following screenshot, which compares the VisibilityIndex history for to that of

Comparison VisibilityIndex history for and (Germany)Comparison VisibilityIndex history for and (Germany)

Whenever we see a drop in the visibility for, we have a simultaneous increase in the visibility-score, and vice versa. It becomes apparent that Google is repeatedly having a hard time, allocating these two domains to the correct countries of Germany and Switzerland. For Hertz, this becomes problematic for two reasons. First, they have the problem that, during those weeks, the results for Hertz show up less often in the SERPs. The second problem comes due to the fact that the CTR in Germany should be less for results on than for those on When we take a look at the Toolbox-data for Switzerland, we are treated to the same spectacle on, with the only difference being the reversed positions of and

A possible explanation for why Google is continuously having the problem of assigning the correct geographic targets to both domains can be found when looking at the backlinkprofile for There, we can see that the website's links are not predominantly coming from other Swiss sites, but that there are a similar number of strong links coming in from Germany, too.

Overview links hertz.chOverview links

The .de and .ch domains are ccTLDs (Country-Code Top-Level Domain), which means that you are unable to change the geographic target for these domains in the Google Webmaster-Tools (GWT). This is something, that can only be done for generic domains like .com or .eu or for individual sub-domains and directories. Google notes that the country-specific domain-endings are one of the signals they use, when figuring out the geographic orientation for the website, while others are the server-location (IP-address for the server) and additional indicators. The above example for Hertz shows that there is still a potential for problems, when it comes to geo-targeting. This goes hand in hand with Google's statement that “Geotargeting isn’t an exact science, so it's important to consider users who land on the "wrong" version of your site.” It is also important to note, that Google made the announcement that they are not using any code-level language information such as lang attributes, or Meta-Tags such as geo.position or distribution.

Going back to the Hertz case, one way to fix the problem is to use the rel=”alternate” hreflang=”x” (not to be confused with the lang-attribute) annotations in the <head> HTML portion of the website, where you can use the hreflang attribute across domains. This means that you can use the hreflang attribute de-DE (German content for German users) and de-CH (German content for Swiss users), for example. This enables the website-operator to point to the correct version of the website and for Google it becomes easier to understand the relationship between the respective domains. Detailed information on how to use hreflang can be found on Google's Webmaster-Tools-Help page.

Hanns Kronenberg - 15.04.2012 23:27

Social Signals everywhere!

When you look back at the SMX Munich, you will notice that there were very few presentations that did not, at least, touch on “Social Signals”. This surprised me a bit, seeing that the question of whether social signals are relevant for searchengine optimizations is one that, one year ago, was cause for controversy among SEOs. I remember talking to SEOs that disputed the idea that these signals had any relevancy whatsoever and who were sure that, “SEO can be done without Social Media”. I started getting into Social SEO in 2009 and I am pretty sure of its increasing relevancy, especially now, with Google making its Google Plus network their highest priority. Another factor to keep an eye out for, is the new social media star “Pinterest”, which is not only one of the top traffic-boosters, but also “still” handing out follow links.

A few weeks ago, I started listening to the “Google Story” as an audiobook, which took me back to Google’s beginnings. With everything we do in SEO, we should never forget Google's main idea: ”Return the most relevant results for the users searchquery”. Originally, the web was a place where the user needed to know a website to get to it – or he was able to get there via a link from a website he already knew, that recommended that site. This means links are basically a form of recommendation, which Google uses to calculate the relevancy of a website. There was no other “currency” out there – that is, until Social Networks came along. There, users are recommending content on the web, all day long – this makes it seem natural for Google to incorporate these recommendations into their algorithms, “to give the users the most relevant results for their searchqueries”.

This means that, sooner or later, everyone who has anything to do with searchengineoptimization will not be able to avoid the subject of Social Media – for some keywords more than for others. OK, it is not likely that every subject matter will be discussed on social networks in the future, but many topics are already being discussed and “recommended”, extensively, many times a day, on social networks. There are numerous companies that were only able to increase their visibility and reach because their content was shared through social media. Not every product is being searched for actively, it may just be that we do not yet know they exist. Social networks make it possible to create a demand and then spread it virally. This viral distribution is the high art of creating reach, inspiring discussions and, by doing so, generating links, because others will report on it. This means that SEO and Social Media are moving ever more closely together.

As far as this relationship is concerned, I learned to love a special function in the SISTRIX Toolbox.

At (only accessible when logged-in), you are shown the “Top Social Signal” winners for the day. This makes it a great compendium of best practice examples for good social media campaigns and viral distribution. One feature that I enjoy immensely is the visibility history, combined with social signals.

At first, you notice the usual suspects, YouTube, Facebook, BBC, Stern..., but you have the ability to sort by networks. It starts to get especially interesting when you start sorting by Google Plus, as you will notice which companies are already successfully using Google's new service. At the moment, its quite hard to find Google Plus best cases, as they are still rare and hard to find.

I decided to take Pizza Hut as an example, as I asked myself where they got all their Google Plus Ones from? When I looked into this matter, Pizza Hut managed to gather a total of 850 Google Plus votes in the SISTRIX “Social Winners”-list. Pizza Hut does have a Google Plus account, but one with no posts on it and which is not linked to on

Social Signals\Daily Social Signals winners (sorted by number of total Google+ votes)

In conclusion, this would mean that the controversial Google Plus button on the Google SERPs is actually being used. As far as I am concerned, I was always skeptical of this, because who would give a “+1” to a site that they have not looked at at that time, or who comes back to Google to click on the “+1” button? Now I know that, for Pizza Hut at least, it's 850 users who do.

+1-button Pizza Hut on the SERPs+1-button on the SERPs

Google can use these votes to identify brands, seeing that it is rather likely that only well known brands, with which users can identify with or which they love, will be able to get these numbers of “+1” votes.

Just in case you are wondering; all those tweets, that Pizza Hut managed to rack up to become one of the Social Winners for that day, are due to the ability to tweet about a successful online-order from Pizza Hut:

Another interesting example is the US TV series “Smash”. As a short intermission, here is a good place to call attention to the possibility of sorting the results by country, which is surely another way to arrive at interesting conclusions about the social winners.

Now back to Smash. If you click on the + behind the domain in the toolbox, you are shown the graphical representation of the history for the social reach. This is also where you will notice that the social reach for Smash increased greatly with the start of its first season.

Chronological history social signals SmashChronological history social signals Smash (Source: SISTRIX Toolbox)

This may seem a little like the age old question of what came first, the hen or the egg? I would like to wager the guess that the strong viral distribution is part of why the series is so successful and has such reach. A series without constant success would generate a lot of links at the beginning, but would stagnate over time. The idea, that the social votes mirror the series success, can be confirmed when looking at the numbers for the series on Wikipedia.

More evidence for the series success can be found in the SISTRIX Toolbox at The page for the series is one of the top social media URLs on

Anna-Lena Radünz - 11.04.2012 10:30

SEO-Regular's Table Bonn on 04.26.2012

Now that the SEO Campixx in Berlin and the SMX Munich are over, April will give us a great opportunity to get the next SEO-regular's table in Bonn on its way. The plan is to have a cozy get-together on Thursday, April 26th 2012. As always, the regular's table will start at 7pm MEZ. Everyone interested in SEO is cordially invited to attend.

To sign-up, please use this form. We will send you all the necessary information about the location a few days prior to the event. Please remember to sign-up soon, as there is an attendance limit of 50 people.

Hanns Kronenberg - 11.04.2012 09:34


Some of you might have noticed that we have had some changes this week in the VisibilityIndex scores for both keywords and websites in the local search. This is due to the way the Toolbox handles the local-hybrid-results (blended places search).

These examples should make it easier to understand the difference between a normal Universal-Search integration and a local-hybrid-result: Here the example for "hybrid-local-results" and here the example for a "Universal-Search-integration". At the moment, we are only able to differentiate between the two types, both visually and in the source-code, through the heading “Places for ...”. When we look at the second example above, “radiologie hamburg” (radiology in Hamburg), we are shown the heading “Places for radiology near Hamburg”. Until a short time ago, Google was also adding a snippet to the hybrid-local-results, identical to the organic results, which made differentiating the two much easier. It seems that this snippet has now disappeared from all hybrid-local-results altogether.

There is another important difference between these two result-types: when it comes to local-hybrid-results, there is a very high chance that Google will show an organic result as the first result in that block, but display it like they would a places-result. Often times there is more than one organic result in these blocks.

How can we tell? Simple, just run the same search on T-Online or AOL. Both are in a search-partnership with Google, but they will always display the results without a Universal-Search integration. This means they will show the pure, unaltered organic results. If you want to read more on this, here and here are two blog-posts I wrote on the matter.

Let me use today's searchresults from both Google and T-Online for “sprachschule münchen” (language schools in Munich), as a fresh example. I put both results in a table to make comparing the results more accessible.

As you can see, the first four results (,, und for the hybrid-local-results are identical to the organic results we get from T-Online and the fifth spot ( can also be found in T-Online's Top-10. It seems that Google was able to clearly identify a specific Places-profile for each of these five organic results, which they then combined into one block of hybrid searchresults.

When we look at positions number 6 ( and 7 (, we notice that they are absent from the T-Online Top-10. It is likely, that Google padded the "7-pack" of the hybrid-local-results with additional Places-results. Results 8 (prisma-ev-de), 9 (, 10 (, 11( and 12 ( can, once again, be found in T-Online's Top-10.

When we include the hybrid-local-results to the mix, we notice that the Google-SERPs have two more results than the SERPs for T-Online. These two results are an exact match to the number of Places-results that Google is presumably adding. Apart from that, both SERPs are identical.

This leaves us with a giant question mark. How should we handle hybrid-local-results in the Toolbox? Seeing how our customers use the data on a day-to-day base and report them to their customers, we want to know what they are saying. This means that today is your day. We want to hear your thoughts on this matter.

We believe that it would be a mistake to completely label these hybrid-local-results as a pure Universal-Search integration, without counting them as an organic ranking. A site like is obviously an organic Top-10 ranking for “sprachschule münchen”, even when Google decides to show this result differently from other organic results. Many SEOs will certainly be at a loss for words when they are asked why the website they are in charge of can not be found in the organic Top-10 ranking for important keywords, if we would not count these results. For that reason we started to incorporate them this week as a test-run.

When we treat the hybrid-local-results as organic results, we will count some results that are not really organic results at all. Such results are shown only because Google needs to fill up a "7-pack" or because they believe that they can guess the city-of-origin for a non-city-specific searchquery. These results are not something we are strictly happy with and they are also the reason for the changes in VisibilityIndex score for the Movers & Shakers that could be observed this week.

It is not really possible to clearly attribute these results to one category or another because, just as the name already states, these are hybrid results, with the addition of there also being some exceptions, too. It seems that we can make out the trend that hybrid-local-results have become important, even though, at the moment , Google is obviously still experimenting a lot with the integration and how to display the local results. Google's so-called Venice-update also seems to be indicative of this.

Our suggestion is that we will count the hybrid-local-results as both organic- as well as Universal-Search-results and clearly label them as such in the Toolbox.

Ultimately, it seems that we now have a third established type besides the organic- and Universal-Search-results, that does not fit into any classification system we have for SEOs and SEO-tools at the present. We are looking forward to your input.

Hanns Kronenberg - 05.03.2012 14:45

Brave new Signal-World

We just got used to the idea that SEO does not only mean the mandatory listing of all meta-keywords, but that it also consists of linkbuilding and already the world has turned and there are new signals like user-behavior and social-media-data that take the high seat in the public's perception. And just in case this wasn't enough, Google has now created a smokescreen with their monthly blogposts, which regularly makes it harder to focus on whats really important. This also leads to interesting discussions in numerous blogs and networks. I want to use this posting to add some points to the discussion at large.

It might sometimes seem hard to remember with all the new features and verticals coming out all the time, but remember that Google is still a full-text-searchengine. I don't want to go on and on about the basics, but I believe that they can be quite helpful in comprehending certain relationships. Google uses a webcrawler that goes through large parts of the public Internet and uses the words it finds there to fill its index. Now, when someone comes to Google for advice, Google first looks at their index to find the sites where the queried word is actually present. Depending on the query, this may be a list with a few million URLs. Only then, in a second step, does Google use its ominous algorithm, with which we deal with on a daily basis. Google will then sort the list with those URLs from step one with the help of a presumably huge list of rules and processes, just to show us the first 10 results.

To actually get picked for the algorithmic sorting, two preconditions have to be met: first, Google needs to have crawled the site and saved it in its index and then Google also needs to classify that site as relevant for the particular searchquery. The first condition can usually be achieved by using a solid page-layout: use an orderly information-structure and sensible internal linking to show the Googlecrawler the way. As far as the second condition is concerned, Google will use a rather simple indicator 99% of the time: the word (or a synonym) that is being searched for can be found on the page or withing the title of the page. Only once these conditions are met, do we get to the sorting and ranking of URLs. So how does user-behavior and social-network-signals fit into this system?

I am rather certain, that Google will only use these two signals during the last step, the sorting of results. And even there we see obvious difficulties, which is likely the reason why these two factors don't take up a huge significance in the algorithm, at the moment. When we look at the user-behavior, you notice that the fun only starts once you put them in relation to the actual searchquery. Meaning a bounce rate for that one URL for that one keyword, instead of a global bounce-rate for the domain. If we take a look at the click-rates on the Google results pages, it quickly becomes apparent, that the click-rate takes a massive plunge once you are past the first page or results. This means that Google will not be able to get much meaningful user-data from there and the further we go towards the long-tail, the more inadequate the coverage becomes. By implication, this actually also means that this signal could be used to decide whether to rank a site on position 3 or 4 (ReRanking), while it will clearly be unable to help with the decision of whether the site belongs in the top-10 or top-1.000, at all.

When we look at the social-signals, we get a situation that's even more deadlocked: at the moment, Google does not have a reliable source for this data. After they canceled their contract with Twitter for a complete subscription of all tweets, Twitter converted their system to replace all the URLs on publicly available websites with their own URL-shortener and setting them to 'nofollow'. When it comes to the relationship between Facebook and Google, you couldn't call it so friendly that Facebook would home-deliver the necessary data to their competitor. All that is left for a possible source is Google+. We have been gathering the signals for URLs for a while now and it is impossible to make out a trend that Google+ is actually being used more. A new Spiegel Online article, for example, has 1.608 mentions on Facebook, 227 tweets and a whopping 3 Google+ votes. Not exactly what you would call a solid foundation for an elementary part of an algorithm, that is responsible for 95% of the revenues for a publicly-traded company. So, how can we measure the significance a rankingsignal has on Google's algorithm? When Google starts to publicly warns people about not manipulating these signals, then it is about time to start giving some thought to these signals ...

Johannes Beus - 29.02.2012 09:46

SEO-Qualification: What does the future hold?

Even though SEO as a topic has been around for more that ten years, and while it has found a regular place in the marketing-mix, there is still a stark disparity between the SEO-sector and other disciplines like SEM or Display-Marketing. One the one hand, we have a comparatively large amount of people getting into the discipline, thanks to low requirements for getting started, coupled with a rather flat learning curve. On the other hand, we have a very dynamic environment that requires a lot of special-knowledge and experience to thank for the fact, that the circle of really good 'allround-SEOs' is not that large. I believe that everyone can agree that, even though we get a “SEO will die out tomorrow”-article every quarter, searchengines will still be around in the next few years. This means that the question of how to offset the demand for good SEOs is an important one for our sector. But where to take them from?

You will often hear those who propose the 'classic' way: read the massive library of available SEO-knowledge on the Internet, run your own projects, keep learning and repeating this process until the results are what you want them to be. I can't say that I don't hold a certain sympathy for this procedure, seeing how I got my SEO-knowledge in exactly the same way. It was thrilling, I learned a lot of new things and had a lot of fun on the way. But there are also drawbacks: it takes quite a while, which means you cannot compare the time it takes to the time-frame of an apprenticeship and it is also not necessarily the right option for every type of those willing to learn.

It is rather obvious that, when it comes to many areas of knowledge, each new SEO does not have to reinvent the wheel, they can rather get imparted with knowledge that is already available and established in a compact and time-saving manner. While this was already happening in the larger SEO-agencies for a few years now, where they train their new employees themselves, we start to also see initiatives that go beyond single companies. Mario Fischer should be mentioned with his engagement at the FH Würzburg, but also veteran-SEO Gerald Steffens wants to push the market forward with his new “Akademie für Fortbildung in Suchmaschinenoptimierun” (akademy for advanced training in searchengineoptimization – whose name itself is already optimized for searchengines), that started today.

Personally, I approve of such initiatives, because I believe that the SEO-sector needs to become more professional, especially here in Germany. While other sectors have already taken this step, we still see some resentment when we take an external look at the “amateurish SEO-krauts”. Every step in the direction of a higher level of professionalism in the day to day SEO-work is welcome. I am interested to see how this new offer is being taken up and wish Gerald the best of success.

Johannes Beus - 19.02.2012 19:24

SEO: a perceived need for action

When I take a look at my Twitter-stream or grab the latest SEO-update on, I temporarily get the feeling that the SEO-world is continuously spinning faster and faster: It feels as though every minute there is a new format, vertical, integration, algorithm-change and -improvement. These news are usually worded in such a way, that each and every professional SEO will feel that they have to get on top of each of these new features, just so they are not greeted by a position in Google-Hades beyond position 100 the very next day. For the benefit of a lasting mental health of our kind readers, let me point out a different perspective:

More than 10 years ago, when Google set out to reinvent the world of searchengines, they managed to revolutionize the field by using links in their ranking caluculations: the SERPs saw a massive increase in quality and it didn't take long for their market-share to reach numbers close to those you see in elections in North Korea. In the beginning, you could see massive changes in the SERPs during the regular “Google-Dances” - Google needed to experiment with a large number of factors and attenuation values (puffers) to figure out the right relationship between the two. After a few years, Google had a good handle on this and they managed to get closer and closer to their desired optimum: consequently, this meant that the times where they needed to change anything became less frequent. Over the last couple of years, Google was also noble enough to mostly hold back at making many huge changes to their algorithms, which would have an impact on large parts of the index. This explains why an update like Panda, which only pertains to a small percentage of sites affected, causes such a big commotion, while its reach can't hold a candle to something like the Florida-Update (2003).

For those who want to be successful in today's index, you basically optimize for the same things that you would have 5 years ago: a clean, well-structured site, using fitting keywords as well as getting as high-a-grade links as possible are still the way to go on the road to SEO-success. For that reason, a SEO should focus on the above – all the other topics, be is social-signals or any of the smoke that Google keeps blowing in everyone’s faces on a regular basis, should not cloud you view on the factors that really matter.

Johannes Beus - 14.02.2012 12:09

1 2 3 4 5 6 7 8 ... 103