You might sometimes notice a fall in your organic traffic and there can be many causes for this.
Many studies prove it: SEO today is one of the most powerful web marketing levers. Although there are many benefits of natural referencing, it has to be recognised that some elements can jeopardize it and make your strategy, which had been working so well, fail, at least on paper. Here is a recap of the 9 main reasons explaining why your SEO strategy isn’t getting the expected results and why your website is showing a decrease in organic traffic.
How to diagnose a fall in organic Google traffic
Anyone talking about deploying a strategy is obviously talking about KPIs and tools for measuring its efficiency, but also for more easily identifying the obstacles and being able to intervene as quickly as possible to correct them. For this reason, we can’t advise you too strongly to use these two un-beat-able Google tools and to use them with others to have a 360° view of your performance.
Google Analytics for an overall view of your traffic and conversions
Your traffic and sales are excellent indicators of the quality of your SEO strategy. The Google Analytics tool allows you to measure the leads generated by organic traffic, as well as the turnover brought by the latter. Identifying the pages which position themselves best and drain the most traffic to your website allows you to re-direct your marketing strategy accordingly. The ideal is to design your own dashboards according to your objectives and to segment your audience to distinguish which are the targets to privilege in order to increase your results.
From the launch of your website or during a redesign, therefore, it is indispensable to immediately integrate the Analytics monitoring code to start to collect results as quickly as possible.
If you notice a significant decrease in traffic, you first have to define whether it’s mainly your SEO traffic which is impacted, only some pages of your website or your traffic more generally. The measures to be taken are not the same, from case to case.
Google Search Console for monitoring the development of your KPIs
Indispensable for any referencer, the Google Search Console (formerly Google Webmaster Tools) allow you to keep an eye permanently on the evolution of your various KPIs defined in advance and to easily intervene on your website in just a few clicks. It is also the only tool which gives the true number of clicks coming from organic traffic.
You can know precisely:
- the keywords on which your website positions itself and those which generate the most traffic;
- the monitoring of your positioning according to your strategic keywords;
- the CTR (click rate) of your various links. If the percentage of clicks is low, that might mean that you have to carry out an optimisation of your Title and Meta tags;
- the various backlinks aimed at your website in order to evaluate the quality of your netlinking.
Via this tool, you can very quickly see whether your website includes errors on the sitemap.xml or robots.txt files, is a victim of a Google penalty or even hijacking.
Position monitoring tools
If you are used to carrying out a daily monitoring of your positions, the advantage is that you can very quickly identify a decrease in traffic and remedy the situation immediately. To carry out this check, we would advise you to list the 20 strategic key expressions which generate most traffic. If you notice significant fluctuations two days in a row, you need to press the alarm bell immediately!
Among the various tools available to you, you can user Ahrefs, SEMrush, Google Search Console or even measuring tools developed according to your needs, either by your SEO agency or internally by your R&D department.
At Semji, we have designed our own tool which lists all the indicators to be monitored and offers dashboards which give us a 360° view in real time of our partners’ situations.
What are the reasons for a fall in SEO traffic?
Google algorhythms have recently been updated
Seeing a brutal fall in your traffic, you might also suspect a change in algorhythm. Generally, Google communicates the days its algorhythms are updated in quite a vague way and is content to confirm the arrival of a new one when the first significant sanctions affect websites, but no-one knows the formula in advance, even if everyone is aware of the consequences: a gain in positions or de-rating on the SERP, even de-indexing etc. A website might also not see any change.
It’s obvious that your website can’t evolve with each change of algorhythm and the reason is simple: Google makes changes to its algorhythms more than 1,500 times a year! It’s not always obvious to make the necessary corrections quickly before an algorhythm cracks down.
In order not to fear algorhythm changes, the best advice to apply is still to think of building ALL your website FOR the web user! Thanks to an RM Tech audit, for example, you can check the quality of your website and its content in order to carry out the corrections progressively.
Your website is inaccessible
And this can relate to a website inaccessible by Google or users. In both cases, this is damaging for your natural referencing.
Have you planned to carry out maintenance in the middle of the night, thinking that no-one visits your website at that time? You’ve got it wrong. Certainly, your usual visitors aren’t there, but that’s not the case with Googlebots! And if they can’t crawl your website when they’ve decided to do so, they won’t stint at penalising your website.
From the user’s side, an inaccessible website owing to a server breakdown or updating error is also extremely negative. The error type displayed when you try to get onto your website allows you to identify what the problem is: ownership of files and directories, syntax error in the .htaccess file, a too-crowded server waiting list. In all cases, if the problem is down to your hosting company and is recurrent, perhaps you should think about going somewhere else!
In addition, your site might be extremely slow to load, so much that your visitors are discouraged and quite quickly go off to a competitor’s website they can have a look at. Some studies reveal that a loading time greater than 3 seconds leads half of visitors to leave. Website performance is a criteria which can play to your SEO’s advantage or disadvantage. Be careful!
Your website has encountered a crawling or indexing problem
Another important point to check: does your website appear correctly in search results? If that isn’t the case, take a look at file settings which allow crawling and indicate to robots how to proceed, which pages and content should be privileged or ignored: robots.txt and sitemap.xml. Don’t forget to take a look at the server logs which will give you some precious information!
As far as the robots.txt file is concerned, if it’s not filled in, no Googlebot will visit your website! And your website won’t be crawled. If it includes bad instructions, you can clearly deprive yourself of an enormous amount of traffic!
If this first step has been correctly carried out, the problem perhaps lies in the sitemap.xml file, which may have errors. For example, it’s probable that all your website’s URLs are not indexed. To make sure of that, you should compare the number of URLs present in the sitemap.xml file and the number of URLs really indexed.
Perhaps only the mobile website version is concerned, and with the Mobile First Index that can hurt. Here again, the Search Console will help you put the finger on these various obstacles.
Where crawling is blocked, the indexing of your pages is impossible and therefore visitors have no chance of accessing your website.
If you are undergoing an SEO makeover, poorly done 301 redirect work can have a very negative impact on your traffic. On the one hand, all visitors seeking to access your website through older URLs won’t be able to and will fall onto a 404 error. On the other hand, if you benefit from backlinks and haven’t communicated the new URLs to the websites in question, or a redirection request, all the potential traffic from these websites is also lost.
You have suffered a Google penalty
A website which is victim of a Google penalty is obviously going to see a significant decrease in its organic traffic. Two specific cases:
- either the website is going to lose SERP positions and if it’s the most strategic pages which are affected, that hurts a lot,
- or it quite simply won’t appear any more in the search engine.
Most often, Google penalties affecting a website are caused by the Panda, Penguin and Fred algorhythm filters, but they can also come from a manual action by a real human!
If your website is subject to a Panda penalty, your content is considered to be of bad quality (duplicate content, price comparer, link farms).
If it’s Penguin which has ruled, you’re sanctioned owing to abusive over-optimisation. This may be owing to poor quality backlinks or a link acquisition strategy which doesn’t comply with Google guidelines (purchasing, exchange, suspicious frequency of obtaining new links etc.).
Finally, the Fred algorhythm hunts out websites which make an abusive use of advertising and seek primarily to generate income, to the detriment of the quality of the content offered to the user. This can be translated particularly by many low-quality backlinks, too much spam and advertisements in relation to content, over-optimised anchors or badly defined siloing which makes browsing difficult. A website affected by the Fred penalty might see its traffic fall by 50%: if this is your case, a good clean-up will be necessary!
No matter which algorhythm you’ve upset, you can detect the nature of the penalty suffered via the Search Console and proceed with getting the website back up to standard. Then, you will be able to request a re-examination so that your website can be put right as soon as possible. In all events, it’s necessary more than ever to act QUICKLY!
Your competitors’ websites have overtaken you
It’s the same battle for everyone on Google (and in real life too, moreover!): get in front of your competitors. If you’re thinking about this, tell yourself that it’s the same for your competitors!
Websites positioned in the SERP TOP 3 scrape up more than 80% of clicks just for themselves. In the case where your website is positioned in this TOP 3 and goes down to 4th or 5th place, it’s obvious you’re going to lose a large part of the organic traffic usually acquired. Worse, if you land on the second page, your visibility is going to be significantly affected!
You are the victim of a Negative SEO attack
If you suffer Negative SEO action, it’s because someone is out to damage you willingly. The results of such action all converge in the same direction: the declassification and de-indexing of the website. And who could benefit from your website’s disappearance? Those malicious people are none other than your competitors. Why? Because undertaking Negative SEO procedures against a website takes time, energy and sometimes even money – but disarming the leaders is well worth such effort!
The practice of Negative SEO can be summarised as using the relevant practices of Black Hat SEO. The aim is to artificially increase a website’s ranking by using and abusing techniques contrary to the regulations imposed by Google – and which will be the subject of heavy sanctions for the website concerned! A brutal increase in bad backlinks, a warning of a penalty via the Search Console or even the meteoric fall in your traffic are signals which should alert you.
A Negative SEO attack is difficult to foresee, but some good practice can allow you to quickly detect it:
- regularly audit your website:
- keep an eye on your netlinking at all times;
- regularly check duplicate content which could be realised from your website.
Following this Negative SEO action, the reputation you’ve spent so long building can collapse in no time: enhanced scrutiny is the answer!
Your website has been hacked
Contrary to Negative SEO action whose aim is to damage you personally, “SEO hacking” corresponds to hijacking motivated by money which has damaged your website’s SEO score. For example, a hacker detects that your website has security lapses and uses them to create a multitude of pages with the aim of earning money through affiliation. He can, for example, publish advertisements on thousands of pages created just for the occasion without you even noticing.
To avoid finding yourself in this situation, carry out regular updates of your website and also any plugins you use. This will allow you to quickly patch the various lapses. If you use a prefabricated template via a CMS, check its vulnerability too!
Let me tell you that any websites which suffer a back, no matter what kind, are in Google’s sights and will see their SEO score suffer severely. Here again, keep a high level of vigilance via Analytics, Search Console and position monitoring tools. They will allow you to react quickly or, failing this, avoid a catastrophe.
Your sales are seasonal
Before panicking because visitors seem to be deserting your website, it is important to define whether your sales are seasonal or not. It is obvious that the website of a ski hire store will be much busier during the winter holidays than in the middle of August. For the website selling ice cream, the opposite will be true. Some event websites are also highly subject to large variations in traffic: don’t panic, it’s quite normal!
To really know whether your website is abnormally losing visitors, the best thing is to compare your metrics in relation to the same period last year.
Google Trends can also give you precious information about changes in query search volume. As its name indicates, this tool allows you to observe trends and therefore define whether your situation is in line with these or not!
How to react to loss of positioning on Google
The loss of organic traffic is an uncomfortable situation which can be suffered by any website during its lifetime. You are not in the clear! If you come up against it, it is indispensable to know what attitude to adopt to limit breakages and also to surround yourself with SEO experts used to dealing with this sort of situation.
At first, the goal is to identify precisely the origin of your fall in traffic: is it technical or malicious? This data is indispensable for knowing where and how to intervene. By carrying out a report of your traffic and positioning day-to-day, you will more easily succeed in sniffing out the disaster before it happens!
The putting in place of warnings is also an excellent way of “prevention being better than the cure”. Thanks to this method, you will be able to quickly detect Negative SEO action, for example. While you’re at it, monitor too the algorhythm updates: if they correspond to those of the registered fall in traffic, you know what you have to do.
As soon as you see a change which might foreshadow a loss in position, don’t wait around before you react. A proactive attitude is the best defence against a fall in traffic, so keep on the look-out and be ready to react at any moment!