Domain Authority dropped 10 points and I don't know why
-
In my latest site crawl, the domain authority dropped 10 points for no apparent reason. There have been no changes to the site. The only change I have made this month is to block referral spam to the site. My competitors' DAs have stayed the same too.
website name: https://knowledgefront.co.uk/
Any ideas?
-
Check for incoming links and see if there's any pattern. Check referral traffic and see if any bot traffic is landing on your website or not. Try the following steps:
- Create a filter in analytics
- Block spam IPs
- Block bot traffic to land on your website.
- Create quality backlinks
- Find out pages that are performing lower in SERP i.e. 5,6,7 or 8,9th positions. Optimize them and push them for better CTR.
Once you are done, submit sitemap and wait for some time, I hope this will help you out recovering the lost DA as it works for me.
Regards,
Ravi Kumar Rana
TheSEOGuy -
@lisababblebird said in Domain Authority dropped 10 points and I don't know why:
With just a quick look, you've got 53ish links and many of them are questionable at best, or directory links.
So, link quality may be an issue, also, did you manually disavow links? If so, you probably just brought unwanted attention to your site.
We never tell clients to disavow on their own... only if they receive a manual penalty do we take that step.
Instead, focus on building actual valuable links as Google and other search engines have gotten pretty good at recognizing and ignoring spam links on their own.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which better to rank with 40 DA domain redirect the domain 301
hello, which better to rank with 40 DA domain redirect the domain 301 to my website or host domain and create posts with my website link + if i do the 301 redirect the Crawl Errors of old 40 da domain will display on my new website or not+how much links can i get from one website pbn
Technical SEO | | cristophare79
+
which better get links for home page or postsbest regards ,0 -
Geo ip filtering / Subdomain can't be crawled
My client has "load balancing" site traffic in the following way: domain: www.example.com traffic from US IP redirected to usa.example.com traffic from non-US IP redirected to www2.example.com The reason for doing this is that site contents on the www2 contains herbal medicine info banned by FDA."usa.example.com" is a "cleaned" site. Using HK IP, when I google an Eng keyword, I can see that www.example.com is indexed. When googling a Chi keyword, nothing is indexed - neither the domain or www2 subdomain. From Google Search Console, it shows a Dell Sonicwall geo ip filtering alert for www2 (Connection initiated from country: United States). GSC data also confirms that www2 has never been indexed by Google. Questions: Is geo ip filtering the very reason why www2 isn't indexed? What should I do in order to get www2 to be indexed? Thanks guys!
Technical SEO | | irene7890 -
Should I be concerned about Google indexing an old domain if the listings redirect to the new domain?
I noticed this about Moz's old domain SEOMoz.org. If the URLs from the old domain are redirecting, is there any reason to be concerned about an old domain still appearing to be indexed by Google? See here: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site%3Aseomoz.org Links to seomoz.org are listed, but if you click them they redirect to moz.com. Is this anything to be concerned about or is everything operating as expected?
Technical SEO | | 352inc0 -
What can I do if Google Webmaster Tools doesn't recognize the robots.txt file?
I'm working on a recently hacked site for a client and and in trying to identify how exactly the hack is running I need to use the fetch as Google bot feature in GWT. I'd love to use this but it thinks the robots.txt is blocking it's acces but the only thing in the robots.txt file is a link to the sitemap. Unde the Blocked URLs section of the GWT it shows that the robots.txt was last downloaded yesterday but it's incorrect information. Is there a way to force Google to look again?
Technical SEO | | DotCar0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0