Today while poking around the Louisville Kentucky personal injury lawyer market I discovered a site which had a disastrous message underneath it in Google Search Results which read “A description for this result is not available because of this site’s robots.txt”.
Unfortunately, this occurs often when a new site is taken live. It is a rookie mistake, but a very costly one.
If you Google the site’s URL, this is what shows up in search results:
How does this happen?
When designers are shutting down one website and 301 redirecting the old site to the new site, the new site often blocks search engine spiders from crawling it until it is fully developed. Once live, the marketing company is supposed to unblock the spiders. Sometimes they forget or there is a miscommunication between them and their outsourced web developer.
This is what the code of the robots.txt file should NOT look like:
To check your code, you can go to yourURL.com/robots.txt and take a look at what is disallowed. On WordPress websites, the /wp-admin/ folder is typically disallowed along with 1 or 2 other folders. This is done at the server level to revent search engines from indexing pages which hackers like to target.
I checked the law firm’s original site out, and it was indeed recently redirected to the new site, so this was an epic fail on the SEO Expert’s behalf. Unfortunately, the law firm may never know what went wrong until it is too late and all SEO value of their domains has been lost.
I’m nice, so I’d reach out and tell them, but I’m already working with a competitor in that area. It isn’t going to matter as they’d be buried in the search results anyways thanks to their awful marketing team.
Oh well.
If you are experiencing a total loss of traffic on your personal injury website and suspect technical SEO problems are to blame, feel free to reach out to me for a free and confidential consultation.
Leave a Reply