pixel
Or how the site: search can tell you so much

 

04
April, 2018
“Who was the guy in that one movie we saw that time?” While a question like this doesn’t mean much without context, between people who know each other well and who have history, the answer is Robert Downey Jr. (Honestly, the answer to almost all of life’s mysteries is Robert Downey Jr., but I digress.)

One of the amazing things about Google is that it learns our search habits over time. This means we get to the answers we want without being overly specific about the question. Kind of like the shorthand you develop with your significant other or best friend – you can type in something misspelled or non-specific and, between your search history and location settings, Google will figure it out. Super fast. So fast, we don’t even understand the complexity of what is happening.

Search parameters can help dig through the clutter. There are a ton of options, from the typical boolean type stuff we use in databases other than Google’s index (AND, OR, +, -, “xyz”) to specific parameters that help us SEO peeps diagnose indexing issues.

There’s a complete list and a fantastic post on Beyond, so I’m not going to copy that. But I am going to mention a use case that came up when I got a phone call from a development company that was stumped.

 

“Honestly, the answer to almost all of life’s mysteries is Robert Downey Jr., but I digress.”
Last year, I got a phone call from a valuable development partner that was scoping out a project for a regional Alcoholic’s Anonymous website. The client had never really paid attention to their SEO but was thinking about making it a priority for the new site, as they didn’t have any organic traffic to their website. I would need to spec out this part of their proposal. As we were talking, I wanted to get an idea of how many pages the site had and what would need to get done.

Enter the ‘site:’ search. I searched Google for site:theirdomain.com to get a page count (the number of results are roughly the number of pages in the website, if all is well) and found nothing. This meant that there were no pages in Google’s index. So what was blocking it? Without access to the website, I was limited to public information. I then checked the site’s robots.txt file. This is the file that instructs Google on which elements on a website should be crawled. For example, pages should be crawled, but individual image files should not. And on this robots.txt file, I discovered that the entire site was being blocked from Google.

It took about 3 minutes to find all this and we were still on the phone. So I asked my development partner how long the site had been live in it’s current state. 3 years, I was told. I let her know about the index problem and, after that was fixed, the client decided they didn’t need SEO. Their traffic increased dramatically all on it’s own.

3 years of people looking for help. 3 years of people seeking an AA meeting calendar. People traveling through, people living in the region, and people who needed help right away. And it could have been discovered with a simple site: search.

Google’s index is a great double-check on the health of your website. I check my own often and encourage you to do the same!

Like what you read? BadCat is even better in person.
Loading...