For the SEO community, the news is arriving fast and furious. Google often "refreshes" its algorithms monthly, but "updates" are less frequent, and usually more consequential. So we feel it's important to write about these updates, and to discuss their potential consequences for your website.
With all our talk about algorithms, however, we've received some feedback from several readers--mostly owners of small Internet businesses--that our talk has gotten a little "esoteric."
|The word "esoteric" is often associated with cults, like this one. However the definition of esoteric is more benign: "intended for or likely to be understood by only a small number of people with a specialized knowledge or interest." [Picture Source: Sensei Marketing]|
The Organic SEO Blog was created with the intention of writing about SEO in a way that would be accessible for specialists and laypeople alike. So, honestly, we hate to be referred to as "esoteric."
In reality, however, it is hard to write about Google's algorithms without sounding, well, a bit esoteric. After all, this stuff is complex.
As Google's own, John Mueller, wrote in a help thread in September:
"In practice, a site is never in a void alone with just a single algorithm. We use over 200 factors in crawling, indexing, and ranking."
If you're an SEO layperson, pay attention to this comment. It's simple, yet profound, and it gets to the heart of what the word, "algorithm," really means.
In fact, we believe this quote offers a good working definition of an algorithm: a system for crawling, indexing, and ranking websites.
Viewing these three functions--crawling, indexing, and ranking--separately also offers a helpful view of exactly why algorithms--and SEO specialists--exist.
So below we'll try to define each function as simply as possible. Just a note: we often refer to Google's algorithms, but the definitions below can also apply to how search engines, like Yahoo or Bing, offer reliable search results.
Web "crawlers" are often called "spiders," and this is the easiest way to envision their function. Spiders crawl the Internet in an attempt to gather valuable information from websites. If a certain website links to another page on the same site, or a to a different site, the spider will follow the link and gather more information. In this way, the spider creates a web of information that is sent back to Google for indexing.
It's important to note: not every website is crawled. This is why it is important to follow today's web development standards, and why, we believe, it is important that your website developer understands the basics of SEO. Please read: "Website Development: The Perfect Job for Spock." Beyond making sure your website is crawled, a website developer with knowledge of SEO may or may not make "granular choices" to further refine how it is crawled. Again, your website developer should know about these choices. As Google writes here:
"Most websites don’t need to set up restrictions for crawling, indexing or serving, so their pages are eligible to appear in search results without having to do any extra work. That said, site owners have many choices about how Google crawls and indexes their sites through Webmaster Tools and a file called “robots.txt”. With the robots.txt file, site owners can choose not to be crawled by Googlebot, or they can provide more specific instructions about how to process pages on their sites."
The information obtained from crawlers needs to be organized. This is the function of indexing. This process is self-evident: Once Google receives information from the crawler, it creates an index, much like the back of a non-fiction book, so that it can easily retrieve the information for future use.
As Google says:
"The web is like an ever-growing public library with billions of books and no central filing system. Google essentially gathers the pages during the crawl process and then creates an index, so we know exactly how to look things up. Much like the index in the back of a book, the Google index includes information about words and their locations. When you search, at the most basic level, our algorithms look up your search terms in the index to find the appropriate pages."
An SEO specialist can have a profound influence on how a website is discovered by the crawlers, but indexing is purely a a Google function.
Last week Google inferred that it would be adding a new ranking signal: websites that offer better mobile experiences will rank higher on mobile search--and, possibly, desktop rankings. This new signal, however, is merely one of hundreds that define precisely how each algorithm will effect your website's ranking.
Ranking is the end-result of crawling and indexing. Once the information is gathered, and sorted, it is then ranked in a way that Google believe is most beneficial to the browser's experiences. Attentiveness to crawling and indexing is an important function of an SEO specialist, but attentiveness to the many ranking singles is really what sets the best SEO specialists apart.
Backlink has a very helpful list of the "complete list" of Google ranking factors. Some of this list is speculative (Google is often stingy about offering ranking single information), but much of it is really spot-on. Some factors are more important than others; and some matter very little, although it's good to be attentive to each and every one.
If you're looking for more information on algorithms, try Google itself.
Here on The Organic SEO Blog we discuss algorithms not only as functions but ideas. If you're looking for a more philosophical view of algorithms, please read: "A Frank Look at Algorithms."
For a more topical view of algorithms, try: "Algorithms Have Consequences."
And, of course, if you'd like to speak directly to an SEO specialist who understands the complexity of algorithms, we suggest calling our sponsor, Alex Stepman, of Stepmans PC: 215-900-9398.