In This Chapter
• Analyzing how a website fits in its "web neighborhood"
• Viewing websites like an SEO
• Assessing good site architecture and webpages from an SEO perspective
• Assessing website content like an SEO
When people surf the Internet, they generally view each domain as its own island of information. This works perfectly well for the average surfer but is a big mistake for beginner SEOs. Websites, whether they like it or not, are interconnected. This is a key perspective shift that is essential for understanding SEO.
Take Facebook, for example. It started out as a "walled garden" with all of its content hidden behind a login. It thought it could be different and remain completely independent. This worked for a while, and Facebook gained a lot of popularity. Eventually, an ex-Googler and his friend became fed up with the locked-down communication silo of Facebook and started a wide open website called Twitter. Twitter grew even faster than Facebook and challenged it as the media darling. Twitter was smart and made its content readily available to both developers (through APIs) and search engines (through indexable content).
Facebook responded with Facebook Connect (which enables people to log in to Facebook through other websites) and opened its chat protocol so its users could communicate outside of the Facebook domain. It also made a limited amount of information about users visible to search engines. Facebook is now accepting its place in the Internet community
and is benefiting from its decision to embrace other websites. The fact that it misjudged early on was that websites are best when they are interconnected. Being able to see this connection is one of the skills that separates SEO professionals from SEO fakes.
I highly recommend writing down everything you notice in a section of a notebook identified with the domain name and date of viewing.
In this chapter you learn the steps that the SEO professionals at SEOmoz go through either before meeting with a client or at the first meeting (depending on the contract). When you view a given site in the way you are about to learn in this chapter, you need to take detailed notes. \fc>u are likely going to notice a lot about the website that can use improvement, and you need to capture this information before details distract you.
Keep Your Notes Simple
The purpose of the notebook is simplicity and the ability to go back frequently and review yDur notes. If actual ph^ical writing isn't yDur thing, consider a low- tech text editor on your computer, such as Windows Notepad or the Mac's TextEdit.
Bare-bones solutions like a notebook or text editor help you avoid the distraction of the presentation itself and focus on the important issues^the characteristics of the web site that yDu're evaluating.
If you think it will be helpful and you have Internet access readily available, I recommend bringing up a website you are familiar with while reading through this chapter. If you choose to do this, be sure to take a lot of notes in your notebook so you can review them later.
The 1,000-Foot View—Understanding the Neighborhood
Before I do any work on a website I try to get an idea of where it fits into the grand scheme of things on the World Wide Web. The easiest way to do
this is to run searches for some of the competitive terms in the website's niche. If you imagine the Internet as one giant city, you can picture domains as buildings. The first step I take before working on a client's website is figuring out in which neighborhood its building (domain) resides.
This search result page is similar to seeing a map of the given Internet neighborhood. \fc>u usually can quickly identify the neighborhood anchors (due to their link popularity) and specialists in the top 10 (due to their relevancy). >t>u can also start to get an idea of the maturity of the result based on the presence of spam or low-quality websites.
During client meetings, when I look at the search engine result page for a competitive term like advertising, I am not looking for websites to visit but rather trying to get a general idea of the maturity of the Internet neighborhood. I am very vocal when I am doing this and have been known to question out loud, "How did that website get there?" A couple times, the client momentarily thought I was talking about his website and had a quick moment of panic. In reality, I am commenting on a spam site I see rising up the results.
To turn this off, append "&pws=0" to the end of the Google URL.
Also, take note that regardless of whether or not you are logged into a Google account, the search engine will automatically customize your search results based on links you click most. This can be misleading because it will make your favorite websites rank higher for you than they do for the rest of the population.
Along with looking at the results themselves, I look at the other data present on the page. The amount of advertisements on the search result gives a rough idea of how competitive it is. For example, a search forbuy viagra will return a full page height worth of ads, whereas a search for women
that look like Drew Carey WOPI t likely return any. This is because more people
are searching for the blue pill than are searching for large, bald women with nerd glasses.
In addition to the ads, I also look for signs of temporal algorithms. Temporal algorithms are ranking equations that take into account the element of time with regards to relevancy. These tend to manifest themselves as news results and blog posts.
Taking Advantage of Temporal Algorithms
You can use the temporal algorithms to yDur advantage. I accidentally did this once with great success. I wrote a blog post about Mchael Jackson's death and its effect on the search engines a day after he died. As a result of temporal algorithms my post ranked in the top 10 for the query "Mchael Jackson" for a short period following his death. Because of this high ranking, tens of thousands of people read my article. I thought it was because I was so awesome, but after digging into my analytics I realized it was because of unplanned use of the temporal algorithms. If you are a blogger, this tactic of quickly writing about news events can be a great traffic booster.
After scanning search result pages for the given website's niche, I generally get a sense for that neighborhood of the Internet. The important takeaway is to get an idea of the level of competition, not to figure out the ins and outs of how specific websites are ranking. That comes later.
Easy De-Personalization in Firefox and Chrome
Most SEOs perform searches dozens or hundreds of times per day and when you do, if s important that de-personalized results appear so that you see what a "typical" searcher would see, as opposed to search results influenced by yDu own search history.
Firefox is a terrific browser for SEOs for many reasons, but one of its most helpful features is the ability to search right from the address field of the browser, the area at the top of the browser where you normally see the URL of the web
page yDu're on. Better yet, with a little customization, you can easily perform Google searches that are de-personalized (although not de-geotargeted).
1. From the Bookmarks | Organize Bookmarks... menu, select any bookmarks folder in the left pane. (Do not simply select the Al Bookmarks folder, because it won't work.)
2. Right-click the folder and select New Bookmark...
3. Add the following values to the fields:
Name: Google de-personalized search
Location: http://www.google.com/search?&q=%s&pws=Q Tags: (Optional. Add anytags you want.)
Keyword: g
Description: (Optional. Use this to describe the search.)
4. Click Add.
That's it. Now, go to the AJdress field in Firefox (where you see a URL at the top of the browser) and type something like this:
g hdmi cables
This tells Google (g) to search for "hdmi cables". More important, because yDur Location field included &PwS=o, that URL parameter will carryover to your search result. From now on, if yDu want to perform a de-personalized Google search, simply type "g" (no quotes) and the query term from yDur URL field.
Use this process for creating as many custom searches as you like, keeping these important factors in mind:
1. The Location field must contain the exact URL of the search result, with the exception of the%s variable, which will be replaced with yDur query term automatically.
2. The Keyword field is where yau'll type before your search query to tell Firefox which custom query you'll be running. Be brief and accurate. I use terms like "b" for Bing, "tc" for text cache, and so on.
This functionality carries over to Google's Chrome browser too, because Chrome can import bookmarks from any other browser you use. If you're a Chrome user, simply import yDur Firefox bookmarks from the Chrome | Import Bookmarks and Settings menu, and you can search from the Chrome address bar just like you did in Firefox
Action Checklist
When viewing a website from the 1,000-foot level, be sure to complete the following:
• Search for the broadest keyword that the given site might potentially
rank
• Identify the maturity of the search engine results page (SERP) based on the criteria listed in this chapter
• Identify major competitors and record them in a list for later competitive analysis
This section discussed analyzing websites at their highest level. At this point, the details don't matter. Rather it is macro patterns that are important. The following sections dive deeper into the website and figure out how everything is related. Remember, search engines use hundreds of metrics to rank websites. This is possible because the same website can be viewed many different ways.
The 100-Foot View—The Website
When professional SEOs first come to a website that they plan to work with, they view it through a very different lens than if they were just idly surfing. They instinctively start viewing it from the perspective of a search engine. The following are the elements that my colleagues and I pay the most attention to.
কোন মন্তব্য নেই:
একটি মন্তব্য পোস্ট করুন