Monday, March 23, 2009

55 Quick SEO Tips Even Your Mother Would Love

Everyone loves a good tip, right? Here are 55 quick tips for search engine optimization that even your mother could use to get cooking. Well, not my mother, but you get my point. Most folks with some web design and beginner SEO knowledge should be able to take these to the bank without any problem.

1. If you absolutely MUST use Java script drop down menus, image maps or image links, be sure to put text links somewhere on the page for the spiders to follow.

2. Content is king, so be sure to have good, well-written and unique content that will focus on your primary keyword or keyword phrase.

3. If content is king, then links are queen. Build a network of quality backlinks using your keyword phrase as the link. Remember, if there is no good, logical reason for that site to link to you, you don’t want the link.

4. Don’t be obsessed with PageRank. It is just one isty bitsy part of the ranking algorithm. A site with lower PR can actually outrank one with a higher PR.

5. Be sure you have a unique, keyword focused Title tag on every page of your site. And, if you MUST have the name of your company in it, put it at the end. Unless you are a major brand name that is a household name, your business name will probably get few searches.

6. Fresh content can help improve your rankings. Add new, useful content to your pages on a regular basis. Content freshness adds relevancy to your site in the eyes of the search engines.

7. Be sure links to your site and within your site use your keyword phrase. In other words, if your target is “blue widgets” then link to “blue widgets” instead of a “Click here” link.

8. Focus on search phrases, not single keywords, and put your location in your text (“our Palm Springs store” not “our store”) to help you get found in local searches.

9. Don’t design your web site without considering SEO. Make sure your web designer understands your expectations for organic SEO. Doing a retrofit on your shiny new Flash-based site after it is built won’t cut it. Spiders can crawl text, not Flash or images.

10. Use keywords and keyword phrases appropriately in text links, image ALT attributes and even your domain name.

11. Check for canonicalization issues - www and non-www domains. Decide which you want to use and 301 redirect the other to it. In other words, if http://www.domain.com is your preference, then http://domain.com should redirect to it.

12. Check the link to your home page throughout your site. Is index.html appended to your domain name? If so, you’re splitting your links. Outside links go to http://www.domain.com and internal links go to http://www.domain.com/index.html.

Ditch the index.html or default.php or whatever the page is and always link back to your domain.

13. Frames, Flash and AJAX all share a common problem - you can’t link to a single page. It’s either all or nothing. Don’t use Frames at all and use Flash and AJAX sparingly for best SEO results.

14. Your URL file extension doesn’t matter. You can use .html, .htm, .asp, .php, etc. and it won’t make a difference as far as your SEO is concerned.

15. Got a new web site you want spidered? Submitting through Google’s regular submission form can take weeks. The quickest way to get your site spidered is by getting a link to it through another quality site.

16. If your site content doesn’t change often, your site needs a blog because search spiders like fresh text. Blog at least three time a week with good, fresh content to feed those little crawlers.

17. When link building, think quality, not quantity. One single, good, authoritative link can do a lot more for you than a dozen poor quality links, which can actually hurt you.

18. Search engines want natural language content. Don’t try to stuff your text with keywords. It won’t work. Search engines look at how many times a term is in your content and if it is abnormally high, will count this against you rather than for you.

19. Not only should your links use keyword anchor text, but the text around the links should also be related to your keywords. In other words, surround the link with descriptive text.

20. If you are on a shared server, do a blacklist check to be sure you’re not on a proxy with a spammer or banned site. Their negative notoriety could affect your own rankings.

21. Be aware that by using services that block domain ownership information when you register a domain, Google might see you as a potential spammer.

22. When optimizing your blog posts, optimize your post title tag independently from your blog title.

23. The bottom line in SEO is Text, Links, Popularity and Reputation.

24. Make sure your site is easy to use. This can influence your link building ability and popularity and, thus, your ranking.

25. Give link love, Get link love. Don’t be stingy with linking out. That will encourage others to link to you.

26. Search engines like unique content that is also quality content. There can be a difference between unique content and quality content. Make sure your content is both.

27. If you absolutely MUST have your main page as a splash page that is all Flash or one big image, place text and navigation links below the fold.

28. Some of your most valuable links might not appear in web sites at all but be in the form of e-mail communications such as newletters and zines.

29. You get NOTHING from paid links except a few clicks unless the links are embedded in body text and NOT obvious sponsored links.

30. Links from .edu domains are given nice weight by the search engines. Run a search for possible non-profit .edu sites that are looking for sponsors.

31. Give them something to talk about. Linkbaiting is simply good content.

32. Give each page a focus on a single keyword phrase. Don’t try to optimize the page for several keywords at once.

33. SEO is useless if you have a weak or non-existent call to action. Make sure your call to action is clear and present.

34. SEO is not a one-shot process. The search landscape changes daily, so expect to work on your optimization daily.

35. Cater to influential bloggers and authority sites who might link to you, your images, videos, podcasts, etc. or ask to reprint your content.

36. Get the owner or CEO blogging. It’s priceless! CEO influence on a blog is incredible as this is the VOICE of the company. Response from the owner to reader comments will cause your credibility to skyrocket!

37. Optimize the text in your RSS feed just like you should with your posts and web pages. Use descriptive, keyword rich text in your title and description.

38. Use captions with your images. As with newspaper photos, place keyword rich captions with your images.

39. Pay attention to the context surrounding your images. Images can rank based on text that surrounds them on the page. Pay attention to keyword text, headings, etc.

40. You’re better off letting your site pages be found naturally by the crawler. Good global navigation and linking will serve you much better than relying only on an XML Sitemap.

41. There are two ways to NOT see Google’s Personalized Search results:

(1) Log out of Google

(2) Append &pws=0 to the end of your search URL in the search bar

42. Links (especially deep links) from a high PageRank site are golden. High PR indicates high trust, so the back links will carry more weight.

43. Use absolute links. Not only will it make your on-site link navigation less prone to problems (like links to and from https pages), but if someone scrapes your content, you’ll get backlink juice out of it.

44. See if your hosting company offers “Sticky” forwarding when moving to a new domain. This allows temporary forwarding to the new domain from the old, retaining the new URL in the address bar so that users can gradually get used to the new URL.

45. Understand social marketing. It IS part of SEO. The more you understand about sites like Digg, Yelp, del.icio.us, Facebook, etc., the better you will be able to compete in search.

46. To get the best chance for your videos to be found by the crawlers, create a video sitemap and list it in your Google Webmaster Central account.

47. Videos that show up in Google blended search results don’t just come from YouTube. Be sure to submit your videos to other quality video sites like Metacafe, AOL, MSN and Yahoo to name a few.

48. Surround video content on your pages with keyword rich text. The search engines look at surrounding content to define the usefulness of the video for the query.

49. Use the words “image” or “picture” in your photo ALT descriptions and captions. A lot of searches are for a keyword plus one of those words.

50. Enable “Enhanced image search” in your Google Webmaster Central account. Images are a big part of the new blended search results, so allowing Google to find your photos will help your SEO efforts.

51. Add viral components to your web site or blog - reviews, sharing functions, ratings, visitor comments, etc.

52. Broaden your range of services to include video, podcasts, news, social content and so forth. SEO is not about 10 blue links anymore.

53. When considering a link purchase or exchange, check the cache date of the page where your link will be located in Google. Search for “cache:URL” where you substitute “URL” for the actual page. The newer the cache date the better. If the page isn’t there or the cache date is more than an month old, the page isn’t worth much.

54. If you have pages on your site that are very similar (you are concerned about duplicate content issues) and you want to be sure the correct one is included in the search engines, place the URL of your preferred page in your sitemaps.

55. Check your server headers. Search for “check server header” to find free online tools for this. You want to be sure your URLs report a “200 OK” status or “301 Moved Permanently ” for redirects. If the status shows anything else, check to be sure your URLs are set up properly and used consistently throughout your site.

Richard V. Burckhardt, also known as The Web Optimist, is an SEO trainer based in Palm Springs, CA with over 10 years experience in search engine optimization, web development and marketing.

Google algorithm update Vince favors Big Brands

Recently there was a rumor on the Internet that there has been a change in Google algorithm which seems to be helping big brands for ranking on major keywords. Matt Cutts, the Head of Search Quality group in Google, confirmed in the following video that the rumor was indeed true, and Google ranking algorithm update named ‘Vince’ is responsible for the sudden boost in rankings for big companies for major keywords.


As Matt Cutts explains in the video, this is not a major Google update, but just one of the over 300 changes which Google carries out a year to enhance the quality of their search results. He further explains that this change does not favor any specific brands, but rather gives a boost in rankings based on relevant criteria such as trust, authority, PageRank or reputation.

Website Grader - Check SEO Score of Your Site

This is is a useful tool for all webmaster, Website Grader is a free seo tool that measures the marketing effectiveness of a website.

it will help you check the SEO score of your site, such as how many del.icio.us does your website got, Google Pagerank, traffic rank from Alexa, inbound links, social popularity and give the score for you site.

Link : website.grader.com

SEO Score: explaining Googlebot’s perspective

Googlebot ViewSEO Text Browser is an easy tool which allows the user to view the same perspective that Google does when it crawls a website. As Google builds its search engine everyday, it sends an agent out into the world called Googlebot. This agent could be described as a robot scout because it looks around your website and then reports back to the Google mothership on everything it has seen. For those of you new to Search Engine Optimization (SEO), Googlebot is the most important visitor to a website. Depending on the experience, it could recommend a website either be awarded top placement or be buried in a dusty janitor’s closet in building 46 at Google World Headquarters.

SEO Browser of MattCutts.comToday we are taking a look at our SEO Text Browser. This is a tool we wrote which loads a page in real time and then gives quick SEO tips. There is no reason for a sloppy page, so the first thing it does is make a quick check to see if there are any missing tags. Missing meta descriptions and Alt tags on images are common mistakes made by beginners and experts alike. For example, to the right we are taking a look at MattCutts.com. Every time a whois record is loaded on DomainTools.com, we fetch a copy of the homepage for that record (sometimes a cached copy). We then display how any search engine robot would view that page. You can locate this widget on the DomainTools’ whois page on the right hand side of any record.

Summary of SEO ScoreAbove the whois, we also have this summary display of what the search engine robot would have seen. As we can see in the example, Matt Cutts does not have a meta description for his page. For those of you that do not know, Matt is a Google Engineer, so it is a bit ironic that he does not populate this. Overall Matt’s front page lacks a lot of things and his score was a 68% for on-page content.

It is important to note, the SEO Score is only calculated for on-page information found on the front page of a website, so it is possible for everyone to achieve a perfect SEO Score of 100%. When we first launched the SEO Text Browser, we discovered we had missed a few Alt tags on our own front page. The browser can also be undocked and taken for a test drive on any web page, not just the front page. Just click on the browser in any whois page and a full browser will pop out of the page. As you browse the website, you are now viewing the website as a robot would see it. Robots cannot see images or read them, so be sure to use Alt tags and text on webpages.

The SEO Text Browser runs on JSON, so it will be possible for us in the future to allow anyone to take a copy of this program and embed it on any website. Look forward to this in a future release.

What’s Your Web Site’s SEO Score?

Almost anyone who has used the internet for the past three years or so would have stumbled upon the term SEO. Those with web sites would actually care what SEO means.

Anyway, if you want a simple indicator on how potent your web site is in the SEO department, then you might want to check out Web Site Grader. The ranking score and explanation is easy to understand and would not scare away those new to SEO.

In case you’re wondering; HTNet got a score of 89. Not bad, eh? ;)

Thursday, March 12, 2009

Handling 301 Redirects with PHP

What To Do With Your 404 Errors

Part 2 of the SEO for PHP Series

Blast! I’m late for my Nunchucks class and I can’t find my keys! I was pretty sure I left them right here by my laptop. Ah ha, there they are—glad I finally found them. Good thing I did, ’cause I need the practice. Check me out at last week’s class.

Don’t you hate it when stuff gets lost? I sure do. When you’re sure that something is there and then all of the sudden, 404! It’d be helpful to have a little 404 post-it note show up on my shelf when my wallet is not in its place. Or, when sitting down to dinner I’d see a big foam 404 cutout where my son should be but is instead still outside playing.

Well, as much as it bugs you, it’s probably just as bad for your site’s visitors to find a link somewhere in a search engine or blog, and come to your site just to get zinged by a big 404 message (the system is down, yo!).

Even cooler than the 404 post-it notes would be little 302 messages, with detailed information pointing me to where the object of my search hides. I can see it now, “302: Keys temporarily moved between the couch cushions.” Of course, you never really see 302 messages on the web, but it would be nice in real life.

So why not equip your site with a clever little something to take care of any misplaced pages? You may even go as far as to write a little script (or borrow the one I’ll include here) to transform those 404s into 301s and get your users back on the road again. Here’s the process I’d follow:

  1. Get yourself a Google account and add your site to your webmaster tools if you haven’t already.
  2. Make a list of 404 errors that your users have come across on your site.
    • Find out what 404 errors Google has found (look through your webmaster tools).
    • Look in your server logs for other 404 errors.
  3. Search through search engine results for problem pages.
  4. Create a PHP “map” page (essentially an array of missing pages and their new locations).
  5. Add a small piece of code before all other code processed on your site to handle redirects.

*A quick side note: right in the middle of typing this entry, I found Nelson’s enormous water jug here in my office—must have been left here sometime today. I just sent it back to him with a friendly 302 post-it note attached “302 Redirect: The water jug you’ve been looking for was temporarily moved to Peter’s office”. Wow, these notes are sure helpful! I need to get a patent attorney on the phone quick! Alright, back to the post…

Remember that not all the 404 messages that are dished out on your server need to be redirected. Some of them are legitimate status codes for pages people have requested that you really don’t have on your site. So sift through and create your list of only the pages you want people to still be able to find. Once you’ve identified these pages, organize them into an array map list like so (I’ve created a file called redir_map.php):

  1. $redir_map_arr = array(
  2. ‘productsconnect.htm’ => ‘products’,
  3. ‘press-releases/press-main.htm’ => ‘news/press-releases’,
  4. ‘current-news/news-main.htm’ => ‘news’
  5. );

OK, I’ve only got 3 pages to redirect here in this example, but it should still be pretty fun. Save the redir_map.php file somewhere that makes sense on your site. Now on to the implementation.

Depending on the type of structure you have on your site for delivering pages, you’ll have to find a way to include this little redirection map before ANYTHING else on your site is processed. This is very important because we’re going to send some HTTP status codes to the agents requesting pages on our site. If you’ve already sent headers, you’re likely going to run into problems.

I personally like to build sites to redirect all incoming traffic to the index.php page (done using an .htaccess file and explained in a future blog entry). This keeps my sites DRY–I don’t have to put includes on every page for a header and footer and what not. So I put the following PHP code in my index page before just about everything else (you’ll have to determine where to put this script on your own site—just ensure it is the very first piece of code on your site that sends headers):

  1. require("redir_map.php");
  2. foreach ($redir_map_arr as $old_url => $map_to){
  3. if(strpos($_SERVER[‘REQUEST_URI’], $old_url)){
  4. header( "HTTP/1.1 301 Moved Permanently" );
  5. $host = $_SERVER[‘HTTP_HOST’];
  6. $uri = rtrim(dirname($_SERVER[‘PHP_SELF’]), ‘/\\);
  7. $query = ($_SERVER[‘REDIRECT_QUERY_STRING’]) ? ‘?’.$_SERVER[‘REDIRECT_QUERY_STRING’] : NULL ;
  8. header("Location: http://$host$uri/$map_to/$query");
  9. }
  10. }

Alright, as you can see, I’m requiring the redir_map.php file we created previously. I then loop through each name value pair in the array (the key as $old_url and the value as $map_to). Then I look for the page that’s being requested by the incoming agent, and see if it matches something in my list. You could write a regular expression for more sophisticated matching, but this works for simple stuff.

Then comes the meat of this operation. The first thing I’m going to do is send a header telling the agent that the file it’s requesting has been moved permanently. This way, Google, Yahoo et al will make sure to index the correct page, leaving the old reference url alone.

The $host, $uri and $query variables should be obvious (see exception below). After setting those, I simply redirect the request to the new location using the PHP header function again.

Last, I exit out of PHP execution. You’ll want to do this just to make sure your script doesn’t continue on with whatever it would have done had you not just redirected your visitor to the new place.

That’s it! I’m going to leave comments open on this to see what other ideas others may have, or see what others have done for this same problem. Thanks everyone—and keep an eye on those keys and water-jugs!

Optimizing for Navigational Searches

People search for many different reasons. These myriad reasons can be broken down into a few wide segments, informational, transactional, and navigational searches. There are millions of searches when people simply need information, but often even informational searches are just the early stages of the purchasing process. It is important to understand the different ways and reasons the people search and be sure that your site is optimized to be found at each stage.

The type of searches that SEO people rarely talk about is the navigational searches. Navigational searches are search queries where people search for your exact domain name or brand name. Basically, they know exactly what they are looking for, but they choose to search to find it instead of typing it into the browser address bar.

Don’t think navigational search is a big deal? Check out the top 10 overall searches for July 2008 from Hitwise: myspace, craigslist, ebay, myspace.com, youtube, mapquest, yahoo, facebook, www.myspace.com, craigs list

Compete data shows 17% of searches are navigational. Why are there so many searches for domain names and site names? Many people will enter the URL or site name into the search box and it magically appears as the top link and the click on it. Sure, it adds an extra click that really isn’t necessary, but that’s how they learned to find things on the web so why change if it works?

As marketers, we need to understand that this is a huge part of the way people navigate the web and we need to be sure our sites are optimized to show up when people are looking for us.

Domain Name Searches
First off, make sure your site is indexed in the search engines. Does your site show up first in a search for your domain name? Unless your site is penalized, it should show up at the top of the results when you search for the domain name. Follow SEO best practices with regards to your website architecture and content. Don’t hide your content behind javascript or Flash navigation. Include a link to your HTML sitemap in the footer of your site. Make sure your XML sitemap is updated and submitted to the search engines. Be sure you are blocking any directories and pages of your site that you don’t want indexed, but be careful to not deny the search bots from indexing the pages you want them to find. Adding a / in the wrong place in your robots.txt file can get your site removed from the search engines.

Company/Brand Name Searches
Brand name searches can be an interesting challenge. It seems like it would be so obvious to the search engines and your site would automatically show up at the top for all searches for your brand name. Sometimes that is the case, but the larger your brand is, the more competition you will have for your own name. This could come in the form of affiliates, review sites, news articles, press releases, and many other types of pages. Usually it is fine to have that stuff showing up, as long as your official site shows up at the top. To make sure it does show up, be sure to feature your brand name(s) prominently on the site in textual format, not just graphics. Most of the time you will have plenty of links to your site using your brand name as the anchor text. If your site isn’t on top, however, or if you have a newer site or brand, it might take a while to get enough link juice to move to the top for your own name.

Another issue that I’ve seen on occasion is when you run into competition from other companies or products that have the same or similar brand name. They might be in a completely different market, but if they have the same name, you could have a hard time beating them for your own name if they are more established and have more link equity than your site. Another thing to consider with name searches is misspellings or spelling variations. You usually won’t want to look stupid by misspelling your own company name in the content on your site, but a few low-profile links with the misspelled version will often do the trick. Sometimes Google figures out what they meant to type and serves up your site anyway, so you should check to see what shows up for common misspellings.

International Search Engines
If you have content catered to an international audience, you should make sure your site is showing up for brand/domain searches in that country. Google is the biggest search in most countries, except Baidu in China, Yahoo in Japan, and Yandex in Russia. Make sure your site is listed and shows up for your name in the top 3 search engines for the countries you are targeting. The easiest way to get top billing in the country-specific search engines, including pages from country searches in the localized Google search, is to have a separate, localized site on the country’s preferred TLD. You can also set up international sites on subdomains or sub directories of your main domain. Then you can specify to Google which country each subdomain or directory pertains to.

Product Name Searches
Although not exactly navigational searches per se, I wanted to mention product name searches. People might search for a product name or SKU when they know exactly what they want to buy. They might be comparing prices or just looking for the best place to purchase. Or they may be looking for reviews and feedback from other purchasers of the same item. Or it could be existing customers looking for support information or accessories for their product. Whatever the reason, you would be well served to have your product pages showing up for these keywords. The best approach will depend on how competitive your products are. If you are the only one selling the product, all it will take is to get the pages indexed. If you are competing against thousands of other resellers, it will prove more difficult and you’ll need to do some serious optimizing and link building to those specific pages. Start off with the basics of getting the pages indexed. Spend some time searching for your products and see what you find. Analyze the level of competition and put together a plan to get your pages to show up on the first page. You might not get a ton of searches on any single page, but if you sell thousands of products, the aggregate traffic and sales from those keywords will make a big impact.

PPC for Navigational Searches
You should be able to show up for all of your brand terms without having to pay for the traffic. One thing to consider however, is that if you augment your organic SEO with paid listings, you increase the chances of getting people to click through to your site. This can be especially important if you have competitors bidding on your brand or other navigational searches. You want to do everything you can to make sure they get to your site and not your competition. On the flip site, you could use the same strategy to try to capture some of the navigational search traffic to your competitors by bidding on their brand names and offering a compelling alternative product or service.

More than Just Rankings
One last thing I wanted to mention about optimizing for navigational searches. Showing up for your domain and brand names is just the first step. Your really want as many of those searchers to click through to your site, so you should pay close attention to the title and description snippet that show up in the search listings. If you have a number one ranking, but no title or description, it will be easy for searchers to skip it for the more compelling link right below.

Stop Worrying About Rankings

Do you search Google for your trophy keyword every morning to see if you’re still on the first page? Do you check more than once a day? If so, you are a rankings junkie and it’s time to shake the habit. Things have changed with the search engines to the point where rankings are no longer the best indicator of SEO success. Honestly, rankings have never been the best indicator of success, but this is becoming more and more important for marketers to understand. You should be focusing on how much traffic is coming from search, which keywords are driving that traffic, and most importantly, which keywords are driving sales.

The most recent twist in the search universe was Google’s announcement this week of the launch of SearchWiki. SearchWiki gives registered Google users the opportunity to mess with the position of sites in the SERPs. Basically, it gives me as a searcher the opportunity to pick which site shows up in the top spot and get rid of all the crap that isn’t relevant for any given search. Sound incredible? Don’t get too excited, any changes I make will only be visible to me when I’m logged into my Google account. However, position adjustments and comments people make in SearchWiki can be viewed by other searchers if they click on the “See all notes for this SearchWiki” link at the bottom of the page. If this catches on, and isn’t ruined by spammers, I expect SearchWiki to gradually gain more importance in what people see when they search. The first step will be to allow the option to let people’s search results be influenced by friends’ wiki changes, and then Google could start including aggregate wiki data as part of their search algorithm for the general public.

At the recent PubCon conference, representatives from all three major search engines spoke about how each is trying to offer more than “ten blue text links.” What they mean by this is that rather than the traditional 10 text links to web pages when you search, they are starting to serve up other types of content that is relevant to your search query. We’ve been hearing about this trend for the past couple years, and it has gradually become more prominent in the search results–known as a Universal or Blended search. Any given search could yield results for images, video, shopping, blogs, local maps or news. Rather than just links to these other types of media, they are often embedded right in the search page. With this shift away from the standard “10 blue text links,” it changes the paradigm of search engine optimization. While optimizing web pages is still important, if you aren’t creating and optimizing a wide array of digital assets, you are missing out on a huge opportunity to get your brand in front of searchers.

Another major shift in the search engines that will continue to have a huge impact on search marketers is personalization of the search results. The search engines are starting to customize the search results for each individual searcher based on their search history, geographic location, or other demographic factors. This change makes it futile to focus on search engine rankings, because the ranking will vary depending on who’s searching.

Mobile search is another important area to consider. The newest smart phones like the G1 and iPhone make mobile search look a lot like regular web search, but it is still a different experience searching the web from a mobile device. It’s a much smaller screen, and people aren’t usually searching for the same reasons they would search at the office or at home. Mobile search centers more around local search–it’s about finding restaurant reviews, phone numbers, directions, stuff they need to know when they’re on the go. Often, the default Google search from a mobile phone serves up search results that are localized to the searchers location. Search tools like Google’s recently launched voice search application for the iPhone, and ChaCha, which has been around for a while, give people the option to speak their search queries, or even send them via SMS text messages.

What does all this mean? It means we need to rethink how we look at search engine optimization. We need to do all the little things to make our sites relevant for our keywords. We also need to think beyond our own websites and provide unique, valuable content in as many different formats as possible. Focus on being relevant to whatever and however people might search, and your traffic and sales numbers will tell you if you are hitting your target.

New Canonical Link Tag Will Help with Duplicate Content

Just like the lamb lying down with lion, but the big three search engines came together for a rare joint effort to announce the launch of a new feature that will help ease the world’s duplicate content problems. The new feature, called the tag tells the search engines what version of the URL is the correct one to want index.

The link tag goes in the section of the page of looks like this:

The search engines have been doing a pretty good job figuring out the right URL when people use redirects properly, but it’s not always easy to get it right–especially when we build these nutty sites with 50 different URLs pulling up the same content. This change should help a lot of webmasters sleep easier knowing that their proper, canonical URL will be indexed and given all the link juice it deserves. I look at it like a page by page sitemap to tell the search engines what URL to index.

Official blog posts about the new link / canonical tag:

Yahoo

Google

Microsoft

Coverage on other blogs:

Search Engine Land

Joost already created a WordPress plugin for this feature

3 Reasons to Use rel=canonical, 4 Reasons not to use it

Trouble with Google Search Partners Metrics

Back in October, Google Adwords separated the metrics for traffic from Google and their Search Partners. We all had our assumptions about how our campaigns were performing in the Search Network, but this updated allowed us to see the actual numbers.

Google Search Partners Statistics

For the most part, this new information is more informative than actionable. Without knowing which exact partners the traffic is coming from and without having the ability to opt out of individual partner sites it is hard to make any significant changes to your campaigns. But there are still some little adjustments that can be made to improve your campaigns.

One of the main things you can do with these metrics is work to improve your Quality Score. Your quality score is only affected from ads running on Google. So if you look at a campaign and take out all the Search Partner traffic, you can see the exact metrics that are driving your Quality Score and make adjustments to keywords and ads where needed.

I was making these exact adjustments to one of my campaigns a few weeks ago and ran into some trouble. The trouble didn’t come on the day I made the changes to try to improve my Quality Score, it came a few days later. I came back to the account after a few days, with the intent of making some major bid increases in anticipation for increased traffic for the holidays. I jumped in and started looking at data for the last 30 days to see which terms were performing well. Here is where my trouble came. My settings in the account were still set to look at Google traffic only. So as I started to make bid changes, I was making my changes based on incomplete data.

Fortunately, I noticed that the numbers weren’t adding up and changed the settings before any of the new bids went live, but it still scared me a bit.

I now have a new step in my process whenever I make major changes to my Adwords account: change the “statistics” setting to “summary.” Hopefully, this will help me avoid ever making this mistake again.

3 Ways to Increase Sales in Google Adwords

The competition in PPC advertising is increasing daily. With this increased competition, it is increasingly important for advertisers to continually build and improve their campaigns. Everyone should be expanding their keyword list, improving their ads through split testing, and managing their bids on a regular basis.

In addition to these account best practices and other account management techniques, there are three things that can help you drive more traffic and generate more sales for your site. They are, increase your negative keywords, use the Placement Network, and try out the Display Ad Builder.

Increase your negative keywords

Adgroups that only have [exact] and “phrase” match keywords are missing out on related keywords that are only triggered through broad match terms. Some people shy away from broad match terms because they can bring in unrelated traffic. No one wants to pay for traffic from unrelated keywords. The answer is to expand your negative keyword list.

By expanding your negative keyword list, I’m not talking about adding 10 to 20 more terms. I’m talking about adding hundreds to thousands of negative keywords. Aside from adding the common sense negative terms that you can come up with on your own, there are two places you can look for negative keywords in your Adwords account. The first is by running a Search Query Performance Report.

You can create a Search Query Performance Report by going to the Reports tab, creating a new report, and then selecting the Search Query Performance under Report Type. Run this report on all the campaigns in your account that have broad match keywords in them. This report will show you the exact keywords that triggered your ad for the time period. Pick out the terms that don’t relate to your product and add these to your negative keyword list for that campaign. Running these reports on a campaign level is the way to go, since you add the negative keywords on a campaign level as well. Don’t forget to also look for the terms that just aren’t performing. If you have a keyword that appears to be related but isn’t making any sales, add it to your negative list as well. There is no point spending money on a term that doesn’t perform.

The second place you can go to build your negative keyword list is the Keyword Tool in the Tools section of your account. Put in one of your keywords and search the results for any terms that don’t relate to your product. Add all of these terms to your negative list.

Building out your negative keyword lists will prevent you from showing up for unrelated terms and will still take advantage of all the long tail keywords related to your web site.

Placement Network

Many Adwords users have tried the Content Network, wasted a lot of money on worthless clicks, and have written off the Content Network as useless. It is time for you to reconsider. The Placement Network now allows you to pick the exact web sites you want your ad to show up on. You don’t have to worry about showing up on random sites throughout the network and wasting your ad dollars. You pick related sites where your customers will be and your ads will only show up on those sites.

Here is an example of how this works. Let’s say you sell women’s briefcases. Once you’ve activated placement targeting in your campaign search, go to the Placement Tool to search for related sites. You can search by category, topic, URL or demographics. Use all these different search functions to find possible sites. In this case, the demographic search will be very helpful as you can search according to gender, age, and income. For women’s briefcases, I could look at sites that target women ages 25 – 64, with house hold incomes over $100,000.

Here is a short list of some of the sites that showed up using these search tools: www.downtownwomensclub.com, www.womanowned.com, www.thefashionspot.com, and www.businessknowhow.com. Just add the web sites that you know for sure your customers visit. Don’t worry about being everywhere. Your customers visit more sites than just yours and Google.com. Using the Placement Network in the right way will expand your presence and bring in more traffic and sales.

Display Ad Builder

The Display Ad Builder is a new tool Adwords introduced to help advertisers that don’t want to or can’t pay a graphic designer to design ads, to easily create display ads of their own. If you have a successful Content Network campaign, you should create some display ads and see how they perform compared to your text ads. In many cases, you’ll find you get a better CTR. In some cases though, the extra traffic won’t necessarily mean extra sales. Keep a close eye on these ads and make sure they are performing and bringing in sales.

Top 6 SEO tips for bloggers

Millions of blogs, and only the top 10 results. It seems like a recipe for headaches and back pain. If you don’t mind, I’ll see if I can give you some pain medication in the form of tried and true methods for getting your blog to show up in those top 10 results.

1. If you are using WordPress, start by installing the all in one seo pack. You can specify unique titles and descriptions for each post using the all in one. The reason you want to have unique titles and descriptions is so that single posts show up easier for long tail keywords, which sometimes bring in visitors or clients 6 months down the road. It’s always nice to be there for an obscure search term, so you can beat out your competitors who aren’t showing up for that term.

2. Headlines: Should be no more than 60 characters, How to’s and top 5 or 10 lists usually work the best to bring in readers. It is also easy to digg a “How To List” or “Top 10 list”. If your keywords are “Internet Business” or “Movie Critiques”, an example could be “The top 5 movie critiques for online shoppers.”

3. Text: Make your paragraphs no more than 6-7 sentences. Lists and bullet points are easier on the eyes and help people read your whole post. Obviously your keywords are an important consideration, have your keyword research handy when writing any post. First and foremost is your reader though, so don’t sacrifice semantics/readability for your keywords.

4. Links: Link to influential blogs or sources as much as possible (no more than 8 links in one post though, you don’t want the reader to get all confused by the abundance of outbound links). There are at least two reasons for this: 1) If you link to an influential blogger, they will see your link, and possibly reciprocate one back to you if your article is good enough. 2) Trusted sources are worth their weight in gold if you want to be seen as an industry expert and to keep your readers coming back for more.

5. Ultimately you will want people to read what you write on your blog. That’s why we have to prepare your blog in such a way that the search engines will find it easier. There are ways to get immediate traffic to your site using Stumbleupon or Digg or some other social bookmarking service, which we use extensively. But the real value in having your post or article go “viral” is for the links that almost automatically come from people who like what you have written. Relevant, keyword targeted text links from a high profile blog or site counts alot towards how well you rank for a particular keyword phrase. Use Stumbling and Digging as much as needed, but first write good content, and make your site search engine friendly.

6. Last but not least, make your RSS feed readily available at the top of your page somewhere. That is usually the first place people look for an RSS feed if they like your post, and if someone wants to link to you or a post of yours, you should give them as many options as possible to do so. An RSS feed is great for publishing content that will then get read and possibly linked to more often.

Using Facebook to Promote Your Business

Again I come to you with an offering of sweet tips on getting links to your site as well as promoting your business. My platform today is Facebook. I seem to spend a lot of time on that site, because they have everything I need in a social network: Video, Audio, Images, Apps, Games, links, etc.

So how and where do you go to promote your business on Facebook so that your personal profile isn’t associated with your professional site?

Here’s a step by step guide:

1. Login to your Facebook account. If you don’t have one, it takes 5 minutes to set an account up.

2. Scroll down to the very bottom of your home page, and click on the ‘Advertising’ link in the footer.

3. Click on the ‘Pages’ link at the top of this page

facebook4

4. Then click on the link that says ‘Create a Page’ (it’s kind of hard to see) facebook3

5. Now Select Your Category (let’s say you’re in the printing industry)pagecategory

6. Now go through the process of uploading your image, writing in some info about yourself, and publish the page. Now comes the meat and potatoes. Go to your admin area by clicking on Edit Page:

fbook-edit

7. Then click on this link here : More Useful Apps

8. Now you’ll see a list of the most useful apps for pages, I’ve circled the best ones for SEO and User Friendliness:usefulapps

9. The FBML app allows you to add static html to a widget in your page’s sidebar. Which means links, links to images, and there are a number of fbml markup tags that you can use in it as well. All dofollow!

10. The Blog RSS Feed Reader sometimes doesn’t work correctly, and you have to know the RSS feed of your blog (but I’m assuming that if you have a blog, you’ll know what your RSS feed is). But when this app works correctly, it will add an RSS feed of your 5-10 most recent posts, pulling in the title of the post, with do follow links to each blog post. Very handy!

To sum up, if you are using Facebook, the 2 best apps for links that are helpful to your fans are the Blog RSS Reader, and the Static FBML App.

Beware of Spammers in Capitalist’s Clothing

Time for a little rant.

I’ve been researching Google’s initiative to crack-down on websites selling and buying paid links. I’ve let myself get a bit riled-up as I’ve re-discovered the following:

  1. Too many people still think Capitalism is somehow inherently evil
  2. Some usually bright people have an amazingly hard time distinguishing between spam and good content

I’m not going to spend a lot of time on why Capitalism is not inherently bad. Go read Mike Mann’s book on making change—it’s a free book (making it appealing even to anti-capitalists).

On point number two I’ll voice a few more thoughts. Aaron Wall wrote an insightful post on new link strategies that people have employed to avoid having to purchase links outright. Some of the comments to that post just about killed me.

One comment reads, “It can’t be long until Google starts detecting these types of strategies.” An astute retort followed shortly, “Never going to happen. What is there to detect? Good content written by an author who writes about the field? Sorry, writing guest posts/content is as legitimate as it gets.”

After reading so many on similar blog posts, I got the feeling that there are many people out there who must have been bitten so many times by the spam bug that they can no longer see the difference between junk and good content.

What do people expect? Should Google be penalizing online newspapers because their journalists get paid to produce the content? Should Google ban their own site for offering up paid listings?

I think some people have this idea that any website actively trying to get links, traffic, or any other type of attention is spam, or at least in the same category. They think that any site attempting to draw traffic must be doing so surreptitiously, or behind some clandestine operation. No so! These are surely the same people who think Wikipedia would turn to the dark side by posting ads on the site. I’ve got news for you people; most of the sites you read that have content worth reading exist because someone is getting paid (refer to the link to Mike Mann’s book above).

The difference between spam and good content lies in context and relevance—two things that these spam crying scuttlebutts should be able to determine. Google doesn’t claim any artificial intelligence and they seem to be able to do a good job most of the time.

Don’t get me wrong, I hate spam too. But you need to know the difference. Here is a very succinct and simple way to distinguish spam from quality content for those of you who have a hard time telling he difference: spam will always appear unsolicited and out of context. Both attributes must accompany any content for it to be categorized as spam. If you have a site that has relevant content about a particular subject and it is accompanied by pertinent ads, you are not looking at spam. Read this entry by Matt Cutts for other good insights.

How do I get my Site Indexed in Google?

So you have put together a website and have it published on the web. All the content is in place and all your site’s functionality is working properly. What comes next? This question can be answered by answering another question I have seen several times on the forums at SEO.com and other forums that I frequently visit. This question is, “how do I get my site indexed in Google?”

How does someone get their site indexed in Google and the other search engines? I have seen many different answers to this question. Some of these answers are, “submit the site to Google” or “link to your site from blogs and forums” or even “submit your site to some directories.” While these methods may all work, there is one really effective way to get your site indexed fairly quickly in Google. I have used this method for several sites that were not indexed yet, and each site was indexed in Google within about one week. This is done by submitting a sitemap in XML format to Google through Google Webmaster Tools.

When you create your XML sitemap for Google you can also submit it to Live Search through their Webmaster Center. You should also be sure to reference your sitemap in your robots.txt file. To reference it, just add the following line to your robots.txt file and insert your own domain name.

  • Sitemap: http://www.example.com/sitemap.xml

Submitting a sitemap doesn’t mean you don’t have to do those other things I mentioned as answers to getting indexed. Building inbound links to your site will not only help with getting indexed, but will also help with improving the rankings of your site. You should continue to look for good places to get inbound links. Directories, social media, blogs, and forums are just some of the places where inbound links can be acquired. But building inbound links is a topic for another blog post.

Keep up with the latest SEO techniques by joining me at our SEO Forums.

Where did I put that SEO button?

For all you people out there who are looking to do SEO, I have some seriously bad news. You might want to sit down for this. Ready? There is no instant SEO button. I’m sorry to have to bear this bad news to you, and I hate having to be the one to break it to you. There’s no switch either. Or simple form to sign, trick to use, or connections you can have with people on the inside. If you want to be there in the top of the field with the best sites, you can’t just call up Google and say “Ok I’m ready.” It takes work and it takes time.

Even though this is fairly well known by now, it’s tempting to think that SEO is that simple. Regardless of how good your SEO firm is, you still have to be a relevant site. Even then, it will still take work and effort to get you to the top of the search engine rankings.

It will happen, from time to time, that a new site will get into a contract for SEO and stop their own developments, essentially filling that one basket with all the eggs. This actually makes things more difficult for the SEO firms. The fact of the matter is that we don’t suddenly make your site more relevant to people searching on your terms. We work to make it so that Google can see your site better so they can decide how relevant you are. We will make suggestions on how to make your site more relevant to your users. A hard, but necessary question to ask yourself is whether your site really is the best site to show up for the given keyword. If it’s not, perhaps you need to make some adjustments.

Here are a few tips to making your site the best site available for your keywords:

  • Make sure you have some method of keeping your site up to date, and a source of information (where possible). A blog, or news section works well for this idea.
  • Don’t fall too in love with the overall design and look of your site. Be willing to make changes, and reorganize and restructure how the site works.
  • Consider keywords that don’t have corresponding pages. Perhaps pages need to be created to fit that missing piece.
  • Most importantly, continue to develop your site like you would your business.

In the end, having a website that people want to find makes SEO that much better and faster. Working with your SEO service provider to make sure that you really are the best site out there will do wonders, and not to mention make a lot of people happy–including your SEO firm.

On-Page Optimization Tips

There are many factors that contribute to showing up well in search engine results. Some of the most important items are found on the web site themselves and are called “On-Page Optimization.” Some of the areas of concern on websites include:
  • Page Title
  • URL
  • Meta Description
  • Alt Attributes for Images
  • Content

Before doing anything with On-Page Optimization (or any other SEO for that fact) you first have to choose the right keywords for your site. You can learn more about keyword research on one of our previous posts.

Page Titles

Titles are very important to on-page optimization. The titles show up at the top of the browser and are often used by search engines as the title in search engine results. The title is a good place for keywords. Make the title direct, to the point and ensure that all the pages on your site have their own unique titles. For example, a page about PVC pipes on a plumbing site could have a title of “Cheap PVC Plumbing Pipes | Joes Plumbing Supplies” (I had to use Joe the Plumber as an example, couldn’t help myself). Notice that with one long keyword term many targeted words are included. By using this keyword phrase a search engine is likely to serve this page when people search for PVC pipes, cheap PVC pipes, cheap plumbing, cheap plumbing pipes, cheap plumbing supplies etc.

URL

Often when sites are designed using a Content Management System (CMS), the URLs on deeper pages look like you just spilled a cup of alphabet soup on your desk. Sometimes these URL’s are dynamic so they change every time you visit the page, making it a huge problem for the bots to properly index the pages. Unfortunately the URLs also have nothing to do with your targeted keywords unless you want to show up for po=J0ro57v. Many CMSs have modifications that can be added to make URLs search engine friendly. For example: www.joesplumbingsupplies.com/products/pvc_pipes is a URL that is Search Engine Friendly, because it is short, descriptive, and includes relevant keywords.

Meta-Description

The meta description shows up in the code of the page. Make sure each page on your site has it’s own unique meta description tag. The meta description is often used to describe the page in the search engine results after listing the page title. Having a relevant meta description increases the chances that people will click on your listing. It can also be used to include a phone number or some other call to action.

Code Structure

When structuring the body of the site, use h1 and h2 tags. The Search Engines place a high priority on words in the h1 and h2 tags. Getting ranked first in Google will be of little worth if the content on your site is not relevant or the content is confusing. Sometimes coders forget that people are reading this page and not just some random Google bot. When designing and coding sites, always place priority on the humans coming to the web site over the search engine bots. Be cautious about what content you place in flash. Although Google can now detect the text inside of flash files it is still better to place keywords in the html surrounding the flash. Some sites use Java Script extensively for site structure and navigation. Because bots tend to get confused when reading Java Script avoid using in this way. Many coding geeks have debated whether the bold tag or the strong tag is better for SEO. According to Matt Cutts Google places the same weight on both tags

Alt Attributes

Often times when people do on-site optimization they overlook placing Alt attributes on pictures. By properly describing the pictures on a site the search engines will see that even the pictures are relevant to the keywords. By doing this you are also more likely to show up in the results when people search for images.

Content

The last point is one of the most important. Whether you have content to sell your products or your product is the content it’s important to have enough relevant content all your pages. Make sure the entire page reads smoothly and is understandable because humans buy products not search engine bots. It’s good to link out to other sites as long as they are relevant. Be careful of having too many outbound links on any one page of a site. Many people have the misconception that you build a website and then kick back and watch the money roll in. Building a website is a job that is truly never done. You should always be updating and modifying information on your sites. If you want people coming back to your site you have to give them something to come back for. By adding new content to a site you also improve your chance of ranking well in Google. Regular updates will help your site get indexed faster. Search engines will see frequently updated sites as up to date, accurate and relevant.

On-site Optimization should be the first thing you do after determining your keywords. If you follow these simple guidelines you will find that you will rank much better in search engines.

What Does Alex Rodriguez Have to Do With SEO?

Disappointed? Yeah. Threw me into my low-mid-life crisis. We should have held a memorial for baseball as we knew it.

I remember the years when baseball was a man’s sport, not a cheaters sport. I loved it. I couldn’t miss a Mariner’s game rain or shine…more rain than shine, but the King Dome ruled. I was there to watch Griffey’s rookie season, the Jr. and Sr. tandem, his first Grand Slam and A-Rod’s inaugural homerun. Jay Buhner, Pete O’Brien, Dave Valle, Harold Reynolds, Tino Martinez, Omar Vizquel, The Big Unit…all ancient memories now. We’re talking stone-age ancient, or so it seems.

When the recent news came out about Alex Rodriguez, I was broken for a day. Extremely disappointed. Literally every last one of my hopes for baseball’s future was finally strung out to fade in the sun on an unreachable limb of shame.

How does baseball bounce back from having its icon player completely, publicly wamboozled? Who knows. Alas, we’re not here to talk about the MLB and roids. Thankfully.

Something Profound Amidst the Muck

And what else is there to do than apply the A-Fraud stuff to SEO? Yup, we can apply what A-Rod is going through to SEO. It’s brilliant, check it out.

A-Fraud broke the cardinal rule of SEO, and, well, of life too. Though steroids weren’t in the day to day conversation back then, he marginalized his personal brand integrity and partook of ‘grey area’ stuff when the consequences of taking it weren’t even considered. Steroids was part of the ‘loosey goosey’ era in baseball where players, coaches and trainers just looked the other way and let it happen (I would have considered it flat out cheating, but, to be fair, it actually was a grey area back then. It baffles me as to why but that’s just how it was).

Every day at SEO.com we come across webmasters that have doomed their business to SEO failure. They’re doing the exact same thing that A-Rod did, but to their businesses. They marginalize their site’s integrity by partaking in grey area, loosey-goosey stuff without concern for future consequences. They pump the juice to get some cool short term results without a thought for future ailments that may come of it. Bad idea. Very bad idea.

When you have an all-seeing and omniscient powerhouse like Google looming over you, as a webmaster, you can’t hide under ‘ignorance’ like A-Rod did for so many years. Google doesn’t care who you are. They know everyone and are everywhere, watching you. Kinda creepy. More creepy knowing they control your web business, at least the free traffic part of it. So you HAVE to make them happy. You have no choice. They are no respecter of persons [or websites]. They will find you and they will dump your site as if it didn’t even exist. And search engine placement resurrection is a difficult, lengthy process.

Avoid Grey-ness Like You Would Steroids

Using A-Rod as an example of what happens when you go the grey hat route, you get hammered when the sun exposes the true colors or when the market/boss changes expectations. In other words, once Google finds out or changes policies (and they will) you get hammered. Once the U.S. found out about her golden boy baller’s ugly, old, stale grey stuff, golden boy got hammered. Wouldn’t A-Rod have been better to steer clear of anything that could be considered cheating? Duh. Your business is no different.

For your main site, things that are considered ‘grey hat’ should be avoided completely. It’s not worth the long term repercussions. It just isn’t. Build your site right the first time. Keep as far from the ledge as possible and you’ll be loved by Google and you’ll find a place in your niche’s Google hall of fame. Good luck A-Rod.

For long term sustainability, steady and white hat will win the race. Don’t go for the ‘quick’ score.

What is Grey Hat?

Here’s the legit definition so bookmark this page or tweet it to your friends because this is good:

Grey hat SEO is the employment of techniques that are not technically against Google’s policies but that don’t provide value and are done solely for SEO purposes without regard for user experience.

Here’s the kicker. Pay attention. Grey hat today can literally become black hat tomorrow.

I’ve seen grey turn to black too many times. Here’s an example: Remember when blog commenting was the craze about 18 months ago? Everyone came out with their own software to almost automate blog spamming because it was cool. An immediate billion anchor text links!

All of a sudden, Google laid the smack down and it hurt. Blog spamming will kill you now. That’s not to say you can’t comment on a blog and point to your site, but you have to actually make a real comment and join the conversation.

***Don’t risk sandboxing your brand for the latest SEO craze. Be real all the time***

What Are Grey Hat Techniques?

Before we lay out the common grey hat mojo, know that there are literally unlimited grey hat areas. When building SEO focus on adding value and user experience and you will win, always. Period. Limit the things that you do that are ONLY for SEO purposes. Remember, grey hat today can be black hat tomorrow morning so pick your battles wisely. Don’t blow it like A-Rod did.

These are a few grey-hat tactics that could be just as bad for your site in the long run as steroids were to A-Rod:

• Stuffing keywords in alt tags, link titles and image titles.
• Linking to sites that have nothing to do with your niche or industry.
• Obtaining inbound links that have nothing to do with your niche or industry.
• Purchasing links under the guise of ‘advertising’ or ‘traffic purposes’ when they are really for SEO.
• Social media spam.
• Blog comment spamming is now black hat.

Have you committed the grey hat A-Rod steroid sin? Repent. Do you have anything to add to the grey hat areas that we’ve gone over?

Bling.

Killer Robots From Outer SEO Space: How to Dominate the Robots.txt File

If you haven’t heard of Mr. Robots, don’t blame yourself. It wasn’t even on the SEO map till just a couple years ago. Most of you, however, know what it is but don’t know exactly how to dominate the robots.

Robots.txt files are no secret. You can spy on literally anyone’s robots file by simply typing “www.domain.com/robots.txt.” The robots.txt should always and only be in the root of the domain and EVERY website should have one, even if it’s generic and I’ll tell you why.

There’s mixed communication about the robots. Use it. Don’t use it. Use meta-robots. You could have also heard advice to abandon the robots.txt all together. Who is right?

Here’s the secret sauce. Check it out.

First things first, understand that the robots.txt file was not designed for human usage. It was designed to command search ‘bots’ about how exactly they can behave on your site. It sets parameters that the bots have to obey and mandates what information they can and cannot access.

This is critical for your sites SEO success. You don’t want the bots looking through your dirty closets, so to speak.

What is a Robots.txt File?

The robots.txt is nothing more than a simple text file that should always sit in the root directory of your site. Once you understand the proper formats it’s a piece of cake to create. This system is called the Robots Exclusion Standard.

Always be sure to create the file in a basic text editor like Notepad or TextEdit and NOT in an HTML editor like Dreamweaver or FrontPage. That’s critically important. The robots.txt is NOT an html file and is not even remotely close to any web language. It has its own format that is completely different than any other language out there. Lucky for us, it’s extremely simple once you know how to use it.

Robots.txt Breakdown

The robots file is simple. It consists of two main directives: User-agent and Disallow.

User Agent
Every item in the robots.txt file is specified by what is called a ‘user agent.’ The user agent line specifies the robot that the command refers to.

Example:

User-agent: googlebot

On the user agent line you can also use what is called a ‘wildcard character’ that specifies ALL robots at once.

Example:

User-agent: *

If you don’t know what the user agent names are, you can easily find these in your own site logs by checking for requests to the robots.txt file. The cool thing is that most major search engines have names for their spiders. Like pet names. I’m not kidding. Slurp.

Here some major bots:

Googlebot
Yahoo! Slurp
MSNbot
Teoma
Mediapartners-Google (Google AdSense Robot)
Xenu Link Sleuth

Disallow
The second most important part of your robots.txt file is the ‘disallow’ directive line which is usually written right below the user agent. Remember, just because the disallow directive is present does not mean that the specified bots are completely disallowed from your site, you can pick and choose what they can and can’t index or download.

The disallow directives can specify files and directories.

For example, if you want to instruct ALL spiders to not download your privacy policy, you would enter:

User-agent: *
Disallow: privacy.html

You can also specify entire directories with a directive like this:

User-agent: *
Disallow: /cgi-bin/

Again, if you only want a certain bot to be disallowed from a file or directory, put its name in place of the *.

This will block spiders from your cgi-bin directory.

Super Ninja Robots.txt Trick

Security is a huge issue online. Naturally, some webmasters are nervous about listing the directories that they want to keep private thinking that they’ll be handing the hackers and black-hat-ness-doers a roadmap to their most secret stuff.

But we’re smarter than that aren’t we?

Here’s what you do: If the directory you want to exclude or block is “secret” all you need to do is abbreviate it and add an asterisk to the end. You’ll want to make sure that the abbreviation is unique. You can name the directory you want protected ‘/secretsizzlesauce/’ and you’ll just add this line to your robots.txt:

User-agent: *
Disallow: /sec*

Problem solved.

This directive will disallow spiders from indexing directories that begin with “sec.” You’ll want to double check your directory structure to make sure you won’t be disallowing any other directories that you wouldn’t want disallowed. For example, this directive would disallow the directory “secondary” if you had that directory on your server.

To make things easier, just as the user agent directive, there is a similar wildcard command for the disallow directive. If you were to disallow /tos then by default it will disallow files with ‘tos‘ such as a tos.html as well as any file inside the /tos directory such as /tos/terms.html.

Important Tactics For Robot Domination

  • Always place your robots in the root directory of your site so that it can be accessed like this: www.yourdomain.com/robots.txt
  • If you leave the disallow line blank, it indicates that ALL files may be retrieved.
    User-agent:*
    Disallow:
  • You can add as many disallow directives to a single user agent as you need to but all user agents must have a disallow directive whether the directive disallows or not.
  • To be SEO kosher, at least one disallow line must be present for every user agent directive. You don’t want the bots to misread your stuff, so be sure and get it right. If you don’t get the format right they may just ignore the entire file and that is not cool. Most people who have their stuff indexed when they don’t want it to be indexed have syntax errors in their robots.
  • Use the Analyze Robots.txt tool in your Google Webmaster Account to make sure you set up your robots file correctly.
  • An empty robots is the exact same as not having one at all. So, if nothing else, use at least the basic directive to allow the entire site.
  • How to add comments to a robots? To add comments into your robots, all you need to do is throw a # in front and that entire line will be ignored. DO NOT put comments on the end of a directive line. That is bad form and some bots may not read it correctly.
  • What stuff do you want to disallow in your robots?
    • Any folder that you don’t want the public eye to find or those that aren’t password protected that should be.
    • Printer friendly versions of pages (mostly to avoid the duplicate content filter).
    • Image directory to protect them from leeches and to make your content more spiderable.
    • CGI-BIN which houses some of the programming code on your site.
    • Find bots in your site logs that are sucking up bandwidth and not returning any value

Killer Robot Tactics

• This set up allows the bots to visit everything on your site and sometimes on your server, so use carefully. The * specifies ALL robots and the open disallow directive applies no restrictions to ANY bot.

User-agent: *
Disallow:

• This set up prevents your entire site from being indexed or downloaded. In theory, this will keep ALL bots out.

User-agent: *
Disallow: /

• This set up keeps out just one bot. In this case, we’re denying the heck out of Ask’s bot, Teoma.

User-agent: Teoma
Disallow: /

• This set up keeps ALL bots out of your cgi-bin and your image directory:

User-agent: *
Disallow: /cgi-bin/
Disallow: /images/

• If you want to disallow Google from indexing your images in their image search engine but allow all other bots, do this:

User-agent: Googlebot-Image
Disallow: /images/

• If you create a page that is perfect for Yahoo!, but you don’t want Google to see it:

User-Agent: Googlebot
Disallow: /yahoo-page.html
#don’t use user agents or robots.txt for cloaking. That’s SEO suicide.


If You Don’t Use a Robots.txt File…

A well written robots.txt file helps your site get indexed up to 15% deeper for most sites. It also allows you to control your content so that your site’s SEO footprint is clean and indexable and literal fodder for search engines. That, is worth the effort.

Everyone should have and employ a solid robots.txt file. It is critical to the long term success of your site.

Get it done.

Bling.

Is Your Content Link-Thirsty?

Lets face it. Link building is drab work. It’s the kind of work most of us like to hand off to the”‘next guy.” Nevertheless, I firmly advocate its role in Search Engine Optimization. Google’s key indicator of the “good,” “better,” and “best” websites depends on number of backlinks, and where those links are coming from.

I assume you, the reader, have already figured this out. An SEO firm without offering link building strategies to its clients might as well pack up. A site with no links is like a telephone pole with no wires—useless. Consequently, instead of asking “if” link building should be done, online marketers are always asking “how” it will be done. What’s the best strategy?

A relative of mine created an informative website, after research, about “Good Security Questions.” That was over two years ago. The site lacked all the bells and whistles that come with newer web 3.0, but it was content-rich, and more importantly, there was a niche market available who needed the information the site provided.

He didn’t know much about Search Engine Optimization. He didn’t think much about keyword research or quality link building. I spoke to him recently about the results he’s had lately. His site now ranks #1 for “security questions.” He also told me some larger corporate websites such as American Express, Delta, Prudential, and ING Direct recently changed their security questions to match the list of questions he had written on his website. People are also beginning to link to his site and use his content at the cost of little to no SEO effort on his part.

His results reveal an old rule that’s been around ever since the first neanderthal man showed his prehistoric friends how to make fire. If you want people to listen to you, say something they want to hear. Thus, before you spend time and money developing an extensive link building strategy, first, provide that your content is “link-thirsty”—make it useful, interesting, timely, or outrageous (see some of Adam’s tips about Buzz Marketing)!

Even better, include “quality content” as the first priority on your link building plan. Indirectly, appropriating your content (appropriate: suitable or fitting for a particular purpose, person, occasion) is, by far, the most successful link building strategy on the planet. If done right, link building will work for you—no more mindless directory submissions. At a bonus, you’ll get traffic from relevant sources–which means higher conversion rates. Here are a few tips to appropriate your content (This stuff isn’t breaking news…just common sense principles of which many companies fail to apply):

  • Research and know your topic. This shouldn’t be difficult because you are already a guru of your industry.
  • Be informative and as detailed as possible. (Word of caution: Quantity is good as long as you know how to present it in a comfortable, readable manner. Use lists, headings, and varied amounts of italicizing and bolding. Remember, people like to learn, but they don’t like to read)
  • Link out to your sources. Google looks for links on your site as extra avenues to gather further information on your topic. Don’t be a dead end on the web.
  • Make lists and tables. People like to gather information fast. Tables make it easy to compare items. You could even provide an objective microsite comparing your product with your competitors’ products (this only works if you, honestly, have the best product/service on the market).
  • Tell stories and be narrative. This adds flavor to your content. Write with flamboyant, playful, or exciting language. Don’t always be so serious. People want to know there is a human being behind your content and your business.
  • Be original. Are you telling something people already know? If it’s already known, take a new twist or add your own opinion. (Don’t copy content from other sites! First, this is plagiarism and second, Google doesn’t like duplicate content).
  • Involve Users. Yes, you will need to jump on the Web 3.0 bandwagon. Provide means to comment, write reviews, vote on items, etc.
  • Check your spelling and grammar. Nothing is better at killing your credibility than poor writing. Hint: Get a professional editor to check your work.

Don’t Deceive Your Users

Last month I posted a brief guide of “The 5 Don’ts of SEO” listing suggestions of things not to do when designing a website. These guidelines include a few of the common “Don’ts” and suggestions from Google to help you keep your site compliant with search engines’ webmaster guidelines. I wanted to expand a little further and give a little more insight on each guideline. I’ll start with the first guideline: Don’t Deceive Your Users.

There are many ways search engines could view a web page as deceiving an end user. Probably the most common form of deceit on a web page is presenting different content to search engines than you display to human visitors. This is commonly referred to as cloaking. Doing this may cause your site to be perceived as deceptive and can result in removal from search engines.

A common form of cloaking is serving up a page of HTML text to search engines, while showing a page of images (or Flash or JavaScript) to your site visitors. You may be doing this on your site right now not intending to deceive your users or the search engines. If you are employing these cloaking tactics, or are designing a website rich with Flash or JavaScript, you should make sure that your end users are your main priority.

To prevent your site from being a deceiver, there are a few ways to correctly provide crawlable data for the search engines. These will also be helpful to your visitors who have screen readers or images turned off in their browser.

  • Provide alt text that describes images on your pages.
  • Provide textual content of JavaScript in a noscript tag.

Make sure the content you provide is the same text for both the JavaScript and the noscript tag. Having substantially different content in these different elements is viewed as extremely deceptive to the search engines and they may take action against your site.

Keep your site visitors in mind as you build out your site. As I’ve mentioned before, a good rule of thumb is to think about what you are doing and who it is for. Ask yourself if what you are doing helps your users, and would you do the same thing the search engines didn’t exist.

Blackhat SEO and “Lie to Me”

You may have heard of a new T.V. show called “Lie to Me” that has recently started airing on FOX. I have watched several episodes of the show and I really enjoy it. If you have not had a chance to watch this show here, is a quick overview of what it is about. “Lie to Me” involves several specialists who are experts at detecting if someone is lying or not. These investigators are called in to investigate all types of situations where people are suspected of lying to cover something up. “Lie to Me” stars well known actor, Tim Roth.

The reason I mention this show in my blog post is to compare this show to Blackhat SEO. Like in the show “Lie to Me”, Blackhat SEO is a way that webmasters or search marketers try to deceive or lie to the search engines. But like in “Lie to Me”, search engine algorithms have learned how to detect lying or spamming websites. The algorithms are constantly being updated to look for new patterns or signs that a website is a spam site and then remove the spam site from the search results. They are so good at detecting spam now that many current blackhat techniques are not only going to get sites banned from the search engines but it is moving towards to outright illegal - Matt Cutts, PubCon 2008.

Like in the show “Lie to Me”, the lying website or spam website is going to get caught. So it is better to not try to deceive the search engines with blackhat techniques. Use legitimate optimization and internet marketing techniques that are approved by the search engines and avoid the inevitable consequences that come from trying to manipulate search rankings through blackhat SEO.