Ads 468x60px

Friday, June 10, 2011

7 SEO Tips for your Django Site

Forget content. When it comes to the Internet, Google is king. If you please the king, you can do well.If you cross the king, your disloyal ass will be thrown into the tower to watch your site wither and die, unless the king deigns to sets you free.Read Care fully this article on seo tips and tricks.Best seo tips.

 

Latest seo tips and tricks, amazing 7 seo tips and tricks,

Some of the things that most please the king are “thick” (i.e. not “thin”) content pages, well structured site navigation, and honest links to and from your site.

In building my new site, tipsmap.com, I’ve both pleased and inadvertently crossed the king. The problem with this monarch: he doesn’t speak to his subjects, so if you end up in the tower, you’re never quite sure why.


Free seo tips


Your best strategy is to know the rules of the kingdom, and do your best to follow them. Below are some of the best practice, white-hat SEO techniques that I have used to make sure the king stays happy.

1) Automatically NOINDEX “Thin” Pages, Part 1

As I write this post, I’m mentally air-quoting the word “thin” because nobody knows for sure how “thin” is defined. For me, I’ve arbitrarily decided that a page should have a minimum of 400-600 words of good, original content, not counting navigation and other necessary on-page fluff. Conversely, I red-flag pages with less than 250 words of core, original content.

For Wantbox I have implemented the useful django-tagging application which has, in the short term, produced some automated “thin” content. As more “wants” get added to Wantbox, more tags get created, and more tag listing pages are dynamically produced (side note: a want is the equivalent of a user’s posted question, but is specific to what that poster wants to buy or find). This is an example of a legitimate, but currently “thin” tag listing page on Wantbox: Wants Tagged with “ergonomic”.

 

Local seo tips


Now, as the Wantbox user base increases, and submitted wants grow, this listing page could be a useful resource for visitors. In general, I don’t want to remove pages like this. Instead, in the Django template for these want listing pages, I have added the following code in the <head> section of the page:

{% if object_list.count < 5 %}
<meta name="ROBOTS" content="NOINDEX, FOLLOW" />
{% endif %}


I use this same template to display all want lists on Wantbox: wants by state, wants by tag, wants by submit date, wants by user, etc. With the above template addition, if any of these pages have less than 5 wants listed, the page is deemed “thin” and I direct any crawler to not add the page to their index, but to still follow (and allow PageRank to flow from) the links on the page. To be clear, when I say “any crawler” I mean the king’s Googlebot.

2) Automatically NOINDEX “Thin” Pages, Part 2

Another problem for Wantbox, or any Q & A type site,  is that when a user posts a want it remains “thin” until it gets a reply or two. This unanswered valentine’s day post is a “thin” want, but this sump pump one is not. To prevent new, unanswered wants from adding up and harming the site, I add the same NOINDEX, FOLLOW meta tag shown above until it gets a reply.

3) Use 301 Redirects when Page URLs Change

For Wantbox, I use a slug to construct the URL of each want page. For example, this want for a rooftop cargo box looks like:

http://wantbox.com/I-want-a-rooftop-cargo-box-that-fits-a-Lexus-RX400h_Boston-MA-02111.html

I believe this is good for Wantbox visitors since the resulting URL is clean, readable (although a bit long) and very pertinent to the page. However, since the slug is composed of the want title and city, it can change if the want is edited.

To deal with this, after any want is edited, if the title or city changes, I add an entry to a Want301 table with the old value and the new. If a visitor encounters a 404 error when attempting to view a want, I check the Want301 table to see if there is a recorded change. If one is found, I redirect the user to the new page using:

django.http.HttpResponsePermanentRedirect

This sends the HTTP 301 Moved Permanently redirect. If a search crawler encounters this, it should update it’s index to account for the change.

4) Use the Canonical Meta Tag

When new wants are added to Wantbox, I create a shortcode for the want and tweet it out. These tweets look like this:

These URLs end up getting archived and re-posted by random Twitter scrapers although they are completely duplicate with the full slug-based URLs I have constructed. Both the shortened and slug-based URLs actually end up at the same Django template.

Danger seo tips


 

In this example, the real URL for this roof replacement want is:

http://wantbox.com/I-want-to-find-local-roofer-contractors-to-replace-a-roof_Burbank-CA-91503.html

The king despises duplicate content. In order to indicate to his majesty which URL reigns supreme, I add the canonical meta tag to the top of my “want detail” template:

<link rel="canonical" href="http://wantbox.com/I-want-to-find-local-roofer-contractors-to-replace-a-roof_Burbank-CA-91503.html" />

This lets the search engines know the preferred, or canonical, location of this page.

5) XML Sitemap

If you have a dynamic site, you should have a dynamic sitemap which updates as new content is added and specifies the most important pages for you site. Django has an entire framework for dealing with XML sitemaps, but I found it just as easy to create a simple sitemaps.xml template and populate it with the specifics for Wantbox.

The Wantbox sitemap automatically generates on each call and pulls all the current content from the database that I want crawled. It also updates the <lastmod> value to indicate to the crawlers when a page has changed. I can also assign my personal page priorities with the <priority> tag.

Once you have a sitemap, go to your Google Webmasters account (you do have one, right?) and submit it to Google.

6) Useful Chrome SEO Extensions

These days, I mostly use Chrome as my web browser, although I do hate that it doesn’t have a master password. Two good SEO extensions I use:

  • SEO Site Tools

  • Chrome SEO


I particularly like that both of these tools visually highlight the “nofollow” links on the page that you are viewing.

7) Clicky Analytics

When monitoring SEO traffic to your site, I’ve found no better analytics package than Clicky. Specifically, it does three things that I really love:

  1. Shows your current traffic and compares it to where your were at the same time of day yesterday, 7 days ago, or versus a 7-day moving average. This is a great way to see how you are trending throughout the day.

  2. Splits out traffic sources by direct, search, media searches, links and social media and shows where you are now for each traffic source category versus where you were yesterday, 7 days ago, or versus a 7-day moving average.

  3. Lists all the outbound links that were clicked on (which is particularly helpful for monitoring your generated affiliate traffic).


I’ve also found that the dashboard page has everything on it that I want: traffic sources, outbound clicks, top viewed pages, recent visitors (with country, IP address and referrer), recent search terms and an hourly visitor graph overlaid on top of yesterday’s or last week’s visitor graph.

It’s awesome and I actually paid to upgrade to a real subscription. Check it out.

Closing Thoughts

I hope this post helps at least a couple of you be more loyal subjects. Follow the king’s rules and you should be OK.

I am very interested to hear from my fellow peasants how you please the king. Please add comments below with your favorite SEO TIPS best practices.

One last thing, the king loves links, likes and retweets, so if I have helped you out please show your gratitude by liking, linking or retweeting this post or, even better, doing so on Wantbox if you like anything you see over there.Latest and best article on seo tips. (seo tips)

 

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.