Capitalization in Google Serps Destination URL

Posted on by in Organic SEO

Seeing something new on google.com today, (not on google.co.uk) – there seems to be selective capitalization within the destination URL fragment displayed in organic serps:

As you can see the results look like:

Forums.seoChat.com
seoForums.org

Anyone else seeing this?

 

+ interestingly about two months ago they stopped capitalization within adwords results.  Not sure why the different approach.

Google Panda Update UK Analysis

Posted on by in Organic SEO

google panda updateMany of you will know that I work in house in a company that operates a pan European ticket marketplace, Seatwave.

Its a great industry to work in (lots of nights out, constantly evolving marketplace etc) but one of the drawbacks that we struggled with historically has been content sourcing. While there are a million things that can be said about most artists (just look at Justin Bieber on twitter as point in case), the actual mechanism of buying your tickets isn’t the piece of the value chain that fans are interested in or indeed passionate about.

However since the MayDay update, the writing has been on the wall for all eCommerce sites, big or small – either get content, or get out.

That’s pretty much the same message in both the Panda and MayDay updates, so why people are all of a sudden upset I’m not really sure.

Data Capture

I track competitive SERPS for around 700 queries – these are primarily “artist name + tickets” searches – for instance “Kylie Minogue Tickets” is a good example.  If you’re interested I use Advanced Web Ranking by Caphyon for this reporting, I’ve found it to be the most dependable of all my rank trackers and I fully endorse it.

Big Losers

Lets look at an example: Panda Google Update

SoldOutEventTickets.com

Sold Out Event Tickets are the oldest of all the secondary ticket marketplaces in the UK (probably in Europe for that matter) and are part of TicketMaster, the monopolistic entertainment ticketing company with an overwhelming market share worldwide.

For years now they have maintained competitive rankings down purely to a few factors:

1) domain age
2) links from Ticketmaster (again, the authority in this vertical)

They have never however invested time or money into:

1) unique content (all theirs is aggregated from a variety of sources, and whatever is unique is so far below the fold its irrelevant to the random surfer.
2) social interaction (they have none)
3) UGC (reviews etc)

The screenshot below shows my Advanced Web Ranking data comparing yesterday to today:
(please note, this screenshot was taken a few hours into my AWR scrape, I will replace it with the full update tomorrow)

Its important to point out at this point that this isnt the only site affected in this update, in reality the “lower order” of automated sites have all been hit to a certain degree, but as this example have consistanly ranked well down to domain level factors the drop looks particularly bad.

Running the same analysis on the fresher generation of sites,  they are up on average – not in my opinion due to any improvements on those properties, just that SoldOutEventTickets and other players with similar site structures have been penalised.

Analysing Site Content

Take a look at these pages as a direct comparison:

Kylie Minogue
Sold Out Event TicketsSeatwave

Take That
Sold Out Event TicketsSeatwave

As you can see on the above examples, the affected site is missing “unique” content, UGC, and social signals.  This is nothing new, this is really the SEO 101 stuff that google have been preaching and reputable SEO’s have been practicing for years now.  This update is just weeding out sites that havent followed the mantra of google in the 21st century.

Arguably the domains that have been booted in this update had few “page level indicators” of value, all of the strength that was keeping them ranked well came from domain level metrics.

If that’s the case then this update is probably good for “the little guys” out there that have the mobility to create great content, but don’t have a monolithic domain behind them that’s unfairly perhaps creating results that have meant they can outrank better content on smaller sites.

The Sum Up

If you don’t have a content rich site, you’re not going to get any love out of google any more.  Make sure your sites are well updated, fresh, have compelling content and data, utilise UGC, push social signals etc.

IMAGE DISCOVERY CREDITS:
I asked on twitter earlier if anyone had any “good” images of panda’s to use, the first one was suggested by Simon Panting, a self proclaimed apple fanboy and purveyor of banter, the second was suggested by Bob Meijer, a good friend of mine and co-incidentally a London based googler.

Lost in Translation: Google Algo Update 06-04-2011

Posted on by in Organic SEO

I started noticing a large-ish drop off in google organic traffic to seoforums.org last night.

This is something that I’ve been through a few times before so as soon as these changes occur it generally means a late night, lots of coffee and much concern.

Around 2am I finished my analysis and pretty much knew what was going on, but needed to wait until this morning to publish the findings just in case it was a matter of GA being slow to update (that doesnt happen anywhere near as often as it did 3 or 4 years ago, but it was still a possibility).

Google Algo Update: Lost in Translation

Since September last year, I’ve been using a plugin here on seoforums.org which auto translates all of the content on pageload to a new URL in the members selected language of choice.  It then saves that page into a cache database to increase performance for future pageloads (refreshing every so often depending on the page type).

So far this just sounds like pretty common google translate spam right?

Wrong.
The plugin also detects posts made in foreign languages and translates them out to all 30+ other languages used here on seoforums.org – therefore (and lots of people do this) you can see people posting in greek, speaking to someone posting in arabic, and someone else speaking in French.  Each one of those posts are seen in each users native language as defined when they signed up.

Its a beautiful system. That is NOT google auto translate spam, that is a legitimate translation use of their API in my opinion.

The Setup:

google algo updatewhen I started using the plugin, I registered each of the subfolder sites as specific websites in Google Webmaster Tools, as you can see from this screenshot.  I then went on to geo-target each of the subfolders to the relevant countries.  Linkbuilding to these subfolders has been entirely organic (I’ve never bought a link for this site full stop).

I’ve also submitted a specific XML sitemap for every regional variation of seoforums.org and each of the sites within GWT is correctly reporting the search queries, volumes, site speed etc. all as you would expect for any other site.

In short, I didn’t do this “a la spam” site – I built it properly.

The Results:

until yesterday I was pleasantly surprised with the results.  After a slow-ish start it was building up a nice % of overall google organic traffic.  Over the last week or so these landing pages have been receiving more or less a thousand additional visits per day, at a bounce rate no worse than the rest of the site (and much better than some pages in english – but thats another story).

As you can see from this graph, there was a heck of a fall off sometime yesterday:

google update

Additionally, when searching google.co.uk with a site:seoforums.org command, you saw lots of “junk”, ie. the homepages for each foreign language was listed in the top 50 results.

This wasn’t really a great UX, but then how many people actually search with the site: command that aren’t SEO’s? Furthermore – as I’d submitted each sub folder as a different geo-targeted site  so frankly google shouldn’t have been displaying them in the .co.uk results anyway.

I must stress that I DO NOT BELIEVE that my use of the google translate API in this case is spammy.

Its creating a legitimate forum in dozens of languages, and allowing people of diverse nationalities to discuss topics in their native languages.  Some of them (ie. the Spanish one) are great translations. I know some are rubbish, but I’m relying on google’s own API for quality.

My Conclusions:

it does appear that google yesterday rolled out an algo update which has improved their detection of auto translated pages and removed them from the index.

That will be of benefit to most consumers in removing yet more of the junk that has been out there for some time, but in my case I do feel a little let down – there is no other way for me to translate a site which is 100% UGC and updated constantly.

 

 

Blackhat SEO is a Joke

Posted on by in Organic SEO

This post was inspired by Kris Roadruck’s post yesterday, titled “Whitehat SEO is a Joke

Background: Kris and I both spoke at the recent Distilled Linkbuilding conference’s and shared the same topic “Lessons from the Darkside” – while I filled the slot in London, Kris filled the same slot in New Orleans.  Kris is a proud badge wearing grey hat, I spent years working in poker SEO and know a heck of a lot about black hat tactics, but no longer practice them commercially.

You know, there was a time when I called myself a Black Hat, and was proud to be part of THAT community – using automated bulk tools, finding php vulnerabilities, injecting links, mass spam submission, building orphaned pages using poorly constructed search forms, the list goes on (and on).

white hat awesome duck I’ve long preached that people in the white hat community were all unicorn believing naive marketers, with their build it and they will come mentality to linkbuilding which was never going to compete on valuable search terms.

Things have changed however, and its nothing to do with ethics, or the black/white hat debate (which I’ve found myself thrust into a bit recently).

The reason why black hat sucks is that its not scalable, from a commercial sense.

Black hat techniques are fine for “lower order” websites.  You know the kind, someone put together in 20 minutes on wordpress using a $0.99 per month hosting package and expect to rank for valuable commercial terms in 48 hours.

If that’s your situation, then black hat may well be the way to go.  You’re likely to get some traffic and traction if you do it well enough, but you’re always going to be in the situation where you start every morning by checking if today is the day you get booted out of google’s index.  Trust me, I used to do that EVERY morning.  When it happened it sucked, and it did happen.

If however you have a web property that has some value to it, then true blackhat strategies are not the way forward.  If you are working agency side, having to explain to your client just why their website has disappeared is NOT fun.  If (heaven forbid) you are client side, or its your own property that’s just been penalised, then you’re in the situation where you have to start everything from scratch, or get your house in order and submit a reconsideration request (which I still think is a sham).

Notice I’m not saying that black hat is a ‘bad thing’ to do:

because it isn’t.  Google guidelines are just that, GUIDELINES.  They are mainly there to serve two purposes: 1) making the web as easy as possible to crawl, and 2) not messing with their PageRank algo, which until recently with the advent of social scoring is the only real mechanism they’ve had to sort the noise from the signal.

The reason you shouldn’t do black hat, is because you’re very unlikely to build a huge web presence or business out of it.  Even if you are the 0.0001% that do manage it, you’re always running the risk of losing it all in either an algo update that nullifies your competitive advantage, or if you’re big enough getting a manual booting from the webspam team.

Black Hat = Risk Management

blackhat is risk managementwhat it all boils down to then, is risk management.  If you’re working on a project that you simply dont care if it lives or dies, then you may well engage in some of the nefarious tactics that (still) work.  If on the other hand you’re planning on building something of value, why would you risk it?

The same goes for client work, if you’re engaging in these strategies without your clients full knowledge and consent, you are at best misguided and more likely to be professionally negligent as an SEO – and the world needs less of these so called “seo experts” sullying the name of our trade.

</rant over>

Not a black hat SEO.

Posted on by in Organic SEO

So it seems that my slide deck (and that of Russ Jones @virante) caused a bit of a stir in some circles.

Firstly – my slide deck was evenly balanced, it starts off with why you shouldn’t do black hat, but in the past some very well known companies have, to go on to build a huge brand.

I go on to say NOT to buy links on webmaster forums (like mine, ironically) and NOT to buy mass directory submission, and NOT to buy linkwheels and so on.

Wil Reynolds summed it up nicely (which I quoted several times during my session) “do you want to be the kind of SEO that wakes up every morning worried about whether today is the day you get a penalty?”.

The answer ofcourse is NO – you do NOT want to be that kind of SEO professional.


Sam Crocker goes on to quote me several times as saying things like: “don’t do this for clients”, ” I would never do this anymore, but here’s what I learned”, “don’t do this, it will get you banned”, etc. in his reply to this post.

The strategies that I did give examples of include things like building your own affiliate network to keep control of your inbound links and not sacrificing the equity to the affiliate network – something that hasn’t conclusively been advised against by Google in the past, and building a mechanism to keep control of your links in widgets and embeds should you ever want to change them in future (seems a legitimate thing to me, if used correctly).

Neither of those strategies I would define as black hat – in fact, immediately before my session I pulled Will Critchlow aside and we discussed that my talk, despite being called “lessons from the darkside” was not in any way blackhat, rather the kind of things you would only normally discuss over a drink, which was my brief after all.

Therefore – I would like to state for the record, I do NOT advise people to use “black hat” tactics to rank.

EPIC SEO Fail by readingfestival.com

Posted on by in Organic SEO

Today is a big day in the UK ticketing space.  Its the day that Reading and Leeds festival’s go on sale.

This normally prompts a huge storm of competitive PPC, Social and SEO optimization by all the big ticketing sites, as there are literally hundreds of thousands of searches for the term “reading festival” and traditionally the website that takes the lions share is (quite naturally) the official site: readingfestival.com

The demand is off the scale, with impressions on adwords being in the hundreds of thousands – this google insights graph gives a good idea of the spike this week every year:

Reading Festival Ticket Demand

The graph below shows site visits as tracked by Alexa.com – the massive spike at the start is last year’s onsale, and the subsequent spike is the week of the festival itself:

ReadingFestival.com Traffic

Most of this traffic every year is fueled by google organic, pure SEO traffic – as they don’t have much margin they don’t by paid search ads, nor do they need to.. They OWN the brand.

This year however is a different story.  If you take a look at the SERPs in the UK right now, their site is nowhere to be seen on google.co.uk:

Reading Festival SERPs

How could they have made such a monumental cockup to be de-indexed on the biggest day of the year for them?

Simple: massive internal duplicate content creation.

Typically their site goes down during the onsale and they replace it with a holding “coming soon” type page, detailing that they are experiencing huge demand.  Last year this page was put in place during the actual spike, and they 301′d the rest of their site back to this page.

This year, some enterprising sole at their web company has simply rendered EVERY page of their site as the (identical) holding page.  Not great for duplicate content methinks.

Furthermore, they put the holding page up in lieu of the entire site about 12-18 hours ago.

Google has been spidering all of the pages in its index over the course of the day, in a much accelerated fashion thanks to the huge spike in volume, and gets the same page returned for every URL.

example 1: the Homepage…

reading festival homepage

example 2: any other URL under readingfestival.com = an http 200 page with identical content.
reading festival duplicate content

Its things like this that scare me as an SEO.  Presumably a company like festival republic have an agency, or perhaps an in house SEO.  The fact of the matter is that whoever handles this either did not check how this was going to work, or did not understand what serving a whole site of dupe content can do to your rankings.

I guess the moral of the story is: whenever dev get involved, make sure you know whats happening. Then THINK about it.

How-To Guide for Reaching U.S. and Latin American Hispanic Markets

Posted on by in Guest Posts, Organic SEO

If you are looking for niche markets that are largely untapped, look no further than the U.S. Hispanic and Latin American markets. To be able to dominate in these markets, being well-versed in search engine optimization, or SEO, especially if trying to capture the attention of Hispanics. The Latin American and U.S. Hispanic markets have some similarities as far as demographics go; they are still two very distinct markets and should be treated as such.

A combination of methods can be used – either local SEO or by a more regional approach – but the main differences between English and Spanish SEO need to be considered for these markets. Below we will attempt to define each in order to help you develop a strategy that will work best for you.

1. Hispanics Who Reside in U.S. and Latin America and the Differences Between Them

Look at the Hispanics who reside only in the United States versus those who live in Latin America and it’s easy to see that main differences between the two. Economy and culture are two of the main differences between the two locations and the strategy for targeting the two demographics will be dependent on them. For example, Hispanics in the United States fall into two categories: immigrants or offspring of immigrants and have thus been somewhat Americanized, even while holding on to their own culture.

What does this mean? For the Hispanics being targeted living in the United States, copy that would promote self-improvement or choices in education would be appropriate, while copy for a culture that is centered more around family would be appropriate for Hispanics living in Latin America.

2. The Difference Between Regional and Local Aspects of Reaching Hispanics Online

When considering the Hispanics that need to be reached online, it’s important to consider local aspects (specific cities and states) like Pittsburgh, Pennsylvania and regional aspects (a general geographic area) like the Midwest.

By approaching SEO in this manner, you are able to pinpoint the best strategy for reaching the targeted Hispanics living both in the United States and Latin America – which allows greater control and ultimately higher success.

3. The Importance of Language in the United States and Latin America

Cultural and location differences have been discussed, but it is important not to overlook language barriers when considering the Hispanic demographic, both in the United States and Latin America – particularly when dealing when trying to reach this demographic through ads and web content.

The Geoscape 2010 Census Report states that there are more than 50 million Hispanic residents in the United States, of which 60% are bi-Lingual while 40% of whom are dependent on either English or Spanish.

The following is an excerpt from a presentation by César M. Melgoza, founder and CEO, Geoscape, at Versailles Breakfast Club on Oct. 08, 2010: “Will the 2010 Census Results Change the Way Businesses Market to America?”

A combination of content in both languages might sound like a good idea when trying to reach this segment, and in many cases it is. But in many other cases it may not be necessary at all. It depends entirely on the specific Hispanics online you’re trying to target.

Generally speaking, if you are targeting an English-dominant group such as the younger Hispanic audience from the United States, you should target it mainly with English-based ads and website content, since they will be more likely to speak English and relatively less likely to speak Spanish. On the other hand, if you are targeting Spanish dominant U.S. Hispanics or first residents of Latin America, you should almost always use Spanish.

Between those two ends of the continuum are the bi-linguals, whether English-preferred, English & Spanish or Spanish-preferred. When trying to connect with these sub-segments, you may want to use a combination of English and Spanish content to capture the entire spectrum.

To sum up everything we’ve covered, the three areas discussed are the areas that every marketer worth their salt should consider when trying to target the Hispanic market – whether in the United States or in Latin America. How specific the marketing effort put forth will determine how much time will be needed to understand the targeted audience and how much money has been allotted to these efforts.

So, what did you think of this post? Post a comment (good or bad) and if you liked it, please tweet or email about it!


ABOUT THE AUTHOR

Sebastian Aroca is an entrepreneur and a customer centric professional. He co-founded Hispanic Market Advisors, a company that offers Spanish SEO and English to Spanish translation services. Sebastian has over 10 years of professional experience managing regional customer programs and client acquisition strategies in the areas of sales and marketing communications, primarily for the U.S. Hispanic and Latin American & Caribbean markets.

Creating The Right SEO Strategy

Posted on by in Guest Posts, Organic SEO

Note from MOGmartin: this is a guest post by Jon Quinton (@jonquinton1 on twitter) of Go Search Marketing.

About Jon: I run my own small consulting company ‘Go Search Marketing’, and work with a variation of clients from local business through to niche ecommerce sites.

Creating The Right SEO Strategy

Coming up with an effective SEO strategy and ongoing plan is key to the success of your website. When I first delved into the world of SEO I was definitely guilty of just ‘jumping in’ and not spending too much time planning. This was down to experimenting with various techniques more than anything else, and it soon became apparent that strategising my approach was vital.

I thought it might make a useful and interesting blog post to share a few things that I do to create a plan specific to the website in question. I personally feel that there is definitely an element of gut feeling and intuition involved, but all of this comes based on data and research. There are also many more things I check than can be discussed in one single post, but here are a few things that I always make sure I cover.

What Does ‘Strategy’ Mean To Me?

Whenever I start an SEO strategy there a few things that I always keep asking myself; what are the needs and goals of the website in question, and what’s going on in the market?

To me, an SEO plan should be answer the needs of the website in question and the market it relates to.

Keyword Research

Aim: Find a direction

With any project, this is always my first step. Until I conduct even the most basic amount of keyword research, I feel somewhat in the dark. I usually start off by simply entering a load of relevant keywords into Google’s keyword tool to try and get an overall idea of what’s going on. This is a really easy and effective way of comparing a large set of keywords and trying to suss out a direction to start going in. This information all gets exported into a spreadsheet for use later down the line.

After a while of doing this I’ll start to build up an idea of certain ‘hotspots’ and opportunities to start looking into further. I’ve recently started using SEOgadgets keyword tool to get more of a ‘visualised’ overview, and also to start sorting the keywords into categories. Not only is this really useful for me to see, it’s also a great way to start organising your data into a presentable and understandable format for clients.

Open Site Explorer

To really get into the ‘nitty gritty’ I’ll use the SEOmoz keyword difficulty tool. This allows me to start judging how tricky it’s going to be to rank for certain keywords. I’m constantly comparing different keywords against each other in an effort to keep building an even stronger picture of the market, from which I can then base my recommendations.

Competitor Analysis

Aim: Get the ideas flowing

Looking at websites that are currently enjoying success in your market is a great way of getting some ideas going. I’m not saying copy other people’s work, more look at what they are doing for a bit of inspiration. On another note, it’s vital to know what the competitions doing in order to compete effectively, and keep up with them in the long run.

I usually look into how well optimised their sites are, how active are they in social media, where are their links coming from, are their links natural or have they been manually link building? This can all be very useful stuff when you’re trying to prioritise your SEO tasks.

SEOmoz’s linkscape tool comes is very useful for this, and I’ll normally make use of it by looking through competitors back links to try and see if I can spot opportunities or areas that I should probably be focusing on. Even without a particular goal in mind, just manually looking through these links will create a really good image of what’s going on. It might be quite a laborious task, but in my mind it’s well worth the time.

Website Audit

Aim: Unearthing any potential problems

Every project is different, and every job will present its own unique challenges and potential issues. However, even if someone contacts me asking for help with link building only, I will always insist on a site audit even if it’s at a basic level. There’s no point in going out spending time getting great links into a website only to find nothing happens because it’s an absolute mess. I believe that would be a slight disservice to the client, I’m sure you’ll agree!

XENU link sleuth

(Xenu’s Link Sleuth is a great way of delving into a website)

Depending on what arises from looking into the website, the first step in any SEO plan of mine is to get the website into a good shape. The aim of doing a site audit is to find out if there are any SEO issues, and also to find out what can be improved. From this I can then create an easy to follow task sheet with a prioritised list of recommendations.

Going forwards, you will also be able to effectively uncover areas for future development and build that into your long term plan.

Budget/Time Constraints

Aim: Time to get realistic

Perhaps this should have been my first point, because there’s no point in putting together the most detailed and world conquering SEO plan if the client can’t afford to action any of it. To provide real value to the client, any strategy has to be realistic. One of the first things I’ll ask a prospective client is how much they can afford to spend on SEO. With that in mind I can then create a plan that can actually be put into action.

If money’s tight then I might start to suggest that the client themselves take on various tasks, but the most important thing for me in this instance is to prioritise the most important tasks.

On the flip side to this, I can’t afford to work for free and I don’t want to end up doing way more work than I’m getting paid to do. This is where having a solid strategy not only helps the client and the overall success of the campaign, it should also serve to protect you and make it clear what can be expected from you.

I hope you’ve found this post useful, please leave some comments to discuss things further or feel free to get in touch with a tweet or two: @jonquinton1

The 4 Most Overlooked On-Page Factors for SEO

Posted on by in Guest Posts, Organic SEO

Note from MOGmartin, this is a guest post by Sam Page:

Sam Page is the in-house SEO Manager for Southwest Equipment in Lewisville, TX. He started working online in 2006 with his own website and has carefully developed his skills in SEO over the last 4 years. He specializes in e-commerce and sales websites. Follow him on facebook.

So, you think you have your website perfectly optimized? Perfect keyword density, great internal link structure, title tags & meta tags nailed, plus perfectly placed landing pages, huh? Those are great things to do, but like everything in SEO, you can refine your page and come even closer to the ideal search engine algorithm buster.

1. Google Caffeine introduced an emphasis on loading speed. With the internet needing to be faster and faster, the quicker your site can load, the better. There are several things a person can do to speed up their loading times. This includes fixing broken links, adjusting images, editing flash or java, streamlining your landing page layout, and checking to make sure your host isn’t slowing you down (most common with shared hosting). Should this be your first priority? Probably not, Matt Cutts mentioned that this affects only 1% of search queries. Speed up your site; Google did.

2. Be careful with your bold text. While it’s important to implement bold text, I don’t recommend using it outside of a citation or heading. I come across pages all the time with keywords in bold and can’t help but think it looks like Spam.  If it looks over-optimized to me, it will to Google. I am very sure the staff of spam fighters at Google have already addressed this issue. I still believe the best way to organize your site is by keeping it  more formal and using the h1, h2, h3… tags.

3. Don’t forget about social networking or bookmarking. You want to integrate your entire social network as seamlessly as possible. This may not necessarily impact on-page optimization, but it will affect the professional appearance of your site. Just like having a blog on your site to keep your content fresh, I believe that in the next year or two, social media will bring relevance and freshness to your site. Google is ranking with cues from Twitter and Facebook, and Bing / Yahoo is also utilizing Facebook  ‘likes’ into their search algorithm.

4. Keep your content FRESH. Want to know how to beat some very important sites in the SERP’s? Be one of the first to report a news story or update and you will likely be on your way to the top. Last night, I was listening to Joe Laratro speak about a site that had beaten major sites like The New York Times and ESPN because they were one of the first to post a story about the officiating of the NY Jets vs. NE Patriots game. He also mentioned that Google drastically moved them up because of how quickly and timely they wrote the story. To Google, authority and timeliness are king. Sure, you can write a story, but it may not be picked up by Google very fast, right? Wrong. You can submit that link to Twitter, Facebook, and Google Buzz to get it indexed much quicker. Sometimes, you’ll even get your tweets included in the Google SERP’s.

Utilize these tools and cut your website from a different cloth. If you are in a highly competitive market, these tips are absolutely necessary to ranking well and converting traffic.

Review: SEOgadget Keyword Tool

Posted on by in Organic SEO

seogadget keyword tool review

The SEOgadget Keyword Tool (beta)

As a professional SEO I regularly get asked to review new products for search engine optimization.  I infrequently publish these reviews because the majority of the tools are fairly repetitive and add nothing to existing free, or popular tools that are well known (how many other SEO tools have similar functions to the rather excellent set available over at seomoz?).

This review however is a bit different.  Firstly, I know Richard Baxter, founder and CEO of seogadget from having seen some of his excellent presentations at UK search conferences over the last three years or so.

Secondly, because unlike 99% of the products I get asked to review, this one has actually some absolutely unique features, and provides real actionable data that is useful to site owners and agency SEO’s alike.

Anyway, onto the good part, what does this tool actually do?

Put simply, its a great keyword research tool that allows you to quickly and accurately determine which keywords your site is currently ranking for, receiving traffic on, but could be improved upon to bring you more traffic.

Furthermore, it also has some awesome keyword grouping functions that are a powerful tool to group content on your site in a more effective manner, and ensure that you have the optimum on-site architecture to maximise your organic search traffic around these keyword silo’s.

seogadget keyword tool graph

click to enlarge

I have connected it to the Google Analytics (GA) account here on seoforums.org – here are some screenshots:

In this first one you can see a nice ranking vs. traffic graph of popular terms around “tools” biased keywords that seoforums.org ranks for.

This type of interpretation is valuable to me when I’m writing the content in the seo tools section of this site, which I review every few months to try and find any changes in search behavior that I can take advantage of.

All very pretty, and its a nice visualization, but I want some actual numbers please that I can make my judgements on….

seogadget keyword tool data

click to enlarge

So the table on the left gives you the  equivalent data with the following information in raw numbers: Keyword, your Google Rank, Total Visits you have received on that term (last 30 days), and the overall search volume for that keyword across google.

This enables me as a webmaster, to determine the best candidates to increase this sites search traffic – by concentrating both my content creation (or adaption) and internal linking to try and push those results that have both high search volume, and an average position of say, 6th or worse into the top 5 results on google.

I have also filtered this list to show keywords where I rank on the first 3 pages in google, so I know I can move the rankings quite quickly to attract more traffic.

All this data is really cool, but what do I do with it to get more traffic?

Well, in my example you can see that I have added some new links on the homepage of seoforums.org to the pages that were ranking on the second page of google, and have high search volumes.  These links are in the section on the right titled “handy links”.  These should gradually move up the search ranks over the next few weeks, resulting in more free traffic for the site.

The other strategy I have employed has been renaming some of the internal links to the tools pages concentrating on keywords in the anchor text that are higher search volume.  Another step I took was renaming the adwords forum page title, and URL based on the data from the SEOgadget Keyword tool.

These changes by themselves, which took me less than an hour from setting up my profile on the keyword tool to rolling out to the website, should result in another 8 or 10 thousand visitors per month to seoforums.org within a few months.  Not bad for an hours work!

My recommendation would be to checkout this tool, and spend some time playing with its features and analyzing your traffic – Im pretty sure you will learn something new in the way it slices and dices your analytics data, and combines it with keyword volume from the pretty reliable (these days) adwords API.