23
Oct 13

How can nofollow links hurt your site’s ranking

The title to this blog post is definitely contradictory as no-follow links can’t hurt your SEO rankings mainly because Google doesn’t account nofollow links as factors for SEO. But recently Google is being looking at nofollow links to identify a few culprits trying to take advantage of leniency. Let me explain.

There are many SEO and search marketers out there commenting or participating in forums, leaving their website’s link with their comment. These comments with links also help them get direct traffic from comment readers. I agree that these marketers intentions are definitely not to build external links but to generate traffic from valid thoughts shared as comments. This also allows comment readers  find whether the comment writer is an expert in subject by looking at his/her website or blog from the link. One thing to understand here is Google definitely doesn’t count these links (as most of them are no-follow links) for SEO rankings.

But  recently Google started looking at these links to understand the commentators intentions and whether these comments are added for legitimate reasons. Some of the cases they look at are :

  • Do you post comments to all the blogs/forums in the world – which  clearly shows that these comments can be non-legit?
  • Does your comments are really being treated as spam or in other ways most blog owners mark you as spam as soon as you get them ?
  • Your comments generally doesn’t make sense or is not useful to anybody – generally like a spammy comment 🙂

The above steps can lead your website to a  manual penalty from Google. The above steps are clearly a sign for Google to identify that a website is trying to be a more creative here to generate direct traffic from spamming the internet. Matt Cutts from Google have recently agreed that Google has the technology to identify these websites and put them to the wrong list.

To be on the safer side, here are some recommendations :

  • Only comment or participate in comments with links if you are passing genuine thoughts. Adding the same comment to hundreds or thousands of comments can lead you to trouble.
  • If you have outsourced your SEO to a third party company and they are recommending this – Just be aware that they can hurt your future SEO health.
  • Definitely practice adding comments to generate direct traffic to your website – its a good way to create brand awareness or even self branding.

If you still have questions, please add your comment and I will reply with answers (Again ‘MIND’ when you comment…)

 

Possibly Related Posts:



21
Oct 11

Effective SEO for Test websites

Enterprises always create internal publicly accessible sub-domains for various reasons but mostly for testing. For example, Enterprises (for example, site.com) tend to create sub-domains like beta.site.com, test.site.com, demo.site.com or qa.site.com . The SEO’s should incorporate a few SEO checks before the company launches test sites or beta websites.

It is always recommend that you propose few SEO requirements for beta websites so that these practices doesn’t affect your main site’s SEO. The following are some of the reason they can affect your main websites SEO :

  • Duplicate content
  • Page rank sharing
  • excessive crawling / crawl activity
  • Incorrect indexation

These are some tips where you can control the access of web crawlers to these internal websites and not have any SEO issues

Implementing the correct blocking rules in robots.txt file.

I recommend you add the following rules in the sub-domains robot.txt file :

User-agent: *

Disallow: /

Implement Canonical tags

Make sure the pages under the sub-domain contains the canonical tag with the primary domain name. For example, the canonical tag in the page www.beta.site.com/test-url/ is www.site.com/test-url/ .  This way if the pages inside the beta site is crawled , they will index the correct URL while indexing.

Add no-follow, no index to internal pages

If the internal website have pages which is not present in the main website, add the “robots” meta tag blocking the Search engines. This way  they don’t get indexed .The tag is <meta name=”robots” content=”noindex, nofollow”> before the </head> tag.

Register the sub-domain in Google webmaster tools (GWT)

If you can register the sub-domain to GWT, you can find whether any crawling activity is happening inside the website.

Possibly Related Posts:



05
Oct 11

Google Analytics adds SEO reporting

Google have launched SEO (Search engine Optimization) reports in Google analytics after you integrate with your Google webmaster tools account. This is great news for SEO’s around the world. I would guess this is Google’s first attempt to give direct reports to SEO professionals who are not good at web analytics (But I always recommend that a good SEO should be a good web analyst too).

Google analytics adds SEO report

So let’s look at the reports Google launched recently. Three kinds of reports are available to everybody from today and they are:

  1. Query reports
  2. Landing pages reports
  3. Geographical summary

The query reports list the top thousand user queries along with numbers of impressions, clicks, average position and the click through rate (CTR).  The impressions, clicks and average position are directly coming from the GWT and they have being joined with Google analytics CTR data. I would love to have this data when building custom reports in analytics.

The landing pages reports show similar data as a report and lists all the top pages in your websites. The report helps us to learn the top performing and sought after pages on your site with click through information. That way, with effective use of built-in filters you can identify the pages which have lowers CTRs or lowers clicks and higher impression, or higher average positions, perform more SEO optimization and improve results. These are some great details to make good decisions on SEO items.

The geographical summary reports all the top countries where you are doing great. If your business is in multiple countries, the report comes handy and is great to analyze your performance in each country you are doing business. This report also have a cool feature named “Google Property” which lists the performance based on various Google properties like Web, image, mobile (this is interesting data when Mobile traffic is very important nowadays).

If you have not yet discovered how to access this report , Please visit this “how to enable feature” blog post.

Possibly Related Posts:



28
Sep 11

How is site speed and SEO related?


There are already lots of hints from Google employees (who guides SEO’s) like Amit Singhal and Matt Cutts that Google is indeed incorporating site speed in SEO and search rankings. Even the Google Webmaster team blogged about it recently. I would like to add more to it and be more descriptive on how site speed can be related to SEO and how site speed can improve your rankings, user experience etc.
Here are some of the reasons and evidences that site speed is important for Google :

  • Google webmaster tools (GWT) reports average site speed for your website. Please login to GWT > Labs > Site performance. Also find more details here. Clearly explains that Google wants  you to know your website’s average page load speed when google crawlers visited your site.
  • Google’s recent panda update mentions about page speed. The panda updates points to conversion rates and time on site but I would stress that they both are much related with site speed. Will add more details to support that.
  • Google’s efforts to make internet and web faster through Google Code – page speed
  • Google’s recent stride into content delivery –Google Page speed service

From the above links it’s very clear that Google considers site speed very important and one of the factor in deciding on search rankings. Also on top of the getting higher rankings, there are other benefits from improving your site speed and they are:

  • Site speed brings more conversions (if you sell something or even if you are increasing your userbase through subscriptions )
  • More pages are visited by users if you have a faster website.
  • Users spend more time on your site.
  • More likely to visit back your website as they had a good time the last time they were in your site.


Please find the complete AOL optimization report here
All the points mentioned above help Google decide that your website is high quality and will help never get affected through future panda updates. Please check the following link which clearly explains what are some of the questions panda update tries to answer – Amit Singhal’s update. .
Please check back for future blog posts which talks more about site speed and SEO, how you can improve and various tools to check site speed.

Possibly Related Posts:



14
Jul 11

Top 10 SEO blogs any SEO expert should follow

There are a lot of SEO blogs and resources (blogs sites) that shares all the latest news with search technology and It’s always important to keep updated with what’s happening around search. As the technology is moving and changing so fast, it’s always recommended that SEO expert’s follow the top SEO blogs that can keep you updated with all latest SEO trend and advice.

Here are my top 10 favorite SEO blogs that I follow and share:

1. Google webmaster tools blog
2. Bing webmaster tools blog
3. SEOMoz Blog (Don’t miss there weekly SEO discussion called whiteboard Friday – )
4. Search engine Land
5. Matt Cutts blog
6. Search engine Journal
7. Search Engine Roundtable
8. Search Engine Guide
9. SEO book blog
10. Search Engine Watch

I recommend that you create a category called SEO blogs in your Google reader and add them. These blogs are really important to keep you updated with search engine news.

Possibly Related Posts:



30
Jun 11

Top enterprise SEO tools for businesses

enterprise SEO tools

Here are a set of SEO tools that I mostly use and recommend.

Google Webmaster tools (GWT)
The Google webmaster tool is a very useful enterprise SEO tool and gives the enterprise SEO expert a lot of insights on current SEO details for their website. I would recommend that enterprise SEO should be starts with adding your website to the Google webmaster tools. I will be writing a separate post on explaining the advantages of having GWT.

Majestic SEO
The Majestic SEO is a great tool to understand the diversity of your external links coming to your website or to your competitor’s website (Competitive Analysis). The tools helps to evaluate how an SEO campaign (in building new external links ) performed in a new marketing campaign for a certain  period of time. They have a pretty good list of features in their sites and they are working on adding more.
Get more details here here

majestic SEO enterprise SEO tool

SEOMoz Tools
SEOmoz tools provide best of class data for enterprise SEO professionals to make wise decisions on their current organic search results. I recommend this SEO tool if your company can afford a PRO account or start with trial account by accessing this link here .

HTTPFox
The HTTP fox is an add-on for Firefox and  is very useful plug-in to analyze how your web servers are responding to request to your web pages. It comes pretty much handy when you are moving your webpages to a new URL and you need to confirm whether they are showing valid 301 re-directs status messages. You can get the add-on here .

Link Sleuth
This SEO tool looks a little hacky but one of the best tool designed to find broken links in your site. Make sure you be a little careful when running this tool to live sites as it can use a lot of your server memory. I recommend to run the tool at off-peak time (when you have minimum visitors to your website). You can get the tool here .

UserAgent Switcher
I have a written a detailed blog post on this tool which helps to analyze a webpage accessibility as a search engine crawler . The tools helps you to perform SEO testing on web pages that got modified or built new. As you can find that UI developers are using all sorts of latest UI development techniques on web pages (Most of the time to make them cross browser compatible) and sometimes these  code snippets disallowing crawlers to access the pages. Please find my previous blog post on accessing web pages as crawlers.

Page load speed testing tool
As you know, Google loves webpages that loads faster and are lite. Pages with big images (in file size),  tons of  javascript calls (external and internal) etc makes search engine’s crawling experience bad and indirectly affects the webpage’s performance in search engines poor  . I mostly recommend the webpagetest.org website , where you can analyze how fast your web pages are loading and in detail report on which page element took the highest time to load.

Possibly Related Posts:



27
Jun 11

How to view a webpage like a Search engine crawler

I have come across many situations where we SEO professionals need to test a website (mostly when we are launching new pages) and would require to access the webpages in the same way a search engine crawler will do. Many a time’s software engineers feel that crawlers will see the same webpage as you see in a browser window. And my answer to that is ‘Not always’.
Nowadays the new web applications are complex and web developers can introduce some logic which can block crawlers (Most of the time when you add some script to check user agents – to deal with cross browser compatibility). It’s also seen that load balancers and servers can block search engine crawlers accidently (Most of the time when you want to block certain user agents – Scraping sites or certain search engines which you don’t want to get index – like Baidu ).
The best way to test your website for Search engine accessibility is to test with the search engine user agent in Mozilla Firefox. Here is the step by step procedure to start your testing:
1. Get the user agent switcher for Firefox.
I prefer using the add-on “User Agent Switcher” which can be downloaded from here .
If you access that link from Firefox itself, its a one click install to get it added to your browser.


2. Change the user agent in your browser
After you install the add-on, go to Firefox option and you can find the add-on user agent switcher added to the list of other add-ons. This add-on comes with a preset Google bot user agent that you can select.If you don’t want to select the existing user agent (sometimes they can get outdated when Google changes their user agents), I recommend you edit and add your own user agents.

For example, the latest user agents for major search engines are :
Google :
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Bing :
msnbot-webmaster/1.0 (+http://search.msn.com/msnbot.htm)
Yahoo:
Mozilla/5.0 (Yahoo-MMCrawler/4.0; mailto:vertical-crawl-support@yahoo-inc.com)
Change the description and user agent entry as shown in the image below.

3. Start visiting WebPages without closing the browser

While being in this setting, any pages that you visit will be same as a crawler seeing when analyzing your page content.You can make sure that your pages are loaded completely and all the important content related section of the pages are looking good.

Possibly Related Posts:



24
Jun 11

Enterprise SEO definition

Search Engine Optimization (SEO) is a very broad and complex topic. And of course SEO differs a lot with respect to your requirements or the type of your content being promoted on the web. Its different from optimizing a blog to optimizing a 10 page static website or even a Content Management System ( CMS ) on the web. Recently the SEO industry have defined a new category of SEO (Which is a more advanced that normal SEO ) and its called Enterprise SEO. Let me define what is Enterprise SEO :

Enterprise SEO can be defined as the process of  implementing and maintaining a set of SEO related projects and evangelizing certain culture (focusing on SEO) inside large enterprise (web related) companies . So now we have to define ‘Large enterprise companies (web related)’ , to be more precise. Large enterprises are the ones which have more common attributes mentioned below :

  • Large number of public landing pages (more than 5000 pages) in which majority of the pages are dynamically created (which are not static HTML pages).
  • Contains multiple hosting servers where traffic is controlled by load balancers.
  • Update existing landing pages with content and structure changes.
  • Adds new landing pages.
  • Dynamic user generated content – Reviews, ratings and comments.
  • International presence.
  • Have good number of competing websites in participating industry.
  • Search engine Marketing (Like Google Adwords) is one of their key marketing campaigns.
  • Have a strategy to generate high ROI (return on investment).

Here are some enterprises which heavily depend depend on SEO managers to perform enterprise SEO :

  • Amazon
  • Ebay
  • Overstock
  • Top E-commerce websites
  • Top Content sites like Yelp, Zagat etc
  • Top product companies like Apple, Adobe etc

Please check for my future posts which will focus more on topics like top SEO tools for enterprise SEO and role of an Enterprise SEO manger.

Possibly Related Posts:



17
Jan 11

My predictions for SEO in 2011

Here are some of my predictions for SEO in 2011 .

  • Google SEO is first and now Bing SEO is second

In 2010 , the SEO market was clearly focusing on Google SEO and the main reason being the search market share for bing was mere 11% . That trend has changed this year (has gone up) and it was reported recently that its been some where around 25 -30% now. So I truly believe that SEO engineers will have to deal with two search engines now equally for their clients and follow each search engines closely with their technology updates.

  • Anti-social will fail your SEO

We have seen a lot of  SEO engineers not much serious about Social Media and push the work to the marketing team. It’s been also seen when corporate leaders never feel social media and SEO are related in any ways and are two different sectors. All that is going to change in 2011,  as improving link building and link bait tricks can be made possible through social media. Social media marketing is going to be part of SEO strategies for corporate marketing decisions. The best supporting reason  being Bing incorporating Facebook likes into their search results.Google still trying hard to bring a better explanation on incorporating Social search to their main search result pages.

  • Click Through Rate (CTR) and ranking algorithm will mingle more

Google is pretty serious in using their click through rates (CTR’s) and making a smart decision improvising their search algorithm to incorporate decisions using this ‘data about data’ . The usage based data will be utilized this year  into their ranking algorithms to remove spam and showcase more read content.

  • Link Exchange going to die in 2011

2004-2005 saw the rise of Link exchange , but the trend for it has went low drastically. We will see no SEO expert putting their time on link exchange and will concentrate in bringing  the latest SEO strategies into play.

  • SEO Experts will need lot of skill sets

SEO experts will need to be an expert in lot of areas and the list of areas are  going to grow big. Some of the areas being Web analytics , Social Media , Improving site accessibility , improving Link/traffic , compete in vertical results (youtube, google images, google news, google blog search) etc etc . And of course there are lot of new items going to be added in this list.

I think we will have a tons of new predictions and soon we can consolidate into one list.

Possibly Related Posts: