17
Sep 14

Coca Cola’s latest ad is truly heart touching !!!

Coke’s launched this new Ad (“The Happiest Thank You”) on September 11, 2014 and had brought more than 1.3 million views in few days. The three minute commercial is definitely going viral in spite many raising whether the ad was staged (Like the one Adweek suggested here).

Even if it was staged, Coke marketing really touches humanity and human emotions in this ad. Please enjoy the Ad !!!

Client: Coca-Cola
Agency: McCann Worldgroup
Director: Paolo Villaluna

Possibly Related Posts:



17
Nov 13

Top 3 key disciplines to CRO

There are fixed disciplines that you should encourage within your Conversion team to show continuous growth in conversion rate improvements.If you make sure that you and your team strictly follow these rules , it can bring significant growth in your stride towards conversion rate optimization efforts. In my opinion there are key three disciplines that you should strictly follow to achieve continuous successes :

  1. Experimentation
  2. Data gathering
  3. User testing

Experimentation should begin with setting up goals for your overall end result – Answering questions like what are the macro and micro goals for the CRO team. There are many steps involved in experimentation leading from setting up goals to how to keep track of results of your experiments for re-iterating the successes and implement more on top of them. I will be writing a separate detailed blog post on what are the key steps in experimentation and find success in this focus area

Date gathering in conversion optimization is very important for your team success just because data helps you learn next steps and make wise decisions. Find easy ways to identify what data to gather and find the best tools to gather them. For example, you are testing a widget which has a two arrows which helps users to scroll left and right of the list. If you setup events in Google analytics to track these user actions, the data can provide you meaningful details on user behavior on this widget. If the data shows that very less percentage of users are using the left arrow and majority uses the right arrow, that’s a clear indication to remove the left arrow and give users just the right arrow functionality.

User testing can be any tests that are more focused towards users interaction with your webpage. Those can be tests more focused on making it easy for users to take desirable outcomes like filling up a mandatory text box or even some tests to reduce errors on a page like a log in page for example, you know that a lot of users are not getting to the logged-in page, perform user testing on the login form to identify the error nous user behavior and make it easy for the user to understand how to avoid those known or common confusing behaviors.

Happy Testing and optimizing :)

 

Possibly Related Posts:



23
Oct 13

How can nofollow links hurt your site’s ranking

The title to this blog post is definitely contradictory as no-follow links can’t hurt your SEO rankings mainly because Google doesn’t account nofollow links as factors for SEO. But recently Google is being looking at nofollow links to identify a few culprits trying to take advantage of leniency. Let me explain.

There are many SEO and search marketers out there commenting or participating in forums, leaving their website’s link with their comment. These comments with links also help them get direct traffic from comment readers. I agree that these marketers intentions are definitely not to build external links but to generate traffic from valid thoughts shared as comments. This also allows comment readers  find whether the comment writer is an expert in subject by looking at his/her website or blog from the link. One thing to understand here is Google definitely doesn’t count these links (as most of them are no-follow links) for SEO rankings.

But  recently Google started looking at these links to understand the commentators intentions and whether these comments are added for legitimate reasons. Some of the cases they look at are :

  • Do you post comments to all the blogs/forums in the world – which  clearly shows that these comments can be non-legit?
  • Does your comments are really being treated as spam or in other ways most blog owners mark you as spam as soon as you get them ?
  • Your comments generally doesn’t make sense or is not useful to anybody – generally like a spammy comment :)

The above steps can lead your website to a  manual penalty from Google. The above steps are clearly a sign for Google to identify that a website is trying to be a more creative here to generate direct traffic from spamming the internet. Matt Cutts from Google have recently agreed that Google has the technology to identify these websites and put them to the wrong list.

To be on the safer side, here are some recommendations :

  • Only comment or participate in comments with links if you are passing genuine thoughts. Adding the same comment to hundreds or thousands of comments can lead you to trouble.
  • If you have outsourced your SEO to a third party company and they are recommending this – Just be aware that they can hurt your future SEO health.
  • Definitely practice adding comments to generate direct traffic to your website – its a good way to create brand awareness or even self branding.

If you still have questions, please add your comment and I will reply with answers (Again ‘MIND’ when you comment…)

 

Possibly Related Posts:



21
Oct 11

Effective SEO for Test websites

Enterprises always create internal publicly accessible sub-domains for various reasons but mostly for testing. For example, Enterprises (for example, site.com) tend to create sub-domains like beta.site.com, test.site.com, demo.site.com or qa.site.com . The SEO’s should incorporate a few SEO checks before the company launches test sites or beta websites.

It is always recommend that you propose few SEO requirements for beta websites so that these practices doesn’t affect your main site’s SEO. The following are some of the reason they can affect your main websites SEO :

  • Duplicate content
  • Page rank sharing
  • excessive crawling / crawl activity
  • Incorrect indexation

These are some tips where you can control the access of web crawlers to these internal websites and not have any SEO issues

Implementing the correct blocking rules in robots.txt file.

I recommend you add the following rules in the sub-domains robot.txt file :

User-agent: *

Disallow: /

Implement Canonical tags

Make sure the pages under the sub-domain contains the canonical tag with the primary domain name. For example, the canonical tag in the page www.beta.site.com/test-url/ is www.site.com/test-url/ .  This way if the pages inside the beta site is crawled , they will index the correct URL while indexing.

Add no-follow, no index to internal pages

If the internal website have pages which is not present in the main website, add the “robots” meta tag blocking the Search engines. This way  they don’t get indexed .The tag is <meta name=”robots” content=”noindex, nofollow”> before the </head> tag.

Register the sub-domain in Google webmaster tools (GWT)

If you can register the sub-domain to GWT, you can find whether any crawling activity is happening inside the website.

Possibly Related Posts:



05
Oct 11

Google Analytics adds SEO reporting

Google have launched SEO (Search engine Optimization) reports in Google analytics after you integrate with your Google webmaster tools account. This is great news for SEO’s around the world. I would guess this is Google’s first attempt to give direct reports to SEO professionals who are not good at web analytics (But I always recommend that a good SEO should be a good web analyst too).

Google analytics adds SEO report

So let’s look at the reports Google launched recently. Three kinds of reports are available to everybody from today and they are:

  1. Query reports
  2. Landing pages reports
  3. Geographical summary

The query reports list the top thousand user queries along with numbers of impressions, clicks, average position and the click through rate (CTR).  The impressions, clicks and average position are directly coming from the GWT and they have being joined with Google analytics CTR data. I would love to have this data when building custom reports in analytics.

The landing pages reports show similar data as a report and lists all the top pages in your websites. The report helps us to learn the top performing and sought after pages on your site with click through information. That way, with effective use of built-in filters you can identify the pages which have lowers CTRs or lowers clicks and higher impression, or higher average positions, perform more SEO optimization and improve results. These are some great details to make good decisions on SEO items.

The geographical summary reports all the top countries where you are doing great. If your business is in multiple countries, the report comes handy and is great to analyze your performance in each country you are doing business. This report also have a cool feature named “Google Property” which lists the performance based on various Google properties like Web, image, mobile (this is interesting data when Mobile traffic is very important nowadays).

If you have not yet discovered how to access this report , Please visit this “how to enable feature” blog post.

Possibly Related Posts:



29
Sep 11

Google Analytics adds real time stats feature

The Google Analytics team has done it again – a new  innovative Analytics report – GA real time . Basically the real time report is going to give you an idea on the metrics when it’s just happens on your site. Something that any marketer or product owner wants to know when launches something new to their website. This was a feature that only Adobe Omniture’s SiteCatalyst used to have and now Google is ready to be at par with.
Google Analytics adds real time stats feature
What are the pros from this new feature in GA real time?

  • Hourly metric analysis

Web analysts and website optimizers can have a better understanding on their daily traffic numbers. They will be able to answer a few questions like

  1. What are the different traffic peak times for their website?
  2. Segmentation of traffic sources (direct/search/referral) in different time slots like hourly.
  3.  Usage metrics – bounces, conversions in different time of the day
  • Campaign analysis and measurement

If you are a marketer and you are launching a new campaign, real-time data can give you lot of good metrics to analyze you campaign performance. The data will give you a better understanding on what went well and what didn’t go well, which can eventually help you plan your next campaign in a better way.

  • Instant Social Media tracking

If you manage social media campaigns, this feature can help you understand social metrics in a better and faster way. It can help you decide the most converting social media tool to use at the correct time slot of the day.

How to access GA real time now?

If you want to access Google analytics real time feature, you need to switch to their new interface. Right now they can be accessed in the dashboards section but eventually move near the “home” section.
Accessing Google Analytics real time
They are still rolling out the feature to more users but if you want to be an early adopter, please visit this link and send the team a access request. These reports are accessible for everybody and is all free.

Possibly Related Posts:



28
Sep 11

How is site speed and SEO related?


There are already lots of hints from Google employees (who guides SEO’s) like Amit Singhal and Matt Cutts that Google is indeed incorporating site speed in SEO and search rankings. Even the Google Webmaster team blogged about it recently. I would like to add more to it and be more descriptive on how site speed can be related to SEO and how site speed can improve your rankings, user experience etc.
Here are some of the reasons and evidences that site speed is important for Google :

  • Google webmaster tools (GWT) reports average site speed for your website. Please login to GWT > Labs > Site performance. Also find more details here. Clearly explains that Google wants  you to know your website’s average page load speed when google crawlers visited your site.
  • Google’s recent panda update mentions about page speed. The panda updates points to conversion rates and time on site but I would stress that they both are much related with site speed. Will add more details to support that.
  • Google’s efforts to make internet and web faster through Google Code – page speed
  • Google’s recent stride into content delivery -Google Page speed service

From the above links it’s very clear that Google considers site speed very important and one of the factor in deciding on search rankings. Also on top of the getting higher rankings, there are other benefits from improving your site speed and they are:

  • Site speed brings more conversions (if you sell something or even if you are increasing your userbase through subscriptions )
  • More pages are visited by users if you have a faster website.
  • Users spend more time on your site.
  • More likely to visit back your website as they had a good time the last time they were in your site.


Please find the complete AOL optimization report here
All the points mentioned above help Google decide that your website is high quality and will help never get affected through future panda updates. Please check the following link which clearly explains what are some of the questions panda update tries to answer – Amit Singhal’s update. .
Please check back for future blog posts which talks more about site speed and SEO, how you can improve and various tools to check site speed.

Possibly Related Posts:



14
Jul 11

Top 10 SEO blogs any SEO expert should follow

There are a lot of SEO blogs and resources (blogs sites) that shares all the latest news with search technology and It’s always important to keep updated with what’s happening around search. As the technology is moving and changing so fast, it’s always recommended that SEO expert’s follow the top SEO blogs that can keep you updated with all latest SEO trend and advice.

Here are my top 10 favorite SEO blogs that I follow and share:

1. Google webmaster tools blog
2. Bing webmaster tools blog
3. SEOMoz Blog (Don’t miss there weekly SEO discussion called whiteboard Friday – )
4. Search engine Land
5. Matt Cutts blog
6. Search engine Journal
7. Search Engine Roundtable
8. Search Engine Guide
9. SEO book blog
10. Search Engine Watch

I recommend that you create a category called SEO blogs in your Google reader and add them. These blogs are really important to keep you updated with search engine news.

Possibly Related Posts:



30
Jun 11

Top enterprise SEO tools for businesses

enterprise SEO tools

Here are a set of SEO tools that I mostly use and recommend.

Google Webmaster tools (GWT)
The Google webmaster tool is a very useful enterprise SEO tool and gives the enterprise SEO expert a lot of insights on current SEO details for their website. I would recommend that enterprise SEO should be starts with adding your website to the Google webmaster tools. I will be writing a separate post on explaining the advantages of having GWT.

Majestic SEO
The Majestic SEO is a great tool to understand the diversity of your external links coming to your website or to your competitor’s website (Competitive Analysis). The tools helps to evaluate how an SEO campaign (in building new external links ) performed in a new marketing campaign for a certain  period of time. They have a pretty good list of features in their sites and they are working on adding more.
Get more details here here

majestic SEO enterprise SEO tool

SEOMoz Tools
SEOmoz tools provide best of class data for enterprise SEO professionals to make wise decisions on their current organic search results. I recommend this SEO tool if your company can afford a PRO account or start with trial account by accessing this link here .

HTTPFox
The HTTP fox is an add-on for Firefox and  is very useful plug-in to analyze how your web servers are responding to request to your web pages. It comes pretty much handy when you are moving your webpages to a new URL and you need to confirm whether they are showing valid 301 re-directs status messages. You can get the add-on here .

Link Sleuth
This SEO tool looks a little hacky but one of the best tool designed to find broken links in your site. Make sure you be a little careful when running this tool to live sites as it can use a lot of your server memory. I recommend to run the tool at off-peak time (when you have minimum visitors to your website). You can get the tool here .

UserAgent Switcher
I have a written a detailed blog post on this tool which helps to analyze a webpage accessibility as a search engine crawler . The tools helps you to perform SEO testing on web pages that got modified or built new. As you can find that UI developers are using all sorts of latest UI development techniques on web pages (Most of the time to make them cross browser compatible) and sometimes these  code snippets disallowing crawlers to access the pages. Please find my previous blog post on accessing web pages as crawlers.

Page load speed testing tool
As you know, Google loves webpages that loads faster and are lite. Pages with big images (in file size),  tons of  javascript calls (external and internal) etc makes search engine’s crawling experience bad and indirectly affects the webpage’s performance in search engines poor  . I mostly recommend the webpagetest.org website , where you can analyze how fast your web pages are loading and in detail report on which page element took the highest time to load.

Possibly Related Posts:



27
Jun 11

How to view a webpage like a Search engine crawler

I have come across many situations where we SEO professionals need to test a website (mostly when we are launching new pages) and would require to access the webpages in the same way a search engine crawler will do. Many a time’s software engineers feel that crawlers will see the same webpage as you see in a browser window. And my answer to that is ‘Not always’.
Nowadays the new web applications are complex and web developers can introduce some logic which can block crawlers (Most of the time when you add some script to check user agents – to deal with cross browser compatibility). It’s also seen that load balancers and servers can block search engine crawlers accidently (Most of the time when you want to block certain user agents – Scraping sites or certain search engines which you don’t want to get index – like Baidu ).
The best way to test your website for Search engine accessibility is to test with the search engine user agent in Mozilla Firefox. Here is the step by step procedure to start your testing:
1. Get the user agent switcher for Firefox.
I prefer using the add-on “User Agent Switcher” which can be downloaded from here .
If you access that link from Firefox itself, its a one click install to get it added to your browser.


2. Change the user agent in your browser
After you install the add-on, go to Firefox option and you can find the add-on user agent switcher added to the list of other add-ons. This add-on comes with a preset Google bot user agent that you can select.If you don’t want to select the existing user agent (sometimes they can get outdated when Google changes their user agents), I recommend you edit and add your own user agents.

For example, the latest user agents for major search engines are :
Google :
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Bing :
msnbot-webmaster/1.0 (+http://search.msn.com/msnbot.htm)
Yahoo:
Mozilla/5.0 (Yahoo-MMCrawler/4.0; mailto:vertical-crawl-support@yahoo-inc.com)
Change the description and user agent entry as shown in the image below.

3. Start visiting WebPages without closing the browser

While being in this setting, any pages that you visit will be same as a crawler seeing when analyzing your page content.You can make sure that your pages are loaded completely and all the important content related section of the pages are looking good.

Possibly Related Posts: