Welcome to My Blog

Here i share some information about google and also about SEO, so please check out this, and give your suggestion regarding the same subjects.

Tuesday, June 26, 2012

30 Free Ways To Market Your Small Business Site

Are you looking for ways to market your small business website with a limited budget?
Whether it’s with established sites such as Google and Facebook, or newer outlets like Pinterest, there are plenty of options available to promote your site.

There are at least 30 ways to market your website with a time investment and no credit card required. Some of these are oldies but goodies, while others are newer and exciting avenues you may not have tried out yet.

Here are 30 things you can do today to get started marketing your website for free.

  1. Press releases still work. Granted a submission to PRWeb or a Vocus account make the pickup and link benefit much easier, but those cost dollars – so for this article lets reiterate the best free press release sources:
    • 24-7PressRelease.com
    • PRLog.org
    • IdeaMarketers.com
  2. Send the press release to your local media outlets, or any niche media outlets that may be interested in what you do.
  3. Claim, verify, and update your Google Local Business listing. This is extremely important. Google Local Listings have been absorbed into Google+, so be sure to check out this great resource over at Blumenthals.com to keep up to date on how to manage your Google Local Listing.
  4. Find a niche social media site that pertains to your exact business and participate. Be helpful, provide relevant and useful information, and your word of mouth advertising will grow from that engagement.
    • Examples:
      • Travel or hospitality business – Tripadvisor.com forums
      • Photography store – Photo.net or RockTheShotForum.com
      • Wedding Planning or Favor site – Brides.com or Onewed.com forums
      • Search your niche or service plus forums to find ideas. If there isn’t a forum out there, consider starting one.
  5. Build a Google+ page for your business and follow businesses that are related to your product or service niche. Share informative and relative content and link to your profile from your website. You should also consider allowing users to +1 your content on a page by page basis.
  6. Setting up joint benefit with local businesses or others in your niche can help you reach eyes you never did before. Be sure to answer the question "Will my user find this information beneficial as they shop and purchase?" every time you link to a resource, or request a link or listing on another site.
  7. Comment and offer original, well thought out, sensible information, opinion and help on blogs that are relevant to your website's topic and be sure to leave your URL. Even if a nofollow tag is attached, you could gain a bit of traffic and some credibility as an authority on the subject matter. This is not blog comment spamming, this is engaging in a conversation relevant to your website's topic.
  8. Set up and verify a Webmaster Central Account at Google.
  9. Set up a Bing Webmaster Tools account and verify it.
  10. Update or create your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
  11. Write a "how-to" article that addresses your niche for Wikihow.com or Answers.com. This is kind of fun and a good resource for getting mentions and links. Looking at your product or service in a step-by-step manner is often enlightening in several ways. It can help you better explain your products and services on your own website. I will say I don’t know why some of these sites still rank well, many of them are junk. I do like most of the answers on the two sites mentioned above. Be picky with where you participate.
  12. Write unique HTML page titles for all of your pages. This is still extremely important, don’t skimp on this one.
  13. Share your photos at Flickr – get a profile, write descriptions, and link to your website. Don't share photos you don't own or have permission to use.
  14. Start a blog. There's nothing wrong with getting the basics of blogging down by using a free service from Blogger or WordPress.
  15. Make sure your Bing and Yahoo Local listings are up to date.
  16. Update and optimize your description and URL at YP.com. They'll try to get you to spend money on an upgraded listing or some other search marketing options. Don't bother with that, but make sure the information is accurate and fresh.
  17. Use your Bing Webmaster Tools account to look at your incoming links. How do they look? Are all of the sites relevant and on-topic? If not, reevaluate your link building practices and start contacting any of the irrelevant sites you can and ask them to take down your link. A clean and relevant incoming link profile is important; cleaning up bad links is a necessity until we can tell Google and Bing which links we want them to ignore.
  18. Make a slideshow of your products or record an original how-to video and upload to YouTube. Be sure to optimize your title and descriptions. Once it's uploaded, write a new page and embed the video on your own Web site. Add a transcription of the video if possible.
  19. Try a new free keyword tool for researching website optimization, then see #20.
  20. Add a page to your site focused on a top keyword phrase you found in #19.
  21. Build a Facebook Page and work to engage those that are interested in your product or service. Facebook is so much more robust than it ever was! Create groups, events, and photo albums. Link to your Facebook profile from your site and allow visitors to your site to like and share your content.
  22. Install Google Analytics if you don’t have any tracking software. The program is pretty amazing and it's free. You need to do this if you haven’t already. It's that important.
  23. Start Twittering or start doing it much better than you are now – it's a great way to network with like-minded individuals.
  24. Pinterest is hot right now. If you have visually stimulating content that is relevant to the site's demographic, you can find great success right now. Be sure you're using solid practices for marketing on Pinterest as you get started.
  25. Create a new list in Twitter and follow profiles of industry experts you know and trust. Use this as your modern feed reader. I don’t use RSS feed readers anymore. I like content that has been vetted by my peers and is worthy of a tweet or two.
  26. Try a new way to write an ad for a struggling PPC ad group or campaign.
  27. Review your Google Analytics In-Page insights and take note of how users are interacting with your page. Where to they click, what is getting ignored. Make changes based on this knowledge.
  28. Set up a Google Content Experiment through your Analytics account and test with the information you obtained and changes you made in number 27.
  29. Build a map at Google Maps and add descriptions for your storefront, locations, and nearby useful points of interest. Make your map public and embed it on your own website. Add links back to relevant content on your site if possible to each point of interest.
  30. Keep reading Search Engine Watch for more free tips and tricks.
There you have it – 30 ways to market your website. Get to work and make something happen! There's no reason to say you can't be successful because you don’t have a huge advertising budget. Time is all you need.

Sources : http://searchenginewatch.com/article/2048588/30-Free-Ways-To-Market-Your-Small-Business-Site

Why Content Marketing is a Great SEO Strategy, Not a Short-Term Tactic

Content marketing is a great SEO strategy – even better better than link building. Shifting your strategy from search marketing to content marketing is increasingly leading to higher search rankings and more organic traffic.

Some tests in May that looked at the impact of Google+ to organic search performance produced some interesting results. I analyzed two sets of clients I was working with and categorized them as:
  1. Websites with strong social profiles.
  2. Websites with weak social profiles.
What this analysis showed was:
  • Websites with weak social profiles saw a 19.5 percent reduction in organic traffic.
  • Websites with strong social profiles saw a 42.6 percent increase in organic traffic.
Google is now valuing authorship, natural links, and social signals far more highly. So the next natural step is content marketing.

With many SEO campaigns, it can be easy to over-analyze, often at the expense of the most important ranking factor: doing. Yes, analysis and auditing is important, but if you don't take action and change anything, your results aren't going to change. If anything, they'll probably get worse because your competitors will be out there doing instead.

Every site will have different SEO needs and requirements – but too often the actionable outcome of SEO audits and analysis is that a website needs more great content and it needs more high quality links. In these cases, why not just get on with building great content and attracting high-quality natural links?

Why Link Building is a Short-term Tactic

If your main focus for achieving search success is via SEO-based link building, then I think this can only ever be a short-term tactic at best.

The algorithms are looking to catch anything that appears unnatural – so when the next Penguin or Panda updates come around (or Platypus or Pigeon, whatever stupid name they give it next!) you're unlikely to be in a defensible position where you can expect to see a benefit rather that a drop. In fact, you're probably going to be pretty scared and concerned about what's around the corner, even if you haven't been hit yet.

Link Building Should be a Byproduct of Great Content

The main difference is that link building is a tactic, while content marketing is a strategy. What I mean by this is that if you're just trying to build links for SEO purposes and nothing else, you're basically just chasing Google's algorithm and making the most of what works while it's still getting you results. It can still work, but it's not a long-term strategy.

Great content, however, can send you targeted traffic for years. And I don't just mean search traffic, but referral, social and viral/word of mouth traffic.

Getting a great link shouldn't be your only end goal – you should think about other target metrics such as audience reach, traffic, mentions, citations, eyeballs, rankings, followers – or, more importantly, revenue!

What Happens if Links are no Longer Valued by Search Algorithms?

I can't see this happening in the near future – certainly with Google, but who knows what's ahead of us. The 2011 ranking correlation factors from SEOmoz showed that Facebook likes/shares had the highest correlation to rankings out of all factors. This is correlation not causation.

Google has said that they don't use this data for rankings - but it showed how powerful social data can be in terms of identifying the best content. So what happens if Google change their mind and start using it? Or what if Facebook/Twitter search becomes a real threat to Google?

You need to have something else to fall back on.

If a piece of content has 100 or more links and no social footprint, it's a clear sign that those links have been built to a page – they've not been naturally generated. Likewise, if you have many social votes for a piece of content, yet no links, it's also not the best sign that this is a high-quality page that demands trust and relevancy from the search engines. You need a mix of both – and it's becoming much more difficult to fake and make shortcuts.

Where to go From Here?

Whether it's content marketing, inbound marketing, earned media or just online marketing – what it's called is largely irrelevant. What's important is that you've got a great content strategy in place and you're able to make the most of this by promoting your content to generate attention online.

If you've got great content and you can attract/build an engaged audience, you don't have to rely on search. And even if you don't notice your organic traffic rising straight away, I would be confident that this is the best method right now for achieving long-term success.

Sources : http://searchenginewatch.com/article/2186953/Why-Content-Marketing-is-a-Great-SEO-Strategy-Not-a-Short-Term-Tactic

Thursday, May 31, 2012

The 10 most common Landing Page Optimization mistakes

The first thing that most people think when they want to boost their online sales is to increase their website traffic. Thus they focus on getting more links, achieving better Search Engine Rankings, investing more on advertising, sending more emails and newsletters, finding more affiliates, becoming more active on Social Media and using other ways that can help them increase the number of their visitors. But is this always the right approach? Certainly not! Before spending all of your resources on trying to acquire expensive traffic, you should first ask yourself: “Are my landing pages optimized? Do they help me generate sales?”

The Landing Page Optimization is probably one of the most important but at the same time ignored part of the Online Marketing strategy. Have in mind that in many cases, increasing your traffic by 10% will not have the same impact on your revenues as improving your conversion rates by the same percentage. This is primarily due to the fact that increasing your traffic incurs several costs that can squeeze your profit margins. Thus as we discussed on a previous article building effective landing pages should be a top priority in order to improve your conversion rates and increase your sales.

That is why in this article we discuss the 10 most common Landing Page Optimization mistakes and we explain how these problems affect the online marketing campaign:

1. Hidden or Unclear Call to Action

As we said in a previous blog post, the Call to action is a simple way to interact directly with your visitors and encourage them to take an immediate action (such as buy, call, subscribe, register, download etc) after visiting your website or viewing your page. The call to action should be distinctive, it should be visible on the top of the page and it should communicate a clear message to your readers. Failing to use clear call to action or failing to focus the user attention on it, is something that can definitely lead to lower conversion rates.
eye_Tracking_Data

2. Having too much Text

Having too much text on the landing pages of your website is not helpful for your visitors. Keep in mind that the landing page should get the user’s attention and make him/her understand the basic features of your product/service. The rule “less is more” applies in this situation. Present to the user the most important aspects of your product/service and provide him a way to read more details if he wants. Use clever graphics/visuals to pass hidden messages and don’t forget that a picture is worth a thousand words.
Now I am sure that many SEOs will argue that such an approach will have a dramatic impact on the SEO strategy of the website. Think again! There are several ways to add this text below the fold (where it will not distract the users) or use JavaScript/jQuery to hide and show the content in case they want to get more information. There are several ways to solve this problem technically. Just think out of the box and have in mind that visitors have limited time and thus you have only few seconds to get their attention.

3. Too many Links/Choices

Providing too many choices to the users is not advised since it can confuse them and make them leave your website. Make your users navigate easier on your site and increase the odds of focusing them on the Call to Action by keeping the number of choices short. This applies especially when the visitor lands on the page from an Adwords or PPC advertisement.
your_landing_page
Again you don’t have to sacrifice your SEO campaign and destroy the Link Architecture of your website in order to create effective Landing Pages. You can resolve these issues easily by using several web development techniques and by implementing clever designs. The links can remain on the page as long as you focus the user attention on the choices that you want him/her to make.

4. Visual Distractions

Avoid at all costs any visual distractions that will make user ignore the call to action. Removing flashing advertising banners, floating boxes, popups or any other similar distractions is essential to increase the conversion rates. Keep in mind that using too many call to actions on the same page can also be considered as a distraction. Even though you can have more than one call to actions on the same page, make sure you prioritize them and not confuse/distract the user from your main goal.

5. Request too many information on the forms

In many cases the landing pages contain HTML Forms where the visitor has to fill in his/her information in order to proceed (download, buy, subscribe, register, contact sales etc). If possible eliminate the forms that are not necessary since they are considered as barriers by many users. In case they are needed, try to request from the user only the information that is absolutely necessary and avoid asking for things that are not useful/relevant or that he/she might be reluctant to provide immediately. Don’t forget that you can ask later for this information, once you gain the trust of the user.

6. Message Mismatch

It’s a common mistake to use Advertising campaigns with misleading texts in order to drive traffic to the landing page. A message mismatch between the ad and the landing page can lead to high bounce rates and low conversions. By using misleading ads not only you attract not-targeted traffic and you waste your resources but also you risk getting banned by the advertising networks since this is a clear violation of their Terms of Use. The same applies not only for ads but also for all of your promotional campaigns and links that you place across the web.
site_optimisation_landing_pages-ads

7. Lack of Trust

Another reason why you might have low conversion rates is because the potential clients that visit your website do not trust you. In order to resolve this problem first of all make sure you have a well designed website which is an indication that you are a serious company. Also make sure you include the logos and reviews of your satisfied clients or provide reviews from experts of the industry. Finally keep in mind that you should include in a visible position any Trust Seals or Badges that show that you website is safe for online transactions.

8. Not tracking the results of your landing pages

In order to be able to optimize a landing page, you must be able to know how well it performs. A major mistake that lots of people make is that they don’t monitor and track the results of their landing pages. Make sure you invest time and effort on reviewing your traffic logs, tracing the behaviour of the user within your website and above all track every click of the user by using Google Analytics Event Tracking or Virtual Pageviews or similar techniques. This information will be invaluable for you when you start evaluating and optimizing the results of each landing page.

9. Not testing the results of your landing pages

If you don’t test different versions of the landing page you can’t be sure that it works properly (or how well it works). By using A/B testing you can get lots of useful data about the behaviour and needs of the user and use this information to improve your pages and make the right marketing decisions. Don’t forget that in order to be able to evaluate the results of the A/B testing you need gather results over a period of time and of course to set all the necessary mechanisms that will help you monitor the results of each page. Google’s Website Optimizer is a useful service that can help you perform such tests, nevertheless there are several other solutions and tools that you can use.
a-b-testing

10. Not optimizing your Landing Pages

Webmasters and Online Marketers optimize their Ad/SEO/Social Media/Online Marketing campaigns all the time. Why should not we do the same for the Landing Pages? Redesigning the Landing Pages of the website by taking into account the results that we track, the A/B testing and the feedback from the users is absolutely necessary in order to improve the conversion rates. Don’t forget that it’s your client that should “design” your website and not your Web Designer or Online Marketing consultant. And since different clients have contradictory needs you need to find a solution that satisfies most of them. This can only be done by dedicating time and effort on monitoring, testing and optimizing your landing pages.

The Landing Page Optimization is a difficult and challenging task that requires you to present the information in an optimal way, to know your clients and understand their needs, to monitor closely the performance of the pages and to invest time and effort on improving their results and on trying new approaches. Still Landing Page Optimization is not rocket science and as a result in many cases by dedicating the appropriate time you can achieve very good results just by making few targeted changes on your marketing approach.

Source : http://www.webseoanalytics.com/blog/the-10-most-common-landing-page-optimization-mistakes/

Thursday, May 24, 2012

Internal Linking to Promote Keyword Clusters

In “Keyword Clustering for Maximum Search Profitability” we discussed the idea of clustering keywords and how doing so can increase the speed and effectiveness of your link building efforts. The article was based on the principle of passing internal strength between the pages of your website.

Today we'll discuss this subject in more detail. Be warned, it'll involve a little bit of math (we are dealing with Google's algorithm after all, so there's almost always going to an element of that). That said, I'm going to do the math for you to provide what's most important, an understanding of why the formulas work and simple ways to determine what needs to be done. If you understand the why, the math essentially does itself.

What Are Internal Links?

We all know that internal links are the links within your website that enable visitors to get from one page to another, a point we won't dwell on further. The question you want to answer here is, “What do internal links mean to a search engine and how are they weighted?”

At its core, an internal link adds value to your pages in a manner similar to third party links to your site. That said, this would be a poor attempt at education if I assumed knowledge so let's take it from the top and answer first the question… how does strength pass from one page to another? (Note: many of the principles of this apply to both internal and external links to a page.) From there we'll look at how external links impact the weight flow.

When I ponder the value of a link, either internal or from a third party, I consider the world of Orwell's "Animal Farm". The first and foremost thought is my head is that each page has a vote – a chance to cast their ballot in favor of other resources. Where it gets Orwellian is in his infamous quote which I will bastardize for my use here, “All votes are equal, but some are more equal than others.”

To put this in the context of internal links, a link from the homepage of a site or another strong page will be weighted higher than a link from a weak page 12 levels deep in its hierarchy. With this sentiment we know that the old real estate adage of “location, location, location” holds as true in SEO as it does in the “real world” however it gets even more true when we consider the other elements that come into play.

A picture is worth a thousand words, so below you'll find an image of a simple site hierarchy. Due to my complete lack of design ability, hopefully this picture is worth at least 75 words or at least doesn't draw from the overall word count of this article. It will serve the purpose needed here at least.

A Simple Website Structure

Below is a seven page website (six internals and a homepage). Now let's consider how the weight will pass from one page to another. In my calculations I am not factoring in the evaporation of weight that occurs with every link (Matt Cutts discussed this at second 40 in his video here).

Because this happen with every link on your site and your competitors as well, it can be viewed as a level playing field and negated for simplicity though it is reinforcement for limiting the number of links on a page to minimize evaporation. But back to the point at hand.

No matter what the site is, one can assume the homepage value is 100. This is because I'm only factoring in the link passing within the site, not valuing the site against others. So let's begin.

internal-link-example-1 
If the homepage value is 100, the value passes as:
  • Homepage – 100
  • One – 33.3
  • Two – 33.3
  • Three – 33.3
  • Four – 16.7
  • Five – 16.7
  • Six – 33.3
This assumes that each of these pages links only to the pages in the diagram and the weight is split evenly among all links. In the real world, however, it would be more realistic (though messy in the illustration) to assume that each page links to each page higher in the hierarchy plus the homepage. So let's look at what each page will pass.

The homepage starting value is 100 meaning that it will indeed pass 33.3 value to each of the pages one level down.

Rather than linking downward however these pages will also link back to the home page giving the following values:

The Home page passes:
  • 33.3 to Page One
  • 33.3 to Page Two
  • 33.3 to page Three
Page One passes:
  • 11.1 to Home
  • 11.1 to Page Four
  • 11.1 to Page Five
Page Two passes:
  • 33.3 to Home
Page Three passes:
  • 16.7 to Home
  • 16.7 to Page Six
Page Four passes:
  • 5.6 to Home
  • 5.6 to Page One
Page Five passes:
  • 5.6 to Home
  • 5.6 to Page One
Page Six passes:
  • 8.4 to Home
  • 8.4 to Page Three
So at the end we end up with the following values:
  • Home – 180.7
  • Page One – 44.5
  • Page Two – 33.3
  • Page Three – 41.7
  • Page Four – 11.1
  • Page Five – 11.1
  • Page Six – 16.7
So clearly we see a situation where the second level in a hierarchy gains value by having a larger number of pages as sub-sections of it.

Now let's look at a more realistic (albeit advanced) example and consider the weight passing if each page links to those within its cluster as well as all the pages above it.

For example, Page Four would link to pages One and Five as they're within its cluster and also the homepage and pages Two and Three. This mimics an environment where pages One, Two, and Three are in the main navigation along with the homepage.

Further to that, we'll consider the inclusion of breadcrumb navigation adding additional links to pages within the direct hierarchy and thus part of the cluster. As discussed in the video above (and witnessed many times in practical application) the addition of these links passes more “juice” to those pages (the simple math is, two links to a page gives it twice the weight – we'll go with that for our purposes here).

So let's look at how that weight passes this time:

The Home page passes:
  • 25 to itself
  • 25 to Page One
  • 25 to Page Two
  • 25 to Page Three
Page One passes:
  • 7.1 to Home
  • 3.6 to itself
  • 3.6 to Page Two
  • 3.6 to Page Three
  • 3.6 to Page Four
  • 3.6 to Page Five
Page Two passes:
  • 10 to Home
  • 5 to Page One
  • 5 to itself
  • 5 to Page Three
Page Three passes:
  • 8.3 to Home
  • 4.2 to Page One
  • 4.2 to Page Two
  • 4.2 to itself
  • 4.2 to Page Six
Page Four passes:
  • 0.9 to Home
  • 0.9 to Page One
  • 0.5 to Page Two
  • 0.5 to Page Three
  • 0.5 to itself
  • 0.5 to Page Five
Page Five passes:
  • 0.9 to Home
  • 0.9 to Page One
  • 0.5 to Page Two
  • 0.5 to Page Three
  • 0.5 to Page Four
  • 0.5 to itself
Page Six passes:
  • 1.2 to Home
  • 0.6 to Page One
  • 0.6 to Page Two
  • 1.2 to Page Three
  • 0.6 to itself
So at the end we end up with the following values:
  • Home – 153.4
  • Page One – 40.2
  • Page Two – 39.4
  • Page Three – 40.0
  • Page Four – 4.6
  • Page Five – 4.6
  • Page Six – 4.8
So we end up with a curious situation in the math. What one may conclude is that it's better to keep a limited number of sub-pages in your site, after all – Page Six is carrying more weight than Page Four so it must be a better structure.

The only takeaway I hope you draw at this stage is that it's a good idea to take out expired products or useless pages (for your users as much as the engines). To illustrate why I'm going to shift our site into the real world and imagine once again that we're selling bike parts (see last article for reference here).

Let's imagine the following page definitions:
  • Homepage – My bike store
  • Page One – Suspension forks page
  • Page Two – Privacy Policy
  • Page Three – Dual-suspension frames page
  • Page Four - Marzocchi 44 Rlo page
  • Page Five - Marzocchi 44 TST2 page
  • Page Six - Banshee spitfire page
To see the impact on ROI, let's imagine a scenario where I build a link to both pages Four and Six. The link built will give the same weight (let's give it the arbitrary value of 5 by our weight model above) and let's now calculate what happens.

I'm going to omit the math and simply list the final numbers. We'll assume that the starting value are those defined above so the weight and re-factoring on the engine's part will produce higher numbers (as the values from the pages low in the hierarchy add weight to the pages above it).
  • Homepage – 248.8
  • Page One – 131
  • Page Two – 125.6
  • Page Three – 129.4
  • Page Four – 25.4
  • Page Five – 20.4
  • Page Six – 26.2
So here we see that the weight given to Page Six is higher than either of the two other third-tier pages however there are two important points we need to consider before asserting that a hierarchy that puts a single path to each product is superior.

Had we divided the paths off into four from the homepage to facilitate each of the product pages the split in weight from the homepage would have yielded the following example.

internal-link-example-2 


If this is the case we would have found the following to be the final values (assuming the link additions to pages four and six as above and the same breadcrumb navigation):
  • Homepage – 228.5
  • Extra Page – 87.1
  • Page One – 85.1
  • Page Two – 80.6
  • Page Three – 87.1
  • Page Four – 10.8
  • Page Five – 9.8
  • Page Six – 10.8
The weakening of the links off the homepage weakened the entire site reducing all the potential rankings on the internal pages.

While the link to page six in the first example yielded a higher page weight on that specific page when it was in a single path on the third tier of the site, the benefit was limited to that page alone. When we linked into a cluster, the benefit on the individual page was reduced slightly as the weight shifted with the increased number of internal links however the weight of a number of pages improved (including the product category page).

By clustering your targeted keywords together you'll be building links to groupings of pages that will, by design, help the rankings and page weight of each other.

A Beneficial Issue With The Calculations

It's only fair to note when there are known issues and unknowns with data. In the above calculations where I added in weight from third party links I reduced the weight of those links along with the internal weight.
The treatment of weight from external sources by the major search engines is undoubtedly different than internal weight and it's likely that the target page of the link would hold the full weight or a larger portion of it and then pass the weighting along without diminishing from itself.

What I mean by this is that the link to page four in the initial example held a weight of 5 and was divided in our math by 8 (the total number of links on the page). It's far more likely that the page would keep all or something near all 5 and then proceed to pass on weight internally without diminishing its own or diminishing its own only slightly.

Essentially what this means is that building a logical clustered hierarchy is, if anything, even more effective than outlined in the examples above.

Final Word

The data matches closely to what the search engines are trying to push webmasters toward: provide a logical and well-coded site architecture that serves your visitors well and you'll be rewarded. Imagine if Amazon tried to apply the example from the second graphic above and provide a link to each product page on their homepage. Usability would be horrible, page load speed would be a disaster, and because math works well, their rankings would be non-existent.

I don't expect all the readers of this article to draw out diagrams and do the math behind this for their own sites, it's time-consuming enough with a seven page example – let alone a 1,000-plus page site. However, as the numbers expand, the math stays the same and the benefits only amplify. And that's why clustering keywords as discussed in last month's article works.

Source : http://searchenginewatch.com/article/2179376/Internal-Linking-to-Promote-Keyword-Clusters

Tuesday, May 22, 2012

Two Weeks In, Google Talks Penguin Update, Ways To Recover & Negative SEO

It’s been about two weeks since Google launched its Penguin Update. Google’s happy the new spam-fighting algorithm is improving things as intended. But some hurt by it are still wondering how to recover, and there remain concerns about “negative SEO” as a threat. I caught up with Matt Cutts, the head of Google’s web spam team, on these and some related questions.

Penguin: “A Success”

The goal of any algorithm update is to improve search results. So how’s Penguin been for Google?
“It’s been a success from our standpoint,” Cutts said.

What About Those Weird Results?

Of course, soon after Penguin was released, people quickly started citing examples of odd results. The official Viagra site wasn’t listed, while hacked sites were. An empty web site was listed for “make money online,” and there were reports of other empty sites ranking well. Scraper sites were reported outranking the sites they scraped.

How could Penguin be a success with these types of things happening?

Cutts said that many of these issues existed before Penguin launched and were not caused by the new spam-fighting algorithm.

Indeed, the Viagra issue, which has now been fixed, was a problem before Penguin hit. Penguin didn’t cause it.

False Positives? A Few Cases

How about false positives, people who feel they’ve been unfairly hit by Penguin when they weren’t doing any spam?

“We’ve seen a few cases where we might want to investigate more, but this change hasn’t had the same impact as Panda or Florida,” Cutts said.

The Panda Update was Google’s big update that targeted low-quality spam last year. The Florida Update was a major Google update in 2003 intended to improve its search quality.

I’d agree that both of those seemed to have impacted more sites than Penguin has, based on having watched reactions to all these updates. Not everyone will agree with me, of course. It’s also worth the regular reminder that for any site that “lost” in the rankings, someone gained. You rarely hear from those who gain.
Bottom line, Google seems pretty confident that the Penguin Update is indeed catching people who were spamming, as was intended.

Why Spam Still Gets Through

Certainly when I’ve looked into reports, I’ve often found spam at the core of why someone dropped. But if Penguin is working, why are some sites that are clearly spamming still getting through?

“No algorithm is perfect. While we’d like to achieve perfection, our litmus test is, ‘Do things get better than before?’,” Cutts said.

Cutts also explained that Penguin was designed to be quite precise, to act against pages when there was an extremely high-confidence of spam being involved. The downside is that some spam might get through, but the upside is that you have fewer false positives.

How Can You Recover?

One of the most difficult things with this update is telling people how to recover. Anyone hit by Penguin was deemed to be spamming Google.

In the past, if you spammed Google, you were told to file a reconsideration request. However, Google’s specifically said that reconsideration requests won’t help those hit by Penguin. They’ll recover naturally, Google says, if they clean the spam up.

However, one of the main reasons I’ve seen when looking at sites hit by Penguin seems to be bad linking practices. People have used sponsored WordPress themes, or poor quality reciprocal linking, have purchased links or participated in linking networks, such as those recently targeted by Google.
How do people pull themselves out of these link networks, if perhaps they don’t have control over those links now?

“It is possible to clean things up,” Cutts said, and he suggested people review two videos he’s done on this topic:

 

 


“The bottom line is, try to resolve what you can,” Cutts said.

Waiting On Penguin To Update Again

If you do clean things up, how will you know? Ideally, you’ll see your traffic from Google recover, the next time Penguin is updated.

That leads to another important point. Penguin, like Panda, is a filter that gets refreshed from time-to-time. Penguin is not constantly running but rather is used to tag things as spam above-and-beyond Google’s regular spam filtering on a periodic basis.

Is Penguin a site-wide penalty like Panda or page-specific? Cutts wouldn’t say. But given that Panda has site-wide impacts, I think it’s a fair assumption that Penguin works the same.

What that means is that if some of your site is deemed Penguin-like, all of it may suffer. Again, recovery means cleaning up the spam. If you’ve cleaned and still don’t recover, ultimately, you might need to start all over with a fresh site, Cutts said.

New Concerns Over Negative SEO

Before Penguin, talk of “negative SEO” had been ramping up. Since then, it seems to have gotten worse in some places. I’ve seen post-after-post making it sound as if anyone is now in serious danger that some competitor can harm them.

At the core of these fears seems to be a perfect storm of assumptions. Google recently targeted some linking schemes. That caused some people to lose traffic. Google also sent out warnings about sites with “artificial” or “unnatural” links. That generated further concerns in some quarters. Then the Penguin Update hit, which caused more people to lose traffic as they were either hit for link spam or no longer benefited from link spam that was wiped out.

These things made it ripe for people to assume that pointing bad links at a site can hurt it. But as I wrote before, negative SEO concerns aren’t new. They’ve been around for years. Despite this, we’ve not seen it become a major concern.

Google has said it’s difficult for others to harm a site, and that’s indeed seemed to be the case. In particular, pointing bad links at a good site with many other good signals seems to be like trying to infect it with a disease that it has antibodies to. The good stuff outweighs the bad.

Cutts stressed again that negative SEO is rare and hard. “We have done a huge amount of work to try to make sure one person can’t hurt another person,” he said.

Cutts also stressed again what Google said before. Most of the those 700,000 messages to publishers that Google sent out earlier this year were not about bad link networks. Nor were they all suddenly done on the same day. Rather, many sites have had both manual and algorithmic penalties attached to them over time but which were never revealed. Google recently decided to open up about these.

After Negative SEO Campaign, A Link Warning

Of course, new messages do go out, which leads to the case of Dan Thies. His site was targeted by some trying to show that negative SEO works. He received an unnatural link warning after this happened. He also lost some rankings. Is this the proof that negative SEO really works?

Thies told me that his lost rankings were likely due to changes he made himself, when he removed a link across all pages on his site that led back to his home page. After restoring that, he told me, he regained his rankings.

His overall traffic, he said, never got worse. That tends to go against the concerns that negative SEO is a lurking threat, because if it had worked enough to tag his site as part of the Penguin Update, he should have seen a huge drop.

Still, what about link warning? Thies did believe that came because of the negative SEO attempt. That’s scary stuff. He also said he filed three reconsideration requests, which each time returned messages saying that there were no spam actions found. Was he hit with a warning but not one that was also associated with a penalty?

I asked Cutts about the case, but he declined to comment on Thies’s particular situation. He did say that typically a link warning is a precursor to a ranking drop. If the site fixes the problem and does a reconsideration request quickly enough, that might prevent a drop.

Solving The Concerns

I expect we’ll continue to see discussions of negative SEO, with a strong belief by some that it’s a major concern for anyone. I was involved in one discussion over at SEO Book about this that’s well worth a read.
When it’s cheaper to buy links than ever, it’s easy to see why there are concerns. Stories like what happened to Thies or this person, who got a warning after 24,000 links appeared pointing at his site in one day, are worrisome.

Then again, the person’s warning came after he apparently dropped in rankings because of Penguin. So did these negative SEO links actually cause the drop, or was it something else? As is common, it’s hard to tell, because the actual site isn’t provided.

To further confuse matters, some who lost traffic because of Penguin might not be victims of a penalty at all. Rather, Google may have stopped allowing some links to pass credit, if they were deemed to be part of some attempt to just manipulate rankings. If sites were heavily dependent on these artificial links, they’d see a drop just because the link credit was pulled, not because they were hit with a penalty.

I’ve seen a number of people now publicly wishing for a way to “disvow” links pointing at them. Google had no comment about adding such a feature at this time, when I asked about this. I certainly wouldn’t wait around for it now, if you know you were hit by Penguin. I’d do what you can to clean things up.

One good suggestion out of the SEO Book discussion was that Google not penalize sites for bad links pointing at them. Ignore the links, don’t let the links pass credit, but don’t penalize the site. That’s an excellent suggestion for defusing negative SEO concerns, I’d say.

I’d also stress again that from what I’ve seen, negative SEO isn’t really what most hit by Penguin should probably be concerned about. It seems far more likely they were hit by spam they were somehow actively involved in, rather than something a competitor did.

Recovering From Penguin

Our Google Penguin Update Recovery Tips & Advice post from two weeks ago gave some initial advice about dealing with Penguin, and that still holds up. In summary, if you know that you were hit by Penguin (because your traffic dropped on April 24):
  • Clean up on-page spam you know you’ve done
  • Clean up bad links you know you’re been involved with, as best you can
  • Wait for news of a future Penguin Update and see if you recover after it happens
  • If it doesn’t, try further cleaning or consider starting over with a fresh site
  • If you really believe you were a false positive, file a report as explained here
Just in, by the way, a list of WordPress plug-ins that apparently insert hidden links. If you use some of these, and they have inserted hidden links, that could have caused a penalty.

I’d also say again, take a hard look at your own site. When I’ve looked at sites, it’s painfully easy to find bad link networks they’ve been part of. That doesn’t mean that there’s not spam that’s getting past Penguin. But complaining about what wasn’t caught isn’t a solution to improving your own situation, if you were hit.


Sunday, May 20, 2012

SEO checklist - 10 things to check right now

If you’re not used to SEO, and frankly, even if you are, there’s a lot to take in, a lot to remember. To make sure your bases are covered, here are ten things to check right now. (Start at the top of your web pages and work down.)

1) The blue text you see in the search results (see the picture further down) is called the ‘title tag’. What do yours look like? Are they enticing? Keyword rich? If someone read yours, would you want to click through?

2) Does your meta description, the bit that comes underneath the title in the search results, describe your web page accurately? Is it inviting? If I was a random user, would I click it? The answer to these questions should always be yes.

SERPs listing

3) Are all your URLs (eg, www.example.com) easy to understand? If a human read them, could they get a sense of what your page was about? Test it now. Pick any page on your website (not your home page), copy what appears in the address bar of your browser and email a relative. An old one. Could they roughly describe your page without seeing it?

4) If someone was to look at your page for three seconds, could they tell you what it's about? Make sure your headers, the headlines in the copy, and the H1 and H2 tags in the code, are descriptive and keyword rich.

5) Is your navigation clear and intuitive? If you asked a stranger to find something on one of your deep-level pages, let’s say, find a specific product or article, could they do it just by using their mouse? Great! OK, now test it. Ask a friend to do this now.

6) Are your images optimized? (Right click on an image and ‘inspect element’ and look at where it says “img alt” - is this an accurate description of the image? Does it even exist? The alt tag is what search engines ‘see’ when you upload an image. It’s also used for site readers for the visually impaired. Ensuring your alt tags are descriptive makes your site more accessible and better for search engines.

7) Does your copy (text) sing? Are the words on your pages inviting, informative and punchy? And do they use the language your customers use, or are they jargonized?

8) Is the copy on every page unique? (It should be.)

9) Does the footer (the bit at the bottom of the web page) offer some more navigation options? Navigation that’s keyword rich, and helpful for users (perhaps they have a slow internet connection - could they still get to where they want to go quickly?). If I just looked at the footer of your home page, would I see links to your most important pages?

10) Do you have clean code? (Right click on your web page and click 'view page source’ to see the code). Is it free of Javascript and Flash where possible? If you’re not a coder and you asked this question of someone who is, what would they say?

Source : http://www.wordtracker.com/academy/seo-checklist

Wednesday, May 16, 2012

Life After Google Penguin – Going Beyond the Name

In looking back at my recent posts here it seems, though not by design, there was a theme emerging. Have a look...
And that was all pre-Penguin no less. Seems my Spidey-sense was tingling. The world of search engine optimization just keeps getting more convoluted. Now more than ever, very little is clear.

To date I have not touched upon the Penguin update because, well, we just didn't know. There wasn't enough data to say much. Of course that really hasn't changed, but there are a few things we can certainly look at to help better understand the situation at hand.
But let's give it a go anyway shall we?

Penguins at the Googleplex

A Name is Just a Name

The first thing we need to consider is that there are numerous Google algorithm updates, some of which aren't named. In the weeks before the infamous Penguin rolled out, there was a Panda hit and another link update. The three of them, being within a five-week period, makes a lot of the analysis problematic.

And that's the point worth mentioning. Don't try too hard to look for dates and names. Look more to the effects.

We're here to watch the evolution of the algos and adapt accordingly. Named or not, doesn't matter. Sure, it can be great for diagnosing a hit, but beyond that, it means little.

Regardless of the myriad of posts on the various named updates, none of us really know what is going on. That's where the instinct part of the job comes in. Again, knowing the evolution of search, goes a long way.

What is Web Spam?

To understand how web spam is defined, you need to look at how search engineers view SEO. While there are many, I like this:
“any deliberate human action that is meant to trigger an unjustifiably favorable relevance or importance for some web page, considering the page's true value.” (from Web Spam Taxonomy, Stanford)
And:
“Most SEOs claim that spamming is only increasing relevance for queries not related to the topic(s) of the page. At the same time, many SEOs endorse and practice techniques that have an impact on importance scores to achieve what they call "ethical" web page positioning or optimization. Please note that according to our definition, all types of actions intended to boost ranking, without improving the true value of a page, are considered spamming.” (emphasis mine)
Well la-dee-da huh? We can intimate that Google has eased that stance by trying to define white hat and black hat, but at the end of the day any and all manipulation is seen in a less than favorable light.

The next part of your journey is to establish in your mind what types of activities are commonly seen as web spam. Here's a few:
  • Link manipulation: Paid links, hidden, excessive reciprocal, shady links etc.
  • Cloaking: Serving different content to users and Google.
  • Malware: Serving nastiness from your site.
  • Content: Spam/keyword stuffing, hidden text, duplication/scraping.
  • Sneaky JavaScript redirects.
  • Bad neighborhoods: Links, server, TLD.
  • Doorway pages.
  • Automated queries to Google: Tools on your site, probably a bad idea.
That's about the core of the main offenders. To date with the Penguin update, people have been mostly talking about links. Imagine that... SEOs obsessed with links!

However, we should go a bit deeper and surely consider the other on-site aspects. If not on your site, then on the site links are coming from.

On-site Web Spam

Hopefully most people reading this, those with experience in web development and SEO (or running websites), don't use borderline tactics with their sites. We do know there is certainly elements of on-site with both the Penguin and Panda updates... so it's worth looking at.
Here are some common areas search engines look at for on-site web spam:
  • Domain: Some testing has shown that .info and .biz domains are far more spam laden than more traditional TLDs.
  • Words per page: Interestingly it seems spam pages have more text than non-spam pages (although over 1,500 words, the curve receded). Studies have shown the spam sweet spot to be in the 750-1,500 word region.
  • Keywords in title: This was mentioned in more than a few papers and should be high on the audit list. Avoid stuffing; be concise.
  • Anchors to Anchor text: In other studies engineers looked at the ratio of text, to anchor text on a page.
  • Percentage of visible text: This involves hidden text and nasty ALT text. What percentage of text is actually being rendered on the page.
  • Compressibility: As a mechanism used to fight keyword stuffing, search engines can also look at compression ratios. Or more specifically, repetitious or content spinning.
  • Globally popular words: Another good way to find keyword stuffing is to compare the words on the page to existing query data and known documents. Essentially if someone is keyword stuffing around given terms, they will be in a more unnatural usage than user queries and known good pages.
  • Query spam: By looking at the pattern of the queries, in combination with other signals, behavioral data manipulation would become statistically apparent.
  • Phrase-based: looking for textual anomalies in the form of related phrases. This is like keyword stuffing on steroids. Looking for statistical anomalies can often highlight spammy documents.
  • Globally popular words: Another good way to find keyword stuffing is to compare the words on the page to existing query data and known documents. Essentially if someone is keyword stuffing around given terms, they will be in a more unnatural usage than user queries and known good pages.
(some snippets taken from my post "Web Spam; the Definitive Guide")
And yes, there's actually more. The main thing to take from this is that there are often many ways that the search engines look at on-site spam, not just the obvious ones. Once more, this is about your site and the sites linking to you.

A lot of on-site web spam that's a true risk, will be from hacking. Sure, your CMS might be spitting out some craziness, or your WordPress plug-in created a zillion internal links, but those are the exceptions. If you're using on-site spam tactics, I am sure you know it. Few people actually use on-site crap post-Panda, many times it's the site being hacked that causes issues. So be vigilant.

Link Spam

Is the Penguin update all about links? I'd go against the grain and say no. Not only do we have to consider some of the above elements, but also there seems to be an element of 'trust' and authority at play here as well. If anything, we may be seeing a shift away from the traditional PageRank model of scoring, which of course many may perceive as a penalty, due to links.
But what is link spam? That answer has been a bit of a moving target over the years, but here are some common elements:
  • Link stuffing: Creating a ton of low-value pages and point all the links (even on-site) to the target page. Spam sites tend to have a higher ratio of these types of unnatural appearances.
  • Nepotistic links: Everything from paid links to traded ones, (reciprocal) and three-way links.
  • Topological spamming (link farms): Search engines will look at the percentage of links in the graph compared to known "good" sites. Typically those looking to manipulate the engines will have a higher percentage of links from these locales.
  • Temporal anomalies: Another area where spam sites generally stand out from other pages in the corpus are in the historical data. There will be a mean average of link acquisition and decay with "normal" sites in the index. Temporal data can be used to help detect spammy sites participating in unnatural link building habits.
  • TrustRank: This method has more than a few names, TrustRank being the Yahoo flavor. The concept revolves around having "good neighbors". Research shows that good sites link to good ones and vice versa.
(some snippets taken from my post "Web Spam; the Definitive Guide")

I could spend hours on each of these, but you get the idea. With many people are theorizing about networks, anchor texts, etc... the larger picture often evades us. There are so many ways that Google might be dealing with 'over optimization' that we're not talking about.
The last 18 months or so we have seen a lot of changes including the spate of unnatural-linking messages that went out. Again, Penguin or not doesn't matter. What matters is that Google is certainly looking harder at link spam, so you should be too.

It wouldn't hurt to keep a tinfoil hat handy as well… Look no further than this Microsoft patent that talks about spying on SEO forums. Between that and the fact that SEOs write about their tactics far and wide, it's not exactly hard for search engineers to see what we're up to.

Google Groups Therapy

How Are We Adapting in a Post-Penguin World?

What's it all mean? Well I haven't a bloody clue. Anyone who says they've got it sorted, likely needs to take their head out of a certain orifice.

What you should do is become more knowledgeable in how search engines work and the history of Google. Operate from intelligence, not ignorance.

Have you considered the elements outlined in this post when analyzing data and trying to figure out what's going on? I know I didn't. It was researching this post that reminded me of the myriad of various spam signals Google might look at.

Here's some of my thinking so far:
  • It really is a non-optimized world: Don't try too hard for that perfect title. Avoid obsessing over on-page ratios. You don't need that exact match anchor all the time, in fact you don't even need a link (think named entities). In many ways, less-is-more is the call of the day.
  • Keep a history: Be sure to always track everything. And when doing link profile or other types of forensic audits, compare fresh and historic data (such as in Majestic).
  • Watch on-site links: From internal link ratios to anchors and outbound links, they all matter. From spam signals to trust scoring, they can potentially affect your site.
  • Faddish: Another interesting thing, how much it plays into things we know not, was that Google might have an issue of the tactic du jour.
  • Watch your profile: In the new age of SEO it likely pays to be tracking your link profiles. If something malicious pops up, deal with it and make notes of dates and contact attempts.
  • On site: Hammer it and make it squeaky clean. The harder links get, the more one needs to watch the on-site. Schedule audits more frequently to watch for issues.
  • Topical-relevance: When looking at links think about topical-relevance. Are the links coming from sites/pages that are overly diverse (and have weak authority)?
  • Link ratios: Watch for a low spread in anchor texts as well as total links vs. referring domains (lower the better, it means less site-wide links generally).
  • Cleaning up: When possible look at link profiles and clean up suspect links. And I wouldn't wait until you get an unnatural linking message or tanked rankings.
We've seen a ton of data (this one is interesting) since this all went down and while there are common elements, nothing is conclusive (again, there have been a spate of updates). What is more important is to understand what Google wants and where they're headed. It's just another step in the long road of search evolution, don't get caught up in the names.
Taking the easy way out rarely works for success in life. SEO is no different.

Understand how a threshold might be used. This thing of ours is like the old story of the two of us in the woods when a hungry bear appears. I don't have to outrun the bear; just you. Ensure your strategy is within a safe threshold and it should work out just fine.

It's About Time

To close out there is the one part of this that keeps nagging; history. If you've been squashed by the recent updates (including Penguin) it may not entirely be about recent activities. There is a sense that Google is indeed keeping a history and that this may be playing into the large scheme of things.

Some of the most interesting Google patents were the series on historical elements. Be sure to go back and read some of these older posts:
Sure, they're 3-4 years old, but it is probably some of the more telling parts of the mindset change many in the world of SEO need.
Source : http://searchenginewatch.com/article/2174997/Life-After-Google-Penguin-Going-Beyond-the-Name

7 Time-Saving Google Analytics Custom Reports

Google Analytics Custom Reports can be incredible time savers if you have the right reports. Instead of spending time digging around for important metrics, you can find what you need separated neatly into columns for some analysis that will lead to some actionable insight.

1. Content Efficiency Analysis Report

This report is from none other than the master of Google Analytics, Avinash Kaushik. Brands all over the world are starting to double down on content so it's important to answer questions such as:
  • What types of content (text, videos, pictures, etc.) perform best?
  • What content delivers the most business value?
  • What content is the most engaging?
content-effiency-report-google-analytics

The Content Efficiency Analysis Report comes in handy by putting all the key content metrics into one spot.

Here are the columns that the report will pull in:
  • Page title
  • Entrances
  • Unique Visitors
  • Bounces
  • Pageviews
  • Avg. Time on Page
  • Per Visit Goal Value
  • Goal Completions

2. Keyword Analysis Report

keyword-analysis-report-google-analytics

If you're doing SEO, you want to make sure that your optimization efforts are working as intended. Is the right keyword pointing to the right page?

This first tab of this report, Targeting, will break things down by placing the title and keyword side-by-side. The four metrics you'll see are:
  • Unique Visitors
  • Goal Completions
  • Goal Conversion Rate
  • Avg. Page Load Time (sec)
Using the 4 metrics above, you'll be able to judge whether you need to make adjustments to your campaign or not.

The second tab, Engagement, will tell you how effective each page is by looking at the following six metrics:
  • Unique Pageviews
  • Pages/Visit
  • Avg. Time on Page
  • Bounce Rate
  • Percentage Exit
  • Goal Conversion Rate
The third and final tab, Revenue, will tell you how much money a keyword is bringing you based on 3 metrics:
  • Revenue
  • Per Visit Value
  • Ecommerce Conversion Rate

3. Link Analysis Report

link-analysis-report-google-analytics

What websites are sending you the best traffic? If you're link building, what links are worth going back for more? Link building isn't all about rankings, it's about increasing traffic and conversions as well. If you find a few gems, it's worth looking into them more.
Here are the columns you'll see with the report:
  • Source
  • Landing Page
  • Visits
  • Goal Completions
  • Pages/Visit
  • Bounce Rate
  • Percentage New Visits

4. PPC Keywords Report

If you're paying for search traffic, you obviously want to discover high performing keywords. You can then take this data and use it for future SEO campaigns.

Here are the metrics in this report:
  • Visits
  • CPC
  • Goal Completions
  • Cost per Conversion
By breaking things down easily, you'll be able to hone in one which keywords you need to put on hold and which ones you need to pour more cash into.

5. Social Media Report

social-media-report-google-analytics

Ah yes, a report that tells you how different social media channels are performing for you. This is a simple way to figure out where you should consider investing more time into socially.
The social media report looks at:
  • Visits
  • Social Actions
  • Goal Completions
  • Goal Conversion Rate
  • Goal Value

6. E-commerce Traffic Report

ecommerce-traffic-report

If you run an e-commerce site, it's important to break down your different traffic channels to see which one performs best. Why is one channel performing better than the other? Is it worth it to invest more in a campaign that is trending upwards? Is your investment with paid advertising effective?

This report answers some of your e-commerce questions by looking at the following metrics:
  • Visits
  • Percentage New Visits
  • Bounce Rate
  • Pages/Visit
  • Revenue
  • Average Value
  • Per Visit Value

7. Browser Report

browser-report-google-analytics

This report will tell you how different browsers are performing for your site. You'll immediately see which browsers are your winners and which ones might have problems.

For example, if Chrome and Firefox seem to be doing OK but if Internet Explorer has extremely high bounce rates, you might want to look into Internet Explorer more. After all, Internet Explorer has x percent of the browser share. (research market share for internet explorer)

Bonus: Custom Reporting in Google Analytics

Jaime from SEOmoz created a wonderful realtime Google Analytics report. Here's what it looks like:

google-analytics-realtime-data
This spreadsheet allows you to compare different metrics of your choice with different start and end dates as well. You can easily see how your campaigns are performing from a high level all in the comfort of a clean Google Doc.

Want even more custom reports? Make sure to read Greg Habermann’s top five most used Google Analytics Custom Reports to learn about and get custom reports for Unique Visitors by Page; Conversion by Time of Day; Customer Behavior; Top Converting Landing Pages; and Long Tail Converters.

Conclusion

Google Custom Reports ultimately save you a lot of time and help you make actionable decisions that will help your bottom line. Take a few minutes to set these reports up and explore them. You won't regret it.

What are some useful Google Analytics Custom Reports that you use?

Source : http://searchenginewatch.com/article/2175001/7-Time-Saving-Google-Analytics-Custom-Reports

Sunday, May 13, 2012

Google: Can't Recover From Penguin? Start A New Site

Danny Sullivan published a new story yesterday named Two Weeks In, Google Talks Penguin Update, Ways To Recover & Negative SEO.

In that article, he interviews Google's spam lead, Matt Cutts on ways to recover from the Google Penguin update. There are some solid tips there but scary ones also.

Here is one that is scary for those who were hit by the Penguin update:
If you've cleaned and still don't recover, ultimately, you might need to start all over with a fresh site, Cutts said.
Yes, that is scary for someone who was hit, is trying to frantically make changes but has not seen any recovery. Now, if you have not seen a recovery yet - I wouldn't worry, I don't think they refreshed the update yet, so there wouldn't be any recoveries in my opinion.

But Google is not going to roll this back. Google's Matt Cutts said, "It's been a success from our standpoint." Were there false positives? Few Cutts said, "we've seen a few cases where we might want to investigate more, but this change hasn't had the same impact as Panda or Florida." Very interesting.

Key Take Aways:

(1) Google is not going to roll this update back.
(2) Google says it had less of an impact than Panda or Florida.
(3) Don't take drastic measures yet, do what you can now so when Google does refresh the update, maybe you can break free.

Source :  http://www.seroundtable.com/google-penguin-recovery-15136.html

Friday, May 11, 2012

How Google Creates a New Era for SEO

Google has been making substantial changes to its search engine algorithms. This has led many website owners to reconsider how they implement their SEO campaign. Over the last year, two major new algorithm changes are going to have a major impact on website owners.
Google Panda was enacted last year to help remove low quality content sites from the front page of its indexes. This update had a major impact on a number of websites, particularly content farms and low-level affiliates.

The other update has not been put into place yet and we know even less about it than Google Panda. Google chief engineer Matt Cutts had a discussion with Danny Sullivan of Search Engine Land last month to discuss some of Google’s practices. In this discussion, Cutts let it slip that Google was working on a new algorithm change to penalize sites that have engaged in too much SEO.
How Google Creates a New Era for SEO
Both of these topics are expected to have a major impact on search engine rankings in the coming months. Here is the lowdown on what they both mean.


Google Panda Update – Dreaded for SEO

Google Panda has really opened up a can of worms for many marketers. Almost every Internet entrepreneur I know has started panicking about how the Panda Update is going to impact them. That is completely unnecessary though. The Panda Update was intended to hit content farms like Ehow and many of the affiliate sites that use spun or stolen content.

After Panda, many of the leading content firms lost substantial amounts of traffic. One of the most extreme cases was Acesshowbiz.com, which lost 93% of its SEO traffic. Meanwhile, a number of leading content providers like Youtube increased their search engine traffic by 10%.

Google is clearly looking much more closely at quality content now. Internet entrepreneurs have got the message that they are supposed to update their site with fresh content as regularly as possible. Although fresh content remains a priority, entrepreneurs who have felt the Panda’s bite are likely to find that they are going to need to put more of their emphasis on creating insightful, fresh original content, than just updating regularly.

The Panda Update suggests that you should do the following:
  1. Make sure your site is written for humans more than search engines.
  2. Get rid of duplicate content as much as possible.
  3. Publish content that has never been featured elsewhere, such as article directories.
  4. Make sure your content is authoritative, rather than just rewrites of other people’s articles or spun content.
These tips could help you considerably as you try to keep your site on the top of the search engines.

Next Algorithm Change

The next algorithm change could mean any number of things for the site. Matt Cutts was pretty vague with his statement, but most SEO experts have at least some idea of what he was getting at.

The new algorithm change is supposedly intended to target over-optimized websites. Adam Audette of Rimm-Kaufman Group stated that SEO should be invisible. The sites that are most likely to get nailed by Google’s new update are those that don’t have any business model.

Audette said that SEO should be an “invisible layer” that is added to a site after creating value to the readers. Too many SEOs think that if they get to the front page of Google then the bucks will start rolling in. This notion is obviously flawed, considering how much readers hate over-optimized content. The new algorithm change is intended to keep these sites from even getting on top of Google.

What are some of changes you may need to make in your SEO model? I would follow these points:

Respect the Panda
 
Whatever new algorithm update Google has in the books isn’t meant to replace Google Panda. It is targeting over-optimized content, while Panda was directed towards content that provides no value. However, there is definitely an overlap between the two. Appeasing the Panda by creating great content will help you take your efforts away from over-optimizing your site for SEO.

Refrain from Black Hat Strategies
Many Internet marketers shun black hat SEOs like they are the worst kind of sinner. I personally don’t have any ethical standing against most black hat SEOs. However, I will say one thing about most black hat SEO tactics: they rarely lead to long-term results.

Google has been waging war with black hat SEOs from day one. However, Matt Cutts new statement showed that Google is clearly working even harder to boot black hat spammers off the front page. Here are some of the things he specifically mentioned:
  1. “Excessive link exchanges.” The reason Google evaluates backlinks is to assess how authoritative a website is in the eyes of others. Google hates seeing sites that just exchange links with each other, because they give no indication on the real value of the site. Personally, I don’t see anything wrong with sites linking back and forth between each other to network and share resources. I think Google can tell when sites are clearly trying to manipulate the algorithm and wants to ding anyone who does anything unnatural.
  2. “Overuse of keywords.” Keyword stuffing has been a no-no for a long-time. Avoid using keywords unnecessarily throughout your content.
  3. “Beyond what Google would normally expect.” Cutts could mean any number of things with this one. Some SEOs have ventured some guesses that I think were right on. One idea they suggest that he meant unnecessarily linking to the homepage in the body of the article of the footer. I have seen plenty of bloggers link to the homepage in the middle of their post for no reason with a random string of keywords. It gets really annoying to be honest and I am sure it looks weird to Google as well.
Most of the penalties should be clear to marketers by now. Google has been flagging unnatural use of keywords and linking for a long time. We will need to assess what Google is really doing differently this time around. Quite frankly, I think they are implementing some new changes that we may not have considered previously. Matt Cutts made a statement back in 2009 that Google doesn’t have any “over-optimization penalty” for websites. However, he specifically used that term in his most recent statement. Has he changed his mind or just took a new stance on the terminology he used?

It is much too early to see. Websites are going to need to see how their sites are ranked now and what they are going to need to do differently in the future. Carefully monitor your websites rankings before and after Google’s supposed algorithm change. This will give you some idea on whether or not you have over-optimized your website.

If your site has witnessed a drop in rankings over the next couple of months, you can pretty safely bet that your site is considered “over-optimized.” Sadly, that doesn’t give you any indication as to what you have done to overly optimize it.

I would say you should do a page-by-page analysis of your site. Analyze every single link, block of text and meta description to see what may look unnatural. You may need to play around a bit to figure it out. Anything that looks unnatural to you should probably be changed.

Just the same, I wouldn’t make any new changes just because your site drops a little for a while. Google often applies a heavy hand to its algorithm changes in the beginning, which harms innocent sites. It could reverse some of those mistakes later, which would help your site regain its ranking. There is no sense undoing a good thing just because Google has inadvertently penalized your site temporarily.

These two new algorithms may be just the beginning of the changes we are going to see with Google over the next few years. They should remind us that Google is constantly working to improve the quality of the user experience. Therefore, we will have to come to terms with the fact that our techniques to enhance search engine rankings may become increasingly obsolete.

As you build your blog, you may need to turn your efforts away from keywords usage and traditional linkbuilding strategies as you attempt to establish yourself on the front page of the world’s most popular search engine.

Sources : http://www.1stwebdesigner.com/design/google-creates-new-era-seo/