SEO Tips

SEO Tips

SEO TipsThe five most common errors in SEO, according to Maile Ohye from Google.
1.) No value proposition
2.) Segmented approach
3.) Time consuming workarounds
4.) Caught in SEO trends
5.) Slow iteration

Six SEO tips to help your web site optimization. Do something cool, include relevant keywords, be smart about your tags, sign up for email forwarding in email Google webmaster tools, attract buzz, and stay fresh and relevant.

Thank you Maile

Mastering Google SEO 2012

Mastering Google SEO

Google SEO
Mastering Google. This first video deals with a recap of SEO in 2011 and best practices for SEO in 2012 based on what we learned from last year. It gives us really good information about the panda update, hope you enjoy it.

 

Search Evaluation at Google

Search Evaluation – Understanding Google

Search Evaluation Search engine and SERPs have become more and more complex. Google handles more searches than any of their competitors. To master Google is a huge step toward success in SEO. Your site’s volume of traffic can make or break your bottom line.

Scott Huffman, an engineering director responsible for leading search evaluation at Google writes:

Evaluating search is difficult for several reasons.

First, understanding what a user really wants when they type a query — the query’s “intent” — can be very difficult. For highly navigational queries like [ebay] or [orbitz], we can guess that most users want to navigate to the respective sites. But how about [olympics]? Does the user want news, medal counts from the recent Beijing games, the IOC’s homepage, historical information about the games, … ? This same exact question, of course, is faced by our ranking and search UI teams. Evaluation is the other side of that coin.

Second, comparing the quality of search engines (whether Google versus our competitors, Google versus Google a month ago, or Google versus Google plus the “letter T” hack) is never black and white. It’s essentially impossible to make a change that is 100% positive in all situations; with any algorithmic change you make to search, many searches will get better and some will get worse.

Third, there are several dimensions to “good” results. Traditional search evaluation has focused on the relevance of the results, and of course that is our highest priority as well. But today’s search-engine users expect more than just relevance. Are the results fresh and timely? Are they from authoritative sources? Are they comprehensive? Are they free of spam? Are their titles and snippets descriptive enough? Do they include additional UI elements a user might find helpful for the query (maps, images, query suggestions, etc.)? Our evaluations attempt to cover each of these dimensions where appropriate.

Fourth, evaluating Google search quality requires covering an enormous breadth. We cover over a hundred locales (country/language pairs) with in-depth evaluation. Beyond locales, we support search quality teams working on many different kinds of queries and features. For example, we explicitly measure the quality of Google’s spelling suggestions, universal search results, image and video searches, related query suggestions, stock oneboxes, and many, many more.

To get at these issues, we employ a variety of evaluation methods and data sources:

Human evaluators. Google makes use of evaluators in many countries and languages. These evaluators are carefully trained and are asked to evaluate the quality of search results in several different ways. We sometimes show evaluators whole result sets by themselves or “side by side” with alternatives; in other cases, we show evaluators a single result at a time for a query and ask them to rate its quality along various dimensions.

Live traffic experiments. We also make use of experiments, in which small fractions of queries are shown results from alternative search approaches. Ben Gomes talked about how we make use of these experiments for testing search UI elements in his previous post. With these experiments, we are able to see real users’ reactions (clicks, etc.) to alternative results.

How does Google use human raters in web search?

[break]