By: Keith Neubert – Bayshore Solutions Management Team

“Everything should be made as simple as possible, but not simpler.” – Einstein

That about summed about the goal of the speakers at the pagination and canonicalization session at SMX Advanced.  And it was appropriate for these topics.  Pagination and Canonicalization are especially important after the recent Panda and Penguin updates… now more than ever usability and content duplication are VERY important.  So I’ve tried to summarize and paraphrase from the great speakers on this topic, including Adam Audette, President, RKG and Maile Ohye, Senior Developer Programs Engineer, Google.

Pagination Do’s and Don’ts

The first speaker spent time talking about how ecommerce sites are the likely candidates for a look (or re-look) at pagination techniques and how the newer rel prev/net methods.  “Much like rel=”canonical” acts a strong hint for duplicate content, you can now use the HTML link elements rel=”next” and rel=”prev” to indicate the relationship between component URLs in a paginated series,” (http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html).  So think of these tags as telling Google that your page is part of a series or family of content.  So in a search results page where several pages are returned you are going to want to use these to communicate that all search results are part of the family of search results for that product.  Search engines will still index the deeper content and serve up that content if they find it more relevant.  The speaker said that using rel canonical as self-referencing is ok, but when you use that to point to page 1 in the series there’s a potential conflicting signal.

Better than pagination, a view all search results page is actually preferred by the search engines over the individual – so when able, reference that page as the main content page.  When it is not an option make sure to use the rel prev/next method.  An important consideration on a view all page is that it could be a big page and the experience slow – thinking about using the progressive scrolling methods that are now out on the web.  Quora and Twitter both do this continually loading and refreshing method. Googlebot will only get the first 500 words though so that might be a potential issue if you want to the content crawled.

URL Parameters in Webmaster Tools

Maile from Google covered the next topic around leverage webmaster tools for maintenance and support of your URL parameters.

Page level markup is applied separately after page is crawled and is still taken into consideration.  Meaning that it’s not a direct index factor, just a strong indicator for how and when the page is used.  URL parameters can be a helpful hint and are not directives for the search engines.  These are best practices and can help the search index!

It is important to note that it’s an advanced feature and done improperly can cause more harm than good.  In fact, she pointed out that sometimes a site will already have high crawl coverage as determined by Google on its own and then due to improper actions within the tool, it can actually eliminate the page from showing up in the results

On the issue of inefficient crawling an ‘eligible URL’ would look something like this: Eligible URLs: key=value&key2=value 2

How to approach this?

Step 1: Specify parameters that do not change content
1. Do you have parameters that do not affect page content (sessionID, affiliateID, trackingID)?
Likely mark as “does not change content”.
Step 2a: Specify parameters that change content
Step 2b: Specify Googlebot’s preferred behavior

What other tools that are available for use?

•    Sort parameter changes the order content is presented.
•    Narrows filters the content on the page by showing subset of total items.
•    Specified parameter determines the content displayed on a page.
•    Paginations displays a component page of a multipage sequence. Use “Crawl every URL.”

What about multiple parameters in one URL? “Imagine all URLs begin as eligible for crawling, then apply each setting as a process of elimination, not inclusion.”

Other Best practices:
•    Internal links should only include canonical URLs
•    List canonicals in Sitemaps
•    Helps with canonical promotion
•    Provides more accurate index counts
•    On page indexing markup is still helpful. rel canonical, rel next/prev can be used in tandem.
•    Utilize URL parameters for more efficient crawling

In another session I attended, the topic was on avoiding PPC Data Paralysis.

The speaker covered an issue we at Bayshore face every day –how to present technical information to non-technical people? And in this case, PPC data. His recommendation was to think about what it is you really need to communicate:
•    Test Results
•    Performance
•    Opportunities

If you focus on only the specific problem or goal, and show only the necessary data, you’ll avoid the data paralysis that can occur from sharing all the data paid search platforms have to offer.

The speakers recommendation was your information using the classic scientific method:
1.    Question
2.    Hypothesis
3.    Data
4.    Analysis
5.    Conclusion

So for example your information could be shared as simply as in his example:

Case: Testing which ad copy drives most conversions at the cheapest cost.

Hypothesis: Finding a balance between ad click rate and conversion rate will yield peak efficiency and volume

The Data :

Ad          Conversions              Cost Per
1                      187                  $16.51
2                      263                  $18.00
3                      248                  $17.11
4                      205                  $17.45

Analysis : Ad 1 yields the cheapest but fewest conversions. Ad 2 yields the most conversions at the highest cost. Ads 3
and 4 find a middle ground. Ad 3 delivers more a a better cost than Ad 4.

Conclusion : Recommend to move forward with Ad 3.

A second speaker covered the importance of long-tail keywords for PPC. Often PPC managers may choose to discontinue bidding on long-tail keywords because they don’t see reporting on conversions and conversion rates. But often the conversions take longer to achieve due to the limited impressions long-tail keywords receive, but when conversions occur, they can be at a low cost.

The speaker shared a method of predicting revenue of long-term keywords that involved calculating weights of each touch point in your sales process using multivariate regression to best predict the conversion events.

Recomended Posts