Filter Bubbles; Quality Content; Coding Tag Tips; Ecommerce Marketing; Actionable insights to Webmaster Tools

One of the things that I truly find amazing about New York City is how it never sleeps. The sounds of people laughing, cars rushing by and sirens going off somewhere in the distance were all things that so kindly kept me awake for a good portion of the night.

I suppose living in the “country” (Tampa, FL) for so long has changed the sleeping tolerance I used to have for the noise. Nevertheless, I was up on time, ready to go, and excited to get SMX day number two on its way.

I am beginning to comprehend the theme of this conference: “Understand your user and be sure to provide high quality content that caters to the voice of those that you are trying to reach.” This should be a clear message that is consistent and fluid across all forms of online marketing.

So to recap the second day at SMX New York, I started out the morning with a keynote session with Eli Pariser.

Keynote with Eli Pariser

Eli started with an illustration of relevance quoting, “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa” – Mark Zuckerberg

Eli Pariser is the author of the book the “Filter Bubble”. This concept describes how all of us searchers are placed into a bubble based on numerous different variables that tell the search engines what kind of person we are. Just a few takeaways:

  • Even if you are not logged into Google there are still at least 57 variables that Google uses when you are searching.
    • Google wants to present you with the most accurate search to what they believe that you need
  • Everyone’s search results can be very different
  • We are surrounded by filters – called the filter bubble. This is what gives us the personalized search results when we are searching the internet
  • Personalized algorithm usually comes from your first click
  • When research your keywords, make sure that when searching the results are not just you.
  • Google displays what they think we should see not what we need to see.
  • All of this boils down to the understanding the users and creating your web personas

Session Two – Duplication, Aggregation, Syndication, Affiliates, Scraping and Information Architecture

The topic of this session is a common issue that occurs for numerous industries that deal with feeds, such as, Real Estate, Travel, Coupon and Deal Sites and Product based companies. How do make sure that you are not displaying thin content of low quality. It boils down to focusing keyword and site structure and combining pages that describe an identical item. For example: if you are selling Brand A Food in a 2lb version as well as Brand A Food in a 5lb version, there is no reason why you need to have two separate pages of the same product describing the same thing with one minor variable change. Main points from this session include:

  • ***Google does not want to display the same product description a bunch of times. This is why it is extremely important to have unique descriptions…should not always use the manufacture description for the various products.
  • When writing the descriptions, be sure that you clearly understand your websites voice, tone and style. Cater your descriptions to these factors
  • Various Tools of success;
    • SEO Strategy – have a holistic view – focus on larger category strategy and research as well as competitive analysis
    • Keyword Mapping – URLs to keywords and then keywords to URLs (Spread sheet)
    • Content strategy – Define content, it should have integrated SEO content needs, Quantify the amount of content needed and provide timelines.
      • Better to conduct the research before the content is written, so keywords can be included
  • Create a Content calendar- Make sure that the deadlines are met at the right pace.
  • Research to see of your site has a duplication issue:
    • Site:Search – use site search to review your duplication issues
      • Site:about.com counting calories (they had 60,000 results) this most likely displays a duplication problem
  • Technical Details…
    • The more duplicated and syndicated content that you have on your site the more your original content will be penalized: Panda is a site wide evaluation

Session number 3 – Schema.org, rel=Author & Meta Tagging

Simple coding can be an easy way for us to enhance our search results. This session gave a very technical look at different coding tags that we can add to such things as reviews, products, recipes, events, and more. The main take away from this session was that this technique is still in the early adopter phase and is not necessarily a guarantee in Google.

Some of the details of these codes and the highlights of this session were:

  • Take advantage of semantic code like schema.org
  • Google utilizes Rich Snippets which gives users convenient summary information about their search results at a glance
    • These are simple to implement and can give you instant gratification
  • Rel=Author tag: highlights the original creator of the content, allows the author to be indexed and can be a game changer for your search results
    • Example from CNN’s Eatocracy after they have added this to their recipe pages:
      • 47% increase to recipes showing in Google
      • 27% increase of search traffic to recipe pages
    • Challenges: Does not support multiple author blogs, only works on the same site and only works on self-hosted sites
  • When to implement a more aggressive tactic like schema:
    • Moving to a new CMS
    • Launching new templates
    • Launching a new site
    • Site Redesign
    • Down dev cycles
  • Google says that they sometime will strip the titles.
    • Google does this…because they believe that it will improve the searchers experience. Therefore, your title tag and description may not show up identically in the Google Search Results

Session 4 – Ecommerce & Retail Search Marketing Tactics

Session number four was right up my alley.  Since working at Bayshore I have had many Ecommerce clients and look to this area as being my Subject Matter Expertise. This panel gave many examples and tactics as to what all of us emarketers should be thinking about for our Ecommerce clients.

  • Paid search in Retail marketing- Can be very beneficial, yet challenging
    • Challenges:
      • ROI is under pressure
      • higher cpc
      • more price shopping
      • lower product margin
      • more competition
  • Creating a strong Campaign Structure:
    • One Product – One Ad Group
    • Using Specific Keywords
    • Ads should include offers
  • How to Build out keywords, use:
    • Product Data/feed
    • Part numbers
    • Product title
    • Brand/sub brand
      • Brand + Model
      • Title + Type
      • Year+ Make+Model+Part
  • Tip: Optimize initial bids
    • Avg order value x conversion rate / ROI goal = Target CPC
    • $100 x 2% / 400% = .50 per click
  • Tip: Use Bing
  • Tip: A/B Testing – Promote yourself
    • Your  offer to the buyer
    • Your value proposition
      • Shipping rate
      • Stock status
      • Authorized dealer
      • Reputation
      • Years in Business
  • Tip:  Advertise on your brand name
    • Overall revenue lift
    • Usually Low cost
    • Block competitors
  • Determine the Lifetime value of a conversion
  • Utilize Keyword Expansion, analyze what your competitors are doing through landing pages, ads, images and pricing
  • Submit your product feed to Google Merchant
    • Value of PPC and PPC combo metrics
    • Increase conversion
      • Ensure that there is adequate budget for core terms
      • Take top 25 performing keyword phrases in ppc
        • Make sure the keywords in all of the metadata
        • Make sure the pages are indexed in Google
        • Make sure that these pages are getting refreshed in search
    • Ways to save
      • See search terms in adwords
        • Look for terms that never convert
      • Add negatives
        • Broad match the root term that is the offending term
      • Adding +broad +match +modifier
      • Have another person audit your campaigns
    • Use Display and Remarketing
      • Fastest growing, predicted to reach $25.27 billion this year with a 36% growth to 34.4 billion in 2013
      • 90% of the people who visit a website with completing any action that online marketers intend them to take. Display and remarketing improve those numbers.
      • Competitive pricing with real-time bidding

Session 5 – Making Data from Google Webmaster Central and Bing Webmaster Tools Actionable

The final session of the day was an overview and review of Google and Bing webmaster tools and the importance of utilizing both of these amazingly informative tools. Key notes from session number 5:

Google Webmaster tips…

  • Utilize the message center
    • Check for status, violations, hacked or malware issues, etc…
    • Have these messages emailed to you so you always know the status of your website
  • Analyze your content: does Google think your site is about something that it is not?
    • Make sure that you are giving information that the users are looking for
      • Do the user queries match the content of your site?
  • Analyze the backlinks and anchor text: Who is linking to you? What are they saying?
  • HTML suggestions: Improve user experience and click-throughs with informative tags and description
  • Set your geographic target and preferred domain
    • Set your crawl rate
  • Check crawler access: Test your robots.txt, help generate robots.txt, remove URLs
  • Google webmaster only displays 30 days worth of data, therefore, Google recommends that you download this data every month, even if you do not know what to do with it yet.

Bing Webmaster Tools

  • You can invite other people into the Webmaster tool
  • Bing is moving sites around hourly to find the best location for you to the user per keyword phrase. This why it is based on an average. If the average is decreasing it is displaying that Bing is trusting your content
  • Bing- expects a clean xml sitemap…if there are any errors, they will give you the “hairy eyeball”
  • **You can set the crawler rate- if you know when your site is the busiest during the day you can tell the Bing bot to not be around your site during that time. This will prevent Bing from taking up your bandwidth during your most active times
  • **Submit URLs: you wrote a great blog: it is brand new….you can submit individual links and it will go directly into Bing’s index

Stay tuned for  the key points learned in the last day at SMX…

Recomended Posts