By: Keith Neubert – Bayshore Solutions Management Team

One of the sessions I attended today at SMX Advanced in Seattle, WA was centered on site speed and its importance not only from the end user’s perspective but also in Google’s search results.  Often referred to as ‘signals’, site speed is one of 200 or so factors that the Google algorithm takes into consideration when developing rankings.  Site speed is especially important in the top 1% most competitive keywords.  It was said that most users expect that your pages will load in 2 seconds or less, but interestingly enough, the average Fortune 500 company website takes 7 seconds to load.  Now you may be asking how you’re supposed to do this when a Fortune 500 company can’t.  The answer is there are some quick ways that you can start looking at today – and they don’t necessarily take a lot of effort to do!

1.    Assess what your site is doing currently – There are several free tools available on the web that can be used to benchmark your website.  These tools will give you some insight into the areas of your site that may be worth spending some time on to increase performance.  Two great tools are PingDom (http://tools.pingdom.com/fpt/) and WebPageTest (http://www.webpagetest.org/).

They’ll tell you where the bulk of where the time is being spent in loading your pages and present this information in some easy to understand summaries (examples below).

smx_3

smx_4

 

 

 

2.    Take Advantage of browser caching – when your browser connects to a website it requests the HTML and all of the different components that combine to create the webpage as a whole.  This most likely includes several images, JavaScript files and CSS elements.  There is an ‘Expires’ setting that can be set to a far future date which will tell the browsers to store this locally and not ask for it the next time it accesses your site.  This helps because while some of these files may not be big in size, there are a lot of requests to the server which adds to the page load time. https://developers.google.com/speed/docs/best-practices/caching#LeverageBrowserCaching

3.    Link <link> your CSS and avoid @import – This allows for parallel downloading of the CSS with other page elements and avoids additional delays.

4.    Avoid redirects – While redirects are a VERY important tool and need to be used in some situations, the point today was that it doesn’t need to be used in ALL situations.  Not using them cuts down on the wait time for the users by avoiding an entire request-response cycle.  Talk to your marketers about the redirect strategy on your site and see if there is a case for reducing the numbers.

5.    CSS sprites reduce HTTP requests, use them! – Again, reducing total HTTP requests greatly improves site performance.  Combining common images into “sprites” reduces requests, latency, overhead and total page file size.

6.    Use a Content Delivery Network (CDN) – This is a tool that can be used for a lot of your static site content.  Similar to the browser caching described above, this is just extending that to the cloud.  When a user requests something from your site the CDN will serve it up to the user rather than the site – reducing the number of requests to the server and most likely serving it up from a physical location closer to the end user.  http://en.wikipedia.org/wiki/Content_delivery_network

7.    Remove what you don’t need – This is tough to do because rarely does anyone want to give up something on the webpage.  One suggestion was to assign each element (including social buttons) a real or synthetic value to compare elements on the page.  That way you can tell which ones are providing value for their cost (speed) and which ones can be removed.

So why should you spend time on this?  Well besides the potential search engine ranking benefits, there is the user experience consideration!  Some study results that were shared at SMX Advanced were that for every 1 second of page load time it correlates to 7% drop in users.  In another study, there was a 1% decrease in user response for every 0.1 second in response time from the website.  One company that went through most of the steps outlined above saw a 50% decrease in the amount of time it took Google to crawl a page and a 100% increase in the number of Google-crawled pages per day!

 

Recomended Posts