4 Keys to Improving Buffer’s Search Engine Discovery with Technical SEO

There are a few key SEO areas that if improved can lead Buffer.com (or any website) to better rankings, improved search engine visibility, more website traffic and increased conversions.

1. Website Performance - image size, file size and render blocking JavaScript and CSS are causing pages on Buffer.com to load more slowly than necessary. Losslessly compressing images using software like File Optimizer (Windows) and ImageOptim (Mac), combining files (JavaScript and CSS respectively) and inlining and deferring both JavaScript and CSS can help immensely.

2. Search Engine Indexation - make sure your website has a published XML sitemap that is also submitted to search engines. The best way to track internal links is using Event Tracking in Google Analytics. Don’t use parameters for external links. If you have duplicate content, utilize the Rel=canonical tag. When using redirects make sure they’re 301s, not 302s. Lastly, don’t use the meta keywords tag; most search engines see it as a spam signal.

3. Search Engine Visibility - whenever possible use a single domain to house your content. A Moz SEO study found that this can have a better effect on your blog rankings. Keyword research is the key to a successful organic search campaign. Invest heavily here. Once you’ve identified your target keywords, sparingly place them on pages that you want Google to rank you for.

4. Blog Search Engine Optimization - take your exhaustive keyword research and sparingly place keywords in your blog post titles. Search engine bots look at title tags first. If you’re not using category landing pages, you should be. There’s a big opportunity for targeting keywords there. Lastly, make signing up easy. Don’t make visitors click more CTAs than they have to.

Modern innovations like social media have single-handedly turned the world into a place where nearly everything is shared and little is private.

The public has direct access to our lives in the form of pictures, videos, thoughts, feelings, expertise, hopes, fears and dreams—which is both beautiful and terrifying at the same time.

For better or worse, the internet and social media have fundamentally changed what we share, how we share and who we share things with.

All this talk about sharing has me absolutely AMPED about this month’s Hot Seat because the website I chose is one of my favorite companies, ever.

This month I chose Buffer.


Buffer is a social media management platform that helps individuals and businesses build an audience using their scheduled posts tool, social media image creator, customer service/brand monitoring tool and smartphone apps.

Buffer burst onto the scene in 2010, with its founders Joel Gascoigne and Leo Widrich starting the company from their bedrooms. Buffer began as a web app to manage Twitter profiles, but since then has grown into a full-fledged social media management tool with a remote team of 80+ people working across 4 continents and 30+ cities around the globe.

The company is one of many startups who has chosen the distributed model for work, which allows them to hire extremely talented people regardless of their location or time zone.

Why Buffer?

Buffer Is Transparent

From equity positions to salaries, company revenue, internal emails, how customer money is allocated and even the cultural, gender and age diversity of their employees—nothing is hidden at Buffer.

I absolutely love them for this!

Their example has encouraged me to become less opaque in the way I do business and play the game of life. In my opinion, they’re a great example of what companies can and should be.

Buffer Is Inspiring

The team at Buffer has worked hard to create an awesome culture. A culture that’s not limited by geography, one that pushes its team members to greater heights and encourages colleagues to take risks and move fast. Buffer also provides employees with everything they need to be successful:

  • Buffer pays on average 35% above standard wages for each position.
  • Macbooks and other tech are provided so that everyone can do their best work.
  • A Kindle ebook program to further the education of employees and their families.
  • A free Jawbone Up to encourage healthy movement and sleep.
  • Unlimited vacation policy and $1,000 cash bribe to go on vacation each year.
  • Work remotely from wherever they want.
  • Family leave policy of 1-3 months.
  • Work retreats to exotic locations every 5 months.
  • Equity options of 0.05-0.1%.

Buffer Is Creative and Innovative

Buffer leads innovation and transparency within the tech industry and has the product pedigree to prove it, building and acquiring consumer products and internal tools like:

In just five short years, Buffer has become a major player in the social media management space with its unorthodox, refreshing approach to business and life.

They’ve taken their competition by storm and proven themselves to be a strong and steady competitor, which is why I have chosen them for this month’s Hot Seat.

This particular Hot Seat is broken down into four different sections:

  1.  Website Performance
  2. Search Engine Indexation
  3. Search Engine Visibility
  4. Blog SEO

As you can see from the numbered list above, I’ll be specifically diving into facets of Buffer’s website and search engine performance.

Full Disclosure

If you’re a reader of this blog, you may already know that over seven years ago, I jumped ship on the IT industry and busted my backside to earn my salt as a marketer.

What you probably don’t know is that shame and embarrassment drove me into the deep waters of SEO early in my marketing career (a story for another time) and I’ve been doing it ever since.

I’ve optimized the sites and content of recognized brands like Vistaprint, Verisign and The World Bank and have helped small businesses around the world gain visibility in major search engines like Google, Bing and Yahoo!

From website audits to algorithmic penalty recovery, on-page optimization, web analytics, content audits, keyword research, content writing, technical SEO, international SEO and content outreach, my hands have touched nearly every aspect of SEO—which made this particular Hot Seat a lot of fun.

DISCLAIMER: I do not work for or represent Buffer. I’m just a marketing guy who enjoys their products and exploring how to make good things even better.

NOTE: This Hot Seat assumes that you’re already familiar with concepts like how a web page is displayed, how SEO works, how to use WordPress and languages like JavaScript and CSS. If you’re not familiar with these concepts, this post might be a little challenging.

As usual, in each section I’ll grade the company’s overall marketing efforts, discuss why each element is important, highlight the things Buffer is doing well and nitpick at things that could use some work.

Let’s fire this thing up!

Buffer Analysis

Website Performance

Grade: C

In going through Buffer’s website for this article, I came across a job posting on their Journey page and was shocked, but not totally surprised, to learn that:

“Over 50% of the traffic to the Buffer blog comes from searches, last month it was 721,724 sessions according to Google Analytics. And so far, we’ve neglected doing any real work on making sure our blog is setup to really grow in search traffic.”

When it comes to creative, innovative, useful content—Buffer is basically a Wonka factory. Their Content-Oompa-Loompas have created a wonderland full of in-depth articles, widgets, web apps, slick dashboards and charts and graphs of things that ordinary folks like us would never even imagine to measure.

It’s no wonder that Buffer drives boatloads of traffic to its site and blog.

Despite the massive success of its organic content, Buffer mentions key areas where they’d like to see improvement, particularly better search engine discovery for the main website and the blog. And I don’t disagree.

Page Load

The amount of time it takes a page to load is a metric that often gets overlooked by some web developers and webmasters.

I once heard a developer say that because we now measure page load in seconds instead of minutes that visitors and users shouldn’t complain.

When you compare today’s speeds with the days of dial up internet, pages load at warp speed. But as technology progresses, people expect things nearly instantaneously, search engines included.

Page load time is so important to Google that they use site speed as a factor in how they rank websites.

Why is a fast site important?

According to Radware’s 2015 State of the Union on Page Speed & Web Performance …

“Separate studies have found that 57% of consumers will abandon a page that takes longer than 3 seconds to load.”

Additionally, their findings stated …

“A site that loads in 3 seconds experiences 22% fewer page views, a 50% higher bounce rate, and 22% fewer conversions than a site that loads in 1 second, while a site that loads in 5 seconds experiences 35% fewer page views, a 105% higher bounce rate, and 38% fewer conversions.”

Let’s face it, humans have the attention spans of goldfish and it’s probably not going to get better any time soon. If you want to extract the most page views, engagement and conversions from your site, improving load speeds is a must.

So, how did Buffer fare?


As you can see from the Web Page Test results, the main website appears to have performed well in 3 of the 6 areas measured, but the site ran into trouble with first byte, caching and image compression.

NOTE: I ran this test on IE 11 because Chrome ignores certain types of requests in order to load client-side pages faster, whereas IE provides the whole picture.


On the plus side Keep-alive Enabled, Compress Transfer and Effective use of CDN all scored well.

The downside is Start Render (time it takes for a visitor to visually see the page loading) for First View is 3 seconds. This isn’t great, as many sites have fully loaded within 2 seconds or less.

Also, a significant number of Buffer’s static assets aren’t being cached correctly because they don’t have an explicit caching policy set, which is causing unnecessary multiple round trips to be made between the client and server which is adding to the load time.

While they’ve done an outstanding job with the speed of their platform app, Buffer’s website page load times could use a little more love.

Setting Goals

Planning is a lot like house cleaning, thinking about it makes your brain hurt and you complain while doing it, but you can’t argue with the positive results.

So, what does planning have to do with websites? Before setting out to make pages load faster, it’s important to set goals and make plans to reach them.

Right now, the Buffer homepage takes  just over eight seconds to fully load on first view. It would be great to see it fully load in four seconds or less on first view.

Utilizing a performance budget can help the Buffer development team identify realistic page load targets and create a solid plan to reach them. How?

One way is collecting page load times of competitors and working to beat the best.


In this chart, we can see that sproutsocial.com has the best document complete and fully loaded times of the five social media marketing tools listed.

Using Sprout Social as a benchmark, Buffer might aim to match their page load times and then take things further by setting a target of reducing load times another 20%.

The chart below shows what a 20% reduction of Sprout Social’s load times looks like.


What can Buffer do to achieve the target load times above?

Image Compression

Compressing images can save data (fewer bytes downloaded), improve performance (faster downloading and rendering) and decrease bandwidth (fewer bytes downloaded equals less competition for bandwidth). Some of the images from the PageSpeed audit are below.


Whether it’s the SSL handshake, Time To First Byte or download time, images are by far the biggest issue holding Buffer back from blazing load speeds as highlighted in the waterfall chart below.


The buffer-landing-page-1-light-hero.png took 4.5s to load and is big at 1.2MB.

There are many tools available to Buffer for lossy and losslessly compressing JPGs, PNGs, GIFs and SVGs for Mac, Windows and browsers. Creative Bloq put several tools through the wringer in the article 18 Image File Compression Tools Tested.

Recommended tools include:

  • File Optimizer (PNG, JPG, GIF, PDF) – Windows
  • ImageOptim (PNG, JPG, GIF) – Mac
  • SVGO (SVG only) – Nodejs-based tool for optimizing SVG files

However, the best place to start optimizing and compressing images is your favorite editor, whether that be GIMP, Photoshop or something else. Why? Optimizing within editors can reduce the need for third party image compression tools.

Remove Render-Blocking JavaScript

Unfortunately, none of the initial view content on Buffer.com can be rendered without first waiting for the page’s JavaScript and jQuery to execute.

<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.10.1/jquery.min.js"></script>

<script src="https://ajax.googleapis.com/ajax/libs/jqueryui/1.10.3/jquery-ui.min.js"></script>

<script src="https://d389zggrogs7qo.cloudfront.net/js/shared/base.min.a785033f1d469fbce94f3e5cb5f5d4fa.js"></script>

<script src="https://d389zggrogs7qo.cloudfront.net/js/bundles/homepage_bundle.874ef3ef648538d43c7f5df71d8ae6ad.gz.js"></script>

<script src="https://www.google.com/jsapi"></script>

Usually when a browser sees JavaScript tags like the ones above, it stops until it has evaluated the script. If hosted elsewhere, which in this case it is, the browser has to download the script first, which eats up a lot of time and the script still has to be evaluated, which eats up even more time.

Why does the browser stop dead in its tracks to download and evaluate JavaScript? Because JavaScript can change the DOM (Domain Object Module) or layout of the document and must execute the JavaScript first before it can finish constructing the DOM and begin parsing the HTML.

To reduce the amount of time it takes the homepage to load, Buffer has a few options.

Defer Loading is where you move render-blocking JavaScript tags that are non-critical to page render, below the content fold. A great place to move these snippets is from just after the opening <head> tag to just before the closing </head> tag. This way, the code still fires but doesn’t block rendering of the page (the part of the loading process the visitor sees) and prevents visitors from prematurely bouncing.

Asynchronous Loading is another way to prevent blocking the parser. Adding the async attribute to JavaScript tags allows JavaScript to execute without holding up construction of the DOM. The result is that pages will load faster.

Inline Loading is where you insert JavaScript code directly into an HTML document. Inlining JavaScript contents eliminates external requests, avoids network latency and allows the visitor’s browser to deliver a faster time to first render.

Of the options above, the first is likely the easiest and most feasible. Why? Adding the async attribute to these tags (while a good idea in theory) could cause unforeseen issues, while inlining five scripts into the document will almost certainly increase the size of the page making all previous speed optimizations null and void.

Compress JavaScript

The DOM tree cannot be constructed without first evaluating and rendering JavaScript which naturally takes time. Add on top of that JavaScript files that are too large, and what should take less than 100ms turns into more than 200ms for download.

The most effective way to make pages more responsive and render faster is to reduce the number of files and the size of the files that need to be downloaded.

In Buffer’s case, some of its JavaScript files suffer from being rather large and take a long time to download.

<script type="text/JavaScript" src="//s7.addthis.com/js/300/addthis_widget.js#provider=buffer"></script>

<script src="https://d389zggrogs7qo.cloudfront.net/js/bundles/homepage_bundle.1e51cf2fee3e4cb2e25b4afe06333cee.gz.js"></script>

<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.10.1/jquery.min.js"></script>

<script src="https://ajax.googleapis.com/ajax/libs/jqueryui/1.10.3/jquery-ui.min.js"></script>

Fortunately, there are tools available to help reduce the size of JavaScript files like Closure Compiler.

Closure has multiple different formats including a Closure Service UI (best for a few lines of JavaScript or to test output), Closure application download and the Closure API, which allow you to automate JavaScript optimization or build into a larger system (e.g., IDE extension).

Closure works by rewriting your JS into a much smaller form, while ensuring the code still runs correctly. It can also combine files and can drastically reduce the size of your JavaScript files.

You can find more information about Closure and how it works its magic, along with finding answers to any other questions you might have in Google’s PageSpeed Insights article Compressing your JavaScript with Closure Compiler.

Tag Management (optional)

A number of the JavaScript tags coded into the Buffer homepage are marketing tags, including one for Quantcast, Twitter Ads, Pinterest verification, Facebook Conversion Tracking and Google Analytics.

With the exception of the Twitter Ads tag, the others load asynchronously. The Twitter Ads tag is synchronous, is located in the head of the HTML and is blocking rendering. Ideally, this tag should be deferred, be asynchronous or added inline as to not hold up the DOM.

Alternatively, a great solution for nixing the whole render-blocking JavaScript issue and also centrally managing multiple snippets is a tag manager.

So, what is a tag manager, you ask? It’s software that allows you to install, modify, manage and remove JavaScript tags intended for traffic analysis and marketing optimization, without having to write a single line of code and unified in a single, user-friendly tool.

There are several paid and free options available including Google Tag Manager (free), Tealium (paid),  and Qubit Open Tag (free and paid).

Benefits of utilizing tag management include:

  1. No more worrying about render-blocking, synchronous JavaScript tags.
  2. Frees developers from having to install new marketing or analytics software tags every day.
  3. Precise control over when and where tags fire.
  4. Reduces errors by eliminating the need to edit site code.

Optimize CSS

When it comes to optimizing a website for speed there are few areas more important than CSS.


It’s because along with JavaScript, CSS is by nature render blocking. A browser will not render a page before checking to see if there are styles to be applied to the HTML first. When a website has issues with load time, it’s often style sheets holding a website back from optimal performance.

Because CSS delivery is so crucial to the render path, it’s important to make sure that files are streamlined and delivered in a way that doesn’t block rendering.

Buffer has 4 external CSS files referenced in the homepage HTML.

Homepage – 410.72kb
WebFonts Bundle – 16.68kb
GoogleFont Satisfy – 0.19kb
GoogleFont Open Sans – 1.02kb

The total size of these external CSS files is 429kb. While there’s no hard and fast rule for CSS file size, a reasonable CSS file is somewhere around 75kb.

To help reduce the amount of necessary round trips to the server, Google recommends combining CSS into one file for the best web performance possible.

Alternatively, and preferably, Buffer could optimize its CSS by rewriting their files using an Object Oriented CSS (OOCSS) approach, but isn’t something I would recommend in the near term as it requires forethought and planning to select the best OOCSS framework and post processing tool(s) for your business.

Additionally, Buffer has 24 instances of inline CSS attributes located in their homepage HTML. Google cautions to avoid inline CSS attributes as it leads to unnecessary code duplication and slows rendering.

Lastly, in addition to being rather large, the four CSS files referenced above/the external CSS links listed below, are also render-blocking, which is causing multiple round trips by the browser.

<link rel="stylesheet" href="https://d389zggrogs7qo.cloudfront.net/css/homepage.6367e6a7dbb8af13432a891a015afd4c.gz.css" type="text/css" />

<link rel="stylesheet" href="https://d389zggrogs7qo.cloudfront.net/webfonts/webfonts_bundle.0c901bab6211068beeeedc32e2ba57a9.gz.css" type="text/css" />

<link href='//fonts.googleapis.com/css?family=Satisfy' rel='stylesheet' type='text/css'>

<link href='//fonts.googleapis.com/css?family=Open+Sans:400,700,300,600' rel='stylesheet' type='text/css'>

Google and the developer community agree that the best approach to removing render-blocking CSS is to “identify and inline the CSS necessary for rendering the above-the-fold content and defer loading the remaining styles until after the above-the-fold content.”

    <div class="blue">
      Hello, world!
      var cb = function() {
        var l = document.createElement('link'); l.rel = 'stylesheet';
        l.href = 'small.css';
        var h = document.getElementsByTagName('head')[0]; h.parentNode.insertBefore(l, h);
      var raf = requestAnimationFrame || mozRequestAnimationFrame ||
          webkitRequestAnimationFrame || msRequestAnimationFrame;
      if (raf) raf(cb);
      else window.addEventListener('load', cb);

Yes, it’s true that the inline CSS won’t cache, but it should be small enough so as not to add additional time to page load.

Buffer developers will need to make the call as to what they believe to be critical CSS (initial view, content, layout, typography, etc.) versus non-critical CSS as it pertains to page render.

To help make this process less manual and painful, there are automated tools available to help identify critical CSS, inline critical CSS and asynchronously load CSS.

Critical allows you to both extract and inline critical CSS using Node. You can combine Critical with a JavaScript task runner like Grunt and automatically process your CSS. Alternatively, you can load your CSS asynchronously using LoadCSS.

Browser Caching

Browser caching is one of those critical client-side functions that can make drastic improvements in page load times. When enabled, a browser caches (i.e., saves) a local copy of assets like HTML, CSS, JavaScript and images directly in the client which makes it so that the browser doesn’t have to fetch assets each time a page is visited. This puts less of a strain on the server where the content assets are hosted, requires little bandwidth and reduces the number of round trips in order to load the page.

Unfortunately, there are a number of static assets (below) that are not currently being cached due to no max-age or expires headers being set and not using the If Modified Since flag. This is causing additional round trips and is slowing down page load.


It looks like Buffer serves most of its content from Amazon CloudFront. By default, each object automatically expires after 24 hours.

To change the cache duration for all objects that match the same path pattern, you can change the CloudFront settings for Minimum TTL, Maximum TTL, and Default TTL for a cache behavior.

To change the cache duration for an individual object, you can configure your origin to add a Cache-Control max-age or Cache-Control s-maxage directive, or an Expires header field to the object.

Content Distribution

While Buffer serves much of its static content (images, CSS and JS files) through CDNs like Amazon CloudFront, Cloudflare and Fastly, other assets and objects are not being served from a CDN, including:


My recommendation is that Buffer serve these static items from their CDN and not their primary hosting service. The main benefits being:

  1. Reduced bandwidth usage — A CDN reduces the burden on your hosting system by limiting the flow during a request. Reducing your traffic enables you to save your bandwidth.
  2. Reduced page load times – Users (regardless of location) are able to download your website’s content from the server nearest them, versus downloading it from the region of the world where your site’s servers are located. This reduces latency and provides faster load times.

Mobile Speed Optimization

A quick check of SimilarWeb estimates that Buffer saw roughly 8M total visits to its website last month. Of those 8M visits, it’s estimated that 83% of that traffic came from desktop while 17% came from mobile. That’s roughly 1.36M searches coming from mobile last month alone.

These above numbers are likely far from exact but are a good jumping off point for this section. Why?

If the mobile traffic number from SimilarWeb is anywhere close to the actual figure, Buffer owes it to themselves and to their visitors to serve the best mobile optimized experience possible.

Buffer passed Google’s Mobile-Friendly Test with flying colors.


A check of Buffer’s mobile performance using Google’s Mobile PageSpeed Insights gives Buffer.com a score anywhere between 48-56/100, which isn’t great.

NOTE: Buffer is currently running a homepage split test and depending upon which cohort the Google crawler lands, you’ll see a different score.


The issues bogging down page performance on the desktop are the same ones impacting Buffer on mobile devices, including: render-blocking JavaScript, render-blocking CSS, image file size, browser caching, file compression and minification.

There’s been a lot of talk in tech and marketing circles about mobile, from mobile searches passing desktop searches to Mobilegeddon—so much in fact that I won’t be rehashing anything here.

Anyone with an online business should be aware of the importance of having a site that performs well on mobile devices. If not, check out 12 Reasons Why You Should Have a Mobile Friendly Website.

What I will point out are the recent tools Google has released to help webmasters make their sites run faster than a blue hedgehog chasing gold rings. Specifically, I’m talking about Accelerated Mobile Pages (AMP).

AMP is Google’s open source project that aims to help developers, webmasters and marketers create optimized content that loads fast on any mobile device, anywhere.


Using an enhanced HTML framework, a custom JavaScript library (AMP JS) and an optional proxy-based CDN (AMP Cache), HTML files are marked up with AMP enhanced tags in the document head and body.

AMPs  improve resource loading management, mobile page rendering, layout control and delivery of AMP HTML pages by:

  • Allowing only asynchronous scripts
  • Sizing all resources statically
  • Not letting extension mechanisms block rendering
  • Keeping all third-party JavaScript out of the critical path
  • Inlining and size-bounding all CSS
  • Efficient font triggering
  • Minimizing style recalculations
  • Only running GPU-accelerated animations
  • Prioritizing resource loading
  • Instantly loading pages

What AMP guidelines don’t allow are traditional ad schemes, forms or other assets that could weigh down mobile page load times. Given that Buffer publishes 8-10 articles a week (between the three blogs), it might make sense for Buffer to have AMP HTML versions of their blog posts.


A recent SOASTA mobile performance case study found that …

“Mobile pages that are 1 second faster experience up to 27% increase in conversion rate.”

Using AMP HTML to boost website performance on mobile devices could help Buffer see improvements in search engine rankings. Additionally, sites using AMP HTML are highlighted with a green lightning bolt, inviting users to click, which could greatly improve organic CTR.

It’s best to use AMP with pages on the Buffer blog as guidelines don’t allow forms or other assets that could bog down the loading of pages.

Fortunately, there’s a plugin for that.

Web Performance Summary: This is the biggest area of improvement for Buffer. The most noticeable performance issue appears when trying to immediately scroll down a page after start render, which takes a bit. Good news is that all page speed issues are resolvable. Implementing an explicit browser caching policy, improving image compression and resources should drastically increase performance for Buffer.

Search Engine Indexation

Grade: B

XML Sitemap

Sitemaps are one of the essential things that webmasters can do to help their pages get found by search engine spiders of Google, Bing and Yahoo! In addition to page discovery, sitemaps help search engines in choosing canonical URLs on your site.

Unfortunately, I wasn’t able to locate a sitemap for Buffer.com in the usual places like appending sitemap.xml to the URL in the URL bar or doing a Google search. I used a few different tools to do the searching for me just in case I missed something, but none of them turned up an XML sitemap.

It would be great to see Buffer create a sitemap. There are multiple tools available that let you both manually and automatically create and update the XML sitemap.

Once the XML sitemap is created, submitting it to Google, Bing and Yahoo! webmaster tools will help with future content discovery and indexation.

PRO TIP: Adding the XML sitemap to the robots.txt helps search engines locate your sitemap if it’s listed in a directory that’s uncommon. Simply add the line below, specifying the location of your website’s sitemap to the bottom of your robots.txt file.


Sitemap: http://www.example.com/sitemap.xml

Duplicate Content (URL Parameters and Hash Fragments)

Tracking parameters are great for understanding where and how many people click links you posted to social media, added to email marketing campaigns, included in guest blog posts or published in press or media articles.

These tagged URLs are sent to Google Analytics where you can identify the links most effective in attracting visitors to your content.

(Google’s URL Builder helps developers, webmasters, marketers and publishers create tagged URLs with ease.)

Going through the website, I noticed Buffer uses parameters to track internal links.

Using tracking parameters for internal links may seem like a good idea but should only be used for external links and not internal links.


Google Analytics is great at reporting user activity right out of the box. By default, it reports visits to each page where the code is installed. Using parameters for internal links has the tendency to mess up this reporting.

Instead of identifying traffic as “Organic” or from “Google Search,” it will report traffic as “homepage” or “about-page” or wherever there’s a URL containing parameters. This overwrites the native data and over time corrupts the raw data.

The best way to track internal link clicks is using Google Analytics Event Tracking.

Event Tracking lets you track link clicks using the onclick event handler to allow Google Analytics to capture clicks. These clicks are recorded by Google Analytics and displayed on the Reporting tab, under Behavior > Events.

Event Tracking is painless to set up, especially if you’re already using Google Tag Manager (GTM). If you’re not using GTM, you’ll need to find a willing developer to help. In either case, setup is a breeze.


In addition to ruining your data, using parameters for internal pages can cause duplicate content issues, which is what has happened on Buffer.com.

Duplicate content saps your websites crawl budget by wasting search engine resources on pages bots have already crawled. Duplicate content also dilutes ranking power, making it so that it has to be shared between all of your duplicate pages.

As far as getting rid of duplicate content is concerned, the Rel=canonical tag is an easy way to tell major search engines which version of your page you prefer they index.

Simply place the code below in the document head of the original resource as well as the duplicate versions of the page with the link pointing to the original resource.

<link rel=”canonical” href=”https://examplesite.com.com/example-page/original-resource”/>

When a search engine bot reaches the page and sees the above tag, it will see the link to the original source. All of the duplicate page links will be credited to the original page, which means you won’t lose any ranking power (a.k.a. link juice).

Broken Links

Buffer has nine broken links on its website and one script tag containing a syntax error, all of which should be relatively easy to address.

The spreadsheet below calls out the nine links that are broken on the Buffer website. Column A shows the actual broken link. Column B shows what pages on Buffer.com are pointing to that broken link.

NOTE: Row 85 of the spreadsheet is a jQuery Migrate script that is used to migrate older jQuery code to jQuery 1.9+.

The script tag contains four forward slashes after the colon instead of two. From my end it looks as if this script is not firing, but I can’t 100% confirm this. My research indicates that it may affect caching.


When it comes to search engines and 300 status codes, not all redirects are created equal. The most commonly used 301 and 302 redirects give off very specific and different signals to search engine bots that crawl and index the web. Historically, the 302 redirect has been described as “Moved Temporarily” (HTTP 1.0) but more recently is described as “Found” (HTTP 1.1), while a 301 is described as “Moved Permanently.”

A crawl of Buffer.com revealed hundreds of URLs using the 302 redirect, including:

http://blog.buffer.com (302 redirect to https://blog.bufferapp.com/)
http://buffer.uservoice.com/ (302 redirect to https://buffer.uservoice.com/)
https://buffer.com/app (302 redirect to https://buffer.com/)

See the spreadsheet below for a full list.

Why does this matter?

301 redirects have been confirmed to pass between 90-99% ranking power (a.k.a. link juice) to the redirected page, whereas 302 redirects pass 0% ranking power to the redirected page.

The 301 is the best method for implementing a redirect to another site and in most cases the 302 should not be used. Implementing 301s instead of 302s will let Buffer to regain some lost linking power and could give the redirected pages a rankings boost.

UPDATE: In February of this year, Google’s John Muller stated in the comments area of a Google+ post: “it’s incorrect that 302 redirects wouldn’t pass Pagerank. That’s a myth.” Evidence has been found by other SEOs (see slides 80-82) to the contrary. This remains a polarizing topic.

Meta Keywords

“A long time ago in a search engine far, far away, the meta keywords tag ruled results pages with an iron fist and brought misery to the internets realm.”

During the early years of the internet, search engines relied heavily upon the meta keywords tag to serve what it thought were relevant results to users, but as usual, black hats ruined everything with keyword stuffing, Google bombing and are why SEOs can’t have nice things! *sigh* I digress.

As the result of such raucous behavior, search engines developed more complex algorithms to serve more relevant results and stick it to seedy characters within the SEO community, which punishes us all.

Today, major search engines view the meta keywords tag as a spam signal, which makes it important not to include these in HTML documents. Another good reason not to use the meta keywords tag is that because it’s publicly visible in the HTML, it broadcasts your target keywords to competitors.

During my review of Buffer.com, I noticed that the site is using the meta keywords tag. Buffer should consider removing this element from their pages.

Search Engine Indexation Summary: If there’s one problem Buffer doesn’t have, it’s getting indexed. Buffer.com is one of the top 3,000 most trafficked websites in the world. Adding an XML sitemap, removing duplicate content and replacing 302 redirects with 301 directs is really just making sure that the site and blog squeeze as much authority and ranking power out of its online presence as possible.

Search Engine Visibility

Grade: B+

Single Domain (optional)

When it comes to content creation, there are millions of articles that recommend blog/article writing as the best way to connect with people and provide the most value to visitors. However, there’s not a lot of advice about where this type of content should live.

Should this content live on a subdomain (e.g., http://blog.buffer.com/)?

Should it live in a subfolder (e.g., http://www.buffer.com/blog/)?

Does it really matter where your content lives?

According to the good people at Moz.com, yes it does. Moz SEOs ran 3 separate experiments testing whether it’s better to have your content located on a subdomain versus a subfolder.

What were the results?

Moz rankings rose dramatically across the board for every keyword they tracked to the pages. The Moz team has seen this same phenomenon happen with dozens of other websites with similarly positive results (assuming that the content is being moved from a subdomain lacking quality content and authoritative link signals to a subdomain that has these signals).

Moz posted a Whiteboard Friday video that explains subdomains vs. subfolders in more detail.

“A blog is far more likely to perform well in the rankings and to help the rest of your site’s content perform well if it’s all together on one sub and root domain.” Moz.com

Buffer’s work to better target keywords and attract organic visitors could be greatly helped by moving its WordPress instances (blog.buffer.com, open.buffer.com, etc.) and HTML pages (diversity.buffer.com, respond.buffer.com, etc.) to a subfolder.

Keyword Gap Analysis

Keyword research is the foundation on which great marketing campaigns are built. Targeting the right phrases at the right time and with the right approach is critical to scaling organic growth.

Buffer doesn’t have a keyword targeting issue, but according to their SEO Specialist job description, they’re looking for “ideas and insights” on new terms for which their website can be found in the future. So I’ve put together an exploratory list of keywords below that could help.

There are a number of phrases in the spreadsheet that Buffer already ranks for but whose positions are on pages 2-10, which see very little traffic compared to page 1 search results. The other keywords are phrases where Buffer doesn’t rank at all.

Some SEOs might be tempted to go after a lot of long tail phrases because they typically convert well.

Given that Buffer is an established website with a large online footprint, focusing its keyword strategy only on long tail terms, for them, would be like trying to get full off sunflower seeds. Instead, in this list you’ll see head terms (phrases containing 1-2 keywords), torso or body terms (phrases containing 2-3 keywords) and a few long tail terms (phrases containing 3+ keywords).

One thing that stood out in my research is that Buffer.com currently ranks for few keywords related to social media, whereas blog.bufferapp.com ranks highly for many social media related terms. This could signal a search engine indexation issue but would require further investigation to be positive.

It would be great to see Buffer begin to incorporate these keywords into the titles, URLs, descriptions, headings and body copy of its corporate website and blogs, as many of its organic search opportunities are low hanging and wouldn’t require a ton of effort to see positive results.

Search Engine Visibility Summary: When it comes to search engine visibility there are always more keywords on which a website can be found—Buffer.com and the Buffer blog are no exception. Improved keyword targeting and increased authority are two areas that can help with this. However, moving an established, authoritative blog from its current subdomain to a folder on the TLD isn’t without its risks and will have impact that is unforeseeable, which is why I cautiously recommend this tactic.

Blog SEO

Grade: B

Ask most marketers their advice on ways to drive more traffic to a blog and you’d probably get answers like write better content more often, promote your articles on social media, guest post on other blogs and blast your email subscribers.

While there’s nothing fundamentally wrong with this advice, it’s tired, incomplete, lacks clarity and is missing context.

What rarely gets mentioned in the same breath as the above advice, is making sure that articles are optimized. I mean, honestly, how much of an effect could optimizing old posts have on a blog’s traffic?

   “We’ve increased the number of monthly organic search views of old posts we’ve optimized by an average of 106%.” – Hubspot

That’s right, it can have a huge effect.

Hubspot optimized their old blog posts and increased their organic search views over one hundred percent. They also doubled their number of monthly leads.

Below are a few optimizations that can be easily made to the Buffer blog (and yours) to increase traffic, leads, customers or whatever it is that you’re after.

Blog Post Titles

The title tag (or title element) is the Freddie Mercury of on-page SEO. Yes, all parts of on-page SEO play important roles to the overall strategy, but this critical page tag defines the title of the HTML document, just like Freddie Mercury defined Queen.


Because of the importance placed on the title tag by major search engines like Google, Bing and Yahoo! for indexation and ranking, it’s important to place keywords relevant to your pages in the title tag.

Buffer does a bang up job writing about interesting topics and creating clickable headlines for their blog posts. Though, it would be great to see keywords better targeted in their titles. Below are a few examples of how one change could help search engines better rank Buffer’s blog posts for relevant keywords.













A small tweak to the title allows Buffer to target keywords like social media ads, new twitter features and snapchat guide.

Why target these specific keywords?

Google’s Keyword Planner Tool tells us that there’s search volume for the after keywords but not the before keywords.


This is important because you want to use keywords in your titles that visitors are already searching for and that can drive even more traffic to your blog posts.

Blog Category Landing Pages

Blog categories appear to be one of those things that are going by the wayside in modern CMS themes, which is sad. Categories are perfect for organizing and grouping similar content.

Buffer does a good job using categories next to the the author’s name, but it would be nice if they were made to stand out (maybe even color coded tags) so that they’re more visible.

You may be thinking: “okay, but where’s the practical SEO value in blog categories?”

Well, the practical value lies in category landing pages. When fully utilized, they can provide more opportunities to target relevant keywords that you want to rank for.

Below are a few examples of popular websites that are doing categories and category landing pages well.

Hubspot, Copyblogger and Unbounce show exactly how category landing pages can target keywords in the page title, URL and headings, as well as briefly explain what can be expected from posts in that specific category.

In the case of Copyblogger and Unbounce, they also use that space to link to popular posts and online courses. So, there’s a lot that can be done with categories that still make them practical and highly useful.

It would be good to see Buffer better optimize their blog category landing pages and provide visitors with additional detail and links to some of the great content they’ve created over the years.

User Experience

Though it’s not a direct SEO factor, user experience and usability influence search engine ranking success. Visitors that are happy with the layout, design and flow of a website stay longer, which has a positive impact on engagement metrics and machine learning that search engines use to help surface the most relevant results.

HelloBars, feature boxes and slideups that encourage people to signup for both Buffer and Respond subscriptions are clearly visible on the blog which is great. A recent post by Buffer’s Kevan Lee explains why they recently pivoted from collecting email addresses to encouraging signups.

After clicking the CTA of these ads promoting Buffer’s products, I expected to be placed directly into the product signup flow. Instead, I was taken to the homepage for Respond and the homepage for Buffer and had to click a second CTA, which struck me as a little awkward.

My knowledge of the Buffer technology stack is zero, so what I type next could be a much bigger deal than I realize. That being said, I’ve composed my recommendation as an agile user story below.

Given that I am a blog visitor, when I click a product CTA on a HelloBar, feature box or slideup, then a modal window should appear that asks me to complete step-1 of the signup process.

Blog SEO Summary: The Buffer blog is doing so well in search engines that it almost feels wrong to make recommendations. In line with a desire to better optimize their blog, keyword targeting in titles, descriptions, heading and body copy should help, along with targeting keywords and topics on blog landing pages. Though UX isn’t exactly SEO, removing unnecessary steps in the Buffer signup flow could make an impact on conversions in the future.

Wrap Up

For a site that has, in their own words, “neglected doing any real work on making sure our blog is setup to really grow in search traffic,” Buffer comes out smelling like roses.

Operating without a focus on SEO all these years hasn’t had any negative impact on Buffer’s business—which is great. Yet, the team understands …

“What got you here won’t get you there.”

– Marshall Goldsmith

Buffer is one of those companies that comes along every once in a while that has the potential to help define a new generation of workers and shape the future of work.

No, Buffer isn’t the first company to create fantastic, life-simplifying software with a cult following.

They’re not the first to operate without an office, with no central location and without fixed schedules.

They’re not the first company to pay employees really well and be both open and transparent.

They’re not the first to encourage wholeness and treat their employees like family.

They’re not the first company to operate with a self-defined, self-management style.

They’re not the first company to have an enviable work culture.

But, it’s my opinion that they’re the first to do it all and absolutely nail it.

The future of Buffer looks very bright. I’ll definitely be keeping an eye on them. Keep up the great work, guys!

If you’d like the chance to see your company featured in an upcoming Hot Seat, visit my contact page and fill out the form.

Leave a Reply

Your email address will not be published. Required fields are marked *