SEO for Single Page Applications

Author:
 

It has been said a million times before, but the internet is changing. It’s not only changing how people consume content or interact with businesses; website design, structure and build are also evolving. JavaScript has long been an important component in web design, and its implementation continues to grow in demand from users and web developers. While this results in some amazing websites, JavaScript-based pages and sites can cause problems when it comes to search engine optimization. 

JavaScript has been a looming giant in web development and search engine optimization for a long time, and its dominance will continue as developers find new ways to build speedier and more interactive sites. 

JavaScript is widely used across web development: 96.7% of all websites use it. Of the 60,000 developers surveyed recently by Stack Overflow, large majority preferred to work with JavaScript web frameworks. The people that build the web are using more and more JavaScript. And they’re applying JavaScript frameworks to further enhance site speed with single page applications. 

Single page applications have obvious benefits. When a user loads a page on your website, they are loading the entire site. This makes navigation around the site as fast as possible, and the main hurdle to clear is the JavaScript populating the HTML elements of the page. This JavaScript populates using a bunch of fancy terms like asynchronous fetching, but to the user, the effect is a snappy experience that allows for fast movement around the site. 

These single page apps can, however, create a problem for people interested in Search Engine Optimization. Some content, or even some pages, can be blocked behind JavaScript, which means that the agents that Google uses to index and categorize web pages can’t see that content. In single page applications that rely on JavaScript to dynamically populate content across all pages, this problem can get even more complicated. Tools and practitioners are certainly getting better at recognizing and ranking JavaScript-based pages, but they can still cause problems for the bots that Google and other search engines use to index and evaluate a site’s quality.  As Jamie Alberico points out in a recent post, Googlebot is not as faultless as some may think. It is getting better at reading JavaScript and the time to render a page is getting shorter all the time, but problems still arise. As websites use more and more JavaScript to display content, the gap between the bones, the source HTML, and the user version, the rendered page, widens. It is in that gap that problems can occur 

If something goes wrong during the rendering process, or if the page does not render at all, then Google cannot index your page with the content that is supposed to be on that page. If your webpage, in the eyes of Google, doesn’t have the content that you want to be ranked for, you cannot rank for that content. It seems like an obvious statement, but it is important to keep in mind as you look to implement search engine optimization on your website. Jamie Alberico writes, “If you can’t identify what a page is about and what type of search intent it matches based on the initial HTML, neither can a search engine.” This isn’t the exact truth – Googlebot can see content outside of the initial HTML as it renders – but adopting this line of thinking can help ensure that the content you want visible, to robots and to humans, will be available when necessary.  

Because the HTML of the single page application/website is only requested once, the skeleton of the website is always the same before rendering. When the user first enters, the site loads all at once and then renders relevant web elements as the user interacts with the site. This leads to challenges when trying to track actions on the site or attribution.  

When one of our clients first made the switch from a traditional site to a single page application model, misattribution of incoming web traffic led to an extra bump in organic traffic, as shown in the graph below. The first half of the graph shows the traffic before the site relaunch, and the dip in the middle is the day that the site launched. Due to the nature of single page apps, the site traffic became hard to follow.  

It amounts to trying to track behavior across the site without any server calls or responses after the first. After site entrance, the web server was no longer needed, because the entire website was already loaded on the user-side. This made it tougher to see how visitors behaved on the site when navigating across pages. When you don’t know how users behave on your site, it can be hard to implement an SEO strategy or recommend any search engine optimization changes.  

Thankfully, this can be fixed with some help from a web developer and adjustments to the JavaScript framework. Even if the entire website loads for the user on first click, which once again is the point of a single page application, it is possible to force a server request. This request, though it doesn’t actually deliver any necessary website data or content, helps behavior tracking, and in turn, your SEO efforts. 

JavaScript websites and single-page applications are only getting more popular, for good reason. They simply provide a good user experience and allow for easier development. Statistics continue to show increased adoption. Google, and its trusty Googlebot, is trying its best to catch up. They are getting better every day at evaluating a JavaScript-based site. Google Chrome and Googlebot now use similar or even the same user agent to access your site. That does not mean that SEO or tracking for this new generation of sites is ready out-of-the-box or straightforward. Problems crop up along the entire user experience. Fixing those problems requires collaboration between client, SEO agency, and developer. When all parties involved work together, single-page applications can be easy to use, easy to develop, and easy to optimize. They can cause nightmares, however, without proper guidance. 

Sources linked.  

Special thanks to Martin Splittdeveloper at Google, for his assistance in writing this piece. 

To discuss site analytics (behavior tracking) and SEO for your brand’s website, includinthe location pages for the businesses that make your franchise or multi-location system, contact Location3 today! 

 

2 thoughts on “SEO for Single Page Applications

  1. I do local SEO for my Income Tax business website. I just added my Yelp, Facebook and Twitter accounts to my website. It have been a never ending process for me.

    Chris
    Owner CEL Financial Services
    IRS Registered Tax Preparer
    Registered bonded California CTEC Tax Preparer
    https://incometaxprepfillmore.com

Leave a Comment

Your email address will not be published. Required fields are marked *