More and more website developers are using JavaScript frameworks and libraries these days, but a lot of JavaScript websites fail in Google search results, even though they are rather popular. To avoid this, a true SEO professional has to understand JavaScript and its potential impact on a website’s search performance.
In this post, I will try to explain what JavaScript means in plain English, and tell you about five vital points that every SEO specialist should consider when working with a JavaScript-based website.
What is JavaScript?
Let’s begin by clarifying terms here.
Being the most popular programming language for developing websites (according to the Business Insider 2018 Survey), JavaScript uses frameworks to create interactive web pages by controlling the behavior of different elements on the page. In other words, it makes web pages dynamic and interactive.
Initially, developers implemented its frameworks in browsers on the client side only, but these days the code is built into the server side as well.
JavaScript can be placed within an HTML document, or developers may make a link or reference to it.
HTML is Hypertext Markup Language. Its goal is to organize content and reflect its structure, mentioning all text elements, such as headlines and sub headlines (H1, H2,.., H6), paragraphs, bullet points and other formatting details.
Another abbreviation for modern SEO specialists who are brave enough to learn JavaScript basics is Document Object Model (DOM). A browser takes a series of steps to render the page after receiving the HTML document. This series of steps basically makes up DOM.
So, when Google receives an HTML document, it identifies whether there are any JavaScript elements. After that, a browser initiates DOM to give the search engines the opportunity to render the web page.
Use Internal Linking Correctly
An internal linking strategy is a sure way to show your website architecture to web crawlers, and to point to the most important web pages.
Be careful here to not replace internal linking with JavaScript on click events. Although crawlers are very likely to find and crawl end URLs, they will not associate on click events with the global navigation of a website.
It is vital for internal linking to be enabled with regular anchor tags within the HTML, for the DOM to allow users to traverse the site.
HTML Snapshots
The relationship history of HTML snapshots with Google is long and controversial.
You should know that Google recommends avoiding HTML snapshots (although search engines still support these elements).
However, diligent SEO specialists should be familiar with HTML snapshots, as they could be helpful in a couple of situations.
For instance, you may show HTML snapshots to web crawlers when they cannot grasp the JavaScript on your website. This usually happens because of some coding mistakes. It is better to provide search engines with HTML snapshots, than to leave your context unindexed.
Try to avoid such situations by consulting with the development team and devising backup plans together. Consider installing a user-agent detection on the server side to show the HTML snapshot to both bots and users in emergency cases.
Show Your Javascript to Search Engines
Every SEO specialist knows how important it is to provide Google search engines with a correct robots.txt file, to ensure optimal crawling opportunities. The critical point here is to show a web page to crawlers in precisely the same way you show it to users.
In the worst-case scenario, Google will interpret this dissonance as cloaking and will remove the website from its index.
To avoid unpleasant consequences, always provide search engines with access to the resources they need to fully understand the user experience.
If there are pages you want hidden from search engines, always consult with the development team.
Pay Attention to URL Structure
Traditionally, websites on JavaScript used hashes (#) and hashbangs (#!) within URLs. Nowadays, these fragment identifiers are not recommended, and here is why:
The lone hash is generally used to identify an anchor link. It allows users to jump to a piece of content on a page.
The problem is that “#” is not crawlable, and anything that follows it is not sent to the server. This could cause the page to automatically scroll to the first element with a matching ID. Google recommends avoiding the use of the lone hash in all URLs.
Hashbang was once a hack to support crawlers. In escaped fragments, there are two experiences.
The first is called original experience or pretty URL, which must either have a #! within the URL to indicate that there is an escaped fragment, or a meta element indicating that an escaped fragment exists.
The other is the escaped fragment or an ugly URL. This URL replaces (#!) with “_escaped_fragment_” and serves as the HTML snapshot. It earned the “ugly” name because of its length and appearance.
One highly recommended strategy is a pushState History API. Being navigation-based and part of the History API, it updates the URL in the address bar. This allows for changes only to pieces of content that need to be updated.
PushState History API allows the JavaScript-based website to leverage “clean” URLs, and is supported by Google.
A perfect use case for this method is an infinite scroll. When implemented accurately, pushState History API returns the user to the same spot they were viewing before the page refreshed.
Improve Your Site Latency
Remember how the browser creates a DOM after receiving the HTML document? Good. Now let’s go a step further.
A browser loads the majority of resources exactly as they are mentioned in the HTML document. If there is a heavy file at the top of an HTML document, a browser will load this huge file first.
The rest of the information will appear afterward, probably with a significant delay. If these are JavaScript files that clog up the page load speed, you probably have a render-blocking JavaScript, also called a perceived latency.
In other words, your JavaScript code slows down the page loading speed, even if the web page has the potential to appear faster.
Google claims that user experience comes first. Therefore it recommends placing the most valuable content at the top of the page.
Consider resolving the issue of perceived latency by adding JavaScript in the HTML; adding the ‘async’ attribute to HTML to make your JavaScript asynchronous; or reducing JavaScript elements within the HTML document.
Be careful with JavaScript code, because it has its own pitfalls. For example, scripts must be arranged in the order of precedence, and scripts that reference files should be used only after these files are loaded.
For best results, work closely with your development team to ensure a smooth user experience with no interruptions in the code.
Final thoughts
Curiosity for new knowledge is what distinguishes an outstanding SEO specialist from the pack.
Comprehending JavaScript basics is a huge plus today, and will be a must in the nearest future, as search engines continue to evolve.
Now is the time to check your website for site latency, crawlability and obtainability.
I hope this post inspired you to go deeper into JavaScript and its practical meaning for SEO.
Have any questions or suggestions regarding this topic? I’ll be waiting to see your comments below!
If you want to read further into SEO, we have a selection of articles for you to read:
How to outrank competitors and dominate difficult SEO markets in 2018
How PageRank Really Works: Understanding Google
Google’s updated guidelines – what’s changed?
- 5 things every SEO specialist should know about JavaScript - December 12, 2018
- How to Optimise Your Site for Yandex part 2 - July 13, 2017
- How to Optimise Your Site for Yandex part 1 - June 15, 2017
Thank you for this informative post. Never really looked into Javascript linking. Will defiantly look over it not only for my company website but also my clients. Always forever learning in the digital space. [Link removed]
January 10, 2019 at 11:32 am