How Does Google Index JavaScript

How Does Google Index JavaScript

How Does Google Index JavaScript

How Does Google Index JavaScript

Web developers have to consider how search engines rank websites. Google’s John Mueller recently gave web developers some tips, but to save you hunting them down we have organized them into a neat and tidy check list for you.

JavaScript is one of the most popular scripting languages developers use to create effects for client websites. Over the years Google has increasingly improved its support for JavaScript and now execute and index a multitude of instructions.

However, there are still limitations, both unsupported and enforced. Ultimately, Google is trying to educate web owners to use best practices that are fair and keep the playing field level.

Here’s what to do and what not to do with JavaScript to ensure your client’s website is best equipped to rank in search engines.

Googlebot does not like cloaking

Cloaking is an SEO term used to describe the technique of presenting one variation of information to search engine crawlers and another to the readers. Google outlawed the practice as a black-hat technique several years ago, but some designers still try it.

Googlebot is pretty sophisticated these days and can identify when cloaking is being used. To avoid penalties:

  • Make your content available to both users and crawlers by selecting “feature detection” and “progressive enhancement”
  • Do not redirect traffic to unsupported browsers

Avoid AJAX-crawling for new sites

Last year, Google discontinued its support for AJAX crawling. Although the search engine will still rank old websites that are built in AJAX, the technology is no longer there to crawl new websites using the same methods.

Do not include hashtags in URL

Hashtags may give some leverage to your social media bulletins, but search engines do not like them. Googlebots treat them as spam and completely ignore them.

Unblock robots.txt

In order for websites to rank, Googlebots have to be given access to information. Web designers need to remember to unblock robots.txt files on every application you use – including Javascript files and frameworks.

Limit the number of outbound links

When a browser downloads a webpage, it reads all the information connected with the url – and that means outbound links embedded on the page directed to off-site urls. Therefore keep the number of embedded resources to a minimum otherwise the lag slows down the load speeds of a web page.

Accelerated mobile pages

On the subject of load times, web developers have to bear in mind mobile users. In efforts to speed up load times on mobile devices, Google intends to launch “accelerated mobile pages” which will strip out any unnecessary data that slows down the time it takes webpages to load.

As a result, some design aspects programmed in JavaScript will become defunct. This is something designers have to bear in mind in order to avoid client websites looking threadbare and unprofessional.

JavaScript is a significant cog in the wheel of a website and has an impact on how well pages rank in search engines results. It is therefore important that web developers understand how Google and other search engines gather data and execute it.

Website Design Services

Copyright © 2004 - 2018 Designaweb (BSE) Ltd, All rights reserved