Guide for JavaScript SEO

Guide for JavaScript SEO: -

SEO moved towards using different front-end languages.

Using Google Tag Manager (GTM) requires a basic understanding of HTML and Vanilla JavaScript.

Speaking of JavaScript, there has been a recent discussion on why and how Google crawlers read JavaScript.

The relevant queries that arise are

* About reading Google JavaScript.

* Change if the site uses a lot of JavaScript.

* Server-side rendering.

Use of javascript for each JavaScript framework on the site.

Modern JavaScript applied to SEO:

JavaScript is a rendering programming language used for animation purposes via DOM methods.

In its most sophisticated form, it is used to create native web applications with frameworks like React and Vue.

While React and Vue are very popular and many sites are starting to use these frameworks, an SEO professionals are more likely to come across legacy architectures like WordPress that run on Vanilla and jQuery.

To be precise, the application of JavaScript in such CMS is primarily concerned with slide shows, accordions, and other forms of text animations.

JavaScript is a matter of SEO:

Google claims that the Googlebot can crawl, index and rank every content included in any form of JavaScript.

Googlebot can crawl static languages ​​(i.e. HTML and CSS), but when it comes to object-ready programming languages, it acts differently.

This is a hypothetical period of time that Googlebot uses to index any form of text included in any client-side JavaScript.

JavaScript has become a sore subject for SEOs.

Frameworks and rendering speed:

JavaScript has evolved a lot in the recent past.

The introduction of ECMA scripts has led to the introduction of faster and more efficient frameworks.

React is the most popular JavaScript library for user interface frameworks.

It's hard to track which library performs the best in terms of processing speed from Google's perspective.

There are dozens of variables to take into account; the type of CMS used, the concatenation of multiple functions, and other highly technical aspects that really shouldn't be associated with SEO.

React framework is the fastest in terms of rendering due to its simplicity.

Content animations and the like are primarily based on jQuery / Vanilla.

It would be easier for a technical SEO specialist to find those frameworks compared to the more common React architectures within the UI and native apps.

Render / Search Accuracy Like Google:

To test the page's JavaScript and its render queue, the best approach would be to analyze the site's access log files.

Able to accurately assess the number of visits that crawlers hit the pages, thus understanding which pages have a better render queue.

The Browse Like Google feature in Search Console can be useful for checking the page's HTML hierarchy, but it is definitely not accurate when it comes to understanding JavaScript.

Search Console doesn't always recognize that individual frames are hosted on external CDNs.

The best way to structure content with JavaScript resources.

Separating content from JavaScript is completely doable and is completed in many ways.

JavaScript is used for architectures and engines (i.e. Node, React, Vue), consider using server-side rendering (SSR) libraries, as opposed to client-side rendering (CSR).

This process runs JavaScript and dynamic resources within the server, rather than in the chosen browser.

JavaScript is a complicated business from a front-end perspective and much more complex when it comes to the applications when it comes to SEO. 

More than 90 percent of the Internet runs on JavaScript or uses at least five scripts within its code.

In an ever-evolving industry like SEO, it is imperative to recognize the importance of being able to analyze and evaluate the site's JavaScript.

The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance.

As a technical SEO specialist and web development enthusiast, one enjoys sharing a perspective on modern JavaScript SEO based on industry best practices and agency experience.

JavaScript SEO is the discipline of technical SEO that focuses on optimizing websites built with JavaScript for search engine visibility.

These are mainly:

Optimize content injected via JavaScript for crawling, rendering and indexing by search engines.

Prevent, diagnose, and resolve ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.

Ensure web pages are discoverable by search engines by linking best practices.

Improved page load times for parsing and executing JS code for a streamlined user experience (UX).

JavaScript is essential to the modern web and makes website building scalable and easier to maintain.

Implementations of JavaScript can affect search engine visibility.

JavaScript can affect the following on-page and ranking factors that are important for SEO:

Rendered content.

Connections.

Lazily loaded images.

Page load time.

Metadata.

Websites using JavaScript refer to when the main or main content is injected into the DOM through JavaScript.

Check if a website is built on a JavaScript framework using a technology research tool such as BuiltWith or Wappalyzer.

Also inspect the element or view the source in the browser to verify the JS code.

Popular JavaScript frameworks that one might find include:

Angular by Google.

React with Facebook.

Seen by Evan

Modern web apps are built on JavaScript frameworks, such as Angular, React, and Vue.

JavaScript frameworks allow developers to quickly build and scale interactive web applications.

The HTML document is almost completely devoid of any content.

There is only the root of the app and a few script tags in the body of the page.

The main content of this single-page application is dynamically injected into the DOM via JavaScript.

The app depends on JS to load the key content on the page!

Potential SEO Problems: Any basic content rendered to users but not to search engine bots could be seriously problematic!

SEO JavaScript for internal links:

Besides dynamically injecting content into the DOM, JavaScript can also affect the ability to crawl links.

Google discovers new pages by crawling the links it finds on the pages.

Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for hyperlinks:

Google also recommends that developers not rely on other HTML elements - like div or span - or JS event handlers for links.

Despite these guidelines, an independent and third-party study suggested that Googlebot may be able to crawl JavaScript links.

Potential SEO issues: If search engines are unable to crawl and follow links to key pages, the pages may miss out on valuable internal links pointing to them.

Internal links help search engines crawl the website more efficiently and highlight the most important pages.

The worst-case scenario is that if internal links are not properly implemented, Google may have a hard time discovering new pages (outside of the XML sitemap).

SEO JavaScript for lazy-loading images.

JavaScript can also affect the ability to crawl lazily loaded images.

Googlebot simply resizes its virtual window to be longer when crawling web content.

It is more flexible and robust than the scrolling event listener and is supported by modern Googlebot.

SEO Javascript for page speed.

Javascript can also affect page load times, which is an official ranking factor in Google's mobile-first index.

A slow page could potentially hurt search rankings.

Minimize JavaScript:

Defer non-critical JS until the main content is rendered in the DOM

In lining JS review.

Serve JS in smaller payloads.

Potential SEO Problems: A slow website creates a bad user experience for everyone, even search engines.

Google itself defers the loading of JavaScript to save resources. 

It is therefore important to ensure that whatever is served to customers is encoded and delivered efficiently to help protect rankings.

SEO JavaScript for metadata:

It's important to note that SPAs that use a router package like react-router or vue-router need to take extra steps to handle things like changing meta tags when navigating between router views.

This is usually handled with a Node.js package like vue-meta or react-meta-tags.

Respond in five steps:

When a user visits a React website, a GET request is sent to the server for the ./index.html file.

The server then sends the index.html page to the client, containing the scripts to launch React and React Router.

The web application is then loaded on the client side.

If a user clicks on a link to go to a new page (/ example), a request is sent to the server for the new URL.

React Router intercepts the request before it reaches the server and handles the page change itself.

This is done by locally updating the rendered React components and changing the URL on the client side.

When users or bots follow links to URLs on a React website, they don't receive multiple static HTML files.

React components (like headers, footers, and body content) hosted on the root ./index.html file are simply rearranged to display different content, called Single Page Apps.

Potential SEO issues: It's important to use a package like React Helmet to ensure that users receive unique metadata for every page or view when browsing SPAs.

Otherwise, search engines can crawl the same metadata for every page, or worse, none at all!

In order to understand how JavaScript affects SEO, so one needs to understand what exactly happens when Google Bot crawls a web page:

Crawl

Return

Index

Googlebot crawls the URLs of its queue, page by page.

The robot sends a GET request to the server, usually using a mobile user agent, and then the server sends the HTML document.

Google decides what resources are needed to make the main content on the page.

Usually only static HTML is parsed, not linked CSS or JS files.

According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages.

Rendering JavaScript at scale can be expensive.

The computing power required to download, analyze and execute JavaScript in bulk is enormous.

All unfulfilled resources are queued for processing by Google Web Rendering Services (WRS), as computing resources become available.

Google will index any HTML rendered after running JavaScript.

Google's crawling, rendering and indexing process.

Google crawls and indexes content in two waves:

The first wave of indexing, or instant crawling of static HTML sent by the web server.

The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript.

Crawl budget is the concept that Google has a speed limit on how often it can crawl a given website due to limited computer resources.

While the time between crawling and rendering has been reduced, there is no guarantee that Google will actually execute pending JavaScript code in its web rendering services queue.

Here are some reasons why Google might never execute the JavaScript code:

Blocked in robots.txt file

Deadlines

mistakes

Therefore, JavaScript can cause SEO issues when the main content relies on JavaScript but is not rendered by Google.

Real World Application: JavaScript SEO for Ecommerce.

Ecommerce websites are a real life example of dynamic content injected through JavaScript.

For example, online stores typically upload products to category pages through JavaScript.

JavaScript can allow e-commerce websites to dynamically update the products on their category pages.

This makes sense because their inventory is constantly changing due to sales.

For e-commerce sites, which rely on online conversions, not having their products indexed by Google could be disastrous.

Here are the steps to proactively diagnose any potential JavaScript SEO issues:

View the page with Google's webmaster tools.

It helps to see the page from Google's point of view.

Use the site search operator to check the Google index.

Make sure all JavaScript content is properly indexed by manually checking Google.

Debug using Chrome's built-in developer tools.

Compare and contrast what Google sees (source code) with what users see (rendered code) and make sure they line up in general.

There are also handy third-party tools and plugins that one can use.

Google tools for webmasters:

The best way to determine if Google is having technical difficulties when attempting to render pages is to test the pages using Google Webmaster tools, such as:

URL inspection tool in Search Console.

Mobile-friendly test

Google Mobile-Friendly Test.

The goal is simply to visually compare and contrast the content visible in the browser and look for possible discrepancies in what is displayed in the tools.

Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google, which means they can give an accurate visual representation of what Googlebot is actually seeing when crawling the website.

There are also third-party technical SEO tools, such as Merkle's Recovery and Render Tool.

Unlike Google's tools, this web application offers users a full-size screenshot of the entire page.

Site: Search operator.

Alternatively, if one don't know if the JavaScript content is being indexed by Google, one can perform a quick check using the site: search operator on Google.

Copy and paste any content that is not sure Google will index after the site.

Chrome Developer Tools:

Another method that one can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.

Right-click anywhere on a web page to display the options menu, then click "View Source" to view the static HTML document in a new tab.

One can also click "Inspect Item" after right clicking to view the content that is actually loaded into the DOM, including JavaScript.

Inspect element.

Compare and contrast these two perspectives to see if some basic content is only loaded into the DOM, but not hard-coded into the source. 

There are also third-party Chrome extensions that can help one out, like Chris Pederick's Web Developer plugin or Jon Hogg's View Rendered Source plugin.

There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS for both users and crawlers:

Server-side rendering (SSR) means that JS is executed on the server for every request.

One way to implement SSR is to use a Node.js library like Puppeteer. Hybrid rendering is a combination of server-side and client-side rendering.

Main content is rendered on the server side before being sent to the client. 

All additional resources are offloaded to the client.

When rendered dynamic, the server detects the user agent of the client making the request.

All other user agents will need to render their content client-side. 

Google Webmasters recommends a popular open source solution called Renderton for implementing dynamic rendering.

Incremental static regeneration or updating of static content after a site has already been deployed.

This can be done with frameworks like Next.js for React or Nuxt.js for Vue.

These frameworks have a build process that will pre-render every page of the JS application into static assets that one can serve from something like an S3 bucket.

The site can benefit from all the SEO benefits of server-side rendering, without server management!

Each of these solutions helps ensure that when search engine spiders make requests to crawl HTML documents, they receive fully rendered versions of web pages.

Comments

Popular posts from this blog

PageSpeed ​​Insights.

Stages of the Sales Funnel

Better writing skills help build a successful career.