SEO

Google Is Requiring JavaScript To Block SEO Tools

Published On:

Google has made a change to how it’s search results are served which will also help to secure it against bots and scrapers. Whether this will have further effect on SEO Tools or if they can use a headless Chrome that uses JavaScript remains an open question at the moment but it’s likely that Google is using rate limiting to throttle how many pages can be requested within a set period of time.

Google Search Now Requires JavaScript

Google quietly updated their search box to require all users, including bots, to have JavaScript turned on when searching.

Surfing Google Search without JavaScript turned on results in the following message:

Turn on JavaScript to keep searching
The browser you’re using has JavaScript turned off. To continue your search, turn it on.

Screenshot Of Google Search JavaScript Message

In an email to TechCrunch a Google spokesperson shared the following details:

“Enabling JavaScript allows us to better protect our services and users from bots and evolving forms of abuse and spam, …and to provide the most relevant and up-to-date information.”

JavaScript possibly enables personalization in the search experience, which is what that spokesperson may mean by providing the most relevant information. But JavaScript can also be used for blocking bots.

Using the latest version of Chrome I copied some of the JavaScript and ran it through ChatGPT to ask what it does. One part of it may relate to restricting abusive requests for documents.

Screenshot Of Chrome Dev Tools

ChatGPT gave me the following feedback:

“Core Functionalities
Randomized Value Generation (rdb)

Generates a random value based on properties (D_d, idc, and p4b) of the input object a, constrained by p7d.
This may be used for rate-limiting, exponential backoff, or similar logic.

Purpose and Context
From its components, the script:

Likely handles request retries or access control for web resources.

Implements a policy enforcement system, where:

Policies determine if requests are valid.

Errors are logged and sometimes retried based on rules.

Randomized delays or limits might control the retry mechanism.

Appears optimized for error handling and resilience in distributed or high-traffic systems, possibly within a Google service or API.”

ChatGPT said that the code may use rate-limiting which is a way to limit the number of actions a user or a system can take within a specific time period.

Rate-Limiting:

Used to enforce a limit on the number of actions (e.g., API requests) a user or system can perform within a specific time frame.
In this code, the random values generated by rdb could be used to introduce variability in when or how often requests are allowed, helping to manage traffic effectively.

Exponential Backoff:

ChatGPT explained that exponential backoff is a way to limit the amount of retries for a failed action a user or system is allowed to make. The time period between retries for a failed action increases exponentially.

Similar Logic:

ChatGPT explained that random value generation could be used to manage access to resources to prevent abusive requests.

I don’t know for certain that this is what that specific JavaScript is doing, that’s what ChatGPT explained and it definitely matches the information that Google shared that they are using JavaScript as part of their strategy for blocking bots.

 

Source link

Follow Us On

Leave a Comment