Post by amirmukaddas on Mar 11, 2024 0:14:56 GMT -5
A few days ago I read a post from a very talented colleague on Linkedin, who was truly surprised at how optimizing core web vitals had led to a notable increase in organic impressions. Let's think about it. The first thing we have to ask ourselves is why the colleague was surprised, since he is very good (seriously) and "he should know certain things". In reality, it is not always true that a good score on essential web signals corresponds to an improvement in positioning and a consequent increase in traffic. But what is the distinction? What does it mean to have good performance? To have good performance you essentially need two things: clean and lightweight code; adequate server resources . In theory the core web vitals concern the first point, but in reality everything is connected. The real million dollar question is: if your website ranked really well for the top search terms in its semantic field, how many concurrent hits would it get? Attention, NOT how many visits, but how many simultaneous accesses . There's a difference. Suppose your website deserves maximum organic visibility for a search area destined to condense a lot of traffic in a short period of time.
To give an example close to our times, let's pretend that the political votes are about to end and that you have the most authoritative web page in terms of content and backlinks with respect to the real-time results of the elections. In such cases, visibility is not only assigned based on quality and authority, but also based on performance. Google cannot give you visibility for a search area that can bring 50,000 users Denmark Telegram Number Data to the site at the same time, if your server can handle a maximum of 1,000. The example is clearly extreme, but it serves to make it clear that no matter how much you may rack your brains about the texts and the architecture of the site, Google knows in advance whether or not you can sustain certain positions, regardless of whether you deserve them or not. It knows this because based on the response speed, it assigns a crawl rate to each website it crawls. Remember that the crawl rate is the number of requests to the site that the crawler makes in a given interval, calculated in seconds.
What (and when) do core web vitals have to do with it? They have something to do with the extent to which you can reduce the response time per access by working: on JS and CSS dependencies (merging and differing), on the weight of the images, about blocking elements that slow down the loading of the first part of the page, on those who make it "shift", on the dimensions of the DOM, But if your site deals with the topic " sale of castles with ghosts in Molise " and has pages that open in 3 seconds, with server resources that can handle 1,000 simultaneous accesses without problems, set the complete opening - Fully loaded time - to a second and a half, it won't improve your rankings, because Google doesn't need your site to be more performing than it already is. In these cases we tend to say that working on performances is useless and that the rest of us do a lot of beatings.
To give an example close to our times, let's pretend that the political votes are about to end and that you have the most authoritative web page in terms of content and backlinks with respect to the real-time results of the elections. In such cases, visibility is not only assigned based on quality and authority, but also based on performance. Google cannot give you visibility for a search area that can bring 50,000 users Denmark Telegram Number Data to the site at the same time, if your server can handle a maximum of 1,000. The example is clearly extreme, but it serves to make it clear that no matter how much you may rack your brains about the texts and the architecture of the site, Google knows in advance whether or not you can sustain certain positions, regardless of whether you deserve them or not. It knows this because based on the response speed, it assigns a crawl rate to each website it crawls. Remember that the crawl rate is the number of requests to the site that the crawler makes in a given interval, calculated in seconds.
What (and when) do core web vitals have to do with it? They have something to do with the extent to which you can reduce the response time per access by working: on JS and CSS dependencies (merging and differing), on the weight of the images, about blocking elements that slow down the loading of the first part of the page, on those who make it "shift", on the dimensions of the DOM, But if your site deals with the topic " sale of castles with ghosts in Molise " and has pages that open in 3 seconds, with server resources that can handle 1,000 simultaneous accesses without problems, set the complete opening - Fully loaded time - to a second and a half, it won't improve your rankings, because Google doesn't need your site to be more performing than it already is. In these cases we tend to say that working on performances is useless and that the rest of us do a lot of beatings.