Today I bring you an article + video that I know is going to create some controversy and raise some blister, but it is something that we have known for a while and after trying to find a solution (we are still at it in the Webpositer experimentation department) , I wanted to share it with you.
But first, a little theory !:
The Cache plugins (and not so Cache) that we use:
There is a little controversy here, since the name cache has become so standardized that nowadays, a plugin that improves the WPO is already a Cache plugin and since we are no less, we also fall into error and generalize with the names , so let’s go with the “Cache” plugins that we use the most on a day-to-day basis:
- WP3Total Cache
- As it says Alvaro de Raiola, “This is like killing flies with bazocazos” (it is essential to put a “Galleguiño” accent on the leero?). This plugin is one of the most complete, but a bad configuration can cause an excessive consumption of resources that ends up resulting in a “deterioration” of the WPO in terms of server performance.
- This particular plugin is more of a code minifier and unifier than a cache plugin, but since it also has the function (as simple as it is, it caches something) and is incredibly easy to use, so on with it!
- Fastest Cache
- Another plugin that gives good results for the little work it takes to configure it, really if it were not for Autoprimize and Fastest Cache, it would be tremendously difficult for some people to have the web cached and minified.
- WP Rocket
- Today the best option, but like any good option you have to leave “lereles”, specifically “39lereles” per site, the good is paid, and the bad too, but in the long run? We like WP Rocket so much because with 4 settings it has, it offers the same effectiveness as W3Totalcache and Fastest Cache, one pass. And they also have great support, they understand the money you leave in the happy plugin and that it has to work perfectly.
- WP SuperCache
- This plugin is really in the doldrums, it is far from its competitors, for being old and making bad decisions regarding cache management, but we still have it in our memory and that is why we named it!
What do (or should) Caché plugins (or WPO Suits) do?
As we mentioned at the beginning, really if we stick to the cache concept, we should only look at:
- Server Cache
- The plugin compiles the PHP and the data from the Database and generates a static HTML in plain text. So it performs the task of loading the entire web and storing it on the server, before any user makes a query to it.
- Cacheo in client
- This point is one of the most important, since they establish the time of stay of the downloaded files on the server so that they do not have to be downloaded again the next times they are needed. That is, the browser checks if it has the information saved and if it is, it loads it from the computer itself and displays it, so it does not have to download it again.
And some other functions that we associate with Cache Plugins (to simplify the term) and that affect the WPO in a very direct way:
- Minify the code
- Reduce and remove all code that the browser cannot (or does not want to process) such as HTML comments, spaces, or line breaks.
- Unify CSS and JS
- This technique is what brings us here today, since really, on paper it is a great option, to reduce the number of CSS and JS as much as possible (always taking into account the weight of that resulting file). This is not only an improvement in terms of weight and page load, but also in file processing for GBOT, since, if we reduce the number of JS files, less Java Script will have GBOT to try to interpret and if we do it with less CSS you will have to load Page Speed to check the accessibility of our website. As we say this on paper it works perfectly, but the vast majority of plugins (at least the ones we have tested) at this point fail dramatically due to sheer nonsense.
- These plugins generate url with unique characters (HASH) every time they minify and unify the code. They do this for a very clear reason, and that is because these plugins establish in the Htacces that the files remain on the servers for years (to heal in health), and the browser checks by means of the name of the file if it has already been downloaded previously. If we make a change in a CSS (or JS) and the browser verifies that that CSS we had already downloaded before and above our htacces tells it that it must remain in cache for years, that CSS file will not be downloaded until the cache is cleared or spend the time established in Htacces, so there are people who could be seeing the web bad for months. To solve this problem, these plugins began to work with “HASHES”, unique alphanumeric character sets, thus forcing the browser to always download these files as they change their name and in this way there is never a similar file hosted on the browser.
The problems of these Plugins
The first problem
As these plugins have the main characteristic of clearing the cache every X days (or even hours), this causes new files to be generated with different urls, which causes that when the browser compares the url that it has stored in cache with the one that the server offers to download, it finds that it is a totally new file and downloads it again, so if we have a plugin configured to clear the cache, for example, once a day (which, as we already mentioned, can be configured to stop is done every X hours) the next time the user enters my website, they will have to download it completely, doing the opposite of what they should do, downloading as little information as possible.
And the second problem:
By cleaning the cache in this indiscriminate way, the plugins generate new files with new urls, thus creating a new file for GBOT to track, since we attach great importance to that file by “linking” it from all our pages. web, that file acquires a very high relevance (apart from how important it is by itself and CSS and JS for google to understand our website) and GBOT tracks and processes it very effectively. As we already know if we tell GBOT that one file is more important than another, GBOT will, as far as possible, crawl that file more frequently, accumulating a huge amount of events in that file.
The problem is not that a CSS or JS file accumulates many events (reads by Google), they really are there for that, the problem is that when cleaning the cache, these files, which were the most important yesterday, cease to exist (in some cases generate 404 and others a “false” 200) and google has to re-track new ones, but since GBOT is a big head, it will not stop tracking these files that no longer exist and will also start to track and prioritize the new ones. which will cause an immense amount of crawling on these erroneous urls, and will bypass the pages we are trying to rank for.
This problem mainly affects pages that are new or have a low crawl frequency, since if I have few events and these occur in 404 or “false” 200, these pages will take a long time to be indexed and Google will take them as pages of low priority.
Cache Plugins Analysis – [VÍDEO]
Well, from the Webpositer Experimentation DepartmentAs marketing professionals that we are, we will see it in a next installment (and so you are attentive)? At the moment the only thing we can tell you is: if your website is relatively new and it does not take long to load (less than 3 seconds) For now, forget the unification of CS and JS, the problem is better than the solution, we tell you from our own experience.
While we prepare the best WPO guide ever toldWe offer you the question box to leave the topics you wanted to discuss and questions you have about the WPO in general.