Every query to the content table makes a database server search the rows without indexes. These queries are very expensive. Any site with over 1000 articles will be slow. I saw sites with 10K articles, they work terribly. We should optimize the queries. Any condition in the where clause should reference to a column with an index.
Consideration on the queries in general:
Because the variable :nowDate is changing every second, it even does not allow to cache such queries. MySQL servers (especially MariaDB) have great cache abilities and returns repeating queries instantly. But with such database queries we can trash out the cash, which is sadly. Globally, such queries should not exist. When the article is published and there is no a plan to unpublish it (or wise versa), such conditions should not exist at all. If you implement this, you will speed up Joomla! dramatically. Now, when you are preparing a new major version, you have a chanse to make the Joomla! great again :)
Labels |
Added:
?
|
Sorry I was wrong, deleted my comment, lets start again:
I have talked about per second publishing as being a barrier to caching for a decade now.. However, MySQL 8 then retired support for the Query Cache making it a mute point in the end, and MariaDB is not an officially supported database type for Joomla (yes I know it works, yes I know it is a fork of mysql, yes I know)
This query is run when loading a CATEGORY BLOG menu item
on the frontend of Joomla 4.0-dev.
So create a menu item pointing at a category blog type pointing to a category with a LOT of articles in it.
With 10,000 (9164) articles in 1 category, When running the EXPLAIN for the query you can see what is actually happening.
With 10,000 (9164) articles in 1 category, I can see that this query is having to search through them all in order to return just 9 (the LIMIT). There is no index - you are right.
After applying this crazy index
ALTER TABLE `#__content` ADD INDEX (`state`, `access`, `catid`, `publish_up`, `publish_down`, `created`);
You can see that the number of rows to trawl through is considerably less and the index is being used. I dont think this is the right way to resolve this though, because still retrieving over 4000 rows just to display 9 articles doesn't seem right.
(Developers - do you want 10,000 Articles? Download Overload, put it in your /cli folder, run composer install in the cli folder and then run php overload.php --articles-count=10000
and go and make a coffee. Once its done email @nikosdion and say thank you for his contribution to your workflow)
@PhilETaylor maybe your composite index is not used for all lookups because it can be used only with a leftmost prefix: https://dev.mysql.com/doc/refman/8.0/en/multiple-column-indexes.html
@richard67 please consider this as a release blocker.
@PhilETaylor I've discussed that with some of the other maintainers, and we agree that this is not a release blocker.
LMAO. Really? wow... I guess that Joomla 4 is only a hobby platform now then, and cannot be used on real sites with large amount of content. I give up.
LMAO. Really? wow... I guess that Joomla 4 is only a hobby platform now then, and cannot be used on real sites with large amount of content. I give up.
That doesn't mean it shouldn't be fixed, no idea why you always response like this on any decision that doesn't reflects your opinion.
This is not about opinion. This is about facts.
Unless its a release blocker no one will fix it before Joomla 4 will be released. Fact.
no idea why you always response like this on any decision
Because you (you="maintainers") constantly make BAD decisions for the project that I have been contributing to since Mambo 4 days and treat me like my "experience" and "opinion" mean jack shit zero to you, or this project....
So the decision not to have this as a release blocker means, potentially, that Joomla 4 will be released, not being able to be performant on large sites... and you maintainers are perfectly happy with that...
Labels |
Added:
No Code Attached Yet
bug
Removed: ? |
It's not scalable. A google bot will overload this database when it comes to crawl 22620 items.