User tests: Successful: Unsuccessful:
Smart Search/Finder has a long standing issue that the #__finder_tokens table can run full and abort the indexing process. That table is of type "MEMORY", which can run full and there is an option to reduce the number of tokens to store in that table, which unfortunately doesn't work in all cases. The MEMORY table is used since it is a lot faster than a normal InnoDB table and since this is just used for temporary storage, writing speed is the main factor here. Anyway, enough rambling. ;) This change pushes the code of addTokenToDb
to the tokenizeToDbShort()
method and properly counts the number of tokens written to DB. This means, that now the option mentioned above works properly.
This removes the addTokenToDb
method, but since this is a protected method in a class of a component, this would still be in accordance to our backwards compatibility promise.
The indexing fails.
The indexing works.
Status | New | ⇒ | Pending |
Category | ⇒ | Administration com_finder |
Labels |
Added:
?
|
Just playing around with this thing: The problem of the whole thing is most likely that the number of the memory_table_limit
parameter is too high + the check is on the wrong position.
So first the checks should be enough to have it that way: patch.zip
And second either set the parameter to a lower number (works for me with 20.000) or take the number from https://dev.mysql.com/doc/refman/8.0/en/server-system-variables.html#sysvar_max_heap_table_size
It seems, this will not make it in 4.1.0, so I moved it to 4.1.1 to not forget it, as it should be fixed asap.
Status | Pending | ⇒ | Closed |
Closed_Date | 0000-00-00 00:00:00 | ⇒ | 2022-01-20 08:08:08 |
Closed_By | ⇒ | Hackwar |
@Hackwar I have tested this PR but it doesn't change anything for me. I get the error with and without the PR applied when trying to save an article with some 100 or 200 times the text from the site linked in the description of issue #34377 , https://whiletrue.neocities.org/lte.html .