No Code Attached Yet bug
avatar bato3
bato3
25 Aug 2020

Steps to reproduce the issue

Default php 7.4 configuration from XAMMP,
389597 rows in #_finder_terms (127.9 MB on disk)

php cli\joomla.php -vvv --table=jos37_finder_terms database:export

Actual result

Exporting Database
==================

 Processing the jos37_finder_terms table
PHP Fatal error:  Allowed memory size of 536870912 bytes exhausted (tried to allocate 67108872 bytes) in C:\xampp\htdocs\libraries\vendor\joomla\database\src\DatabaseExporter.php on line 299

Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 67108872 bytes) in C:\xampp\htdocs\libraries\vendor\joomla\database\src\DatabaseExporter.php on line 299
Symfony\Component\ErrorHandler\Error\OutOfMemoryError^ {#390689
  -error: array:4 [
    "type" => 1
    "message" => "Allowed memory size of 536870912 bytes exhausted (tried to allocate 67108872 bytes)"
    "file" => "C:\xampp\htdocs\libraries\vendor\joomla\database\src\DatabaseExporter.php"
    "line" => 299
  ]
  #message: "Error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 67108872 bytes)"
  #code: 0
  #file: "C:\xampp\htdocs\libraries\vendor\joomla\database\src\DatabaseExporter.php"
  #line: 299
}

System information (as much as possible)

  • PHP 7.4.9 (cli) (built: Aug 4 2020 11:52:41) ( ZTS Visual C++ 2017 x64 )
  • Joomla! 4.0.0-beta3 (debug: No) (not fully upgraded, but i put files by FTP, and database version: 4.0.0-2020-05-29 )

Additional comments

I know, that I can reconfigure server, but I think we should find a solution that requires less memory.

avatar bato3 bato3 - open - 25 Aug 2020
avatar joomla-cms-bot joomla-cms-bot - change - 25 Aug 2020
Labels Added: ?
avatar joomla-cms-bot joomla-cms-bot - labeled - 25 Aug 2020
avatar Hackwar Hackwar - change - 20 Feb 2023
Labels Added: No Code Attached Yet bug
Removed: ?
avatar Hackwar Hackwar - labeled - 20 Feb 2023
avatar Hackwar
Hackwar - comment - 29 Aug 2023

I've opened an issue on the database repo: joomla-framework/database#287

avatar Hackwar
Hackwar - comment - 2 Apr 2024

So the solution to this issue would be to stagger reading and writing the data from the table during export. Limit the number of rows to read, then write them to a file resource, read the next batch and write it again until done. This is a bigger refactoring and I'm unsure if our Filesystem package properly supports this.

Add a Comment

Login with GitHub to post a comment