New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allowed memory size of 134217728 bytes exhausted #1728
Comments
Can anyone help me? |
4 days. no answer :-( |
Who do you expect an answer from? Our software is free and open source, meaning that the use of our software is optional. We hold no liability and there is no obligation to support. We will provide support on a best effort basis. If you use the software commercially and need elaborate support or need it urgently, we can offer this on a commercial basis. Please contact info@maatwebsite.nl or via phone +31 (0)10 744 9312. |
OK so please don't close this issue. maybe someone can fix this. Thanks |
I think it's a problem of php memory, maybe it's too much data and overflow the memory... There has not relation with the library. |
@jlcarpioe I have almost 200k rows. The problem occurs when appending rows to sheet |
Did u try to maximize memory_limit in php.ini? |
@bagana89 Thats not a good solution |
I cannot reproduce your problem. I'm able to export a users table of 300K rows using the code you shared. Do note that the memory usage will increase in every job as PhpSpreadsheet has to open the workbook that is getting bigger every time. There's nothing wrong with assigning some more memory for this process. It seems you don't have a lot of memory assigned, that's why it overflows so quick. Best to drop the |
I have 1 GB of ram allocated and still have the same result as saeedvaziry. |
It seems that the chunking doesn't work well when exporting (using FromQuery) (uses a massive amount of memory - up to 3 Gigs for me for about 200k records). But importing works fine using chunking. (memory never exceeds 50MB) |
I only have 15 thousands records and gave me the same error. What can I do? |
This is the error: [2019-11-24 22:39:59] local.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 18874368 bytes) {"exception":"[object] (Symfony\Component\Debug\Exception\FatalErrorException(code: 1): Allowed memory size of 134217728 bytes exhausted (tried to allocate 18874368 bytes) at C:\wamp64\www\.....\vendor\phpoffice\phpspreadsheet\src\PhpSpreadsheet\Collection\Cells.php:421) |
You will need to increase the allowed memory limit in your php.ini or set it dynamically using ini_set |
I did, I have 1G but it doesn't work |
When you run the process, how much memory does the php-cli process consume? It must be exceeding 1Gig then |
The memory limit is definitely not the problem. It's set to 4GB according to phpinfo and I still have this issue. |
i have same problem |
A 'Solution' would be split your file into multiple ones, releasing memory between them, and then merging all files and send merged as response. Cons.:
Pros.:
|
the same issue, memory limit is 512 MB, 4K rows |
Final Solution In that case don't implement ToModel or ToCollect, you need to bypass the process and carry out the operation manually by implementing OnEachRow PS: If you still get the Memory Limit Error, simply add a return statement to the last line like this Thank You |
I had the same issue and with the suggestions from @MoFoLuWaSo I reduced my +128Mb memory usage to 54Mb.
In case of @saeedvz it should look like this: namespace App\DataTransferObjects;
class OldDepositRow
{
public int $id;
public string $created_at;
} and namespace App\Exports;
use App\DataTransferObjects\OldDepositRow;
class OldDepositExport implements FromCollection, ShouldQueue, WithHeadings
{
use Exportable;
public function headings(): array
{
return [
'ID',
];
}
public function collection()
{
$users = User::query()
->where('status', '=', 1)
->select(['id']);
return $users->map(
function ($user) {
$row = new OldDepositRow();
$row->transaction_id = $user->transaction->id;
// cast objects like Carbon or BigDecimal to string
$row->created_at = $user->transaction->created_at->format('d-m-Y');
return $row;
}
);
}
} |
2022 – This problem still persists. The way to solve this memory issue, I have not looked at this package code, but you should be using the append
Otherwise, you're loading the entire file into memory each time, not good. |
Since this package does not support large exports without massive memory increases, here is what I've come up with, if you write directly to file, and use Laravel chunking, you can do millions of rows very fast.
|
This problem happens to me too. I am import a csv with 27k rows. |
I used Chunk reading, with chunkSize 1000 and this problem doesn't happen anymore(but the time of processing is increased in some minutes). |
If you are ready to use the latest version of the package(3.1), you can try the below. Tested 300K records with 2G memory.
|
Prerequisites
Versions
Description
I am getting
Allowed memory size of 134217728 bytes exhausted
when i try to export withFromQuery
optionSteps to Reproduce
Expected behavior:
I want to fix my problem :)
Actual behavior:
Additional Information
See this image https://i.imgur.com/yMgUqXP.jpg
The text was updated successfully, but these errors were encountered: