Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allowed memory size of 134217728 bytes exhausted #1728

Closed
3 tasks done
saeedvz opened this issue Jul 19, 2018 · 27 comments
Closed
3 tasks done

Allowed memory size of 134217728 bytes exhausted #1728

saeedvz opened this issue Jul 19, 2018 · 27 comments

Comments

@saeedvz
Copy link

saeedvz commented Jul 19, 2018

Prerequisites

  • Able to reproduce the behaviour outside of your code, the problem is isolated to Laravel Excel.
  • Checked that your issue isn't already filed.
  • Checked if no PR was submitted that fixes this problem.

Versions

  • PHP version: 7.1.13
  • Laravel version: 5.6
  • Package version: ^3.0

Description

I am getting Allowed memory size of 134217728 bytes exhausted when i try to export with FromQuery option

Steps to Reproduce

Expected behavior:

I want to fix my problem :)

Actual behavior:

Additional Information

namespace App\Exports;

use App\OldTransaction;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Http\Request;
use jDate;
use Maatwebsite\Excel\Concerns\Exportable;
use Maatwebsite\Excel\Concerns\FromQuery;
use Maatwebsite\Excel\Concerns\ShouldAutoSize;
use Maatwebsite\Excel\Concerns\WithHeadings;
use Maatwebsite\Excel\Concerns\WithMapping;

class OldDepositExport implements FromQuery, ShouldQueue, WithMapping, WithHeadings, ShouldAutoSize
{
	use Exportable;
	
	/**
	 * DepositExport constructor.
	 * @param Request $request
	 */
	public function __construct(Request $request)
	{
	}

	public function headings(): array
	{
		return [
			'ID',
		];
	}


	public function map($transaction): array
	{
		return [
			$transaction->id,
		];
	}

	public function query()
	{
		return User::query()
			->where('status', '=', 1)
			->select(['id']);
	}
}

See this image https://i.imgur.com/yMgUqXP.jpg

@saeedvz
Copy link
Author

saeedvz commented Jul 20, 2018

Can anyone help me?

@saeedvz
Copy link
Author

saeedvz commented Jul 23, 2018

4 days. no answer :-(

@JeroenVossen
Copy link
Contributor

Who do you expect an answer from?

Our software is free and open source, meaning that the use of our software is optional. We hold no liability and there is no obligation to support. We will provide support on a best effort basis.

If you use the software commercially and need elaborate support or need it urgently, we can offer this on a commercial basis. Please contact info@maatwebsite.nl or via phone +31 (0)10 744 9312.

@saeedvz
Copy link
Author

saeedvz commented Jul 24, 2018

OK so please don't close this issue. maybe someone can fix this.

Thanks

@jlcarpioe
Copy link

I think it's a problem of php memory, maybe it's too much data and overflow the memory... There has not relation with the library.

@saeedvz
Copy link
Author

saeedvz commented Jul 24, 2018

@jlcarpioe I have almost 200k rows. The problem occurs when appending rows to sheet

@bitsoft-maax
Copy link

Did u try to maximize memory_limit in php.ini?

@saeedvz
Copy link
Author

saeedvz commented Aug 4, 2018

@bagana89 Thats not a good solution

@patrickbrouwers
Copy link
Member

I cannot reproduce your problem. I'm able to export a users table of 300K rows using the code you shared. Do note that the memory usage will increase in every job as PhpSpreadsheet has to open the workbook that is getting bigger every time. There's nothing wrong with assigning some more memory for this process. It seems you don't have a lot of memory assigned, that's why it overflows so quick.

Best to drop the ShouldAutoSize as that will recalculate the workbooks column dimensions in every job. That takes a lot more memory than without using it.

@DevCentre
Copy link

I have 1 GB of ram allocated and still have the same result as saeedvaziry.
Just migrated from v2.1 to v3.1. Was having the same trouble with v2.1 which motivated me to migrate, but did not solve the problem. Excel::create in v2.1 was much easier to style the output either.

@SuperlativeEntity
Copy link

It seems that the chunking doesn't work well when exporting (using FromQuery) (uses a massive amount of memory - up to 3 Gigs for me for about 200k records). But importing works fine using chunking. (memory never exceeds 50MB)

@juanmiguel431
Copy link

I only have 15 thousands records and gave me the same error. What can I do?

@juanmiguel431
Copy link

This is the error:

[2019-11-24 22:39:59] local.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 18874368 bytes) {"exception":"[object] (Symfony\Component\Debug\Exception\FatalErrorException(code: 1): Allowed memory size of 134217728 bytes exhausted (tried to allocate 18874368 bytes) at C:\wamp64\www\.....\vendor\phpoffice\phpspreadsheet\src\PhpSpreadsheet\Collection\Cells.php:421)
[stacktrace]
#0 {main}
"}

@SuperlativeEntity
Copy link

You will need to increase the allowed memory limit in your php.ini or set it dynamically using ini_set

@juanmiguel431
Copy link

I did, I have 1G but it doesn't work

@SuperlativeEntity
Copy link

When you run the process, how much memory does the php-cli process consume? It must be exceeding 1Gig then

@SwithFr
Copy link

SwithFr commented Jan 16, 2020

The memory limit is definitely not the problem. It's set to 4GB according to phpinfo and I still have this issue.

@NazarAli
Copy link

i have same problem

@DevCentre
Copy link

DevCentre commented May 29, 2020

A 'Solution' would be split your file into multiple ones, releasing memory between them, and then merging all files and send merged as response.

Cons.:

  • More space for temporary files
  • More time spent (non intelligent loops)
  • More code required (non out-of-the-box)

Pros.:

  • It works

@uzwebline
Copy link

the same issue, memory limit is 512 MB, 4K rows

@MoFoLuWaSo
Copy link

MoFoLuWaSo commented Jul 8, 2020

Final Solution
This is old, but whoever is reading this now should know that
if you are importing or exporting ToModel or ToCollection, that process requires huge allocation of memory to convert
the data into usable forms like collection or array.

In that case don't implement ToModel or ToCollect, you need to bypass the process and carry out the operation manually by implementing OnEachRow
which allow you to implements onRow method that will pass in an Excel Row object. You can along implements the WithHeadingRow to have it structured as an associative array.
Use this $row->toArray() to get your data and process it as you like. This is fast and easy to manipulate.

PS: If you still get the Memory Limit Error, simply add a return statement to the last line like this
return;

Thank You

@tomasvanrijsse
Copy link

I had the same issue and with the suggestions from @MoFoLuWaSo I reduced my +128Mb memory usage to 54Mb.

  1. implement a DTO. That reduced the memory usage the most.
  2. order the properties of the DTO and remove withMapping
  3. remove ShouldAutoSize

In case of @saeedvz it should look like this:

namespace App\DataTransferObjects;

class OldDepositRow
{
    public int $id;
    public string $created_at;
}

and

namespace App\Exports;

use App\DataTransferObjects\OldDepositRow;

class OldDepositExport implements FromCollection, ShouldQueue, WithHeadings
{
    use Exportable;

    public function headings(): array
    {
        return [
            'ID',
        ];
    }

    public function collection()
    {
        $users = User::query()
            ->where('status', '=', 1)
            ->select(['id']);

        return $users->map(
           function ($user) {
                $row = new OldDepositRow();
                $row->transaction_id = $user->transaction->id;
                
                // cast objects like Carbon or BigDecimal to string
                $row->created_at = $user->transaction->created_at->format('d-m-Y');

                return $row;
            }
        );
    }
}

@timothymarois
Copy link

timothymarois commented Jul 9, 2022

2022 – This problem still persists.

The way to solve this memory issue, I have not looked at this package code, but you should be using the append

$file = fopen("file.csv","a");

Otherwise, you're loading the entire file into memory each time, not good.

@timothymarois
Copy link

Since this package does not support large exports without massive memory increases, here is what I've come up with, if you write directly to file, and use Laravel chunking, you can do millions of rows very fast.

$f = fopen($filename, 'a');
fputcsv($f, $headings);

        $this->query()->chunkById(1000, function ($items) use ($f) {
                foreach ($items as $item) {
                    fputcsv($f, [
                        ... columns here 
                    ]);
                }
        },);

@robertopc
Copy link

This problem happens to me too. I am import a csv with 27k rows.

@robertopc
Copy link

I used Chunk reading, with chunkSize 1000 and this problem doesn't happen anymore(but the time of processing is increased in some minutes).

@rintoug
Copy link

rintoug commented Sep 12, 2022

If you are ready to use the latest version of the package(3.1), you can try the below. Tested 300K records with 2G memory.


use App\User;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithChunkReading;

class UsersImport implements ToModel, WithChunkReading
{
    public function model(array $row)
    {
        return new User([
            'name' => $row[0],
        ]);
    }
    
    public function chunkSize(): int
    {
        return 1000;
    }
}```

Followed the below from official documentation.
https://docs.laravel-excel.com/3.1/imports/chunk-reading.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests