Question

The database

I'm working with a database that has pretty big tables and it's causing me problems. One in particular has more than 120k lines.

What I'm doing with it

I'm looping over this table in a MakeAverage.php file to merge them into about 1k lines in a new table in my database.

What doesn't work

Laravel doesn't allow me to process it all at once even if I try to DB::disableQueryLog() or or a take(1000) limit for example. It returns me a blank page every time even if my error reporting was enabled (kind of like this). Also, I had no Laravel log file for this. I had to look in my php_error.log (I'm using MAMP) to realize that it was actually a memory_limit problem.

What I did

I increased the amount of memory before executing my code by using ini_set('memory_limit', '512M'). (It's bad practice, I should do it in php.ini.)

What happened?

It worked! However, Laravel thrown me an error because the page didn't finished to load after 30s because of the large amount of data.

Was it helpful?

Solution

What I will do

After spending some time on this issue and looking at other people having similar problems (see: Laravel forum, 19453595, 18775510 and 12443321), I thought that maybe PHP isn't the solution.

Since, I'm only creating a Table B from the average values of the Table A, I believe that a SQL is going to fits best my needs as it's clearly faster than PHP for that type of operation (see: 6449072) and I can use functions such as SUM, AVERAGE, COUNT and GROUP_BY (Reference).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top