[gollem] Gollem PHP Memory consumption

Olger Diekstra olger.diekstra at cardno.com.au
Wed Nov 21 03:33:33 UTC 2007


Hi Guys/Girls,
 
In my new wokrplace, they use Horde (Gollem) for various purposes. One of
which is exchanging files through Gollem. The (external) guy previously
doing a lot of the work I now do, had a big whinge about Horde gobbling
memory, but not having seen or used Horde before, I didn't go there just
yet.
Today however someone needed to download a decent sized file (360MB) and it
wouldn't go. Upping the memory limit to 400MB wouldn't make it go either.
Having done my share of php programming in the past and also creating an app
for distributing files (uploading to a webserver and downloading from the
same webserver) I experienced the very same PHP memory_limit issue a lot of
folks seem to run into with Horde(Gollem).
 
My app is now capable (has been for some time) of uploading any sized file
and downloading it again, the only restriction in size being the diskspace
on the webserver (all through the browser). PHP memory_limit is set at
default (8MB).
 
Getting around the memory_limit without increasing it, is actually not that
hard in this case. Increasing it is something that should only be done
because the design of the application cannot use less (in theory, an
application could have 16MB of code... lot of programming though). But I'm
talking increasing it to 16 or maybe 32MB. Upping to 256MB is insane.
There's no two ways about it.
 
I haven't had a look at the code, I probably don't have time to take a real
good look at this sometime soon either, and because Horde is completely new
to me, it'll probably take me a good chunk of time to be able to get my head
around Horde.
But, I am more than willing to help in sorting this issue, explain my code
(which BTW, finds its origin on the web, looking at someone elses code,
getting the idea and creating my own code) and help in implementing it. The
idea is basically to send the data of the file in chunks to the browser.
 
Contact me if someone is willing to fix this.
 
Also, I did a quick grep on Gollem php files, and found a lot of "foreach"
used. This is probably the reason why downloading a 16MB file needs a
memory_limit of 64MB or more. Foreach makes a copy of the array being parsed
in memory, so if you have constructed an array with filename and the actual
data and then use foreach to parse the array, you make a copy of the entire
array in memory. So the script needs 32MB for a 16MB file. Do this a couple
of times and you find yourself munching memory rapidy. 
A better way to parse such large arrays is using "while". That parses the
original array in memory without making a copy. 
 <http://atomized.org/2005/04/php-performance-best-practices/>  

Regards,

Olger Diekstra 

 


More information about the gollem mailing list