[Tickets #5913] Re: Lower memory usage while downloading files

bugs at bugs.horde.org bugs at bugs.horde.org
Thu Nov 29 23:33:46 UTC 2007


DO NOT REPLY TO THIS MESSAGE. THIS EMAIL ADDRESS IS NOT MONITORED.

Ticket URL: http://bugs.horde.org/ticket/?id=5913
-----------------------------------------------------------------------
 Ticket             | 5913
 Updated By         | olger.diekstra at cardno.com.au
 Summary            | Lower memory usage while downloading files
 Queue              | Gollem
 Version            | HEAD
 Type               | Enhancement
 State              | Feedback
 Priority           | 2. Medium
 Owners             | Chuck Hagenbuch
-----------------------------------------------------------------------


olger.diekstra at cardno.com.au (2007-11-29 15:33) wrote:

Hi Chuck,

Installed the latest and greatest stable of Horde/Gollum, but couldn't
really figure out how to place the file.php/ftp.php. I found similar files,
in the Horde VFS directory, but they were of a vastly different size. So
haven't been able to test this yet.
However, having had a spare minute to play with this, I did get it working
with my own function. I've just downloaded a 368MB file succesfully from
Gollem with my own function. Here's how:

Modified gollem/view.php:
Added function
Function ReadFileChunked ($FileName) {
              $chunksize = (102400); // how many bytes per chunk
              $buffer = '';
              $handle = fopen($FileName, 'rb');
              if ($handle === false) { return false; }
              while (!feof($handle)) {
                      $buffer = fread($handle, $chunksize);
                      print $buffer;
                      }
              return fclose($handle);
              }

Then changed this section:
case 'download_file':
    $browser->downloadHeaders($filename, null, false, strlen($data));
    ReadFileChunked("/exampledir/home/web/".$filename);
    /* echo $data; */
    break;

"/exampledir/home" is where all the userdirectories are located, "web" is
the logged in user. I didn't know how to get that information from
Horde/Gollem quickly, so for testing purposes I hardcoded it.

But it works a treat. The only reason I still have to keep the PHP
memory_limit just over the filesize I'm trying to download is because of
this line:
$data = $GLOBALS['gollem_vfs']->read($filedir, $filename);
Which reads the entire file into memory.

But as opposed to having to set the memory_limit to just over double the
file size, it now needs to be just over the size of the file.
I'm not used to working with objects in PHP, so haven't been able to
retrieve the directory from the $GLOBALS['gollem_vfs'] object (although I
could print the array and view it).
Now if we can get that one line fixed so that the object doesn't read the
filecontents into memory anymore, we'd be laughing.

Cheers! Olger.

> Please give these two commits a shot, assuming that you are using 
> either the file or FTP backend in Gollem:
>
> http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072746.html
> http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072745.html
>
> I went ahead and used fpassthru because I didn't see anything that 
> indicated that it read the whole file into memory - just that on some 
> older versions it might leak, which is a problem but a different one. 
> :)




More information about the bugs mailing list