[Tickets #2315] Uploading attachment uses too much memory

bugs@bugs.horde.org bugs at bugs.horde.org
Fri Jul 22 16:07:39 PDT 2005


DO NOT REPLY TO THIS MESSAGE. THIS EMAIL ADDRESS IS NOT MONITORED.

Ticket URL: http://bugs.horde.org/ticket/?id=2315
-----------------------------------------------------------------------
 Ticket             | 2315
 Updated By         | jigermano at uolsinectis.com.ar
 Summary            | Uploading attachment uses too much memory
 Queue              | IMP
 Version            | 4.0.3
 State              | Unconfirmed
 Priority           | 2. Medium
 Type               | Bug
 Owners             | 
-----------------------------------------------------------------------


jigermano at uolsinectis.com.ar (2005-07-22 16:07) wrote:

I would like to make a comment on this since I have been working today
in this very issue. I had the memory limit on 120Mb. It was
exhausted trying to upload a 10Mb attachment. The max_allowed_packet was
too 10Mb and the actual query being received by mysql was a little
bigger than that, and that's when things go really wrong.
VFS uses DB. Right after executing mysql_query (DB/mysql.php), -failing
because of the max_allowed_packet thing-,
mysql::mysqlRaiseError is being called, which in turn calls
DB_Common::raiseError. Now, DB has a propery called last_query, which as 
you may guess is the last executed query. Here is
a snippet of DB_Common::raiseError:

        if ($userinfo === null) {
            $userinfo = $this->last_query;
        }
// ...
$tmp = PEAR::raiseError(null, $code, $mode, $options, $userinfo,
                                'DB_Error', true);
        return $tmp;

Is it clear that last_query is more than 10Mb? The object returned includes
the hole query, and is returned to compose (in my case, at this point
80Mb was being used). Now, I had error_reporting == E_ALL. So the query
was passed to the logger (by value) and it then reaches the memory
limit. I changed error_reporting to E_WARNING (horde/config/conf.php).
Now, if I modified max_allow_packet so no error would happen, there would
be no problem. But have in mind that any kind of error -that is, DB
related-, would cause this same thing. So I also changed
DB_common::raiseError so the query would not be included in the report
for the error. 
The peak memory usage reported by xdebug with the 10Mb attachment at
this point (no DB error, error_reporting==E_WARNING) is 58302568 bytes 
(55Mb), which I still think is a little high, but I haven't looked into 
it yet.

In the issue of the logging level, I complete understand if someone 
would argue that if you use E_ALL you are responsable for having enough
memory to handle it, being that this is coded in VFS and DB.

There are other things, though, to worry about related to attachments 
and memory: what happens if I upload N files, all below the 
post_max_size, max_allowed_packet, upload_max_filesize and conf.php's 
max_attachment_size and the send the message? Memory exhausted (and 
maybe a message from your MTA saying file is too big). Thing is, 
max_attachment_size is per attachment. So you can actually upload as 
many files as you want.

What happens if you are writting an email and you have uploaded some 
files (which have acceptable filesize) and you try to upload a file with 
filesize larger thant post_max_size? PHP rejects the whole post so 
$_POST is completly empty, an so it will your compose window (since the 
state info was in the post). 

Some, most, and maybe all of this stuff do not fall in the BUG categry, 
it might be necesary an ENHACEMENT ticket for the number of attachments,
and maybe to try to find a solution to the post_max_size problem. The 
POST goes to the filesystem, and only the no-file fields are actually 
sent to PHP along with the info of the uploaded files, so I guess 
post_max_size could be huge granted there is good control afterwards.

I guess this is more dev-list stuff, but I started comment on the bug 
and I got carried away. Really, sorry for the long post.





More information about the bugs mailing list