[dev] shares performance hit (was Re: disable shares)
Didi Rieder
adrieder at sbox.tugraz.at
Mon Jun 9 16:14:21 UTC 2008
Quoting duck <duck at obala.net>:
> On Mon, 2008-06-09 at 16:42 +0200, Didi Rieder wrote:
>> > Out of memory with only 1000 current users and 130 q/s? This should
>> > never happen. Probably it is an Mysql configuration mistake. How much
>> > ram do you have. You can control your ram usage with various Mysqld
>> > options or even kernel settings like swappness etc.
>>
>> This is what I was also thinking.
>> I have a 16GB, but with the 32Bit server only ~3.2Gb can be used.
>
> uh. 12G wasted.
Not really, we are running more zones on the machine. 2 are syncing
replicas for the imap servers. So in case we have to switch to the
replicas the Ram will be needed.
> I have only 4G :(
poor you :-), but at least you do not have these problems.
>> Do you yous persistent connectins from horde to the sql server?
> no. they are not really usefull in an internal network and floods you in
> the peak time
ok, that could be already on of the problems.
>> What are your settings for:
>>
>> key_buffer_size
> 50M
>
>> max_connections
> 1024
>
>> sort_buffer_size
> 1M
>
>> read_buffer_size
> 256K
>
>> read_rnd_buffer_size
> 768K
>
>> myisam_sort_buffer_size
> 8M
>
> than some other settings that mightl help you (see manual for deatils):
> thread_concurrency = 8
> tmp_table_size = 200M
> thread_cache_size = 50
> skip-external-locking
> skip-thread-priority
> table_cache = 400
>
> for peak times to not get flooded
> connect_timeout = 2 (disable connect attempt after 2 sec)
> and close connections after 30sec of inactivity, yes php should close
> connections after a script ends... yes it should :)
> wait_timeout = 30
> interactive_timeout = 30
Here we go... It seams that we used a rather memory-oversized config
in my.cnf which led to memory problems in peak hours. I decreased most
the relevant values and will report in some days.
Thanks a lot
Didi
--
-------------------------
Didi Rieder
adrieder at sbox.tugraz.at
PGPKey ID: 3431D0B0
-------------------------
More information about the dev
mailing list