Page 1 of 1

Does Unreal/Networks preformance rely on computer thats gonn

Posted: Sun Nov 13, 2005 4:20 am
by onesikgypo
hey jus need to ask, will irc networks/unreal rely be affected by the computer that is running it - or is it all internet connection based. I wanna start makin my irc server go on a linux one, but the only compute ri wanna turn into a linux is a really really really old computer - but will it work the same?

Posted: Sun Nov 13, 2005 8:07 am
by Suchiara
actually, yes. my network is running on a celeron 450 with 384 ram, and however, when loaded, such commands as /lusers, list and etc work a bit slower. It is hard to see the difference. My testnet is on a ftp server (athlon 2400+) and this is just flying..

Posted: Sun Nov 13, 2005 8:29 am
by w00t
It really depends what kind of load you'll be experiencing. RAM-wise, I'd suggest leaving at least 100mb or so for Unreal to play with, should it need it - it'll be there (and RAM is a good thing to have regardless).

CPU-wise, I ran Unreal off a celeron 200mhz for a long while before upgrading that particular machine, with no great problems.

Posted: Sun Nov 13, 2005 11:44 am
by sChutt
Running 80 (well, rather round 80 active + a few dozen idlers) users across a dozen channels on a IBM PC Server 330 that comes with PII 233 with 512MB RAM without any problems at all.
Anope and Denora loaded aside yet cpu humm's round a modest 20%.

- sChutt

Posted: Sun Nov 13, 2005 11:49 am
by Suchiara
w00t wrote:I'd suggest leaving at least 100mb or so for Unreal to play with, should it need it - it'll be there (and RAM is a good thing to have regardless).
What do you mean 'leave at least xxx mb of ram'? as far as I know, linux caches al the memory (almost all) and in `free -m` you see only a few megabytes even you have tone of ram and almost nothing is using itand this is *normal*. Or maybe I don't understand something.

Posted: Mon Nov 14, 2005 4:48 am
by w00t
No, you don't understand something :).

What happens then if you get a sudden influx of users, or w/e - you DON'T want to start paging to disk. This is just normal common sense.

Posted: Mon Nov 14, 2005 5:33 am
by onesikgypo
so the preformance would mostly depend (computer wise) on ram, are their any other possible things

Posted: Mon Nov 14, 2005 2:37 pm
by Syzop
Actually w00t, I think Suchiara understood, but you might as well mean both the same ;p.
What he meant is that Linux generally tries to have few "real free memory" as possible, for example on my 512M box it has 10M free. BUT, it uses 121M for cache, which - if it is read cache - can be invalidated immediately and be used for programs.

Code: Select all

$ free -m
             total       used       free     shared    buffers     cached
Mem:           502        491         10          0         18        121
-/+ buffers/cache:        351        150
Swap:          988         27        960
So to see how much memory you got available you should actually be looking at the -/+ buffers/cache figure, which is, as you can see 150M free (or used minus cached, since I'm not sure if buffers can be invalidated that easily, but that hardly matters here ;p).
Some beginners tend to look at the Mem free figure and then see 10M and think 'woahhh' like Linux is a big ram eater, when in fact Linux (probably most *NIX'es) is much better with ram than f.e. windows (though that is not hard ;pp).

Posted: Mon Nov 14, 2005 2:47 pm
by Syzop
onesikgypo: Actually it depends.

The most CPU-eating task in Unreal is spamfilter, so if you got quite some spamfilters and a lowcpu machine (say, less than 1ghz) and 1000 users then it's not good :P. It might work fine normally, but in case of a flood or attack you will be in trouble (perhaps, even without spamfilter calculated in).

Memory is indeed important, you don't want to be short on that at all, but.. unreal isn't that hungry ;). 100M should be fine in most cases, but during attacks it might eat up to like 300 or 400M (Really depends) for buffers (sendq, recvq, kernel socket buffers, etc). You might as well see it using only 30M or something like that all the time, but it's the attack stuff that matters.

So basically, in many cases an ircd can run fine on lowspec pc's, but when you get an attack you might be in trouble :P.
If you limit the connections to, say, 300 users on a 1ghz machine with 100M free then you'll be fine.
Which is probably a good idea anyway, because if you grow, it is recommended to get something better ;p.

Oh and yes, internet connection specs are very important.

Posted: Mon Nov 14, 2005 4:20 pm
by Suchiara
btw, onesikgypo, you didn't tell us what exactly machine do you have. I started my network on a pentium 100 mhz w 32 ram (unreal + services anope). When my net growed up to ~200 users, I added +32 ram ( so I had 64) and overclocked my cpu up to 120 mhz :) also changed lan card 10->100 mbit (cheap one) and I never had/heard of any problems with ircd..

Posted: Mon Nov 14, 2005 9:00 pm
by aquanight
Syzop wrote:BUT, it uses 121M for cache, which - if it is read cache - can be invalidated immediately and be used for programs.
Of course, that still depends on the swapiness setting (eg, given the need to free more ram, would it prefer to invalidate read cache / flush write buffers or just swap other things out to disk - naturally this is moot if you have no swap :P ).

Posted: Mon Nov 14, 2005 9:08 pm
by Syzop
Of course, Linux tries to do what's best :P.