Forum
Pixel Gaming 900fps TF2DM Server
Created 15th June 2010 @ 22:05
Add A Reply Pages: 1 2 ... 5 Next »
Server’s running 900fps solid. Enjoy. :)
Classlimits:
soldier: 3
scout: 3
sniper: 0 (changes to 1 with 4 players and 2 with 8)
demo: 2
rest = 0 (sorry, no pyro nightbox :P )
connect 94.23.226.193:27015
HP on kill
Ammo on kill
health regen
Scout Banned: Bonk, FAN, Sandman
Sniper Banned: Jarate
Demo Banned: Targe, Scottish Resistance
Soldier Banned: Buff Banner, Direct Hit, Gunboats
Last edited by Skyride,
http://www.gsptruth.com/forum/f79/1000-fps-fairy-tale-125/
“The 1000 FPS Fairy Tale
Friday, 20 April 2007
For a couple of months, a lot of game server providers offer special high performance servers that run at 1000 FPS. These 1000 FPS make the server run a lot better and more precisely compared to standard settings. Players hit better and the calculation of the player’s positions is also a lot more precise and if you are a “competitive player” in a “professional gaming league”, you of course need one of these 1000 FPS servers.
If you take a closer look at the details concerning these 1000 FPS servers, you’ll find that what most providers tell you is very unclear and confusing. Sometimes they try to give the impression that their servers are “FPS certified” in some way, without telling you what this certification really is. We can assume that this certification has been invented by the game server providers themselves or that it only expresses some subjective opinion. Do game servers that run at 1000 FPS really offer a smoother gameplay with better hit registration? No, because they simply can’t! Here’s why …
In theory
All events on a CS:S or CS1.6 server are calculated by frame. All positions, directions and speeds of one point in time are summarized in one frame. A frame is similar to a snapshot or still image of a movie. The more frames a server calculates per second, the more precise are its data. At 1000 frames per second (FPS) the server calculates its “world” at a rate of one frame per millisecond. At 100 FPS it calculates the same “world” at a rate of one frame per 10 milliseconds. So far the argument of a higher precision is still valid … but only as long as you look at the isolated server. In practice all this remains without any use to the client.
In practice
Here the players (clients) have to be taken into consideration, too. The server may calculate its “world” at 1000 FPS. That is only a millisecond for every calculated frame. On the other end the clients do not get their updates at such a high rate but a lot slower. How often they can be updated per second is determined by the server’s tickrate. The tickrate is usually set to 66 or 100 for high quality servers. The server freezes its “world” per tick and then decides which clients it sends its data to. But the server doesn’t send all its information. It only sends the changes compared to the last update. The tickrate defines how often the server takes snapshots of its frames that can then be sent to the clients. This way a client gets only 100 updates per second from a tickrate 100 server. On the other hand the clients also send the server commands. Here the tickrate also determines how many commands per second a server accepts from a client. This would be also only one command per 10 milliseconds for a tickrate 100 server.
This is the point where the house of cards finally collapses. What is the server supposed to calculate its “world” with every millisecond if it only gets one command from all clients every 10 milliseconds? All the server can do is calculating its “world” with old data 90% of its time. If you now also take into consideration that there are different latencies that influence the rates at which the clients send their commands to the server and that the server buffers these commands in queues, 500, 600 or even 1000 FPS make no sense at all. The server usually has to work data that is about 50 milliseconds old or even older. During this time a lot of things may have already changed and some inputs (mouse movement) may have been given on the client side. So the server has to predict events – it has to guess what the clients will do next. It maybe calculates something completely different movement from what the player really does … with a precision of one millisecond at 1000 FPS. It does not matter if the server runs at 333 FPS or 1000 FPS. A wrong guess stays a wrong guess. At 1000 FPS it is just “more precisely wrong”.
Now you might ask “But what if the server guesses right?” True, if the server predicts correctly the position of a player is indeed more precise – on the server! But not on the client. Both engines for CS:S and CS1.6 act on the assumption that the server’s and the client’s time is in sync. The servers time is used for all clients. So the server saves a so called frame time for every calculated frame. It does this every millisecond on good Linux servers. The client uses the frame time that it receives with every update as its own time. From a 1000 FPS server with a tickrate of 100 the client should receive updates in which the frame time has advanced 10 milliseconds. Even if the server has predicted the player’s actions correctly, 1 or 2 milliseconds latency in a packet’s run time from the server to the client are enough to make all data imprecise again. So again, it does not matter if the server runs at 333 or 1000 FPS. Not to mention, latencies are usually a lot higher.
Some might say “But the server includes latencies into its calculations!” True again. The server does indeed take the client’s latencies into account. In order to calculate latencies correctly, the server needs a command packet from the client. This value is determined across a series of command packets and then included in the server’s calculations. But even finding the average makes all this imprecise again. Even more importantly the server uses old data to calculate that average latency. Again no difference between a 333 and a 1000 FPS server.
Some players think that with settings of cl_cmdrate 100, cl_updaterate 100 and rate 30000, you get exactly that from the server. But that is also wrong. In both CS:S and CS1.6 the server is authoritative. That means that under any circumstances it is the server that decides how many updates it sends and how many commands it accepts per second. The client maybe claims to receive 100 updates per second but the server sometimes just delivers 90.
Here 1000 FPS are also counterproductive. If the server is under heavy load and can’t calculate its 1000 FPS anymore, it decides to send the clients fewer updates and to process fewer commands in order to reach its maximum rate of FPS because the engine prioritizes the FPS rate achievement over processing updates and commands. This is important because the Source Engine but also the old Half-Life Engine do all their calculations per frame. No frame, no calculation, no update. The engine will always try to reach its pre-set FPS under all circumstances and will simply drop all commands and commands from and to the clients if necessary. This is why in this case a server that runs at a constant 333 FPS is a lot more precise than a server that has to switch between 500 and 1000 FPS all the time, which is exactly what most tuned 1000 FPS servers currently do.
Conclusion:
It is not important if a server runs at 333, 500, 600 or even 1000 FPS. Any of these frame rates make a server fast enough. It is far more important that the server has a high quality internet connection and always reaches its pre-set FPS. Only Valve Software can make hit registration better by improving the algorithms responsible for prediction, extrapolation and interpolation and thus making these more precise.
Don’t let yourself be fooled by fake “1000 FPS certificates”. Don’t let anyone force you to play on a 1000 FPS server in wars or matches. It is all just a marketing gag – nothing more or less.”
EDIT: It was also stated by Valve many times.
Example:
“A valve programmer (Mike Dussault) is quoted in stating that a Source server will sleep at every frame above the tickrate.”
http://forums.srcds.com/viewpost/58322#pid58322
Sorry, that was probably not the point of the thread, but It’s always annoying to see that people still fell for the server FPS scams.
Last edited by AnAkkk,
Anakin, they do make a difference. I’m not claiming its night and day or that its a million times better, but honestly, you’d have to be completely retarded to not be able to tell the difference between a 250 and 500fps server. This 840fps is a “because I can” thing.
p.s. FPS stability is vastly more important than the actual FPS. The fact you didn’t mention that shows you don’t have the slightest clue what you’re talking about.
Last edited by Skyride,
Edited my first post just after you posted.
So I was saying that even Valve stated that the server does nothing for all the frames above the tickrate. The calculations are done at every tick, not at every frame. The FPS should just not drop below the tickrate.
Last edited by AnAkkk,
Ok Anakin, tell me then, why is a server running at 100fps absolutely horrible to play on? :)
You don’t seem to understand the concept of “in theory” and “in practice”. Yes, in theory that should be the case, but it just isn’t.
Heres my point, the servers I sell are not good BECAUSE they are running at a higher framerate, they are just better; plain and simple, and the servers FPS is a good indicator of this. However plenty of people do believe regardless that 1000fps servers are better, so I’m happy to sell them. :)
My quality 500fps servers actually cost less than a typical server from a large number of well known companies who have been getting a bad rep recently, that alone seems a good enough reason.
I might seem attacking here, but I don’t take kindly to being accused of being dishonest since its something i absolutely am not.
Last edited by Skyride,
Good lord AnAkIn.., thank’s skyride.
Last edited by george,
Quoted from AnAkkk
[…]
Two more of those, and your transformation to the Dark Side is complete.
Last edited by Netsky,
Quoted from Nigh
he is a tech guy, who are you to judge him ?
he’s talking bullshit that he’s read and not experienced though.
but you’re right, skyride and me know nothing about computers. we should just go away and leave anakin to report the fact that different graphics cards render particles differently to valve and then they can just ban everything
Quoted from Fiend
Its just a placebo effect xD
At a certain point (beyond 500 perhaps) it becomes near-placebo. If you truly can’t feel the difference between a server at constant 66fps and 500fps then you are massively unattentive :)
Last edited by octochris,
gais the human eye can only see 20fps so more is useless!!1
I trust adamride to provide quality servers, we are getting one for our clan and I’ll post what I think here once we’ve used it for a while. We’ve had n1ping, clanhost and whatever company wireplay uses for their free server, so I’ll be able to make a comparison to those.
Quoted from kuma
gais the human eye can only see 20fps so more is useless!!1
I trust adamride to provide quality servers, we are getting one for our clan and I’ll post what I think here once we’ve used it for a while. We’ve had n1ping, clanhost and whatever company wireplay uses for their free server, so I’ll be able to make a comparison to those.
http://www.tomshardware.co.uk/athlon-ii-x3-440-gaming-performance,review-31906-2.html
have a read of that, but the human eye can see a hell of a lot more than 20 fps, test it yourself, load up tf2 set the fps_max to 20, play for 5 min, then set fps_max 60, and play for 5 min, you should be able to tell the difference
Anakin is right there.
Better get a server wich has a good routing for the most of all players than a server with shitloads of fps. I just hate it when people all the time whine omg that server has too less server fps can we change?
It is a mind thing as anakin sayed, there is no difference at all.
Add A Reply Pages: 1 2 ... 5 Next »