Frame Rates in CounterStrike Throughout History

While watching a CounterStrike: Global Offensive stream on Twitch there were questions about the streamer’s resolution. A popular resolution for competitive players is 800×600. There are a few reasons for this, mostly revolving around performance. Thinking about this superfluous bit of information sent my brain down a rabbit hole, eventually ending up at “I wonder how each version of CS runs on modern hardware.” I booted up each of the main versions of CS, 1.6, Source, and Global Offensive, and got to work.

By default, CS is limited in the number of frames per second it can display, so those limits had to be removed. Next, I set the resolution to 800×600 so we were measuring against some kind of standard. There weren’t many, if any, visual options for CS 1.6 so there wasn’t much to do there. In Source, I turned up all the settings to their maximum values, except for HDR, which I disabled because it’s not something I would have enabled anyway. In Global Offensive I normally run with rather conservative settings. I turned down a few options, like texture filtering, to better match how a competitive player would have their game setup. Here are the results.

Frames per second in each of the three main releases of CounterStrike
Frames per second in each of the three main releases of CounterStrike

So… wow, these numbers are kind of all over the place. Let’s take a closer look and see what’s going on here.

In 1.6, the oldest of the three versions, we have an average frame time of 3.9 ms (253.5 fps) with a standard deviation of 1.5 ms. Frame times drop as low as 2 ms (500 fps), and the 90th percentile (that is, what 90% of the frame times will be better than) is 5 ms (200 fps). In Source, a version that’s still quite old but much more optimized, we have an average frame time of 3.5 ms (285.7 fps) with a standard deviation of 1.6 ms. The results are, as near as makes no difference, identical between these two games. Even the 90th percentile is only 0.8 ms lower at 5.8. It would appear, then, that these older titles are hitting some kind of limitation with the game engine, rather than hardware. Global Offensive had an average frame time of 8.4 ms (119.0 fps) with a standard deviation of 1.9 ms. Why does GO have a higher deviation compared to the other versions when the graph appears much more stable? Part of it is the way the graph displays the relationship between frame time and frame rate, but another reason is that at these longer frame times (8.4 ms compared to 3.5) each millisecond of frame time has less of an impact in the overall frame rate. The difference between 2 ms and 4 ms is 250 fps, but 8 to 10 is only 25. Here’s the same chart display only the frame rate.

Same chart as earlier, displaying frame rate linearly.

Here the difference is frame rate looks much more severe in 1.6 and Source. Is there anything we can do about that? We could enable V-sync, which would delay the presentation of the frame to the monitor until the previous frame has finished drawing, but that would introduce a delay between what’s happening in the game and when it’s displayed on the screen. We’ll leave V-sync disabled but tell the game to limit the frame rate using the fps_max command and set it to 300 for 1.6 and Source, and 130 for GO.

 

Limiting the maximum number of frames to make gameplay smoother.
Limiting the maximum number of frames to make gameplay smoother.

This had two unexpected effects. First, while the average frame time only decreased from 3.5 ms to 3.4, the standard deviation dropped from 1.6 ms to only 0.5!  Second, the frame rate didn’t get any more stable in 1.6. Global Offensive is expectedly smoother, but still bumpy, because we’re running to limitations of my CPU and GPU, but it’s odd to me that 1.6 didn’t see any smoothing.cs 1. I set fps_max to 300, 250, and 200 to see if we could get smoother frame times, and…

CS 1.6 with fps_max set to 300, 250, and 200.
CS 1.6 with fps_max set to 300, 250, and 200.

Surprisingly, while lower fps_max values did stabilize the frame rate some amount (standard deviation was 1.2, 1.0, and 1.2 ms respectively), I never saw the same kind of stability as I did with Source capped at 300 fps. It’s unfortunate, but it’s likely due to a lack of modern optimizations in the game engine, like hardware-based smoke and particle effects and multicore rendering. Average frame times were 4.1 ms (241.7 fps), 4.5 ms (221.2 fps), and 5.2 ms (192.8 fps), which is a little disappointing, but at that 200+ fps level, you’re going to be hard-pressed to notice a difference anyway.

Going back to the original thing that got me thinking about this, how does resolution affect frame rate in the most recent version of CounterStrike?

800×600 vs 1680×1050 vs 2376×1485

Using nVidia’s Dynamic Super Resolution I was able to run the game at 2376×1485, basically the 16:10 equivalent of 1440p. In order to do this, I had to use nVidia’s optimized presets, so I opted for the performance option and ran new benchmarks with the new settings. At 2376×1485 the average frame time was 5.6 ms (179.4 fps) with 7.5 ms (133.3 FPS) in the 90th percentile. At 800×600 you would expect to see double or triple that frame rate, but surprisingly I got an average frame time of 6.3 ms (158.7 fps) with 8.4 ms (119.0 fps) in the 90th percentile. 1680×1050 saw similar results. It seems, in this instance, that increasing the resolution actually increased frame rate as well.

What did we learn? Well, just because a game is old doesn’t mean it’ll run at a million FPS; the engine needs to be designed to scale with increased CPU and GPU capabilities. As we saw, updated game engines are not able to run faster, but smoother as well. We also learned that GPUs work better when they’re given a healthy workload.

The benefits to gaming at 800×600 don’t appear to be performance related. Instead, it’s more likely tied to what old school players are used to. The only significant change is when using a stretched 4:3 resolution, the whole game appears wider, including player models and doorways. On the flip-side, you’re also limiting your field of view. Some players may even prefer this, as it keeps focus on what’s in front of you. TL;DR, it’s just about preference and what’s comfortable. There’s no performance to be gained either way.

Leave a comment

name*

email* (not published)

website