Input Latency: HD CRT vs HD LCD

Background

A while back my trusty CRT, a Sony KV-32XBR450, started having some weird sync issues where the image would sometimes be too low, then it would start from the middle, then jump around a bunch… it wasn’t great. At first it only happened for a few minutes when it was first turned on, but as time went on, it took longer and longer for the TV to “warm up”. Eventually, it was like this all the time, so it was time to start looking for something new.

While browsing Craigslist for a 1080p flat panel I stumbled across something interesting: an NEC LC5220AV. Knowing that NEC makes some dank shit, I did some research to see if this would be a good display. I was really excited about the inputs: native RGB H/V, VGA, DVI, HDMI, component, composite, s-video; it seemed to go on forever. $200 later it was in my living room and it. was. fucking. awesome. The picture-in-picture options on this 52-inch panel meant that my roommates and I could play multiple consoles on the same TV at the same time (8-player Mario Kart, anyone?). I always wondered what the input latency on this panel was, especially compared to my CRT I had given up for this much larger, more modern display.

Thanks to the 240p Test Suite, I’m now able to answer that question.

Testing

 

The first test I performed involved hooking my Dreamcast up to each display and running the Manual Lag Test five times, using the median as my final result. I did this with the CRT, then the LCD using no scaling (so it was rather small on the screen), scaled to fit the display, then again using the Framemeister XRGB-Mini to upscale the 240p image and output it to the LCD over HDMI.

For the second test, I connected the Dreamcast to the CRT, then used the CRT’s “monitor out” to send the video to the LCD, so the image is displayed on both screens. Using the Lag Test and my DSLR’s video recording feature, set to 60 frames per second, I recorded the difference between the two screens. Afterward, I connected the Framemeister between the CRT and LCD to see if that added any additional latency.

Results

Out of the five input latency tests, I observed the follow results:
CRT: 1.6 frames (26.67 ms)
LCD (native): 2.1 frames (35 ms)
LCD (scaled): 2.1 frames (35 ms)
LCD w/ Framemeister: 1.4 frames (23.33 ms)

I was happily surprised that the LCD’s internal scaler only added about half a frame of delay (compared to the 2-3 I often hear being added on consumer LCD panels), but I was astonished that the results with the Framemeister were better than the CRT’s results. I should point out that the KV-32XBR450 is a high-definition CRT with image processing that cannot be disabled, meaning that it, like modern displays, is going to add some amount of latency compared to a more traditional consumer CRT or PVM/BVM.

The results from the second test were also surprising. Since the signal being output from the CRT’s “monitor out” shouldn’t be subject to any image processing, this would present a good way to run a “drag race” between the two panels to directly compare display latency.

The image on the left is with the composite signal going directly into the LCD panel, which adds one frame of input latency. the image on the right us using the Framemeister to upscale the image to the LCD’s native 1080p, which results in no additional input latency. This isn’t to say that the Framemeister doesn’t add any input latency at all. What it means is that the internal scaler and image processing of the Sony CRT add the same amount of latency as the XRGB-Mini. If my manual input latency testing is accurate, they both add o frame and a half of latency; something only the most elite fighting-game players would notice.

Closing

The purpose of this testing wasn’t to determine which display is better, but to test specific differences between the two. Using the 240p Test Suite’s various tools, I was able to see frame stutters in the LCD panel that aren’t present on the CRT. The scaling of 240p content on both displays was between mediocre and above average. Geometry was accurately represented on the LCD, but rather poorly on the CRT due to lack of calibration, age, whatever lingering issues that had caused the sync issues mentioned earlier. All that said, it would appear that I’ve lost nothing by upgrading from this particular CRT to this particular LCD. Which is pretty cool.

Resident Evil 7: Beginning Hour

Resident Evil isn’t a game I ever paid much attention to. It seemed neat, but I just never had access to it as a kid. For whatever reason, I was really excited to try the demo at PAX this year, but it was by appointment only and it was always booked up for the whole day. The last day of PAX my friend and I decided to get there well before doors opened to try to secure our spot. 4 minutes after the doors opened we finally got to the booth, only to be told that all the appointments had been filled and we would have to wait in the stand-by line, hoping for no-shows.

Needless to say, I didn’t stick around.

Since then I hadn’t given the game much thought outside of the occasional promotional video, but when I saw the demo listed on my Xbox One dashboard I was really interested to see if it was, indeed, a scary game.

The answer is yes, yes it is scary. Penny Arcade coined the term “fear shitting”, which I would also use to describe this game (though they were playing using PSVR). It’s not something I want to explain too much because it really deserves to be experienced with no preconceptions and no biases. It’s truly an amazing experience with lots of replay value.

That said, if you really don’t want to play it yourself, here’s a playthough of the game with no unnecessary items collected or rooms explored with the ‘bad ending’. This way you can experience what the game is like without ruining the exploration and puzzle-solving elements.

If you’ve already played through the game and discovered the various endings, here’s a speed run where I complete the game and achieve the ‘true ending’ in only three minutes. As an added bonus I even talk you through it while I play!

Anyway, RE7 looks to be a great game and I’ll definitely be doing a let’s play series when it comes out in the beginning of next year.

PC Gaming Under $100

A while back I picked up a Dell Optiplex 755 from RE-PC, a local computer recycler, for $25 with the intention of making it a dedicated retro PC gaming rig that could run DOS games without the need for DOS Box. Games like Rainbow Six, Quake III Arena, and Nightmare Creatures run beautifully with all the visual settings maxed out at 1080p and the system seems to be damn near asleep while doing it.

Fast forward to November 25th when OzTalksHW uploaded this video documenting his build of the “OzBox”, a $160-ish gaming build.

“…if you’re thinking about building your own OzBox then definitely tweet at me and use the hashtag #OzBox because I really want to see you guys’ creations.” Challenge accepted.

Genesis

Our starting points are pretty similar. They are small form factor computers, originally designed for business tasks and general home computing with very little in terms of upgradeability. The key is that they both have a single unused PCI-Express x16 slot.

FakeGamerBox OzBox
Base System Dell Optiplex 755 HP Compaq 6000 Pro
CPU Core 2 Duo E6550 @ 2.33 GHz Core 2 Duo E6300 @ 2.8 GHz
RAM 2 GB DDR2 800 MHz 4 GB DDR3 1333 MHz
Storage  80 GB SATA 250 GB SATA
Price $20 $50

The OzBox comes with a newer, faster CPU, double the memory, and three times the storage for an extra $30 (or 2.5x the price, if you want to make it sound more sensational). To be fair, that $30 different is entirely the shipping cost from the eBay auction, so if you could find a similar deal locally that would be the way to go.

For $20 you can buy a Core 2 Quad E6700 at 2.66 GHz. For another $10 you can pick-up a 2x2GB kit of RAM to replace the 2x1GB sticks it comes with, or even suppliment the RAM your system comes with. I happened to have a 2x2GB kit laying around, so I replaced the old memory with the new stuff.

Our video card of “choice” (that is, out of the very limited selection of low-profile graphics cards available, this is the one I felt like spending my money on) is a nVidia GT 730 by PNY. I picked this 2GB model up for  $54, so depending on what upgrades you need we’re looking at $79-109 before tax.

Falling at the Starting Line

 

My original goal was to keep Windows XP on this machine, and that’s how I started the testings. Unfortunately, running XP meant I’m limited to 4 gigs of RAM (the board supports 8), spotty driver support, questionable-at-best security, and, most importantly, most modern games and benchmarking utilities simply won’t run. This limited me to testing older titles which didn’t reflect what a “gaming PC” should be able to play, so after countless software crashes and failed benchmarking attempts I eventually caved and installed Windows 7. Depending on wether or not your computer came with Windows 7 installed, or it came with the OEM license stuck to the top or back, of if you need to purchase a whole new copy, this could add some amount of cost to the build.

Benchmarks

I couldn’t decide if I wanted to test across an array of resolutions or target a common low resolution like 1280×720. Eventually, I decided on 1280×960 for a couple of reasons. First, it seems to be the resolution of choice for pro Counter-Strike players, so that’s the resolution I wanted to test there. I also thought it would be ideal to use the same resolution across all the games to get more comparable results. I also did all my benchmarking with four gigs of memory installed. Originally I was going to test with two gigs installed, then four, but it seemed like a huge hassle when the cost to upgrade (assuming you don’t have some laying around) is so little.

Dual Core

For my first round of testing, I went with the PC game I play the most often: Counter-Strike. Unsurprisingly, Global Offense had the lowest average frame rate at 62.4 FPS. 90th percentile frame times were 20.8 ms (48 FPS).

Going back to CS 1.6, I saw an expected boost in performance, but also an unexpected boost in erratic frame times. Average frame rate was 168.9 FPS, 90th percentile frame times were 7.2 ms (139 FPS), and a frame time deviation of 39%. Ideally, we would see the individual frames bunched as close as possible to the average (which we see with Source and Global Offensive), rather than scattered across the chart.

I was most impressed with CS: Source. Here I saw the highest average FPS between the three games, 212.3, the lowest 90th percentile frame time with 5.9 ms (169.5 FPS), and the lowest frame time deviation at 22%. This gave the best sense of responsiveness and fluid gameplay out of the three.

Unreal Tournament has always been a game that combined incredible graphics with fast gameplay and blazing frame rates. I remember being absolutely floored by Unreal Tournament 2003 on my AMD AthlonXP 1000+ and nVidia MX440, then again by Unreal Tournament 3 on the high-end machines at work back in 2007. So how do these titles hold up on our budget hardware?

UT 2004 had an average frame rate of 146.2 FPS and a 90th percentile frame time of 8.6 ms (116.3 FPS). Despite the modest frame rate and low video settings the game still looked great and was an absolute blast to play again.

I had my doubts about how well Unreal Tournament 3 would run, but those were soon laid to rest. With an average frame rate of 74 FPS, I was concerned it would dip below 60 FPS, but the 90th percentile frame time was 8.6 ms (64.9 FPS) it managed to stay consistently playable. Most surprising was the frame time deviation which was only 14%, meaning the frame times were very consistent.

It shouldn’t have surprised me as much as it did, but Left 4 Dead runs amazingly on this setup. Since there are tight corridors and large outdoor areas flooded with zombies I figured I should measure the performance of both scenarios, which were really quite similar. Indoors saw an average frame rate of 101.3 FPS with a 90th percentile at 12.9 ms (77.5 FPS). Outdoors, while being swarmed by zombies, the frame rate averaged 100.1 FPS with the 90th percentile at 14.3 ms (69.9 FPS). My playthrough of the first mission was very smooth with no noticeable frame drops, stutters, or other performance issues. Then again, it’s an 8-year-old title at this point, so the impressive performance does make sense.

Quad Core

I replaced the dual core E6550 with a Q6700 quad core processor (a surprisingly simple task in this machine), which is the best CPU this motherboard supports. In addition to the extra cores we also get a 333 MHz clock speed increase, so even single-threaded games should see a performance boost.

CS 1.6 saw a huge boost to average FPS, jumping from 168.9 to 211.0, and more importantly, the average frame time deviation (how far away each frame was compared to the average) dropped from 39% to 12%. That means that, rather than having a frame rate that consistently jumps up and down, it stays stable throughout gameplay.

CS: Source had its average frame rate drop from 212.3 to 181.5 while its frame time deviation swelled from 22% to 35%. Not sure why this happened, but it was consistently happening.

CS: GO got a modest increase from 62.4 FPS to 75.3 FPS with minor reduction in frame time deviation.

Both Unreal Tournament games saw virtually no change whatsoever, which leaves me to believe that the game is being bottlenecked by the video card.

L4D got a substantial 13.2 FPS gain while dropping it’s frame time deviation from 27% down to 12%.

This was originally going to be the end of my benchmarking since the selection of modern games that could still run on Windows XP was limited, but I decided to install Windows 7 and see what this hardware was really capable of.

Quad Core on Windows 7

Now that we can install pretty much whatever we want (that will fit on the measly 80 GB drive), it’s time to really put the hardware through its paces.

When I loaded 3DMark the recommended benchmark was Firestrike Extreme, which made me audibly laugh. After selecting the standard Firestrike test, which I already knew would be too much for the system to handle, it came back with a score of 639.

A more fair test would be SkyDiver which came back with a score of 2,639, exactly 2,000 points higher. Still not great, but at least it’s a real score this time.

CS 1.6 saw another boost to average FPS, hitting 232.9 on average. CS: Source continued to drop, this time hitting 153.0, down from 212.3 with a slower dual core CPU on Windows XP. I just can’t wrap my mind around this. Maybe it’s servers, maybe it’s something hardware or operating system related, I have no idea. CS: GO managed nearly identical results with an 115.4 average FPS and nearly identical 90th percentile frame time.

UT 2004 also saw a drop in average frame rate, down from 142.6 to 109.2 with the 64-bit patch. Without the patch, the average frame rate was 100.1. This might be something OS-related, but considering how rarely I play UT 2004 and how little that extra 40 FPS actually matters, I’m just going to leave it alone. UT3 saw no notable change.

L4D, unsurprisingly, was nearly identical to the previous results under Windows XP. Average frame rate grew from 113.3 to 115.4 which is well within the margin of error.

Finally, we get to look at some results from new games that didn’t run on XP.

I wasn’t expecting much out of DiRT 3. It’s a great-looking modern racing title heavy on physics. I lowered the resolution to 1280×720 and ran three benchmarks. The first was with all visual options turned to their lowest settings or completely disabled. The second was the “medium” preset, and the last was with the “high” preset.

This result floored me. I didn’t know the puny hardware inside this little case was capable of playing modern titles like this. Granted, it is at a low resolution and moderate graphics settings, but for $100, that’s not too shabby.

Rocket League is another game that shocked me with how well it performed. With all the visual options low or disabled I saw a respectable 58.6 FPS average with 90% of the frame times being at or above 20.2 (49.5 FPS). Leaving the Render Detail on “High Performance” while turning the Render Quality to “High Quality” resulted in a pleasing image that ran at 43.5 frames per second. While mid-to-low 40s might not usually be an idea frame rate, I found that, with Rocket League, it was plenty for knocking the ball around in the standard 3v3 game type and I didn’t feel like I was limited by the computer’s performance at all.

The last game I tested was 2013’s Tomb Raider, which… It ran, and seems playable, but only with the “Low” graphics preset at 720p and with motion blur and screen effects disabled. It’s possible to play the game at the “Normal” preset, but with frame rates down into the 20s it makes for an unpleasant experience.

Conclusion

Did we accomplish our goal of spending about $100 to play PC games? Yes, absolutely. Is it a good experience? No, not really. You’re better off buying an Xbox 360 or PS3 than trying to built an ultra-budget gaming PC, but if money is really tight and you just need to play those PC-only titles like Counter-Strike: Global Offensive, League of Legends, or DOTA2 this is certainly a possible solution.

Compared to the OzBox this built is based off of, how did we do? Well, it’s hard to compare directly. We have different games, so our benchmarks are going to be different. The OxBox hardware is better, there’s no doubt about that; the GTX 750Ti he picked out for his build costs as much as our whole system did, if not more. Comparing a roughly $100 PC to a $180 PC doesn’t exactly make sense, so I would say this version of the build is for people who want to play older titles or some newer titles if budgets are limited. Ozi’s original version would aim more toward the casual gamer who wants the option to play modern titles either at a low resolution with pretty visuals turned up or a high resolution with lower graphics settings, but still maintaining a 60+ FPS target.

I’ve ordered some parts for a follow-up article, seeing just how far we can push the limits of this compact gaming rig (possibly making it not so compact), so check back for updates.

Drift King: Shutoku Battle ’97 Review

Holy shit, I was so hyped to play this game when I first saw it. Sega Saturn, Genki, Keiichi Tsuchiya, freeway racing; a perfect storm of nostalgia that I had to have.

This game does a lot of things right. The opening intro is clip after clip of Tsuchiya drifting with hair metal blasting in the background. The car selection, while initially small, is on point. Blasting through the highways of Tokyo? Awesome! Until you realize how much faster your opponent is, that traffic is actually out to kill you, and your car seems to drive exactly the same regardless of how many upgrades you buy. Welcome to Shutoku Battle ’97.

That’s a lot to take in all at once, so I’ll break it down. The graphics are pretty good with some nice touches here and there, like pseudo-dynamic time-of-day changes and your dash lighting up when you drive through a dark tunnel. The controls take a while to get used to, but once you master the “drift” button and learn to throttle the gas to adjust your angle and grip, the game plays like a dream. The Saturn version of the game has a different soundtrack than its Playstation counterpart and it suffers because of it. The Playstation version gets full redbook audio while the Saturn version is limited to synthesised audio. The difference is pretty severe and takes the soundtrack down from “badass” to “completely forgettable”.

Unfortunately, the negativity doesn’t stop there. Because this game takes place on public highways there will, of course, be traffic. In titles like Wangan Midnight: Maximum Tune the traffic largely stays in it’s lane, moving over when it’s reasonable to do so. Here, the buses, cars, and semi-trucks all change lanes just as you’re coming up behind them. These same vehicles will do this on two-lane roads to create rolling barriers that will keep ramming into you until you slow down enough to drive around them. Then, they’ll swerve back into their original lane to bash into you all over again.

With the excellent controls, impressive visuals, and wonderful aesthetic you might be willing to look past the weak music selection and insane traffic to enjoy this otherwise great game. Until you get about a third of the way through the campaign and the rival cars completely outclass you. It’s not like “Oh, the enemies are harder now, I guess I need to try harder”. More like “Holy shit, after the first lap he’s already 30 seconds per lap faster than me”, which is a lot when each lap is only a minute and a half. You can upgrade your car, and even switch to a more powerful car and upgrade that one, but with how expensive upgrades are compared to how little you make after each loss you’re essentially going to spend hours and hours losing with the hopes of maybe, eventually being as fast in a straight-line before getting killed by a bus.

The game shows a lot of promise, and some of these issues may have been fixed in the Playstation release. Unfortunately, I don’t have that version, so I’m stuck with a semi-playable disappointment.

Insane Game Prices at Half Price Books

Classic and retro game prices are on the rise. $30 for Pokemon Red and Blue is pretty normal despite being 20 years old and selling over 45 million copies. Super Mario World? $20-25 despite being the pack-in game for the Super Nintendo. It’s no wonder gamers want to know where they can get the best prices for (legitimate) games. Without fail, whenever I hear feedback from friends or YouTubers about where to find cheap games, I always hear Half Price Books come up. It’s not much of a surprise; Half Price Books sells more than just books and many items can be found for an incredible bargain. Just yesterday I paid about $12 for a handful of laserdisc movies, including Blazing Saddles and a sealed copy of The Birdcage (yes, I’m secretly an old man).

Last year I stopped by my local Half Price Books to see what all the noise was about. There were a few rows of modern and last-gen games, but most of it was the kind of stuff you probably don’t want to buy and prices that reinforce that feeling. All of the good stuff was, of course, behind lock and key. I was pretty outraged by the prices, went home, and quickly forgot about it. I happened to be nearby and decided to see if anything had changed.

gamestop-xbox

Let’s start with the elephant in the cupboard, the $150 Xbox 360. I need to say that again, slowly. One-hundred fifty dollar Xbox 360. I can’t even begin to wonder where they got a price like that or how long it’s been sitting in there. You can walk into a GameStop and purchase a Halo limited edition Xbox 360, Modern Warfare limited edition Xbox 360, and a generic white Xbox 360 that comes bundled with Battlefield, Modern Warfare, and Assassin’s Creed, all with cables, controllers, and warranty, for the same price as this one console. I’m just… beyond words.

Then there’s $50 for a PlayStation 2, $40 for a Wii, $100 for a Kinect (which are available en mass from GameStop for $20-25 depending on if it’s the original or S version)… I’ve been tempted to ask someone if these prices are accurate but I also have no intention of buying them, so I haven’t bothered.

hpb-games

Then there are the games themselves. $20 for Tetris? One of the best-selling games of all time? $25 for Super Mario Bros. 3? $75 for Legend of Zelda and Super Mario World?! No, no no no no no. No. Even Hogan’s Alley is 2-5 times as expensive as the current eBay Buy-It-Now prices.

Maybe this is just the result of a rogue employee trying to get every penny possible out of game trade-ins, or maybe someone was looking up complete-in-box pricing when coming up with these prices. Who knows. If anyone has had similar or different experiences at their local Half Price Books I’d love to hear about it.

Let’s Complain about the Nintendo Switch

Note: While I was writing about the potential Wii U backwards compatibility on the Switch I neglected to take into consideration that there is an option to play Wii U titles exclusively on the Wii U gamepad, which resolves one issue I brought up. That said, I still don’t believe we’ll be seeing backward compatibility on the Switch.


Recently, Nintendo announced their new… console? Portable gaming platform? Whatever it is, there isn’t a whole lot known about it outside of what we saw in the promotional video.

Google search results for backward compatibility on the Switch.
Google search results for backward compatibility on the Switch.

Let’s address the biggest “issue” I’ve seen popping up lately: The Switch is not backward compatible with the 3DS or Wii U. In a nutshell, no shit. The 3DS and Wii U are both dual-screen systems, so where this expectation came from that a single-screen system would support dual-screen games is beyond me.

Granted, Nintendo does have a history of supporting old games and hardware on new systems. The Super Nintendo could play Game Boy games and the GameCube could play Game Boy, Game Boy Color, and Game Boy Advance, though both had their own issues and required extra hardware. The Wii supported GameCube games and controllers natively due to the Wii simply being a faster GameCube, and the Wii U supports Wii games, Wii controllers, and even GameCube controllers with a USB breakout box. While not officially supported, it’s even possible to play GameCube games on the Wii U through some software hacking. Even the Super Nintendo was going to have backward compatibility with the original NES, at first natively, then through a hardware add-on, but proved cost prohibitive.

So why not support backward compatibility on the Switch? Let’s start with 3DS compatibility. The Switch has enough buttons to properly replicate the 3DS controller, but still only has one screen. In theory, it could be possible to use the Switch’s screen as the lower screen and your TV as the upper screen, but Nintendo has dispelled that already by designing a docking station (which is how you get video to your TV) that completely obscures the Switch’s screen. Additionally, the idea that 3DS games would be supported but only in very specific circumstances doesn’t make much sense.There’s also the issue of hardware differences between the Switch and 3DS. It hasn’t been confirmed if the Switch even has a touch screen, something an overwhelming majority of 3DS games require, or at least make use of in some way, which puts a huge limit on the number of games that can be played.

Traditional backward compatibility comes from similarities in CPU architecture. Like I stated above, the Wii uses the GameCube’s CPU to achieve perfect compatibility. The PlayStation 2’s CPU is vastly different than the original PlayStation’s CPU, so they included a PlayStation CPU inside the PlayStation 2 for purposes of backward compatibility. Due to CPU changes between the original Xbox, Xbox 360, and Xbox One, backward compatibility between generations was achieved through software emulation (that is, software pretending that it’s hardware, allowing the game to play on hardware it wasn’t designed for). Emulation was not available when each console launched, not all games were supported, and many which were supported exhibited graphical and performance issues. So if we apply that logic to the 3DS and Switch, sure, Nintendo could possibly get software emulation working on the Switch, but with potentially iffy results, a poor user experience, and the fact that the Switch provides no additional benefit over just using a 3DS, there is literally no reason Nintendo should support this.

Analogy of backward compatibility where it doesn't belong.
Analogy of backward compatibility where it doesn’t belong.

So what about the Wii U? All the CPU architecture and software emulation stuff still applies, so I’m going to skip over that part. The biggest reasons I could see this not happening are the storage medium (Wii U uses a proprietary disc format) and, again, lack of a second screen.

The Wii U stores its games on a 25-gigabyte disc that is similar to, but not, Sony’s Blu-Ray discs. The Switch doesn’t have an optical drive. See a problem? It’s not like Nintendo would let you rip your games from your Wii and transfer it to your Switch. Not to mention that we don’t know if there is a touch screen or motion controls (though the announcement of Just Dance suggests that this might be happening). Lacking either of these features would break plenty of games. Speaking of motion controls, the system is supposed to be portable; who in their right mind is going to be using motion controls on an airplane, at the park, or even in their own living room? How about four- and five-player games? On a 6-inch touch screen? No chance.

All of this isn’t to say that there won’t be any backward compatibility. Nintendo could breathe new life into their Virtual Console service, allowing players to play older portable games… portably. Kirby’s Dreamland on Game Boy? Pokemon LeafGreen and FireRed on Game Boy Advance? There’s even the potential for Nintendo 64 and GameCube titles to be played on the go, not to mention non-Nintendo systems that are already supported like the TurboGrafx-16 and Neo Geo.

The most important point to consider, I think, is that we already have devices that perfectly play 3DS and Wii U games: They’re the 3DS and Wii U. If these are the systems you want to play just by those systems. You’ll save money and have a much better experience.

Okay, so what else are people complaining about? Battery life. In this article from Forbes, contributor David Thier says “It would appear to be a pretty powerful machine for the size, and that doesn’t come cheap power-wise. So we’re going to need a machine that gives us 5+ hours of playtime — if we’re short of that, we’re going to have a problem.”

Hello? We’re going to need 5+ hours of play time? Okay, hang on. We need to talk about use cases. Computers and game systems don’t use a constant amount of electricity; it varies depending on what you’re using the system for. This Gizmodo article tests Apple’s claims of 10+ hours of battery life on the original iPad. With approximately 50% video watching and 50% gaming, they got just under 6 hours of battery life. What’s important to note here is that the iPad’s twin battery is massive. Yes, processors have become more efficient and battery capacities have grown, but it gives you a realistic expectation. CNET claims 3-5 hours of game time from both the PlayStation Vita and 3DS. The Switch looks to be quite a bit more powerful with a bigger screen, resulting in more power consumption. I think David Thier is going to have a problem, but only because of unrealistic expectations, bordering on entitlement.

The last issue I see people complaining about is price. We don’t know all the details about it; we don’t know what all it can do. Is the screen 720p? Does it output native 4K? Upscaled 4K? 1080p? What is the quality of the graphics? Last-gen console? Top-of-the-line tablet? We don’t know anything about what we’ll actually be looking at come March, so making blanket statements about “It can’t cost more than…” really doesn’t make any sense.

Let’s all just take a long breath, exhale, and just take what we know at face value.

Of course, the most important thing we know is that a new GameFreak-made Pokemon game is coming to the Switch.

Retro Gaming in the Modern World, part 1

You know that feeling when you start to look up something on WebMD and you start to panic because you think you have some terrible disease? That’s kind of what happened to me when I started looking up retro gaming video quality. My WebMD, in this case, was the My Life in Gaming RGB Master Class. I had been doing a lot of research on playing retro games on modern displays, but my only modern display was a Panasonic plasma TV, which is not ideal for retro games due to the risk of image retention and burn-in. As luck would have it my large CRT has started to act really strange when it first turns on and has only been getting worse. The replacement for my failing CRT handles retro games with surprising grace but still falls flat in a few areas. To address those issues I’ve purchased a video upscaler. Why not just plug in my consoles and let the TV do it’s thing? Well, that takes a lot of explaining. In part 1 we’ll address some of the technical information we need to know before diving head-first into what the scaler does.

Pixels, Sub-pixels, and Resolution

An example 4-pixel by 3-pixel display with each red, green, and blue sub-pixel shown.
An example 4-pixel by 3-pixel display with each red, green, and blue sub-pixel shown.

When an image is displayed on a screen you’re actually looking at small squares called pixels (short for ‘picture elements’) that, when viewed from a distance, make up an image. On top of that, each pixel is made up of three sub-pixels, each one displaying either red, green, or blue (RGB). Colors are created by changing the brightness of each red, green, and blue sub-pixel individually. For example, if red and green are at full brightness and blue is completely darkened you get a bright yellow.

Standard definition is 480i, or 480 lines (rows) of horizontal resolution with interlaced video. Interlacing displays only the odd lines of a video frame (1, 3, 5…), then the even lines of the next (2, 4, 6…). Modern displays are typically 1080p, with 1,080 lines of horizontal resolution with progressive scan. Progressive scan means the whole image is drawn in a single pass, on every line, rather than alternating the lines. The result is a much better quality video when there’s fast motion or scrolling test.

It should be noted that I’m only mentioning horizontal resolution. This is because the vertical resolution, or the vertical rows that made up the image, could vary wildly. Even the true resolution of standard-definition was much wider than what the TV was able to display, and some games ran at wider resolutions than other, even though the horizontal resolution was the same.

Retro game consoles only had the processing capability to generate 240p video, which, despite being a non-standard resolution, TVs were able to display without issue. It wasn’t until the Sega Dreamcast that consoles could display 480i and 480p images. Most modern TVs are able to accept and display a 240p image, but they see this non-standard resolution as 480i and attempt to deinterlace an image that is not interlaced to begin with, ironically making the image appear interlaced and introducing other potential issues. This can be as minimal as a blurry image, but can also interfere with flickering transparency effects, effectively making some sprites and characters disappear when taking damage. The process of upscaling this “480i” signal to 1080p can also introduce input lag, making time-sensitive games like MegaMan or Beatmania impossible to play.

Connection Types

So now we understand what makes up a picture, but how does that picture get from the console to the TV? When the console generates each frame of video it leaves the image processor and enters a digital-to-analog converter (DAC), which turns the video into a signal that the TV can display. The quality of the video that gets sent to your TV depends largely on two things: the quality of the DAC, which you can’t change, and the connection type used, which you usually can.

RF adapters

RF adapter for the Nintendo Entertainment System.
RF adapter for the Nintendo Entertainment System.

There was a time where many consumer TVs in the United States only had a single input for their video; the coaxial connection also called the antenna connection. This was used for both over-the-air TV signals as well as cable TV signals and was often the only way to plug in your video games. Internally the game system would convert the video signal, which is digital when it’s originally created, converts it to an analog signal, then sends it to an RF (radio frequency) adapter which converts the analog signal to another kind of analog signal that, to the TV, looks just like a TV broadcast. If you remember having to use radio adapters to listen to your iPhone in your car, it’s the exact same thing but with a physical connection. The signal was also susceptible to interference from other devices, like TV broadcasts, which would create distortions and ghost images. All this, combined with cramming all the audio and video information into a single cable, really took a toll on the image quality.

As a side note, even if you wanted to connect your console to your modern high-definition TV, many no longer come with analog TV tuners (since it’s no longer used in the US), so this may not work at all.

Composite

Typical composite cables, red and white for audio and yellow video.

Where RF combines audio and video data into a single connection, composite only transmits video data; audio is transmitted over one or two separate RCA cables (white and red). Picture quality is greatly improved because there’s less information to transfer over a single connection, there one less signal conversion and the connection is not susceptible to the same interference as RF. A lot of newer TVs support composite, but not s-video, so for some situations, this may be the only connection type you can use.

This connection is also referred to as “AV” or “RCA”, though RCA the physical connection type and doesn’t refer specifically to composite video.

S-video

S-video cable, carrying separate chroma and gamma .
S-video cable, carrying separate chroma and gamma .

S-video, short for ‘separate video’, splits the video signal into two connections: one for color information and one for gamma (brightness) information. Composite video carries both of these signals on two separate frequencies. These signals can interfere with each other, causing blurriness in the image. Separating these into their own connections means they cannot interfere with each other, providing a higher quality image.

If your TV supports it, S-video is typically the way to go. Most consoles support it and it’s typically the best video quality you can get with a very minimal investment.

Component

Component cables for YPbPr video. Audio cables not shown.
Component cables for YPbPr video. Audio cables not shown.

The correct name for this connection is YPbPr, but is known largely as ‘component’. It carries video over three separate RCA cables; one for gamma, (which is basically a combination of the red, green and blue color information), one for gamma minus red, and one for gamma minus blue. Green is created by subtracting red and blue from the gamma information. It’s also possible to carry an RGB connection over this connection, which the PlayStation 2 has the option to do, but most TVs don’t support this option.

For consoles with AV multi-ports it should be possible to get YPbPr video by using a SCART cable with an SCART-to-component adapter, though your results may vary depending on the console and TV used. You’ll also be getting 240p output, so you’ll end up with similar blurring, interlacing, and input lag issues that you would get with composite and s-video.

What’s the Result?

I took a screenshot of Super Mario World and did some Photoshop work on them to give you an example of the kinds of image quality differences you can expect with each connection. For a more real-life comparison check out the RGB Master Class series.

So What’s the Solution?

There’s a group of video products called scalers that take standard-definition and output them at 720p and 1080p. Most of these devices are expecting a 480i signal, so while you might have less input lag and other issues caused by the TV’s misinterpretation of the 240p signal, you might still end up with some distortion. Common issues are halos around sprites from heavy-handed sharpening and image stretching to fill the TV screen. While there are plenty of options out there, the best so far seems to be the Micomsoft XRGB Mini, also known as the Framemeister. This piece of hardware was designed specifically for 240p video, allowing for proper, distortion-free scaling. Mine was just delivered today, and I’ll be documenting my experience with it as soon as I’m back from Korea.

Another solution is console-style emulators like the Retron, but I’ve never liked that solution. Yes, it uses cartridges, but there’s nothing authentic about the feel of it, the controller is garbage, there’s apparently some amount of input lag, and I already have a PC to connect to the TV, so why pay for an emulator that you could legitimately download for free?

There’s also official emulation from Nintendo, Sony, and Microsoft, as well as backwards compatibility from newer consoles with higher quality output. Some consoles offer perfect compatibility, like playing PlayStation games on a PlayStation 2, but the Xbox 360’s emulation of original Xbox games hit hit-or-miss, but usually ‘miss’. Having a single solution that solves all my video issues, rather than a dozen bandaid solutions, is the better option for me, and the HDMI-out from the XRGB Mini also allows for easy capture of extremely high quality video for streaming or recording gameplay videos.

Frame Rates in CounterStrike Throughout History

While watching a CounterStrike: Global Offensive stream on Twitch there were questions about the streamer’s resolution. A popular resolution for competitive players is 800×600. There are a few reasons for this, mostly revolving around performance. Thinking about this superfluous bit of information sent my brain down a rabbit hole, eventually ending up at “I wonder how each version of CS runs on modern hardware.” I booted up each of the main versions of CS, 1.6, Source, and Global Offensive, and got to work.

By default, CS is limited in the number of frames per second it can display, so those limits had to be removed. Next, I set the resolution to 800×600 so we were measuring against some kind of standard. There weren’t many, if any, visual options for CS 1.6 so there wasn’t much to do there. In Source, I turned up all the settings to their maximum values, except for HDR, which I disabled because it’s not something I would have enabled anyway. In Global Offensive I normally run with rather conservative settings. I turned down a few options, like texture filtering, to better match how a competitive player would have their game setup. Here are the results.

Frames per second in each of the three main releases of CounterStrike
Frames per second in each of the three main releases of CounterStrike

So… wow, these numbers are kind of all over the place. Let’s take a closer look and see what’s going on here.

In 1.6, the oldest of the three versions, we have an average frame time of 3.9 ms (253.5 fps) with a standard deviation of 1.5 ms. Frame times drop as low as 2 ms (500 fps), and the 90th percentile (that is, what 90% of the frame times will be better than) is 5 ms (200 fps). In Source, a version that’s still quite old but much more optimized, we have an average frame time of 3.5 ms (285.7 fps) with a standard deviation of 1.6 ms. The results are, as near as makes no difference, identical between these two games. Even the 90th percentile is only 0.8 ms lower at 5.8. It would appear, then, that these older titles are hitting some kind of limitation with the game engine, rather than hardware. Global Offensive had an average frame time of 8.4 ms (119.0 fps) with a standard deviation of 1.9 ms. Why does GO have a higher deviation compared to the other versions when the graph appears much more stable? Part of it is the way the graph displays the relationship between frame time and frame rate, but another reason is that at these longer frame times (8.4 ms compared to 3.5) each millisecond of frame time has less of an impact in the overall frame rate. The difference between 2 ms and 4 ms is 250 fps, but 8 to 10 is only 25. Here’s the same chart display only the frame rate.

Same chart as earlier, displaying frame rate linearly.

Here the difference is frame rate looks much more severe in 1.6 and Source. Is there anything we can do about that? We could enable V-sync, which would delay the presentation of the frame to the monitor until the previous frame has finished drawing, but that would introduce a delay between what’s happening in the game and when it’s displayed on the screen. We’ll leave V-sync disabled but tell the game to limit the frame rate using the fps_max command and set it to 300 for 1.6 and Source, and 130 for GO.

 

Limiting the maximum number of frames to make gameplay smoother.
Limiting the maximum number of frames to make gameplay smoother.

This had two unexpected effects. First, while the average frame time only decreased from 3.5 ms to 3.4, the standard deviation dropped from 1.6 ms to only 0.5!  Second, the frame rate didn’t get any more stable in 1.6. Global Offensive is expectedly smoother, but still bumpy, because we’re running to limitations of my CPU and GPU, but it’s odd to me that 1.6 didn’t see any smoothing.cs 1. I set fps_max to 300, 250, and 200 to see if we could get smoother frame times, and…

CS 1.6 with fps_max set to 300, 250, and 200.
CS 1.6 with fps_max set to 300, 250, and 200.

Surprisingly, while lower fps_max values did stabilize the frame rate some amount (standard deviation was 1.2, 1.0, and 1.2 ms respectively), I never saw the same kind of stability as I did with Source capped at 300 fps. It’s unfortunate, but it’s likely due to a lack of modern optimizations in the game engine, like hardware-based smoke and particle effects and multicore rendering. Average frame times were 4.1 ms (241.7 fps), 4.5 ms (221.2 fps), and 5.2 ms (192.8 fps), which is a little disappointing, but at that 200+ fps level, you’re going to be hard-pressed to notice a difference anyway.

Going back to the original thing that got me thinking about this, how does resolution affect frame rate in the most recent version of CounterStrike?

800×600 vs 1680×1050 vs 2376×1485

Using nVidia’s Dynamic Super Resolution I was able to run the game at 2376×1485, basically the 16:10 equivalent of 1440p. In order to do this, I had to use nVidia’s optimized presets, so I opted for the performance option and ran new benchmarks with the new settings. At 2376×1485 the average frame time was 5.6 ms (179.4 fps) with 7.5 ms (133.3 FPS) in the 90th percentile. At 800×600 you would expect to see double or triple that frame rate, but surprisingly I got an average frame time of 6.3 ms (158.7 fps) with 8.4 ms (119.0 fps) in the 90th percentile. 1680×1050 saw similar results. It seems, in this instance, that increasing the resolution actually increased frame rate as well.

What did we learn? Well, just because a game is old doesn’t mean it’ll run at a million FPS; the engine needs to be designed to scale with increased CPU and GPU capabilities. As we saw, updated game engines are not able to run faster, but smoother as well. We also learned that GPUs work better when they’re given a healthy workload.

The benefits to gaming at 800×600 don’t appear to be performance related. Instead, it’s more likely tied to what old school players are used to. The only significant change is when using a stretched 4:3 resolution, the whole game appears wider, including player models and doorways. On the flip-side, you’re also limiting your field of view. Some players may even prefer this, as it keeps focus on what’s in front of you. TL;DR, it’s just about preference and what’s comfortable. There’s no performance to be gained either way.

Let’s Clear the Air: Decoding Technical Jargon

With Sony revealing the PlayStation Slim and Playstation Pro yesterday, along with Microsoft’s reveal of the Xbox One S at E3, there have been a lot of terms use that the average consumer may not be familiar with. Terms like “teraFLOPS of compute performance” and HDR make consoles and video cards sound impressive but what do they actually mean?

Let’s start with FLOPS, or Floating-point Operations per Second. Computers can basically do two kinds of math: with decimal places (floating-points) and without decimal places (integers). Floating-point math is critical for scientific computation, including simulating 3D objects and environments, which is basically what a game is. Processor speed is measured in Hertz (Hz), or cycles per second. If a processor is able to execute 100-million instructions per second its speed is rated at 100,000,000 Hz or 100 megahertz (MHz). Modern processors are typically in the range of 3 gigahertz (GHz), or 3-billion instructions per second. Depending on the processor it might be able to perform a single floating-point operation per clock cycle, or maybe it can do 4, 6, or 8 (which also depends on how much precision (how many decimal points) are used in the math). This is one of the reasons that CPUs rated at the same speed can produce different results. So if our 3 GHz processor can perform 8 FLOPS per cycle, that’s 3-billion times 8, or 24-billion FLOPS (24 gigaFLOPS). Of course, modern processors might have 4, 6, or 8 cores, so if we assume we’re looking at a 4-core CPU we need to multiply that number by 4, so we now have 96 gigaFLOPS.

CPUs have to perform a wide variety of computational tasks. Being a jack-of-all-trades means they aren’t quite as fast as a processor that’s dedicated specifically to floating-point calculations. This is where video cards (GPUs) come in. GPUs are purpose-built for doing as much floating-point math as possible. That means that, while a typical desktop CPU might perform somewhere in the 50-100 gigaFLOP range, mid-range GPUs can perform in the 3-5,000 gigaFLOP (3-5 teraFLOP) range.

Now that we know what a FLOP is and how it’s calculated we can look at what Sony claims the Playstation 4 is capable of. This chart from AnandTech shows the GPU performance between the original and Slim Playstation 4 models at 1.84 teraFLOPS (1,840 gigaFLOPS) and the Playstation 4 Pro at 4.2 teraFLOPS (4,200 gigaFLOPS). That 2.3x performance jump means that games can run at higher resolutions, texture and model detail, higher and smoother frame rates, or any combination thereof. Is it enough for native 4K? Probably not. Comparing that 4.2 teraFLOP number to a desktop GPU, it’s right in between a GTX 970 and 980, meaning it’s closer to a 1440p or 2K resolution performer unless you really dial down the rest of the graphics settings.

“But the PS4 Pro supports 4K. How is that possible if it isn’t powerful enough for 4K games?” It might not be able to render typical games in 4K resolution, but it might be possible to render it at 2K resolution and upscale the images to 4K. It’s basically like resizing an image in Photoshop, but with a little bit of sharpening and other effects to make it look kind’a like it maybe was originally rendered in 4K. Similarly, if you have a 1080p display, the system could still render a game at 2K and down-sample that image to 1080p, resulting in sharper, more natural images. This technology is already available on the desktop with nVidia’s Dynamic Super Resolution. The PS4 Pro is, however, capable of playing 4K video through services like Netflix and YouTube, though it looks like the Pro does not include Ultra-HD Blu-Ray support, so you won’t be able to watch your 4K movies on your new Playstation.

One of the other new features that was announced is HDR, or High Dynamic Range. Imagine you’re indoors on a bright, sunny day. You’re taking a picture of your friend, who is standing in front of an open window. One of two things is likely to happen: First, your camera may correctly determine your friend needs to be exposed correctly, leaving the background “blown out”, virtually pure white, or it may try to expose for the outdoors, leaving your friend as a blacked out silhouette. This is an example of a low dynamic range. When you look at your friend with your eyes, though, you can see your friend clearly as well as what’s outside with no trouble at all. This is an example of high dynamic range. HDR video aims to provide a more lifelike range of color and brightness than a typical TV or computer monitor is capable of.

That’s all I can think of for now. If you have any questions about fancy-pants words companies are throwing around in their press announcements leave a comment below.

Pre-PAX Prime PC Preperations

A little over a year ago I decided to undertake a somewhat unique approach to water cooling: using surface area and evaporation to silently remove heat from the computer. If you want details, the build thread is here on the Linus Tech Tips forums. The long and short of it is, while it did work, summer temperatures made the water evaporate at an annoying rate and the cheap pump I was using generated more noise than I was happy with. That, and I couldn’t move my computer downstairs if I wanted to setup an HTC Vive.

I pieced together a massively overkill water cooling loop with the idea that excessive cooling meant less noise. The end result was this:

water-cooling-1

I tried to make the diagonal lines work, but what I really wanted was something with a lot more 90-degree bends. The lines running into and out of the CPU cooler weren’t the same length, ran across the case at slightly different angles, and when combined with the mostly-horizontal line running from the radiator back to the reservoir, I was pretty unhappy with how it looked. The performance was fine, and I achieved a perfectly stable 4.4 GHz overclock with relative silence. The HZXT Hue+ also provided nice, ever-changing mood lighting which really set the build off.

This year I decided to bring my PC to PAX. There’s a LAN across the street every year, and I figured I could edit and post content without relying on my laptop, play some games, and just have a nice space to myself when I needed a break from the madness of PAX. This was the perfect opportunity to address some of the issues I had with the system.

water-cooling-2

First up was the tubing. One of the issues I had with the tubing was how it ran all over the case and was visually too messy for my taste. To resolve this I wanted to move the radiator to the front of the case. Despite there seeming to be enough room, I just couldn’t manage to cram the radiator, fans, and reservoir all in the hard drive bay. My backup plan was to keep the radiator in the front, but rotate it so the inlet and outlet were now in the front of the case. This allowed me to make shorter, more direct lines between components and get those parallel 90-degree bends I wanted so badly.

Swapping the radiator back and forth was a massive pain in the ass because every time I would have to remove the bolts holding the fans on, reconnect them, realize I put them on the wrong way and have to do it again. Eventually, I got it all sorted and everything was fine.

Except when I realized I ran the water loop backward through the water block. Luckily that was an easy fix; I just had to flip the block upside down.

Now that the radiator lines are in the front of the case there is no room for a hard drive. The 3 TB hard drive now lives in the ever-cramped basement with all the power cables. Seems happy enough, but in the future, I’d really like to mount it at the bottom of the hard drive case under the reservoir.

The last thing I changed, which isn’t pictured, is swapping the OCZ SSD and the Hue+ controller. I had originally put the SSD in first and didn’t think about the aesthetics when I put the Hue+ controller in. The position of the massive black box started to wear on me over time, so I figured now would be the best time to swap all that stuff around. Now it looks much, much better.

I ordered a set of clear acrylic cable combs for the GPU wiring. It’s not supposed to arrive until Saturday, so if it doesn’t show up early I’ll have to install them at PAX. The last thing I ordered, which should be here the day before PAX, is a plastic scratch remover kit. My plastic case window has been through a lot, and I’d like it to look new before putting it on display for all to see (They’re rather prominent in the video below). Hopefully it gets here, hopefully it works, hopefully my loop doesn’t have a meltdown like in the dream I had last night.

Because pictures are kind of boring these days, here’s a short build video I took and edited in a hurry. There are chunks missing because the battery in my camera died, but all you’re missing is me struggling with the radiator. When I do my next rebuild I’ll tear the system down to bare components and do a complete build video from the ground up.