Everywhere I read online, people said custom HUDs made essentially no difference in FPS, and that transparent viewmodels would decrease your FPS. However, I struggled to find real (or up-to-date) benchmarks for it. So, I decided to test it!
The Test Setup
I run a pretty unique setup, but I imagine most people will still have similar results. I’m on Pop!_OS 22.04 installed on an SSD (but the game is on a HDD), running the mastercomfig medium-high preset (with a few overrides: sprays on, hud player model on, and outlines on high) on a 1440p monitor, with an RTX 2070 Super, a Ryzen 5 3600, and 16GB of 3200MHz DDR4 RAM. I benchmarked using the benchmark guide in the mastercomfig docs (although note that all timedemos created before 7/25/2023 were broken with the TF2 update on that same day, so you have to revert to a beta version of TF2 to run the current one linked on mastercomfig’s site). Every time I changed a setting, I would do an initial runthrough of the demo, and then record the results for the next 3 demos. After installing a HUD, I never changed any of the HUD settings.
The Results
On the default HUD:
- Run 1: 187.11 fps ( 5.34 ms/f) 31.199 fps variability
- Run 2: 187.78 fps ( 5.33 ms/f) 30.349 fps variability
- Run 3: 184.47 fps ( 5.42 ms/f) 29.962 fps variability
- Average: 186.45 fps with 30.503 fps variability
With FlawHUD:
- Run 1: 188.19 fps ( 5.31 ms/f) 30.886 fps variability
- Run 2: 189.17 fps ( 5.29 ms/f) 30.780 fps variability
- Run 3: 188.29 fps ( 5.31 ms/f) 30.392 fps variability
- Average: 188.55 fps (+1.13%) with 30.679 fps (+0.58%) variability
With BudHud:
- Run 1: 185.13 fps ( 5.40 ms/f) 31.469 fps variability
- Run 2: 183.33 fps ( 5.45 ms/f) 30.028 fps variability
- Run 3: 186.13 fps ( 5.37 ms/f) 30.122 fps variability
- Average: 184.86 fps (-0.85%) with 30.540 fps (+0.12%) fps variability
I also tested FlawHUD and BudHud with their built-in transparent viewmodel features to see how they performed (this time the performance % will be relative to the custom HUD without transparent viewmodels enabled)
FlawHUD with transparent viewmodels:
- Run 1: 188.87 fps ( 5.29 ms/f) 30.523 fps variability
- Run 2: 189.13 fps ( 5.29 ms/f) 30.972 fps variability
- Run 3: 186.36 fps ( 5.37 ms/f) 31.342 fps variability
- Average: 188.12 fps (-0.02%) with 30.946 fps (+0.87%) variability
BudHud with transparent viewmodels:
- Run 1: 186.83 fps ( 5.35 ms/f) 30.397 fps variability
- Run 2: 183.73 fps ( 5.44 ms/f) 30.183 fps variability
- Run 3: 186.33 fps ( 5.37 ms/f) 30.290 fps variability
- Average: 185.63 fps (+0.40%) with 30.290 fps (-0.82%) variability
Lastly, I decided to test how much my overrides (hud player model and outlines) made a difference in FPS. I unfortunately didn’t record the detailed results, but outlines was around a ~1% decrease, and HUD player model was a surprising 3.7% decrease (194.5fps vs 187.5fps).
I use those two settings because they give a slight competitive advantage (outlines let you detect spies a bit easier when respawning, and HUD player model lets you see the weapon you’re holding while you’re disguised).
My Takeaways
Anything less than 1% is probably just typical system variation. To my surprise, neither custom HUDs OR transparent viewmodels caused much more than a 1% difference in baseline FPS. This is consistent with what others have said about custom HUDs, but is inconsistent with what others have said about transparent viewmodels. However, it’s possible transparent viewmodels only affect GPU performance, but I’m likely CPU-bottlenecked, which could explain why I didn’t see a performance difference. If you’re GPU-bottlenecked, your mileage may vary.
Additionally, the 3.7% FPS decrease with the HUD player model enabled is certainly making me question if it’s worth the extremely small competitive advantage it provides when playing spy…
If you notice any errors or have any suggestions, please let me know!