Blair Witch is the latest first-person psychological thriller game from Bloober Team. After receiving its first performance patch, it’s time to benchmark this new first-person game and see how it performs on the PC platform.

For this PC Performance Analysis, we used an Intel i7 4930K (overclocked at 4.2Ghz) with 16GB of DDR3 RAM at 2133Mhz, AMD’s Radeon RX580 and RX Vega 64, NVIDIA’s RTX 2080Ti, GTX980Ti and GTX690, Windows 10 64-bit, GeForce driver 436.15 and the Radeon Software Adrenalin 2019 Edition 19.8.2. NVIDIA has not included any SLI profile for this title, meaning that our GTX690 performed similarly to a single GTX680.

Bloober Team has implemented a few graphics settings to tweak. PC gamers can adjust the quality of Anti-Aliasing, Shadows, Textures, SSS, Resolution Scaling and Lens Flares. There are also options for Motion Blur, SSAO, SSR and Separate Translucency. Moreover, there is an FPS cap option (though the game also supports unlocked framerates).

In order to find out how the game scales on multiple CPU threads, we simulated a dual-core and a quad-core CPU. Blair Witch uses DX11 and appears to be mainly single-threaded. Moreover, the game relies heavily on the memory frequency, something that may bottleneck older PC configurations. Still, and despite these shortcomings, even our simulated dual-core system was able to run the game with constant 60fps. Therefore, in order to enjoy this game at 60fps, PC gamers will not need a high-end CPU. On the other hand, those targeting 120fps will have to use a modern-day CPU.

Despite its somehow “okay-ish” CPU requirements, Blair Witch requires a high-end GPU in order to be enjoyed, even at 1080p. Not only that, but the game under-performs on AMD’s hardware. We know that some Unreal Engine 4 games perform horribly on AMD’s hardware, however in this particular game the GTX980Ti was able to match the performance of the AMD Radeon RX Vega 64.

At 2560×1440, the only GPU that was able to provide a smooth gaming experience was the NVIDIA GeForce RTX2080Ti. As for 4K, NVIDIA’s most powerful graphics card was unable to offer a smooth gaming experience.

We were unable to hit 60fps even when we used the following custom settings. The only way we could achieve a 60fps experience in 4K was by lowering the Resolution Scaling to “Half”.

Now we wouldn’t mind such high GPU requirements if the game justified them. However, and while the game looks great for the most part, it does not. Bloober has used high resolution textures and there are some cool environmental effects (like God Rays). Still, there is nothing here we haven’t seen before in other triple-A games. I know it’s not fair comparing a small team with a triple-A studio, however, Blair Witch currently suffers from major optimization issues. Not only that, but due to the lack of meaningful graphics settings, the game cannot properly scale on older GPUs.

It’s also worth noting that the game uses an awful Depth of Field filter. Thankfully, PC gamers can disable it (as well as Chromatic Aberration) by editing the engine.ini file. All you have to do is open that file and add the following lines.

[SystemSettings]

r.DepthOfFieldQuality=0

r.SceneColorFringeQuality=0

All in all, Blair Witch needs major optimization and performance fixes/patches. Thankfully, the game does not require a high-end CPU for gaming at 60fps, though it’s still limited by the DX11 API. As said, the game requires high-end GPUs, does not justify its high GPU requirements, and does not scale on older graphics cards. Here is hoping that Bloober Team will be able to resolve the game’s performance issues via post-launch updates.

Enjoy!