Indiana Jones and the Great Circle PC: impressive performance – but care is needed with 8GB graphics cards

Indiana Jones and the Great Circle PC: impressive performance - but care is needed with 8GB graphics cards



On first boot, the PC version of Indiana Jones and the Great Circle presents as a simply great PC release. Getting into the action, there’s no shader compilation stutter and no obtrusive traversal stutter. It does require a graphics card with hardware-accelerated ray tracing and there is no fallback to a software alternative, but that’s OK – performance is not a problem in this title. Machine Games has gone one step further, embracing future tech with ‘full ray tracing’, which renders all lighting via RT, but unfortunately we cannot talk about this today as it’s only enabled on December 9th… which is a bit of a disappointment for high-end PC users who bought in via early access. Still, what you get is still an excellent PC release, not so much limited by graphics power but rather the VRAM allocation of your GPU.

Worries about running Indiana Jones were somewhat blown out of proportion bearing in mind that Machine Games got the game running on Xbox Series X and even Series S, so there is scalability here. That said, some of that scalability is not available to PC users. The game’s key RT technology is global illumination, handling indirect lighting and shadowing. Current-gen consoles have very mediocre RT performance, so Machine’s solution here is simply to degrade it to lower-than-low settings, removing many objects from the GI pass. Critically, in the jungle, much of the vegetation is missing, leading to bounce light from the sun manifesting where it shouldn’t, so objects glow and don’t look properly situated in the environment.

PC may lack those lower than low settings, but the minimum preset still looks a lot more natural, with occlusion and darkness where it makes sense and bounce lighting working as it should. Even so, console ‘lower than low’ could have helped less capable RT hardware, including handhelds.

Our video review of Indiana Jones and the Great Circle on PC, focusing mainly on performance optimisation and running the game well on GPUs with varying levels of VRAM.Watch on YouTube

Xbox also compromises on anisotropic filtering, which is closest to PC’s medium, but actually looks worse due to a disappointing implementation of hardware variable rate shading (VRS). Volumetrics on Series X also look degraded to PC’s low with less detailed beams and more flicker in motion. Shadows are closest to the medium setting, which reduces the amount of geometry added to the shadow maps, so entire objects lose shadows, particularly indoors. Basically, there are a number of shortcuts and quality degradations used to get the game running well on Series X at very high resolutions of around 1800p. In theory, you can use these compromises to boost performance on PC, but in actual fact, the game runs fast enough already, even on hardware of RTX 4060 class. Yes, even though it’s using RTGI in excess of console quality.

Indiana Jones and the Great Circle isn’t really compute limited then, but the VRAM allocation of your GPU is crucially important as the game looks built around a 12GB baseline. Users with 8GB or 10GB cards may run into issues with the texture cache setting. So, for example, at 1440p, a 10GB RTX 3080 can’t use the ultra texture cache option without massively degraded performance. Therefore it stands to reason that 8GB users will have an even worse time of it. However, context is king. You need to understand how the texture cache works but dropping from, say, high to medium doesn’t uniformly degrade texture quality – it simply adjusts the size of the cache. The lower the cache size, the more aggressively the game shunts texture data in and out of RAM.

Texture Cache 8GB VRAM* 10GB VRAM 12GB VRAM
1080p Medium Ultra Very Ultra
1440p Low High Supreme
2160p/4K Nope Medium Ultra

* Optimised settings are max across the board, texture cache apart. 8GB GPU users are also advised to use medium shadow quality and low hair quality to maintain optimal performance.

There’s a basic rule of thumb here: the game prioritises textures close to the player, and the more VRAM you have, the higher the texture cache setting you can feasibly use. The higher the setting, the further out the higher detail art presents within any given scene. I’ve put together a nice table on this page that gives realistic texture cache settings for the VRAM your card has – but 8GB really is the minimum and you also need to be careful of the shadow setting. I do not recommend higher than the medium shadow settings at any res as this also taps into VRAM utilisation. Low shadows look awful, but medium is fine. Running at 1440p with an 8GB GPU, I’d also recommend the low quality hair setting as anything higher will cause issues in certain scenes.

Ultimately, follow the guidelines in the table and you should be OK. To put this into perspective, an RTX 4060 only has 8GB of VRAM but I can run the game at 1440p resolution using DLSS in balanced mode and with the texture cache, shadow and hair quality settings suitably adjusted, I can the game locked to 60 frames per second very easily. Still worried about running compromised texture settings? Check out the video as I go into depth on how the cache works – for much of the time, even the lowest texture cache setting can still look fine. At high settings and above, I would say that textures look pretty much the same most of the time. And that is really all I have to say about optimisation: the key to good performance in this game is making sure you are keeping within VRAM budgets and that is about it – those are the optimised settings.

This is a great PC release, but there are some issues and suggestions I’d like to talk about. The game’s level of detail is somewhat low out of the box, even on max settings, so perhaps it is no wonder that the game runs so fast, but the pop-in is noticeable. The developer console lets you change this, so typing in ‘r_lodscale 5’ removes pop-in and LOD changes pretty much altogether. It comes with a GPU cost in terms of performance and memory but I think it’s a good option to add to the game for higher end users. in memory and performance, but I think it would be a great option for the devs to expose in the menu for high end GPUs.

Shadow quality also needs higher levels of fidelity as even the ultra setting presents with low resolution shadows – it just looks bad. A ‘supreme’ setting for shadow maps would be helpful, though in all likelihood, the upcoming ‘full ray tracing’ option will solve this, albeit with a huge GPU requirement. The third issue I have is that the game has dynamic resolution, but it only works with TAA and not with DLSS. DRS in idTech in the past has worked with DLAA so I would like to see it also working with it here. Additionally, there’s only DLSS or TAAU upscaling – FSR and XeSS should be supported.

A few more bugbears: while there’s no shader compilation or traversal stutter, checkpointing causes its own kind of stutter and in areas like the Vatican, it’s there consistently and becomes annoying. Honestly, I wish the game had an option to reduce checkpoint saving as these little blips can prove annoying. Lastly, cutscenes need some improving: animations can periodically run at 30fps for some reason, while camera cuts have extended frame-time spikes of up to 100ms. Generally, cutscenes should run much more smoothly than they do now.

fbq('init', '560747571485047');

fbq('track', 'PageView'); window.facebookPixelsDone = true;

window.dispatchEvent(new Event('BrockmanFacebookPixelsEnabled')); }

window.addEventListener('BrockmanTargetingCookiesAllowed', appendFacebookPixels);

Source link