Jump to content
Sign in to follow this  
Veyrdite

Frame stutter issues

Recommended Posts

TL;DR: this game is really nasty to play on my computer.  Vanilla rene and IA (scripts 4.x) don't seem to have this issue.

I played APB online this morning and suffered horrible frame-stutter issues across the few maps I was there for.  These are the first proper (player-filled) APB matches I've ever played, so I'm not sure if this is a new problem (due to scripts 5 overhauling the renderer) or not.

I'm running my games on Linux using Wine, so I'd like to see if any Windows players are suffering similar issues.  Please comment if you think you are seeing the same thing in the new RA:APB or if you have a better/perfect experience instead.

Overview

Even when the game is reporting 90FPS and higher my game "feels" like it is around 30FPS.  Stuttered camera movement and frames, input delay when typing, etc.

Settings played with

All of the W3D quality sliders at minimum, vsync on or off (my graphics drivers triple-buffer regardless), fullscreen vs borderless, extra shader toggle boxes (do these do anything?) and shadow toggle boxes turned off, MSAA at 0x.

Background information: the difference between frames per second (FPS) and frametimes (FT)

I'm going to assume you know what frames are and how they are used to fake motion on your computer screen.  It's the same concept as frames in videos/film, so have a read about  animation if you are not familiar.

I'll simplify things down into two types of frame:

  1. Rendered frames.  These are the frames your game makes.
  2. Displayed frames.  These are the frames that display on your monitor every time it refreshes.

They are not the same thing.

Most monitors refresh exactly 60 frames per second (this timing is typically accurate and reliable to better than 0.01%).  This means that once every 16.667 milliseconds they can 'refresh' to display a new frame, or keep displaying the old frame for longer, but that's all they can do.

Your game and computer render frames.  Each frame takes a different amount of time to render, depending on the complexity of what is being drawn and the algorithms (games & graphics drivers) being used.  If you are facing a wall then a frame might only take 0.53 ms to render, but if you are looking at a battlefield full of 100 players and tanks then a frame may take 80ms or more to render.

For good quality "motion" to be perceived by players on their screens, each refreshed frame must be:

  1. Unique. ie no frame lasts more than one refresh cycle.
  2. Equally spaced in the game-world's timing.  Ie each frame should show the same amount of time elapsing in the game, not one frame where things move further/faster than in the others.

Bad quality motion makes players feel uncomfortable (headaches are common, motion sickness less so), makes it harder to aim and generally makes it much more difficult to enjoy the experience.

 

There are two common ways of measuring motion quality in games: Frames Per Second (FPS) and Frame Time (FT).

FPS counts how many frames are rendered every second.  Typically you want 60FPS or higher, so that (ideally) your monitor has a nice new unique frame to display every refresh cycle.  Higher than 60FPS "feels" better most of the time  (due to better physics & input timesteps -- outside the scope of this intro), but can sometimes lead to a 'worse feel'.  Many setups cap your framerate to a maximum of 60FPS due to a feature called vsync (again outside the scope of this into).

FPS measurements are fundamentally flawed in a lot of situations, because the frames being refreshed on a monitor are not always the same frames being rendered AND FPS only informs of an average result, which is not what humans experience. 

Take for example:

  • One frame is rendered and then displayed for 0.1 seconds.
  • In the remaining 0.9 seconds 80 frames are rendered (and some of them are displayed).

Your game will report this as 81FPS, but it will look and feel awful to play. Every second the game will stutter noticeably, which is severely uncomfortable if you are rotating your camera with the mouse (ie aiming a gun).

Now imagine a second scenario:

  • During monitor refresh 1/60: (ODD) two game frames are rendered, the second one gets chosen to display because it's newest.
  • During monitor refresh 2/60: (EVEN) no frames are rendered, the old frame still displays.
  • During monitor refresh 3/60: (ODD) two game frames are rendered, the second one gets chosen to display because it's newest.
  • During monitor refresh 4/60: (EVEN) no frames are rendered, the old frame still displays.
  • During monitor refresh 5/60: (ODD) ...
  • During monitor refresh 6/60: (EVEN) ...
  • ...

Here the game will report 60FPS, but you will be seeing 30FPS of frames in real life.  Half of the frames the game creates are thrown away, the other half display for two monitor refresh cycles each.  I've had some games (eg NFS:MW 2005) that do this almost like clockwork on my system.

 

A better measure of animation is frame timing (FT), or "how long did each frame take to make?".   Ideally you want all of your frames to take 16.667 milliseconds (1/60) to render so that every monitor refresh cycle shows a nice, new evenly spaced frameIf the frames take too long then your monitor will display the old frame for another refresh cycle (bad).  If the frames take a very variable amount of time to create then the physics (eg player/tank movement) on your screen will appear to "jitter".  A mixture of both of these problems occurs for every game you play in different amounts.

Shorter frame times than 16.667ms will increase jitter, but this tends to be one of the smallest source of jitter (or stutter) in games and is mostly not noticeable.  Have a read of my analysis of stutter problems in "The Crew" here (inline pictures have been moved to the bottom of the post) to get an idea of how bad other sources of the problem can get.

Vsync (outside the scope of this intro) can help enforce strict and even frame timings, but under certain situations it makes the play experience worse.  I believe Rene defaults to triple-buffering by default, which fixes tearing but doesn't clamp timings as strictly as vsync does.  Some(?) W3Dhub launcher games appear to have vsync turned on by default.

 

Footage of the problem

Note the FPS and FT graphs at the top-left:

There is a lot of FT variation -- you can see even the 30FPS video has notable stutters. 

Note the 'odd frame even frame" up down up down timing patterns that appear occasionally, similar to the second bad example of frame timings I describe in the section above.  This (edit) typically means that perfectly good rendered frames are being thrown out and the remaining frames are being displayed for multiple refresh cycles.

Also note that the variations in timings appear to settle down in the last few seconds of the video when I face away from the bases.  This makes me think that view-based culling is taking over and that VIS might not be working.

(N.B. I have full-match videos showing similar graphs, but they're at half-resolution and too big to upload here.  Toggling my video recording on/off did not noticeably affect the patterns in the graphs.)

 

 

For a comparison: here is a video of me playing Interim Apex (scripts 4.x) on a map that I don't think has any VIS.  Overall the FT stability is better, but you can see it gets worse when I look toward the enemy base (past the airstrip and into the fog):

 

Misc: system specs

  • Processor: i5-4460
  • Graphics: Radeon HD 6850
  • Graphics driver: in-kernel radeon (default option for Linux users)
  • OS: Linux  4.19.30_1 SMP PREEMPT  + wine-4.6

Other games like IA get an FPS from 50 to 300, depending on the map and what is going on.  99% of the time it's above 60 and comfortable, dramatically more so than RA:APB was this morning.

Questions

  1. VIS issue in scripts 5?
  2. Problem faced by Windows users too, or just my Linux/wine setup + the new d3d11 scripts 5 renderer?

 

Edited by Veyrdite

Share this post


Link to post
Share on other sites

minimum graphics settings cant help you because you have too weak CPU for APB, windows inside linux doesnt help too. renegade needs very high CPU single core performance. on most complex map in IA-winter assault with ~30 players and ION storm near shield generator (that bubble increases frametime) i have low FPS on ryzen 7 2700x lol
Im sure even intel i9 9900K@5GHz cant handle this map at stable 144FPS

Share this post


Link to post
Share on other sites

Correction all my map have VIS or else I will have terrible FPS for majority of Renegade player's that I know of their PC spec ( some even as old as Pentium 3 ) :v

Based on what Dblaney said Renegade is capable of going beyond 60fps but is best not to as if your fps is higher than the server it may cause some de-sync with the server ( server fixed at 60? ) or at least how I interpret Dblaney message.

APB uses DirectX 11.1 I think

Renegade/IA uses DirectX 9.0c.

In addition, IA texture size are restrict to 512x512 and I refuse to go beyond it, like 1024x1024.

Yes HUGE difference in directX which may be game breaking since Renegade is single threaded. We need the processor that will make single process program to be able to run by multiple cores.

 

 

Try to watch all the way.

 

Won't be surprise if Microsoft ray-tracing api will use processor cores instead of gpu one :P

Share this post


Link to post
Share on other sites

> Based on what Dblaney said Renegade is capable of going beyond 60fps but is best not to as if your fps is higher than the server it may cause some de-sync with the server ( server fixed at 60? ) or at least how I interpret Dblaney message.

I regularly play in the 100-200 FPS range, never thought it was an issue.

Is DB talking about physics timesteps here?  The contributions of these (1/60th of a sec vs an arbitrary 1/framerate of a sec) are probably minimal compared to the effects of ping.  Being 250msec behind in your simulation and having different inputs to the server (your player's input) are going to create a lot more divergence than a smaller physics timestep will do.  Shared "lockstep 60FPS" physics is only achievable over LANs, not the internet.

 

> In addition, IA texture size are restrict to 512x512 and I refuse to go beyond it, like 1024x1024.

Thankyou :)  This will mainly affect laptop/integrated vs dedicated users: ie people who have limited VRAM or bad RAM<->card transfer techniques.  I appreciate being able to play games on my laptop.

 

> We need the processor that will make single process program to be able to run by multiple cores.

We're unlikely to ever see a processor that turns a single-threaded job into a multi-threaded one, it's too hard of a problem to solve on paper let alone in hardware. Maybe if quantum computing can quantum-teleport/syncrhonise states between multiple cores & mirror memory banks in some weird ways that might perhaps help, but that's a dream.

 

> Yes HUGE difference in directX which may be game breaking since Renegade is single threaded

I don't follow here.  Is there some inherent performance loss from using a newer directX that can never be worked around, due to threading?  Or is this just a general statement (unrelated to the DX implementation change)?

 

 

 

Edited by Veyrdite

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...