Hi guys,

So for the last few weeks I've been working on a platformer that relies heavly on having precise control over movement.

My problem is that if you were to have a lower framerate (let's say 30fps) instead of the normal 60fps, which might happen for performance issues what happens is that it jumps higher.

For what I understand I need to "normalize" the movement with delta, so i created a var NORMALIZE = 60 (so it mirros the jump that I've been working with) delta I then apply it to the gravity (var gravity = gravity_acceleration NORMALIZE) and add that as an acceleration (each frame that is added to the velocity)

For the jump I just have: if is_jump and can_jump: velocity.y = jump_acceleration that overrides everything in the end. However it's still not working properly as I mentioned before, any ideas why? Thanks in advance.

After some debbuging printing the velocity and position each frame I think I know where the problem is but have no idea how to solve it:

See if you have 30fps it will run the code less times so: The acceleration will decrease the velocity less and the velocity will decrease the position less times. But if you have 60fps each frame the variation of position is not constant so in 2 frames it will change in a different way than in 30 fps where it only changes once at a constant.

Here's is a doc with the values of position and velocity printed each frame at 60 and 30fps, because my explanation was a little weird.

https://docs.google.com/document/d/1_NfgNd98pzk0VptK60uI1q2a_3tRsikeAvBWw1QUPZI/edit?usp=sharing

Welcome to the forums @Odion!

I'll admit that I'm not totally sure I am understanding the issue correctly, but I think I can help a bit.

delta is the time it takes to render a single frame, in real-world time. In Godot, delta is in seconds, where 1 is a second and 0.01 is one hundredth of a second. If you multiply a value by delta, you are, essentially, making it bound to real-world seconds. For example, if you have the following code:

var tmp = 0
const VALUE = 20

func _process(delta):
	tmp += VALUE * delta

Then tmp will increase by 20 every second. This is regardless of the frame-rate when normalized over a period of time (more about that later). So, if you are trying to make a game frame-rate independent, you multiply by delta because then it becomes linked to the actual time that passes between each frame, it goes from "X value per frame" to "X value per second".

Where the precision problem comes in is because delta is the time (in seconds) it takes between frames (for_process). If you are running at a constant 60 FPS, then delta will be 1/60 or 0.0167 each time _process is called. If you are running at a constant 30 FPS, then delta will be 1/30 or 0.033.

Side note: Keep in mind that FPS is not always constant, so the actual numbers would vary a bit. However, over a long enough period of time and then averaged, it should equal a number very close to 1.

When multiplied by a value, it may seem like 30 FPS is going to add more to the value, and that is technically correct. On each individual _process call, more "stuff" will be added then a 60 FPS _process call. But if you take all 30 _process calls on a 30 FPS and compare it to all 60 _process calls on a 60 FPS, you should find that the values are exactly the same, as a single second of real-world time has passed.

0.0167 * 60 = 1.0 # roughly
0.033 * 30 = 1.0 # roughly

# in reality, its a bit more like this:
# (13 - 30 FPS frames + 7 - 28 FPS frames + 10 - 32 FPS frames)
(0.033 * 13) + (0.036 * 7) + (0.031 * 10) = 1.0 # roughly, over time

For time dependent actions, like player input, the difference in time can cause precision issues, which is what I think you are seeing and trying to work around. While over the course of a single second its all the same, if you have a sub-second action that the player needs to react to, the FPS can make a difference.

To fix this, you likely will need to add "coyote time" for jumps and similar actions. "coyote time" is where the character can jump even in mid air, as long as they left the ground just a few seconds (or rather, fractions of a second) ago. Something like this, for example:

var timer = 0
var can_jump = false

# they can jump 0.1 seconds after leaving the ground.
# this should allow for jumping with relatively good
# precision even at 10 FPS (1/10 = 0.1)
# You can adjust this value to be relative to your minimum
# target frame rate.
const MAX_TIME = 0.1

func _process(delta):
	# We'll assume is_grounded is a boolean
	# that is true when the player is on the ground
	if (is_grounded == true):
		timer = 0
		can_jump = true
	else:
		# instead of just setting can_jump to false,
		# we use a small timer, giving "coyote time"
		if (timer <= MAX_TIME):
			timer += delta
			if (timer >= MAX_TIME):
				can_jump = false
	
	if (Input.is_action_just_pressed("jump")):
		if (can_jump == true):
			# Jump code here!
			
			# Then, set can_jump to false and the timer to MAX_TIME
			# to avoid double jumps
			can_jump = false
			timer = MAX_TIME

The timer compensates for the possible differences in delta, allowing certain time sensitive inputs (like jumping) to happen even if the FPS is dramatically different. Its a bit of a cheat, because you are technically jumping on air, but it helps give the exact same game feel regardless of the FPS of the computer it is running on. Someone playing at 10 FPS should be able to make all the same jumps that someone playing at 60 FPS makes (at least in theory)

For things that are not as time sensitive you can probably just get away with taking the input as it happens and not using any sort of timer. This is why movement, for example, is often not using a "coyote time" timer system and instead is just processed as soon as it comes, because (for the most part) the difference in movement over 1/10 of a second compared to 1/60 of a second is minor enough that the player will not suffer consequences in-game if they get the timing wrong.

Hopefully what I wrote above helps explain a bit! I wrote it all in one pass, so there may be mistakes and the like. (Also, apologies for writing a book :lol: )

Thank you for your answer! Firstly, I just want to clarify that my code runs on physics_process and not process. Secondly, I've implemented coyote jump but I think in a wrong way, so I'll try your way. Also, I'm not sure if that's the problem, because my problem can be seen everytime a velocity is going "against" an acceleration (gravity), let me try to explain it better:

If I'm running on 60fps, it will do the math to calcute velocity and position changes more times, that in 30fps; however as you stated in 30 fps it will be a larger difference (thats why the delta is their) to compenstate the fact that it happens less times.

However let's say in a certain point in time you have --> position.y = 779.9 and velocity.y = -3500 (I'm using examples from a test in my game), they both start at that values, 30fps and 60fps and both finish with a velocity.y = -3100. In the case of 30fps it takes every frame 400 to the velocity (gravity) and in the case of 60fps it takes 200, but it runs two times more.

So there's one middle frame between the two that I gave the velocity values (in 60fs) the one with velocity.y = -3300. Because the value has a middle "step" in the 60fps case the variation of position.y in the end of the frames, that is equal to the end of one frame in 30fps does not quite match so:

60 fps --> vel.y = -3100(after 2 frames) --> position.y = 666.5 30fps --> vel.y = 3100 (after 1 frame) --> position.y = 663.2

Seems like an insignificant amount but after an entire jump it adds up to, when it reaches the top, having a difference of almost 40 pixels (take in account that is roughly the size of the character), making some jumps possible for 30 fps and not for 60fps.

I know it's probably a weird issue, but I'm really not seeing a way around.

Oh, I think I see what you are saying now. Yeah, that is strange.

You might be able to get around it by multiplying by the difference in FPS:

var current_fps = Engine.get_frames_per_second()
# divide by your target FPS
var fps_factor = 60.0 / current_fps
# then multiply with velocity (maybe?)
vel *= fps_factor

But I'm not positive that would help, to be honest. It might fix the 30 FPS issue, but I worry it would have the opposite effect when over 60 FPS.


Though if you are using _physics_process, I think it should have a constant FPS regardless of frame rate. In the "Physics process callback" section, it mentions:

In order to avoid this inaccuracy, any code that needs to access a body's properties should be run in the Node._physics_process() callback, which is called before each physics step at a constant frame rate (60 times per second by default).

But if that is the case, then the FPS of the display (60 vs. 30) shouldn't matter. Are you retrieving the values in _physics_process?

Yes, I'm using the _physics_process(). When I say I set to 30fps I mean go to Project Settings --> Physics --> Common --> Physics Fps and set it to 30. I do this because in my head a PC that can't run the game at 60 fps would run it at lower frame rate. So basically I'm not changing from the display but from the physics. I guess my question is, if you can't run the game at 60fps, performance issues, will that affect the physics fps?

Btw, thanking for taking the time to explain I'm still pretty new to the engine.

2 years later