- In Remote Space
- In Remote Space
Contrary to popular belief, I am not dead and neither is this project. I just didn't have much time.
To compensate for the lack of updates, here's a pretty video:
It's a brand new region, inspired by the Dallol system.
Lots of work went into the assets, but it was worth it (or at least I hope so). Everything made in Blender.
Now I need to get back to working on the mechanics and reasons for actually playing the game.
If you want to read a bit more about this update, you can do so at my blog! I might be wrong, but I'm pretty sure what you're asking for is logically impossible.
In the second gif in your original post, you can see that the position of the cameras changes when you rotate. Because of that, the image that they see is different, so you can't combine those images.
Here's a quick example of what I mean:
On the left image, the green point will be obstructed by the red point, but once the head turns (the right image), the green point will no longer be obstructed.
You would need to have a set of images for each head angle. Maybe you could get away with a finite number of them and do some blending/interpolation, but I'm not sure how well it would work.- Best Answerset by CrayonApe
All particles will be processed, not only those on the screen. Those not on screen might be culled from being drawn, but it might still have an impact.
If you are worried about performance, it might be best to split the particles into chunks, maybe roughly screen-sized. So instead of having one box that covers the whole map, you would have many smaller boxes. Then, you can enable them based on proximity to the player (by manually checking the distance or by using an Area2D node).
Just gave it a try. Really cool concept and nice execution. A few things were quite confusing though:
- muck. when is it created, what does it do, how to clean it up? from what I've seen, it doesn't really do anything and while the 100$ pods do clean up the sprites, the number on the right doesn't go down. and for the most part I just ignored it.
- heaters. I've tried to set them up, the way the description mentioned, but they didn't seem effective. at least not as effective as standard towers
- does each drain produce money? because each of them displays a yellow
+30
, but it didn't feel like I got more money by placing more drains - what is the purpose of traffic cone and the RimWorld guy?
Overall, the game could really use a decent tutorial. Right now you have to buy things and experiment with them in order to know what they do.
But it's pretty fun, once you get the hang of it.Wulfara
My bad, is misremembered some things.
To have viewports share the world, you need to do it manually.
Here's a basic scene that has that set up:viewports.zip800BWe have viewport
2DWorld
that is not being rendered (because it's not in anySubViewportContainer
) and oneSubViewportContainer
for each player. Players' viewports need to set their 2d worlds to that of2DWorld
viewport. Unfortunately, this can be only done through code.
And because the actual nodes are in a separate viewport, player's viewport can be hidden, without hiding the world for the other player.Here's a tutorial. It was made for Godot 3, but the general principals still hold.
I hope this helps!
- In Remote Space
I did some UI stuff, changed physics engine to Jolt (because built-in had some issues) and finally implemented saving and loading the game.
But most importantly, a new drone:
Took some tweaking to get the physics right, but now it's really fun to fly around! You'll use it to scout new areas, chart maps and probably something else as well ;]
Went a bit more in-depth in the blog: https://blog.loipesmas.net/en/remote_space_devlog/6/
I think the best solution to these problem is to use Viewports.
You would put 2D world in one viewport and 3D world in another viewport and change the visibility of the viewports based on the current mode.
And for split screen you would put just the cameras in viewports, because the world (i.e., nodes, sprites, etc.) will be shared between cameras of the same mode.papalagi
Glad I could help!add a Directional Light with 0 energy, otherwise Godot adds directional light by default
I'm pretty sure that the added Directional Light is only in preview, not when ran or during baking. You can toggle it in the topbar:
in LightmapGI set Environment/Mode to Disabled (this one is quite mysterious to me)
If you have a WorldEnvironment node in the scene, LightmapGI can use its properties when baking (sky color, etc.). Setting it to
Disabled
makes it ignore the WorldEnvironement and only use light from static lights.Weird.
I tried using mobile renderer and it works as well.
Maybe it's a hardware issue. What GPU are you using for baking?This seems like a bug, so you should probably file a github issue.
Lousifr var dampening = clamp(linear_velocity.length() * dampening_sensitivity, 0, dampening_max)
Why do you damp the strength of the suspension based on the total velocity of the car? I haven't watched the tutorial, so maybe it's explained there, but it doesn't seem right.
Also that damp is added to force multiplier, which is also counterintuitive to me.
Fordist = 0.4
andsuspension_rest_length=0.5
the result would be a scalar of-0.1
(ignoring stiffness). But if you adddampening
(which is in range0..50
), the scalar can change direction, i.e., become positive in this case.
Again, maybe it's explained in the tutorial, but it feels off.RigidBodies have axis lock properties, but I'm not sure if it will work in this case. If it doesn't, you can try using one of the availabe joint nodes.
There's also built-in
VehicleBody3D
node, which might suit your needs, without you having to implement those things.You can check UV2 map by selecting the
MeshInstance3D
node and clicking theMesh
menu on the topbar:
(but I see that you're using UV2 in your shader, so it's just a bonus tip)You could also try starting with a simpler scene (one simple mesh and one light) to narrow down the problem space and work from there (if it works).
Baking the lightmaps in Blender is a valid solution (I have done it myself), but it is indeed not practical.
If you figure out what was the problem with the Godot's lightmaps, let us know.If you have set up the LightmapGI properly, then you should see baked lightmaps on the scene.
Double check that everything is correct: lights visible and set to bake mode static, meshes set to global illumination static and have UV2 (you can check that from Godot) and then bake lightmap to a file. If all this is right, then I'm out of ideas.If you want to disable the dynamic realtime lights, you can set them to editor-only. Then they will not be enabled when running the app, but will affect lightmap baking.
You can add varyings from fragment shader to light shader. I don't think going the other way is possible, because (IIRC) light shader runs after fragment shader.
Also, both fragment shader and light shader have access to ScreenUV input.
Generally glTF is the recommended format. Godot supports importing FBX files, by automatic conversion through an external program called FBX2glTF.
Here is more info: https://docs.godotengine.org/en/stable/tutorials/assets_pipeline/importing_scenes.html#importing-3d-scenes
So you can either add that program and continue using FBX files, or manually convert FBX files to other formats (glTF, DAE or OBJ)Lightsheik Is there a way to select which UV map to use in the inspector?
I don't think there is an option for that