Bevy and Path Traced lighting soon enough!!


I decided to rewrite with the bevy engine,

Reasons:

To have a renderer that I can combine with raytraced voxels for

UI, Non voxel entities, Partials, Skyboxes, etc

Why is this difficult? Because the bevy rendering ecosystem is not very well explored, I will get back to you as soon as I figure out how to incorporate the system. Why bevy? because bevy uses wgpu, which is what I'm using for rendering, Why wgpu? because it supports vulkan, opengl and more out of the box, and because its a rust game engine!

Does this make the project less interesting because "game engine"? No not in the least, it is just as difficult.

Comments

Log in with itch.io to leave a comment.

(4 edits) (+1)

Interesting, I've heard of Bevy but never looked at it before.

There's nothing wrong with building on top of an existing game engine.  My first voxel game was written on top of Urho3D.  I never finished it because I decided to learn Vulkan and write my own game engine on top of it.

I don't know Rust, but I've read good things about it.

I look forward to seeing how your game progresses!

(2 edits) (+1)

Hey, thanks! I won't be using all of bevy, I'm going to write my own voxel renderer, I'm not a fan of how bevy does it (default rendering). So i'll just be writing my own and integrating it with the rest of the system. Its a neat game engine this project was made using only the ecs side of bevy no default renderer or other libs at all: https://store.steampowered.com/app/2198150/Tiny_Glade/ Kinda similar to what I'm doing, but with voxels.

(2 edits) (+1)

Hopefully you won't run into the problem I did with getting the raymarched voxel rendering to line up with raster rendering (triangles, lines, points).  The raymarched voxel rendering also has to write to and/or read from the z-buffer.  I eventually figured it all out.

In short:  The raymarched voxel rendering shader has to receive both the same view projection matrix and inverse view projection matrix via a uniform buffer object as raster rendering does.  The inverse view projection matrix is used in calculating raymarched voxel rendering ray origin and direction vectors and the view projection is used in calculating z-buffer writes and/or reads based on ray to voxel hit.

If you run into any problems in this area, let me know because I may be able to help you resolve this issue.

Tiny Glade is cool looking and definitely shows what Bevy is capable of!

(1 edit) (+1)

P.S.  Technically, fragment shaders cannot read the z-buffer due to limitations imposed by GLSL, but fragment shaders can write to the z-buffer.  Given that detail, the raymarched voxel rendering pass should be done as the first rendering pass so that triangles, lines, point, etc. z-buffer clip against voxels rendered.

At some point I'm going to have to implement multiple raymarched voxel rendering passes for rendering moving voxel objects (falling trees, rotating things), so I will have to introduce a programmer defined z-buffer into my code to allow for clipping of multiple voxel rendering passes against each other.

(+1)

Indeed, If I run into any trouble I will ask :D I think I'll probably do the voxel raytracing first, and just have the meshing later, I'll have the voxel raytracer write to the Depth-Buffer and then just have the rasterizer compare depthbuffer distance and check if it should render that particular pixel or not. But I'll be doing all raytracing in a compute shader which will fill a texture and have that texture be read by the fragment shader later.

(+1)

PS: Thats how I plan to get GI, I'll just have all the rays check if they hit a voxel and if so add it to a buffer of hit voxels, then just have yet another compute shader shade all hit voxels, the result of which will be per-voxel GI. (not complete GI just area wide not full world GI)

That sounds like a good optimization technique!