Emissive voxels and fancy lighting [Voxel Devlog #19]
Ойындар
Try CodeCrafters for free today: app.codecrafters.io/join?via=...
Scratchapixel's path tracing resource: tinyurl.com/y9e7wn5w
Online demo: github.com/DouglasDwyer/octo-...
Additional voxel models: tinyurl.com/mtk5mj7f
Path tracing looks amazing for voxel scenes! This devlog showcases my game engine's lighting system, which uses path-traced global illumination to simulate ambient occlusion and emissive materials. I discuss the fundamental rendering equation and how I chose a radiance function. Then, I talk about my unique approach to denoising using a GPU-side hashmap.
Music used in the video:
Rosentwig - On My Way
C418 - This Doesn't Work
C418 - Divide by Four Add Seven
Пікірлер: 135
Looks amazing!
@TheDroidsb
7 күн бұрын
Ayyy awesome to see you here 😂
@aspidinton
6 күн бұрын
Yooo it's the vroom vroom guy, pog
@AndrewTSq
4 күн бұрын
And if my ears catched this right, its running on a GTX1660ti :O
0:36 First things first: *the unfettered power of the sun*
@DouglasDwyer
7 күн бұрын
Get yourself some sunglasses 😎
That's so cool that you can do temporal smoothing without smearing the screen because it's per-voxel not per-pixel!
For SVGF denoising, but retaining the per-voxel or per-voxel-face look, you can just remove the depth based weighting function of SVGF and instead weight based on voxel world-space position and voxel normal. Do this weighting both temporally and spatially and you get a neat pixelated lighting look! I'm doing that in my own project and it's actually quite simple to implement, as with voxels you deal with very simple shapes compared to triangles :)
Tech nit: what you refer to as "indirect lighting" is actually still "direct lighting". You are sampling rays to global light + random ray based on your BRDF, but both are still just the "1st hit" contribution, which is still called "direct lighting". "Indirect lighting" is when you continue bouncing the ray based on BSDF, which is 2nd and more bounces. So more proper name of the passes would be "GlobalLightSamplingPass" and "BSDFSamplingPass".
@DouglasDwyer
7 күн бұрын
Thanks for clarifying! You are absolutely right - I have not yet studied computer graphics formally, so it's easy to mix up the terms :)
@lunatikreal1384
6 күн бұрын
BDSMSamplingPass
The higher the resolution the more bit rate KZread grants to a video. If you can't record at higher than 1080p then resize the video to at least 1440p so that KZread retains most of the quality of the original upload.
The optimization is so good that the game runs at better fps than real life camera wow
Wow, really impressive work! Interestingly, a few weeks ago I started work on implementing a per-voxel hashmap into my own engine for lighting. My whole pipeline was basically identical to yours, great minds think alike haha. I have since changed my technique slightly though and have thus far only used it for soft shadows. Excited to see what you do next!
@DouglasDwyer
6 күн бұрын
I'm not surprised to hear that. When I was exploring this idea, your original Doon Engine was one place that I looked for inspiration (since it also leverages per-voxel calculations).
@frozein
5 күн бұрын
@@DouglasDwyer Glad my old engine is still useful! The hashmap approach seems very promising, basically a denoiser+stylizer in one. I think DoonEngine had the right idea but doing the lighting in a separate compute shader was just too slow and memory-intensive.
@frozein
5 күн бұрын
@@DouglasDwyer Congrats on 10k by the way!
This is amazing. It's turning into exactly what I picture when I think "voxel engine"! The fact that this is all running on a 1660ti is just mind boggling, too. One minor nitpick, as a french speaker: "À-trous" is french for "at/toward holes" and sounds a little more like "ah true", not "a truce". Not a big deal though, especially compared to the insane amount of code work you've put in 😂
there is 4th option, to real-time bake lighting values per pixel, with unlimited samples per voxel, yep pre-calculating the voxel radiance, in that you dont actually sample any light real-time, but bake the voxel lighting as fast as possible in some priority order. fully baked scene lighting is an example of this, its the best. and dynamic updates to the lighting make it work like minecraft rtx lighting updates. semi-realtime but the baked lighting is always correct. no de-noising required. more quality image and actual renders. very cool.
@theneonbop
7 күн бұрын
"dynamic updates to the lighting" "bake the voxel lighting as fast as possible in some priority order" Isn't this just effectively the same as what he's doing, per voxel lighting with temporal denoising, where closer voxels (covering more of the screen) get more samples?
@gsestream
7 күн бұрын
@@theneonbop most of it is very close, emphasis here on separating the lighting updates from final render completely. Ie lighting can remain same over all rendered views. Ie fully pre-baked voxel lighting. Zero denoising blurring.
Man you should be proud of this project, you've come a long way and it's looking really good!
I'm not kidding when I say this is among my favorite series. It's amazing what you're able to accomplish
Congrats on 10k!
@DouglasDwyer
7 күн бұрын
Thank you :)
@mohammadazad8350
7 күн бұрын
Don't lie, there is no way he has [insert 10,000! here] subscribers! _This joke failed due to technical limitations: KZread doesn't seem to like this 35,660 digits long number!_
@Zevest
7 күн бұрын
@@mohammadazad8350 I’m not gonna lie I had to read it 4 times to understand it lol.
As soon as I understand who this video is from, it doesn't matter to me anymore, I've already clicked what to watch the video
@DeGandalf
7 күн бұрын
Same. Also one of the few channels which get an instant like, because I already know that the quality of the content and editing will be super good.
Cool video! If anyone wants to learn more about the rendering equation by the way, Acerola made an entire video on it. I believe it's his most recent video about how realistic and stylized graphics all work under the same premises.
I've been watching this come along for years. Your work has been very inspiring. Thank you!
This is awesome! Your work is always inspiring. I had a similar idea for spatial ray reuse that I wanted to leverage for my own voxel engine, though it was texel based since my engine is designed to be more Minecraft like. The fact you got this working is very cool and I can't wait to see where you go with it! You are truly innovating :)
Congrats on 10k! Keep up the great work!
Would love to see some BRDF magic next. Iirc you've got unique per-voxel normals, I'm curious to see how that would look with even some basic fresnel-driven gloss. Also HDR colors + basic tone mapping would make emissive voxels/skyboxes look insane
This really has come so far! Keep up the great work!
Well done, this is really good stuff.
Looking good! Exciting stuff
You could draw the material at blank locations as if rasterizing, and then blend it, and draw at the final bounce as if rasterizing.
Keep it up man! You are reaching something truly awesome!
Awesome! I really love the look of the per voxel path tracing
Congrats on 10k❤
I can't wait to see where this project goes next!
Congrats on 10k subs!
Your perseverance and skills are very inspiring. Keep up the great work!
Very impressive progress over the time you've spent working on this. Keep it up!
LETS GO looks amazing
@ThePC007
7 күн бұрын
Oh, you’re here, too. ^^
This is really amazing !!
this look phenomenal
you are doing gods work here.
Amazing stuff!
Keep up the good work, It really looks amazing!! the lights works very fast and stable on my testing after some little setting tweaks
looks incredible!
Amazing work! Great solutions for denoising, the shot of the tree at night really sells the value of the emissive voxels!
Very nice, lets see paul allens voxel rendering pipeline
So cool! I always look forward to ur videos
@DouglasDwyer
6 күн бұрын
Glad you like them!
Beautiful project
Killer work, man. Your vids are something I always look forward to, keep up the good work!
@DouglasDwyer
7 күн бұрын
Thanks!
Yes! Amezing work!
I've watched most of your videos, you're doing a really good job! I myself am working on a voxel game for about 2.5 years; funny that we started around the same time. Though in my case I'm not necessarily focused on the graphics engine. I went from using polygons to raymarching a few months ago but there's still a lot of work to do on various details of it. My pipeline is quite different from yours. I render the scene by raymarching from each pixel in the direction of the sun; if I hit an object, I consider the pixel to be in shadow, else I calculate the intensity of light based on the normal and the direction of the sun. During this pass I create an image with the unlit colors, the intensity of sunlight, the normals and the positions. Then, I render a sphere around each light source, and within this sphere, I read the position and normal from the previously-generated images and cast another ray - towards the light source - and add the intensity to the light image. I finally combine the light image with the unlit color image. It performs surpringly well even with thousands of light sources, but it does lag on older GPUs when lights are present. Your videos did give me some ideas during the process. Your technique might inspire some more improvements to my lighting system. Keep up the good work and congratulations on everything you've done!
That denoising method is awesome 😁
Just subbed. Amazing stuff!
Not sure I understood half of the things you talked about, but love the look of the game!.
Wow. That's insane!
This is awesome 😮
Your engine continues to impress, but your devlog skill continues to improve, as well.
Amazing Douglas, I'm just a novice game dev youtuber, and I aspire to be like you!!!! Keep up the amazing work on this
In case you haven't watched it yet, you might be interested in the 2021 SIGGRAPH presentation called "Global Illumination based on Surfels"
This is good stuff
Voxel de_nuke looks awesome
Удивляюсь с каждого твоего видео, отличная работа, альтернатив просто не существует! Желаю тебе успехов, в этом нелёгком деле, и очень надеюсь, что когда-нибудь на твоём движке люди начнут создавать игры!
AMAZING !!!!!
mad respect, u program in light mode
Tbh i dont really understand what you explain in your videos but I still watch them and I kinda just sit here in pure amazement, it's so cool. I honestly can't wait to see the finished product
damn voxel pathtracer in rust, this is what I call the CG chad stack. I want too :D
I always wondered if rendering each voxel as a 2D point in 3D space might look or perform any better than rendering cubes. For example something where distant points that would otherwise occupy one pixel are averaged together into a larger, lower density grid, and where points that are close enough to the camera to where you might see between them get expanded to fill the space, maybe with some additional noise or filler texture applied to fake even smaller detail. No idea how well it would work in practice, but it would be interesting to see something like it tested.
Cool!
One thing I noticed is the time it takes for light to affect the surrounding area is kinda slow, great engine and video though.
Genius
i wonder if you'd be able to easily extend the algorithm to do bounce lighting with a second ray from the bounce. with all the techniques you cooked up, i think it would be smooth and look great. you could even have a setting to determine n bounces for computers with very high specs.
could you combine the real time path tracing with some baked lighting to make it even more smoove?
❤
Very impressive work! I'm wondering how well this translates to directional effects like specular reflection or temporally inconsistent effects like moving objects. When I was working on my SSAO shader for OpenMW, I found I could get pretty beautiful SSAO for static scenes with just reprojection and temporal AA, but moving objects caused a lot of problems.
@DouglasDwyer
5 күн бұрын
That's a great question. For specular, I think temporal accumulation wouldn't work, so I would have to cast specular rays during the "direct" (non-path traced) lighting pass. I already do this for sunlight shadows, and the hashmap is still handy as I can cast just one ray per-voxel. I haven't looked at AO or emissive materials with moving objects yet. Maybe it will look weird lol. In that case, I can try to reduce the amount of temporal accumulation that is used.
This is so sick!! I am just itching to get my hands on this engine and play around with it! Speaking of, what do you think you have to get done before you release it?
@DouglasDwyer
7 күн бұрын
I'm glad you like it! Although I will continue to release demos, there's so much to do before the engine is complete. Physics, audio, entities, gameplay, etc. My next goal for the summer is to implement a new physics engine with a proper iterative collision solver.
Basically saying "I've achieved the holy grail of computer graphics" feels *a little* conceited, but that aside, I really like your engine. A lot of these small-voxel engines seem to always try to copy John Lin's work, and concern themselves with outdoor sunlit scenes primarily (which are easier to light), but I appreciate how well yours handles indoor scenes with more varied lighting. Really good work!
And using this make ultimate minecraft :D How big world with LOD can be displayed? could be this large enough to have similar scope to MC?
This looks amazing! What is the performance like?
@DouglasDwyer
6 күн бұрын
You can try the demo for yourself to find out! But on my NVIDIA 1660 TI, I am able to run all scenes at 100 FPS at 1080p. On my Intel UHD 750H iGPU, I need to use a resolution of only 360p, but it runs at 60 FPS.
Have you got a road map? Quo vadis Douglas? At what point does the engine become a finish product? I've been following you for a while and, yes the progress is impressive and kudos to you for not giving up
@DouglasDwyer
6 күн бұрын
Great question! I do have a general idea of what I want to build. The goal is to create a platform where users can create different games, and play games written by other users. The underlying voxel technology should allow these games to leverage building/destruction mechanics in unique ways. The engine becomes a finished product when: - All core game engine functionality (physics, proper building system, input handling, and third party plugins) are fully integrated and supported. - I have built an example game with the engine. I am taking this summer vacation to just work on the engine. Afterward, I'm going back to university for my senior year. My hope is to spend this summer, and the next year at university, finishing the core functionality. Once it's close to completion, I'll build an example game in it and then release the engine for others to build games with as well. In terms of business plan, I haven't made any concrete plans yet. But I am most interested in revenue-sharing models, where developers can create games in the engine for free, and then publish them on my platform.
So to simulate the "shininess" of a surface would you just give the monte-carlo rays a weighting function toward the angle opposite the camera?
@DouglasDwyer
6 күн бұрын
I believe that's correct - that's the "BRDF" part of the rendering equation :)
brush your hair between your fingers instead of using a comb; it works better and makes your hair look less flat. also, comb your fingers through your hair upwards, not sideways
Next video: Hello guys, this is Douglas. Today I re-created the Matrix, but with better graphics.
I was wondering if it would be possible to make lower resolution scales have sharp edges. You should already know where the edges of all visible voxels are so instead of simply rendering the pixelized output you could take each pixels corresponding to individual faces of a voxel upscale it as its own image (only blending pixels from the same face) and then sort of apply a "mask" following its edges so there is a clear cutoff instead of a blurry one. Also for some reason, if I set the resolution scale below 1x, the frame time stays constant at 6ms no matter if i use 2x or 5x scale.
@DouglasDwyer
6 күн бұрын
Thanks for trying the demo! You can make the edges appear sharp by turning off antialiasing (which has a very particular appearance at lower resolutions). Maybe that will make things sharp in the way that you describe. As for things staying at 6 ms, you were probably hitting the limit of your display's refresh rate.
Excellent video! It looks much better. And I'm surprised by the very good performance even on my potato pc! But walls still tend to look kind of flat. Is that just a consequence of the voxel engine? Or can it be fixed?
@DouglasDwyer
7 күн бұрын
I think this might just be a consequence of the models that I'm showing off. Adding textures and bumpy surfaces (which I will do when I add proper world editing tools) should hopefully fix it :)
Sound like temporal reStir
Amazing, if I didn't know any better, I'd think I'm looking at regular 3d models instead of voxels. But please make a game out of it or release it to the public, otherwise we'd end up with another Euclideon or Atomontage that only few cool videos ever came out of.
@DouglasDwyer
6 күн бұрын
My hope is to do both - make an example game, then release the engine for public use under a revenue-sharing model.
You really should have nearest neighbour upscaled this video to 4k and uploaded it at 4k. KZread compression completely butchered parts of this video.
I'm just wondering - how do you find the normal vector of a voxel when the ray (from eye to fragment) hits it? When I raymarch from eye to fragment, i keep track of which axis the closest plane is on, and use that to decide the normal. The problem is that when the ray hits the surfaces at certain angles, the roundoff errors cause the wrong normal to be used, and so for example if you are really close to a wall, you see a "grid" around the voxels: this is because around the edges of voxels, the normal reported is the one for the internal (invisible) side. Since I'm not seeing you are any other raymarching youtubers struggle with this problem, I presume there must be some better way to figure out the normal of the voxel I hit. Do you have suggestions? Thank you in advance if you choose to answer.
@DouglasDwyer
Күн бұрын
Hey there! There are two main approaches for this. One is to "bake" the normal into the voxel data. This is what I do, and it allows for the smooth per-voxel lighting shown in the video. If you want face normals, you should be able to extract the face normal index from your ray traversal algorithm. If you are using DDA, then the axis of the normal is determined by whichever direction the DDA algorithm last stepped.
@MaddGameMaker
Күн бұрын
@@DouglasDwyer I'm storing my voxels in an SVO so they don't really have defined 'faces' and as such I don't think i can bake the normals in. At the moment I am extracting the normals from the raymarching algorithm (I don't use DDA at the moment) but I think the problem is if a voxel gets hit exactly on an edge, the algorithm can't decide which side it technically hit, and so sometimes it decides that it hit the invisible side. I do suspect that it might just be a bug in my implementation. Thank you for your advice!
Tried running it in browser and yea... it's a bit laggy (like 2 - 5 fps I'd guess?) but personally I wouldn't be able to do this under 1 fps, so great job, considering how great it's looking.
@DouglasDwyer
6 күн бұрын
Thanks for trying out the demo! For full 60 FPS at 1080p, a discrete graphics card is required. If you want to run the game on an iGPU (which I'm guessing is what you're doing), you can turn down the screen resolution using the in-game settings menu. This should help it run at a playable speed.
@sgmvideos5175
6 күн бұрын
Well... maybe that's what happened, I'll try to see later if I can make sure it's running on the stronger one.
Would bounce lighting be too expensive? I think at least a single bounce might be good, as an option I guess you could use the hit voxel's color and shadowmap to do basic lighting for the first bounce, and you could also get lighting from the last frame's hashmap to approximate more bounces (I'm pretty sure this is similar to what Godot's SDFGI and HDDAGI do for traces from the probes). This sounds pretty cheap, because you don't have to do any more traces. (I guess this would replace the ambient occlusion function)
@DouglasDwyer
7 күн бұрын
That's a great question. I haven't really explored this option yet, because the hashmap only stores visible voxels (so stuff offscreen wouldn't be stored). But it's something to try in the future!
@Stowy
7 күн бұрын
@@DouglasDwyer does that mean that you only have indirect lighting from objects on screen ?
@DouglasDwyer
7 күн бұрын
@@Stowy nope, indirect lighting can come from offscreen objects! The hashmap only stores onscreen voxels' lighting, but to calculate their lighting, I cast rays through the world that can hit anything. What the other commenter is suggesting is that I use multiple bounces on my indirect lighting rays. These multiple bounces could take the additional lighting of offscreen voxels into account. But I don't store the lighting of offscreen voxels at present :)
@theneonbop
7 күн бұрын
@@DouglasDwyer Yes, I was thinking you could use the basic lighting model from previous videos to cover offscreen voxels, or voxels that weren't visible in the previous frame. Probably with only sun light and not ambient to avoid light leaks. It wouldn't cover everything, but I think it would be good for most of the common cases, such as sunlight coming through a window and bouncing off of the floor. It would still involve casting extra shadow rays I guess, if the shadowmap only covers screen space voxels.
Did you come up with this on your own 7:02? I couldn't quite understand it. Impressive how you managed to run path tracing in real time AND without needing a pc from nasa.
@DouglasDwyer
7 күн бұрын
Yes, this is my own original idea. The closest thing that I know about is NVIDIA's SHARC (github.com/NVIDIAGameWorks/SHARC), but I think that technique is slightly more complicated. Sorry that the explanation didn't fully make sense. The core idea is that if you average all of the path-tracing samples over an entire voxel (so the voxel just has one color, from all averaged samples), then things look pretty smooth. So, I create a hashmap or dictionary (which stores one single value per voxel) and as my shaders run, I add each lighting sample to the voxel's key in the dictionary. Then, I am able to use that final result for lighting.
@stickguy9109
7 күн бұрын
@@DouglasDwyer So it's like any path tracing light accumulation but instead of doing it on a per pixel basis you do it per voxel and since you don't know which voxel is being processed right now you store them in a dictionary to access them later. Did I get that right? If so doesn't that mean the noise will come back when you move too much.
@DouglasDwyer
7 күн бұрын
@@stickguy9109 Pretty much. Each pixel on the screen still casts a ray (this means that voxels close to the screen get more samples quickly), but they accumulate the results into the per-voxel map. Yes, when you move there might be noise - that's where the temporal accumulation comes into play. The averaged results from previous frames are re-used by looking up each voxel's entry in the hashmap from the last frame. This eliminates most flickering during motion. There may still be some in certain situations - you can play around on the demo to see how it works :)
@stickguy9109
7 күн бұрын
@@DouglasDwyer Very clever stuff. I'll try the demo now (hope it will run on a potato uhd too)
Shame about it only being single-bounce - did you try doing secondary/tertiary bounces too or did that end up being too intensive to run? IMO global illumination really starts looking amazing once you can see the colour of brightly coloured objects bleed onto other objects.
@DouglasDwyer
6 күн бұрын
Agree, multiple bounces would be really cool. I just haven't had time to explore it yet, but I do have some performance concerns. I definitely want to try it at some point :)
Very Small FPS in your last demo. I have 15 fps on 3050 RTX and Ryzen 5
@DouglasDwyer
Күн бұрын
Thanks for trying the demo! If you are using the web version, make sure that it is actually running on your discrete GPU (as opposed to an integrated GPU). You can also try downloading the native demo instead, or decreasing the screen resolution for better framerates. But you should definitely be able to achieve 60 FPS at 1080p; I am able to do so on my 1660 TI.
@Yagir
Күн бұрын
@@DouglasDwyer I use .exe file from github and i don't have a 60 fps. I got a very bad performance, and have delay when I try to remove something.
:)
yes another video also im first lol
i swear this is just teardown
Amazing stuff!
Congrats on 10k subs !
@DouglasDwyer
6 күн бұрын
Thanks!