this post was submitted on 10 Aug 2023
22 points (100.0% liked)
Godot
5922 readers
177 users here now
Welcome to the programming.dev Godot community!
This is a place where you can discuss about anything relating to the Godot game engine. Feel free to ask questions, post tutorials, show off your godot game, etc.
Make sure to follow the Godot CoC while chatting
We have a matrix room that can be used for chatting with other members of the community here
Links
Other Communities
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Rules
- Posts need to be in english
- Posts with explicit content must be tagged with nsfw
- We do not condone harassment inside the community as well as trolling or equivalent behaviour
- Do not post illegal materials or post things encouraging actions such as pirating games
We have a four strike system in this community where you get warned the first time you break a rule, then given a week ban, then given a year ban, then a permanent ban. Certain actions may bypass this and go straight to permanent ban if severe enough and done with malicious intent
Wormhole
Credits
- The icon is a modified version of the official godot engine logo (changing the colors to a gradient and black background)
- The banner is from Godot Design
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Good answer. Especially if it could degrade gracefully for low performance, without temporal artifacts. E.g., have ray-surface hits act like point projectors with approximate visibility, so indirect lighting splashes brightness on a soft region instead of creating a hotspot.
I think there's a general solution to noise that's gone unused.
Okay, Metropolis light transport jiggles steps in each bright path, to find more low-probability / high-intensity paths. Great for caustics. But it's only used on individual pixels. It samples a lot of light coming in to one spot. We want the inverse of that.
When light hits a point, it can scatter off in any direction, with its brightness adjusted according to probability. So... every point is a light source. It's not uniform. But it is real light. You could test visibility from every hit along any light path, to every point onscreen, and it would remain a true unbiased render that would eventually converge.
The sensible reduction of that is to test visibility in a cone from the first bounce offscreen. Like if you're looking at a wall lit by the moon, it goes eye, wall, moon, sun. Metropolis would jitter to different points on the moon to light the bejeezus out of that one spot on the wall. I'm proposing to instead check moon-to-wall visibility, for that exact point on the moon, but nearby spots on that wall. (Deferred rendering can skip testing between the wall and your eye. Pick visible spots.)
One spot on the moon would not accurately recreate soft moonlight - but Blender's newer Eevee renderer proves that a dozen can suffice.
One sample per pixel could be a million.
You don't need to go from all of them, to every point onscreen, to get some pretty smooth results. Basically it's "instant radiosity" but with anisotropic virtual point sources. It's just a shame shadowing still needs to be done (or faked) or else applying them would be doable in screen space.