Git Product home page Git Product logo

unrealengineskyatmosphere's People

Contributors

sebh avatar tcantenot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

unrealengineskyatmosphere's Issues

Ellipsoid scaling

Hi Sébastien,
thank you for sharing this awesome technique. I could already implement the approach in a toy project vr space flight simulator. https://www.youtube.com/watch?v=LGN-KqRX9Z4
Since the simulation is rather realistic I have a non-spherical earth.
Where would be the best place to skew the calculations so that the horizon (EarthBottomRadius) starts 21km lower on the north/south pole?
I assume replacing all ray to sphere intersection calculations might be too costly for a non-spherical on the Oculus Quest.
Thanks in advance.

Dither to avoid the Banding

Hello Sebastian,

Great contribution and thanks for sharing production ready code. One of the things I noticed is no matter the SKY LUT Format. (F16, F32) or resolution there will be visible banding.

  1. Due to Horizon texel compression. The top of the sky has low texel count.
  2. Due to the smooth transitions for Gradients of sky rendering.

I was able to fix this by applying a Jitter at the end post Tonemapping. I tried this pre-tonemapping and it won't work so well due to range of F16 or F32.
I think Unreal defaults to TAA which is inherently eating up these banding artifacts.

For any one interested here is sample code. This works well for F16 SkyLUT 256 x 144

/**

inline float sampleInterleavedGradientNoise(float2 pixelPos)
{
const float3 magic = float3(0.06711056f, 0.00583715f, 52.9829189f);
return frac(magic.z * frac(dot(pixelPos, magic.xy)));
}

inline float3 applyDitherToPixelColor(float3 pixelColor, float2 pixelPos)
{
const float2 scaleBias = float2(1.f/255.f, -0.5f/255.f);
float noiseDither = sampleInterleavedGradientNoise(pixelPos) * scaleBias.x + scaleBias.y;
return (pixelColor + noiseDither);
}

Possible small issue

In order to have the project built using SDK v10 and Toolset v142, I had to include #include <string> in Dx11Device.h.

Biased spherical integration in NewMultiScattCS

The multi-scattering precompute shader numerically integrates incoming light using sample rays distributed uniformly along polar coordinates:

// Reference. Since there are many sample, it requires MULTI_SCATTERING_POWER_SERIE to be true for accuracy and to avoid divergences (see declaration for explanations)
#define SQRTSAMPLECOUNT 8
const float sqrtSample = float(SQRTSAMPLECOUNT);
float i = 0.5f + float(ThreadId.z / SQRTSAMPLECOUNT);
float j = 0.5f + float(ThreadId.z - float((ThreadId.z / SQRTSAMPLECOUNT)*SQRTSAMPLECOUNT));
{
float randA = i / sqrtSample;
float randB = j / sqrtSample;
float theta = 2.0f * PI * randA;
float phi = PI * randB;
float cosPhi = cos(phi);
float sinPhi = sin(phi);
float cosTheta = cos(theta);
float sinTheta = sin(theta);
WorldDir.x = cosTheta * sinPhi;
WorldDir.y = sinTheta * sinPhi;
WorldDir.z = cosPhi;
SingleScatteringResult result = IntegrateScatteredLuminance(pixPos, WorldPos, WorldDir, sunDir, Atmosphere, ground, SampleCountIni, DepthBufferValue, VariableSampleCount, MieRayPhase);
MultiScatAs1SharedMem[ThreadId.z] = result.MultiScatAs1 * SphereSolidAngle / (sqrtSample * sqrtSample);
LSharedMem[ThreadId.z] = result.L * SphereSolidAngle / (sqrtSample * sqrtSample);
}
#undef SQRTSAMPLECOUNT

Uniformly distributed polar coordinates are more concentrated around the poles (close to phi = 0 or PI), and sparse around the equator (phi = PI/2). Traditionally this is compensated for by weighting samples by a sin(phi) factor, but that doesn't seem to be in evidence. Does NewMultiScattCS have an unintended bias, or is this compensated for elsewhere? If so, where?

As always, thanks for publishing this excellent paper and the very helpful example code regardless.

Shouldn't aerial perspective froxels respect terrain?

I am a little bit confused about the way, aerial perspective froxels are supposed to work. I mean, I understand that basically they store the integrated (in-)scattering and transmittance of a slice at a certain depth, however this also implies that if the sun is obstructed by terrain, all scattering effects will still be applied on top of the geometry, even if one would not expect them to be there. I realized this was an issue, when moving the sun behind a mountain and was still able to see the mie halo. I naively disabled mie scattering for AP, however I then realized, that this made no sense for situations where mie scattering is expected, i.e. if the sun is (partially) visible.

The following image demonstrates what I mean. Right is mie on, left is off. Top is sun obstructed, bottom is with sun visible. Basically I want to get rid of the mie effect in the top row, whilst retaining it in the bottom row.

image

I also tried pre-multiplying the transmittance with the luminance when applying the AP, which reduced the effect, but did not completely eliminate it - especially, if the obstruction happens by geometry that is far away from the camera. This makes sense, given the way I understand this implementation of aerial perspective. So... shouldn't the terrain already be considered when computing the froxel? Do you have any advice on how to solve this issue in an adapted implementation?

Luminance values

Hello,
First of all, thank you very much for providing the implementation example for your excellent document!
I tried to implement it in my own project, but now everything that is not the sky is way too bright with a 10.0 exposure used in this paper and in Bruneton's paper.

If I understand the equations and the code, the HDR buffer contains luminance values.
According to wikipedia (https://en.wikipedia.org/wiki/Orders_of_magnitude_(luminance)), an average cloudy sky has a luminance of 2 kcd.m^2 but in this implementation it seems that the bright blue sky has an order of magnitude of 10^-1 cd.m^-2.

Is a 10.0 exposure appropriate for outdoor scenes? Am I missing something with the values in the HDR buffer?

Throughput is not updated with uniform sampling

Hello,
Thank you for sharing this code. It helped me a lot in understanding volume rendering algorithms.
I'm reading the path tracing implemention. I‘m a little perplexed by the following code. With uniform sampling, phaseValue / phasePdf is not constant 1. Why throughput is not updated?

#if MIE_PHASE_IMPORTANCE_SAMPLING
// This code is also valid for the uniform sampling, but we optimise it out if we do not use anisotropic phase importance sampling.
float phaseValue, phasePdf;
phaseGenerateSample(ptc, nextRay.d, ScatteringType, phaseValue, phasePdf);
throughput *= phaseValue / phasePdf;
#else
nextRay.d = getUniformSphereSample(random01(ptc), random01(ptc)); // Simple uniform distribution.
#endif

With MieScattScale larger than 1.0 (need to change SliderFloat v_max) and sun behind camera, sky will be brighter when MIE_PHASE_IMPORTANCE_SAMPLING is disabled than when it is enabled.

Role of SampleSegmentT in IntegrateScatteredLuminance

First off, thanks for publishing this code and the corresponding paper! Very cool work.

I'm working on an implementation, and I'm a bit perplexed by the use of SampleSegmentT in the core integration loop. It looks like it has the effect of shifting the sample positions forward by 0.3 times the interval between samples. I also didn't see any trapezoidal weighting as used in Bruneton's method.

Could you clarify the reasoning behind the integration method you went with? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.