S(xi,→ωi,xo,→ωo)=Sd(xi,→ωi,xo,→ωo)+S(1)(xi,→ωi,xo,→ωo)
The previous few posts all surrounds on the topic about: "how to get Sd(xi,→ωi,xo,→ωo) sample?", "How to get the sample distributed well for importance sampling?" Now we can put these parts all together :) Last look (for now) on the integration equation:
L(xo,→ωo)=∫A∫ΩLi(xi,→ωi)Sd(xi,→ωi,xo,→ωo)cos(θ)dωdA
The basic flowchart of this part will be:
1. when intersect a BSSRDF material, start figure out the probe radius Rmax (this can be precomputed actually but considering σa and σtr may varying in texture space, I made the computation on the fly for now)
2. casting out probe rays for scene intersection test (we can actually just do the intersection test with this particular surface with BSSRDF material instead of intersection test the whole scene. This is a BIG TODO in my mind now for efficiency concern)
3. for each probe ray intersection sample point, calculate the irradiance (the current method is picking light based on power distribution, and only casting one lighting sample for each probe ray intersection, we can definitely do splitting here!(multiple lighting sample with one BSSRDF sample) another BIG TODO item)
4. evaluate Sd(xi,→ωi,xo,→ωo)=1πFr(η,→ωi)Rd(∥xi,xo∥)Fr(η,→ωo), the two Fresnel terms can be calculated with ray direction, intersection normal and the refraction index
5. add up the 3. 4. calculation with appropriate MIS weight we mentioned in the previous post, and divide it with corresponding pdf (pick the light, pick the geometry point on light surface, pick the axis, pick the disc sample, the ray/intersection normal dot product, all need to be taken into account)
and here's the naive implementation with the above flowchart:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Color Renderer::LbssrdfDiffusion(const ScenePtr& scene, | |
const Fragment& fragment, const BSSRDF* bssrdf, const Vector3& wo, | |
const Sample& sample, | |
const BSSRDFSampleIndex* bssrdfSampleIndex, | |
WorldDebugData* debugData) const { | |
const Vector3& pwo = fragment.getPosition(); | |
float coso = absdot(wo, fragment.getNormal()); | |
float eta = bssrdf->getEta(); | |
float Ft = 1.0f - Material::fresnelDieletric(coso, 1.0f, eta); | |
float sigmaTr = bssrdf->getSigmaTr(fragment).luminance(); | |
const vector<Light*>& lights = scene->getLights(); | |
// figure out the sample probe radius, we ignore the integration | |
// of area with pdf too small compare to center, yes...this introduces | |
// bias...but can limit the probe ray as short as possible | |
// skipRatio = pdf(r) / pdf(0, 0) = exp(-sigmaTr * r^2) | |
float skipRatio = 0.01f; | |
float Rmax = sqrt(log(skipRatio) / -sigmaTr); | |
Color Lmultiscatter(0.0f); | |
for(uint32_t i = 0; i < bssrdfSampleIndex->samplesNum; ++i) { | |
const BSSRDFSample bssrdfSample(sample, *bssrdfSampleIndex, i); | |
// sample a probe ray with gaussian falloff pdf from intersection | |
Ray probeRay; | |
float discPdf; | |
BSSRDFSampleAxis axis = bssrdf->sampleProbeRay(fragment, | |
bssrdfSample, sigmaTr, Rmax, &probeRay, &discPdf); | |
Intersection probeIntersect; | |
float epsilon; | |
// TODO we can use a material indexed lookup table to query | |
// surface with this particular BSSRDF instead of doing a whole | |
// scene intersaction test | |
if(scene->intersect(probeRay, &epsilon, &probeIntersect)) { | |
if(probeIntersect.getMaterial()->getBSSRDF() == bssrdf) { | |
const Fragment& probeFragment = probeIntersect.fragment; | |
const Vector3& pProbe = probeFragment.getPosition(); | |
Color Rd = bssrdf->Rd(probeFragment, | |
squaredLength(pProbe - pwo)); | |
// calculate the irradiance on the sample point | |
float pickLightPdf; | |
int lightIndex = mPowerDistribution->sampleDiscrete( | |
bssrdfSample.uPickLight, &pickLightPdf); | |
const Light* light = lights[lightIndex]; | |
Vector3 wi; | |
float lightPdf; | |
Ray shadowRay; | |
const Vector3& ni = probeFragment.getNormal(); | |
Color L = light->sampleL(pProbe, epsilon, bssrdfSample.ls, | |
&wi, &lightPdf, &shadowRay); | |
if(L == Color::Black || lightPdf == 0.0f || | |
scene->intersect(shadowRay)) { | |
continue; | |
} | |
float cosi = absdot(ni, wi); | |
Color irradiance = L * cosi / (lightPdf * pickLightPdf); | |
float Fti = 1.0f - | |
Material::fresnelDieletric(cosi, 1.0f, eta); | |
// evaluate the MIS weight | |
float pdf = discPdf * absdot(probeRay.d, ni); | |
float w = bssrdf->MISWeight(fragment, probeFragment, axis, | |
pdf, sigmaTr, Rmax); | |
Lmultiscatter += | |
(w * INV_PI * Ft * Fti * Rd * irradiance) / pdf; | |
} | |
} | |
} | |
Lmultiscatter /= (float)bssrdfSampleIndex->samplesNum; | |
return Lmultiscatter; | |
} |
For comparison I put another simplest Lambert shade version. Same camera/lighting settings, same samples number:
we can see the dark edges(the head and tail part) in BRDF shading got soften a lot in BSSRDF since the lighting contribution from nearby area, also the bumpy body become not that hard lined in BSSRDF version.
Now moving to the single scattering part, as the term suggests, this part simulate the case that light specular refract into medium, encounter a scatter event to change its direction, refract out the medium and enter the camera. (or think in the other way: camera shoot eye rays with importance, refract into the medium, during the travel in the medium, it absorbs photon that emit from light and refract into the medium) The following images illustrate the single scattering idea:
Image taken from Hery 03, the whole radiance received by this eye ray would be the integration alone the refract ray →pTo in the medium
Put in Fresnel, scatter event, phase function into accout, Jensen 01 derive the equation for each sample point alone the refracted eye ray:
L(1)(xo,→ωo)=(σs(xo)Fp(→ωi′,→ωo′)/σtc)e−si′σt(xi)e−so′σt(xo)Li(xi,→ωi)
another round of scary symbol sort:
F=Fr(xo,η)Fr(xi,η) : the Fresnel factor
σa: the absorb coefficient
σs: the scatter coefficient
p: Henry-Greenstein phase function, similar to BRDF for surface, this describe the scatter property for volume
→ωi′: photon Li refraction direction
→ωo′: eye ray refraction direction
G=∥→nxi⋅→ωo′∥/∥→nxi⋅→ωi′∥ : geometry factor
σt=σa+σs : the extinction coefficient
σtc=σt(xo)+Gσt(xi) : combined extinction coefficient
si′: the refracted photon traveling distance
so′: the refracted eye ray traveling distance
For the refract photon distance si′, it is actually pretty difficult to sample correctly since there is a specular refraction direction change from the inside medium sample to light sample point. Jensen proposed an approximation for the true refracted distance with Snell's law:
si′=si∥→nxi⋅→ωi∥√1−(1η)2(1−∥→nxi⋅→ωi∥2)
where si is the distance between sample point in medium to light sample point minus the part this ray is not in the medium.
Jensen also suggested that we importance sampling alone the eye ray with pdf(d)=σte−σtd since the exponential attenuation alone the medium. This 1D sample was covered in previous post , so we are pretty much ready to go!
The flowchar for single scattering will be:
1. when intersect a BSSRDF material, change the ray direction with specular refraction
2. alone the refraction ray sample point with 1D exponential falloff
3. for each sample point from 2. , sample light source and casting shadow ray to see if there is other geometry blocking the light besides the current medium (need to be a little bit careful on this part when playing around with the shadow ray length)
4. approximate si′ with Snell's law, evaluate Fresnel term and phase function
put them into codes:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Color Renderer::LbssrdfSingle(const ScenePtr& scene, | |
const Fragment& fragment, const BSSRDF* bssrdf, const Vector3& wo, | |
const Sample& sample, | |
const BSSRDFSampleIndex* bssrdfSampleIndex, | |
WorldDebugData* debugData) const { | |
const Vector3& pwo = fragment.getPosition(); | |
const Vector3& no = fragment.getNormal(); | |
float coso = absdot(wo, fragment.getNormal()); | |
float eta = bssrdf->getEta(); | |
float Ft = 1.0f - Material::fresnelDieletric(coso, 1.0f, eta); | |
float sigmaTr = bssrdf->getSigmaTr(fragment).luminance(); | |
Color scatter = bssrdf->getScatter(fragment); | |
Color sigmaT = bssrdf->getAttenuation(fragment); | |
Vector3 woRefract = Goblin::specularRefract(wo, no, 1.0f, eta); | |
const vector<Light*>& lights = scene->getLights(); | |
Color Lsinglescatter(0.0f); | |
// single scattering part | |
for(uint32_t i = 0; i < bssrdfSampleIndex->samplesNum; ++i) { | |
const BSSRDFSample bssrdfSample(sample, *bssrdfSampleIndex, i); | |
// sample a distance with exponential falloff | |
float d = exponentialSample(bssrdfSample.uSingleScatter, sigmaTr); | |
Vector3 pSample = pwo + d * woRefract; | |
float samplePdf = exponentialPdf(d, sigmaTr); | |
// sample a light from pSample | |
float pickLightPdf; | |
int lightIndex = mPowerDistribution->sampleDiscrete( | |
bssrdfSample.uPickLight, &pickLightPdf); | |
const Light* light = lights[lightIndex]; | |
Vector3 wi; | |
float epsilon, lightPdf; | |
Ray shadowRay; | |
Color L = light->sampleL(pSample, 1e-5f, bssrdfSample.ls, | |
&wi, &lightPdf, &shadowRay); | |
if(L == Color::Black || lightPdf == 0.0f) { | |
continue; | |
} | |
// preserve the maxt since the next intersection test will modify | |
// it to the closest intersection, but we still need this maxt to | |
// cast shadow ray from that intersection to light | |
float maxt = shadowRay.maxt; | |
Intersection wiIntersect; | |
// TODO we can use a material indexed lookup table to query | |
// surface with this particular BSSRDF instead of doing a whole | |
// scene intersaction test | |
if(scene->intersect(shadowRay, &epsilon, &wiIntersect)) { | |
// if not, the pSample is out of the BSSRDF geometry already | |
if(wiIntersect.getMaterial()->getBSSRDF() == bssrdf) { | |
// update shadow ray to start from pwi | |
const Fragment& fwi = wiIntersect.fragment; | |
const Vector3& pwi = fwi.getPosition(); | |
const Vector3& ni = fwi.getNormal(); | |
shadowRay.mint = shadowRay.maxt + epsilon; | |
shadowRay.maxt = maxt; | |
if(!scene->intersect(shadowRay)) { | |
float p = bssrdf->phase(wi, woRefract); | |
float cosi = absdot(ni, wi); | |
float Fti = 1.0f - | |
Material::fresnelDieletric(cosi, 1.0f, eta); | |
Color sigmaTi = bssrdf->getAttenuation(fwi); | |
float G = absdot(ni, woRefract) / cosi; | |
Color sigmaTC = sigmaT + G * sigmaTi; | |
float di = length(pwi - pSample); | |
float et = 1.0f / eta; | |
float diPrime = di * absdot(wi, ni) / | |
sqrt(1.0f - et * et * (1.0f - cosi * cosi)); | |
Lsinglescatter += (Ft * Fti * p * scatter / sigmaTC) * | |
expColor(-diPrime * sigmaTi) * | |
expColor(-d * sigmaT) * L / | |
(lightPdf * pickLightPdf * samplePdf); | |
} | |
} | |
} | |
} | |
Lsinglescatter /= (float)bssrdfSampleIndex->samplesNum; | |
return Lsinglescatter; | |
} |
One more! in Jensen 01's final demo image marble bust he added a Fresnel reflection pass. Though it's not necessary, I feel that shiny sparkling highlight is pretty charming. Plus, we already have that in our BRDF, so I just directly plug it into the new Subsurface material:
All the jigsaw pieces are put in place now. Finally! Put them altogether, we have this shiny marble cutie :)
So how's the multiple importance sampling benefit we have? That's actually the most exciting part! For comparison, I render another image with sample alone normal axis only without multiple importance sampling, same 25 samples.
And the side by side comparison (left side no MIS, right side with MIS):
In the highly curvy surface, The MIS version did a pretty good job removing those most noticeable variance noise, while in the relatively flat surface(the dragon right side body part), project all samples alone normal has a better result. Kulla 13 did mentioned about this part that there will be performance lost in overall flat surface since we would have extra wasted sample projected alone dpdu/dpdv and intersect nothing. The algorithm generally remove the spiky high variance but introduce some relatively more uniform variance, which is easier to converge when the sample amount got cranked up.
Image taken from Kulla 13, explains the issue that MIS doesn't work particular well for overall flat surface
No comments:
Post a Comment