Thursday, January 22, 2015

BSSRDF Importance Sampling 2 - Dipole Diffusion Approximation101

Super quick review on the dipole diffusion approximation model.
Compare to the classic BRDF integration:
L(xo,ωo)=Le(xo,ωo)+ΩLi(xi,ωi)fr(ωi,ωo)cos(θ)dω

This is BSSRDF integration:
L(xo,ωo)=AΩLi(xi,ωi)S(xi,ωi,xo,ωo)cos(θ)dωdA


In short words, to evaluate the radiance come out from surface point xo with direction ωo, we need to integrate over the whole surface area (the AdA part ), at each surface point xi, we integrate over the hemisphere(the Ωdω part) that for radiance L(xi,ωi), how much portion arrive at point xo with direction ωo(the S(xi,ωi,xo,ωo) part)

the BSSRDF S(xi,ωi,xo,ωo) is a 4 dimensional function, but most of the time it got simplified to radially symmetric and become a 1D distance funtion Rd(d) where d is distance between xo and xi.

Dipole Diffusion Approximation made assumptions that:
-light distribution  in highly scattering media tends to become isotropic. (My understanding for this part is that when light enter into medium with really high Albedo, after multiple scattering event happens, the light scatter distribution will become more and more uniform, and approach 1/4π at last)
-the surface is semi-infinite flat surface since for finite media there is no general way to solve diffusion equation in analytic way.

By putting two point light source above and under the medium(that's why it's called dipole method), we can get the analytic solution for fluence on the semi-infinite flat surface (OK...."we" not including "me" at this moment......I confess...... Donner 06 thesis Chapter 5 contains a detail derivation on this, while I just grab the result equation directly...)

Rd(r)=α4π[zr(σtrdr+1)eσtrdrdr3+zv(σtrdv+1)eσtrdvdv3]


After a topological sort on those scary symbols:
σa: absorb coefficient, describe how much radiance got absorbed in medium
σs: scatter coefficient, describe how much radiance got scattered in medium
g: anisotropy parameter (range from -1 to 1, -1 for fully backward scattering, 1 for fully forward scattering, 0 for isotropy)
σs=σs(1g): reduced scattering coefficient
σt=σs+σa: reduced extinction coefficient
α=σs/σt: the reduced albedo
σtr=3σaσt: the effective transport coefficient
Fr: the Fresnel formula for the reflection at a dielectric surface
Fdr=ΩFr(η,nω)(nω)dω: the average diffuse Fresnel reflectance
A=(1+Fdr)/(1Fdr): the coefficient takes into account the effect of internal diffuse reflection
D=13σt: the diffusion constant
zr=1/σt: the distance beneath surface where we put the positive dipole light
zv=zr+4AD: the distance above surface where we put the negative dipole light
dr=∥xxr: the distance from x to the positive dipole light
dv=∥xxv: the distance from x to the negative dipole light

Hopefully this dipole image illustrates the thing a little bit clearer...

At last, we put into account the Fresnel reflection at the boundary for both the incoming light and the outgoing light, we get:
Sd(xi,ωi,xo,ωo)=1πFr(η,ωi)Rd(xi,xo)Fr(η,ωo)

Image taken from GPU Gems3, showing the BSSRDF falloff pretty quick when the distance between xo and xi increases, we can also see that red channel tend to resist more before fully extinct, this sorta explains the idea human skin have that red texture underneath

For Fresnel diffuse reflection Fdr , it's an integration that can be approximated with the following code snippet. It was taken from Donner 06 too:

inline float BSSRDF::Fdr(float eta) {
// see Donner. C 2006 Chapter 5
// the internal Fresnel reflectivity
// approximated with a simple polynomial expansion
if(eta < 1.0f) {
return -0.4399f + 0.7099f / eta - 0.3319f / (eta * eta) +
0.0636f / (eta * eta * eta);
} else {
return -1.4399f / (eta * eta) + 0.7099f / eta + 0.6681f +
0.0636f * eta;
}
}
YUUUP.....we got the diffusion approximation part BSSRDF Sd(xi,ωi,xo,ωo) now, and it looks pretty mathmatically daunting (again, for me....). So, I encapsulate all these stuff into the BSSRDF black box, and hopefully I don't need to see them most of the time :P The constructor takes σa, σs, refraction index η, anisotropy parameter g , the rest of the stuff it can cook indoor. Something looks like this:

BSSRDF::BSSRDF(const ColorTexturePtr& absorb,
const ColorTexturePtr& scatterPrime, float eta, float g):
mAbsorb(absorb), mScatterPrime(scatterPrime),
mEta(eta), mG(g) {
float fdr = Fdr(eta);
mA = (1.0f + fdr) / (1.0f - fdr);
}
Color BSSRDF::Rd(const Fragment& fragment, float d2) const {
// see Donner. C 2006 Chapter 5 for the full derivation
// of the following disffusion dipole approximation equation
Color sigmaA = mAbsorb->lookup(fragment);
Color sigmaSPrime = mScatterPrime->lookup(fragment);
Color sigmaTPrime = sigmaA + sigmaSPrime;
Color sigmaTr = sqrtColor(3.0f * sigmaA * sigmaTPrime);
Color one(1.0f);
Color zr = one / sigmaTPrime;
// zv = zr + 4AD where D = 1/(3 * sigmaT') = zr / 3
Color zv = zr * (1.0f + 4.0f / 3.0f * mA);
Color dr = sqrtColor(zr * zr + Color(d2));
Color dv = sqrtColor(zv * zv + Color(d2));
Color alphaPrime = sigmaSPrime / sigmaTPrime;
Color sTrDr = sigmaTr * dr;
Color sTrDv = sigmaTr * dv;
Color rd = 0.25f * INV_PI * alphaPrime * (
(zr * (one + sTrDr) * expColor(-sTrDr) / (dr * dr * dr)) +
(zv * (one + sTrDv) * expColor(-sTrDv) / (dv * dv * dv)));
return clampColor(rd);
}
view raw BSSRDF_Rd.cpp hosted with ❤ by GitHub
OK! That pretty much sums up the TODO list item1, now moves on to item2, get a separate integration for BSSRDF. This is just open another cut in point for some future implementation, pretty straightforward for small side project scale code base:

Color Renderer::Lsubsurface(const ScenePtr& scene,
const Intersection& intersection, const Vector3& wo,
const Sample& sample,
const BSSRDFSampleIndex* bssrdfSampleIndex) const {
const MaterialPtr& material =
intersection.primitive->getMaterial();
const BSSRDF* bssrdf = material->getBSSRDF();
const vector<Light*>& lights = scene->getLights();
if(bssrdf == NULL || lights.size() == 0) {
return Color(0.0f);
}
const Fragment& fragment = intersection.fragment;
Color Lsinglescatter = LbssrdfSingle(scene, fragment, bssrdf,
wo, sample, bssrdfSampleIndex);
// multiple scattering part with diffusion approximation
Color Lmultiscatter = LbssrdfDiffusion(scene, fragment, bssrdf,
wo, sample, bssrdfSampleIndex);
return Lsinglescatter + Lmultiscatter;
}
view raw sss_entry.cpp hosted with ❤ by GitHub
Done!

One particular thought pop in my mind while reading Jensen01 was: "For the sake of god!!! To just get something radially symmetrically falloff you really need to be that hardcore? I mean....there should be something so much easier symmetrically falloff model you can use right? like Gaussian distribution maybe?" The fun fact is yes! There are already some research about using multiple weight sum of Gaussian filters to approximate BSSRDF and it can be really close to the dipole model. This GPU gems 3 article mentioned the idea and give you the weight sum parameters directly for skin BSSRDF. If you are interested in the full story under the hood, d'Eon11 "A Quantized-Diffusion Model for Rendering Translucent Materials" is a cool ride on it.

Image taken from GPU Gems3 , we can see the 4 Gaussian weight sum is really close to the Dipole model already


2 comments:

  1. Hi Wei-Feng,

    I am a Visual Basic programmer, I don't know much C++.

    I have written a graphics application in VB. It does all the basic stuff and global illumination with photon mapping and final gathering.

    I am trying to implement subsurface scattering now. I have red the Jensen and Donner papers, but I am stuck at how to implement Jensen's diffusion DIPOLE in code.

    I have coded the Rd(r) expression, but I am at a loss on how to integrate the area around the point I am rendering.

    I looked at your code in C++ to get some ideas, but I still need help. I am rendering a point, how to go about integrating the area around it to get sss with Jensen's dipole? Christensen suggests going inside the surface and spherically shoot sampling rays and look at the illumination at the ray intersections. So I have a bunch of points with illumination close to the point I am rendering, how to go from there and apply Jensen's diffuse dipole? Can you help me? I hope you will have time for that, I can be reached at my email radiomilo@sodetel.net.lb. Hope to hear from you! Thank you. Camil.

    ReplyDelete
  2. Hi, to integrate samples around the area. There are two main stream methods: Monte Carlo or density estimation. If you have bunch of points with illumination close to the point you are rendering, you can probably try density estimation. The rough idea is just divide the sum of illumination points (each multiply by Rd(r)) by pi*r^2 to get the result. The dipole integrator in pbrt-v2 is a good example of this kind of density estimation method (you probably need an spatial accelerator like kd-tree for fast neighbor query during render time though):
    https://github.com/mmp/pbrt-v2/blob/master/src/integrators/dipolesubsurface.cpp

    ReplyDelete