## Assignment 2 - Stochastic Raytracing## General Sampling ApproachCentral to stochastic rendering is creating good random samples. As with other Monte Carlo methods, stratification greatly reduces the variance. For each pixel I generate a set of jittered samples in the domain [0,1] x [0,1]. I use the samples directly as subpixel sample locations for eye rays. I reuse the samples for other dimensions, utilizing a permutation table (also generated once per pixel) in order to eliminate correlation between samples.I do not branch the ray tree. That means that if I want 64 shadow rays
I need to send 64 eye rays. This greatly simplifies the task of stratifying
samples. It also means that I can sample many different effects in the
same scene with very little increase in computation time. All of these
scenes use soft shadows and anti-aliasing. For toy scenes like those generated
for this assignment a lot of rays are wasted sampling smooth areas. It
might be worth trying a multistage approach for stratification. For instance
I could sample the image plane with 4 rays and then cast 16 more rays from
a single intersection point to evaluate soft-shadows, glossy reflection
etc. After the first intersection of the scene I would not branch the ray
tree and I could continue to sample the higher dimensions as I do now.
## Soft ShadowsTo generate soft shadows I create uniform samples on a convex polygonal light source as described by Greg Turk in "Generating Random Points in Triangles" inGraphics Gems.
64 samples / pixel - 5 minutes
## Glossy ReflectionsFor glossy reflections I transformed uniform random samples into a cosine lobe distribution centered around the reflection direction. This image shows a glossy plate split down the middle. The left side is glossy while the right side is perfectly reflective.
64 samples / pixel - 6 minutes
## TranslucencyThis effect is generated using the same cosine lobe distribution as I used for reflection, except it is centered around the transmission direction. Once again, the left half of the plate is translucent while the right side is perfectly transmissive.
64 samples / pixel - 7 minutes
## Depth of FieldTo create the depth of field I choose a position on a virtual lens. I perturb the origin of the eye ray while constraining the direction such that the ray still passes through the same point at the focal length as the original ray. To generate samples on th lens I warp a uniform 2D random variable to a uniform disc distribution.
64 samples / pixel - 7 minutes
## Texture MappingTexture mapping significantly enhances the realism of a scene. It is a texture-mapped sphere on a plane textured with a procedural checker/marble shader. I do not filter the texture look-ups at all. You can notice some pixelation occuring, especially near the crown of the parrot's head. The procedural texture is pretty expensive to evaluate. While very simply geometrically, this scene took the longest to generate.
64 samples / pixel - 19 minutes
## Stochastic Cow EffectThis image shows cow particles arranged stochastically about a teapot primitive. The models are made up of triangle meshes. I used grids with uniform hierarchical voxels to accelerate intersection tests. Notice the subtle shadowing at the base of the objects. This is due to a very large area light source used to simulate skylight.
36 samples / pixel - 12 minutes
## CodeThis code is in constant evolution. I have been playing with it for some time now seeking a satisfactory design. I recently encountered an unpublished manuscript for a book calledA Literate Raytracer
by Matt Pharr, Greg Humphreys, and Pat Hanrahan that has a design I really
like. I have adopted portions of their design. I will continue to refining
the code a lot over the next few projects. |