Project 5

Anti-aliasing through Signal Filters

Required Images:

(Images scaled down. Click on images for fullsize view.)

This is a sinc function in two dimensions, sampled regularly at the middle of each pixel and not filtered in any way. It represents the baseline for this assignment. Note the graphical artifacts in the image.

Same image, with 9 jittered samples per pixel and a triangle filter with support of two pixels.

Same image again with 9 jittered samples per pixel and a Hanning windowed sinc filter.

From left to right, the following images represent a scene rendered with the same filtering/sampling options used in the above three images.

The aliasing is painfully visible in the leftmost image. The two on the right take care of the problem somewhat.

Performance evaluation here

Design choices:

A new abstract base class this week: Filter. The filter class interface is very simple: it consists solely of a constructor that takes an integer specifying the pixel support, and a method called weight which, given two doubles, returns the weight of this filter as computed on a point whose x and y distances from the center of the filter are those passed to the method. It also has a selector method that returns the support width of the filter.

The other design choice this week was how to generate subpixel samples when they are needed. I chose to use a generator design pattern for this task. The Generator is an abstract base class with two methods: next and reset. The constructor takes a parameter specifying how many samples to generate, while the type of the concrete subclass of Generator determines what sort of samples are generated. My code currently contains three types of generators: Regular, Jittered, and Random. The regular generator generates uniformly spaced samples in a square configuration (the construtor parameter is the side length of this square). The jittered generator works similarly, except that each sample is randomly generated within the confines of its particular quadrant of the pixel. The random generator simply generates a number of samples within the pixel, without regard for which quadrant of the pixel it winds up in.

The generators return a Vector object with each call to next(). When the last sample has been generated, and the user calls next() again, a Stop exception is thrown. Thus, to use a generator properly, the user embeds the call the next(), together with code that will process the sample, in an infinite loop that is itself surrounded by a try block. The corresponding catch block should simply catch a Stop exception, and reset the generator. The infinite loop has at this point been broken out of, so processing can continue.

I also begat a disgusting hack which is nonetheless kind of cool. The images of the sinc function I generated for this assignment came from a new material and object I created specifically for this task. The object is called Function and is actually an abstract base class that is itself descended from the Plane object. The interface contains an eval method of two parameters, which evaluates the eponymous function at that point in the coordinate plane. Intersection with a function object is the same as for a plane. Applying a function material to this object, however, yields a different result. When the image ray strikes a Function object before any other object in the scene, the function material accomplishes its shading simply by calling the Function object's eval method, then interpolating between two colors (specified at construction time) based on how far the function value is between two extreme values (also specified at construction time). The result is a ray traced plane that has a function image pasted on it.

I realized only later that I don't need a separate Function object type, and that having the function material is sufficient. I could (probably) apply the function material to other objects, and achieve something akin to texture mapping. The reason I say call this a hack at this point is because the design is (probably) wrong, but for now, since it actually works, I'll leave well enough alone.

The reason I implemented this was because after I integrated filtering and sampling into my scene's rendering code, I wanted a simple way to generate the images for the assignment. I'm rather pleased with my experiment's results.