## COMP 236 Programming Assignment 2

Implement a triangle scan conversion routine and Z-buffer hidden surface algorithm. (assignment)

### Algorithm

I implemented the algorithm in Matlab.

For "large" triangles (with all sides larger than 1 pixel), I interpolate all the values along all the scanlines as follows:

• sort the vertices by their Y coordinate, breaking ties by their X coordinate => p1, p2, p3
• generate the scanlines i.e., the integer values for Y along each line p1p2, p1p3, p2p3
• compute the endpoints of the scanline spans for each Y i.e., compute the corresponding X values
• interpolate the values for 1/Z, R, G, B for each pair of (X,Y) on each of the three sides (here X is a floating point value, and Y is an integer, so we are interpolating for the endpoints of the spans)
• split the triangle into two triangles with one horizontal side each
• in each triangle, interpolate the values for 1/Z, R, G, B across scanlines and their corresponding spans (here both X and Y are integers, so we are sampling at the center of the pixels)
• cap the color values into the color range
• find the indices where the values of 1/Z are greater than the values in the image passed in the input
• replace the 1/Z, R, G, B values for the appropriate pixels

Note: the final two steps go across the whole image. This is arguably slow for large images, yet faster in Matlab than implementing loops for each scanline

### "Optimizations"

For "small" triangles (with at least one of the sides less than 1 pixel) I just compute the means for all the values and choose the resulting point as "representative" for the triangle.

"Tiny" triangles ("small" triangles with area less than one pixel) are ignored.

### Results

Below are a few screenshots produced by my program. Besides generating random triangles, I wrote a program to read models posted for a similar assignment at MIT. The models have no depth information, so I assigned the same depth to all points. This leads to weird results depending on whether the replacement is also done for values of 1/Z that are equal, not only for the ones that are strictly greater than the previous values stored at the points (see the pictures of the weird cow and teapot below).

 10 random triangles the hex cone, rendered from model 100 random triangles, with R,G,B corners 100 random triangles with random colors the cow, rendered from model the same cow, now looking weird the teapot, rendered from model, with its spout looking weird the teapot, with its lid looking weird this time, but with the correct spout

Here are three movies showing the construction of the random, the cow and the teapot models. Be sure to have your speakers on ;).

### Time model

At first I imagined a complicated time model, timing all the subintervals for various disjoint phases of the algorithm, but the regression did not work very well. The I came back to the simple:

#### Ttotal=Tsetup+Nlines*Tline+Npixels*Tpixel

This worked out ok for the random model, which contains more large triangles (the random values are usually far apart). However, for the cow and teapot models, most triangles are small, and I got a negative coefficient for Tpixel. This is probably due to the large difference in running times for very small "large" triangles that still get scan-converted and "small" triangles that are "optimized by averaging". This leads to a non-linearity in the data that is wrongly approximated by the negative coefficient.

 Model Tsetup Tline Tpixel maxErr random 1.054 e-1 4.9241 e-6 3.1194 e-6 8.1845 e-1 cow 6.2452 e-1 1.3882 e-2 -7.4002 e-4 1.0893 e 0 teapot 6.0369 e-1 8.4540 e-3 -2.5475 e-4 9.2657 e-1