I don't think so. From the lecture notes, it sounds like Square's renderer is more than plain-vanilla raytracing.
Raytracing is a technique that involves shooting one ray for each pixel (or more, for smoother output) from the camera into the universe. If the ray hits an object, the renderer determines, from surface properties and locations of lights and other objects, what color that pixel should be.
Radiosity is another beast entirely, and takes reflection into account. If you have a white tabletop, you place a bright green box on it, and you shine a bright light on them, the tabletop has a "green" patch on it where the light reflected off of the box falls. Raytracing won't do this. Most big commercial applications allow radiosity, often in combination with raytracing algorithms, and some are dedicated to the task.
Now Square's renderer works with photon mapping, which is essentially raytracing with an extra rendering pass, which computes how much each surface is illiminated by reflection from other surfaces. This method is parallel-friendly, is much more time-efficient than radiosity, and the results are comparable.
Kilauea is NOT, however, just plain old raytracing. :)
[ Parent ]