QCgravlens is a gravitational lens simulation based on Apple's Quartz Composer environment. The software warps the image from your Mac's webcam in real time, in an analogous way as the lens RXJ1131-1231, discovered by Sluse et al. (2003). In this particular lens system, a galaxy deflects the light of a background quasar and its host galaxy. This leads to the formation of 4 separate images of the quasar, and a spectacular Einstein ring.
Requirements : an intel-based Mac running Mac OS 10.5/6/7, and its internal iSight or an external webcam.
If an external iSight or webcam is connected, it gets automatically selected as video source. For public outreach events I'm using a Unibrain firewire camera resting on a table, so to clearly make the distinction between observer, deflector, and source.
The idea of such a webcam simulation is not mine ! Thanks to a team from the University of Tübingen (I think... they wrote an article about a very similar black hole simulation maybe around 2000 in Sterne und Weltraum or Astronomie Heute but to my shame I didn't manage to find it again !), Matthew Turk (author of the similar and better packaged GLOYD, that uses simply parametrized lenses directly in QC), and Ute Kraus & Corvin Zahn from Uni Hildesheim, authors of spacetimetravel.org.
Special thanks to Dominique Sluse for providing me the lens model of RXJ1131, and to Chuck Keeton et al. for releasing gravlens, the software used to calculate the deflection map. The above standalone app is built using QuartzBuilder (I first thought I would build something custom, but let's face it, no time for this ...)
Here is the "source" : QCgravlens_source.tgz (1.5 MB). To play with the QC part, simply open QCgravlens.qtz (you'll need to have the developer tools installed).
The deflection map is precalculated, using gravlens. The plotgrid command saves the mapping between the source and image planes, on a specific polar grid, into a text file. This can of course be done with lens models of arbitrary complexity. Sample input file for gravlens : startup_RXJ1131.in .
Next step is to interpolate this mapping so to generate a pixel to pixel correspondance (cartesian grid) from the image plane to the source plane. This is done in python (see interpdef.py) with matplotlib's griddata (natural neighbor interpolation). The script writes the deflection maps into two png files, xdef.png and ydef.png, as image files are suited to be handled in Quartz Composer. For each pixel location in the image plane, the pixel values in xdef.png and ydef.png give the x and y location of the source pixel to use. As a png channel can contain only 255 different values, the red and green channels are used : x = G*100 + R.
Finally, in Quartz Composer, we load those png deflection maps and read the pixels of the webcam stream accordingly.
Turns out that this is rather easy to implement ! There are Quartz Composer modules to store the incoming video flux into a buffer, for instance the Video Delay plugin.
Unfortunately this takes a lot more memory... You can nevertheless try it by running QCgravlens_delays.qtz provided in the source tar (install the Video Delay plugin first).
Like for the deflections, a python script generates a map of the delay for each image position in the form of a png file. This is not just the typical "virtual photon" arrival time surface for one given source position, but the superposition of all arrival time surfaces evaluated only at the positions of the actual (and not virtual) image pixels. The calculation is done with many calls of findimg3 in gravlens. Here is how such an unusual map of time delays looks like, for RXJ1131 (red == image close to the center of the lens == high delay, same orientation as all the images shown by QCgravlens) :