Tuesday, August 26, 2008

Telescope Simulator

I created a telescope simulator! WOOOT

WTF (Why The Heck) would I do such a thing? I don't know. But I did it. Now bow before me. Bwahahahh.....

Ok, silliness aside, here's the basic idea. Telescopes (and cameras, and imaging systems of any sort) act as spatial filters. The aperture's diameter, its shape, and the nature of any obstructions fundamentally limit the sharpness of the image that can be focused onto the image sensor. This phenomenon is known as the diffraction limit.

I created a software simulation of this effect, using Python and the following Python libraries: Matplotlib(PyLab), SciPy and NumPy. I could have done this all in MatLab, but I wanted to give Python a shot, for fun. I made the "Fourier Assumption", ie, that the 2D Fourier Transform of the aperture function is the point-spreading-function in the focal plane (which is a good assumption for imaging systems with high F-numbers), and then solved the Fourier Transform for a circular aperture with a circular central obstruction. I used wxMaxima to crunch the 2D integral for me, which was a good thing, since the solution involved Bessel functions.

My telescope simulator can simulate an imaging system with the following customizable parameters:
  1. Aperture diameter (circular)
  2. Aperture obstruction diameter (circular, centered)
  3. Light wavelength range (optical filter bandwidth)
  4. Focal length
  5. Image sensor pixel X and Y dimensions
The following images simulate a known image sensor with 4.64um (square) pixels. The aperture diameter is ~32", with a ~8" obstruction. The focal length is 342", and the imaging wavelength is 500nm (green). To simulate a wider bandwidth, I would just run multiple simulations, stepped over a bandwidth range and average the images together, but I didn't do that this time.

I begin with a very large image of Mars, probably taken by the Hubble, or an imaging satellite nearer to Mars. I then select out the green channel (since it is the highest-quality channel) and re-sample it (using perfect "Sinc" re-sampling), to simulate it being sampled by my simulated image sensor, at the simulated focal length, and with Mars at its closest to Earth. This is what I call the "perfect" image. It is the upper bound for what the telescope could ever hope to achieve. I then compute the point-spreading function of the aperture, and convolve this with the "perfect" image, which has the effect of smearing it. This simulates the effect of a real-world, finite, imperfect aperture. It shows, quite clearly, the diffraction limit's effect on image quality.

Original Image of Mars
(simulation only uses the green channel)

"Perfect" Image
(as if sampled by a telescope with an infinite aperture)

Simulated Image
(smeared by the effects of a non-infinite aperture)

You can click on the images for the full-res versions, although, the gray-scale images may not be any bigger than displayed here. There aren't very many pixels in the simulated image sensor, and using more pixels doesn't help! That's why its called the diffraction limit. It fundamentally limits an imaging system's resolving power, and partially explains why the ongoing consumer-digital-camera "mega-pixel race" is a gigantic in-your-face scam. Image noise is the other side of the issue.

I will upload my Python code to my Junk Shed sometime in the near future, once I clean it up a bit, to make it fit for public consumption.

Sunday, August 24, 2008

Beware the CMOS Battery

The computer saga continues.... after much sleuthing, and rearranging of hardware between my various frankenputers, I am 99% certain that my original problem began with a dying CMOS battery. The CMOS settings changed back to the factory default, which told the motherboard to use the built-in graphics card, instead of my nVidia 7900 GS. Thus, when I turned it on, no picture! (but it did boot, because it played the Ubuntu login sound). So the lesson is: When things go awry, especially with the video card, check your CMOS settings, make sure they stick, and keep some CR2032 coin-cells on hand!

Sunday, August 17, 2008

Fun With MatLab

I like to goof around with MatLab, and create little simulations, just for the fun of it. I call them "brain candy". In a few minutes, I'll be uploading a zip file containing the MatLab scripts, which created the following screen-shots, to my Junk Shed. But first, I'll take a moment to briefly describe them.

food_chain.m : This is an ecological simulation, which is seeded with a single alga. The alga gains energy over time (from sunlight), and reproduces once it has enough energy. As the algae reproduce, they sometimes mutate into fish. The fish only eat the algae (not each other). Once a fish has gained enough energy, it can reproduce. When the fish reproduce, they occasionally mutate into sharks. Sharks only eat fish. Mutation can happen in reverse, too. Sharks can mutate into fish, and fish into algae. I was interested in exploring the dynamics of a food chain, where A eats B and B eats C but A does not eat C and C does not eat A. I wanted to see how stable such a system is, and if there are any recurring trends.

Food Chain: Algae, Fish and Sharks
(grid showing simulation-in-progress)

Food Chain: Algae, Fish and Sharks
(Results after simulation finished)

logistic.m : This little script merely plots the logistic recursive equation, which I find to be pretty nifty. However, this guy is a total stud. He built an analog circuit which computes the logistic, and displays it on an oscilloscope! Is that not frigg'n awesome, or what?!?

The Logistic

n_body.m : I made a very crude n-body solver. I would really love, some day, to code up a much higher fidelity one in C++, using OpenGL to render it. It simulates a small star cluster, and does a very poor job of it. I haven't bothered to code in some of the tricks-of-the-trade, to ensure stability (or to even improve it a little). Numerical inaccuracies cause lots of stars to get ejected pretty quickly... but many of them don't get ejected, and you can see all sorts of interesting, very complex orbits develop.

N-Body: Star Simulator

alife1.m : I've titled this script "alife1.m" because it was my first success at achieving emergence in an artificial life simulation, and I intend to have many more. It uses a grid of independent, interacting, finite state machines, to implement a cellular automata. Life-like behavior emerges from total chaos. Some might argue it is more like an "artificial chemistry", and I can't disagree. Artificial chemistries are one of my current research interests.

ALife: Emergence

brownian_gravity.m : I created a discrete brownian motion model, and gave the randomness a bias towards downward motion. I then created a pile of particles in the middle of the grid, and set them loose. Its pretty fun to watch what happens.

Brownian Gravity

diffusion1.m : This one is similar to "brownian_gravity.m", but with no downward bias. Simulates a "sugar cube" disolving in water (if you use your imagination).

Diffusion: Discrete

diffusion2.m : This model is very different in structure from the above two models, but the results are similar. It simulates a cloud of particles in two-dimensional space, and at each simulation step, randomly moves each particle. This results in the expansion of the particle cloud, as each particle follows a random walk. The cross section takes on a Gaussian distribution.

Diffusion: Continuous

I hope you have time to download them and play with them. I apologize to the die-hard Octave users out there. These scripts use MatLab's handle graphics, which is not yet supported by Octave. Someday, when I have a job that doesn't require me to use MatLab, I'll do things like this in Python. Then everyone can join in the fun.

Acousting Imaging and The Dark Knight

I'm going to do my best to prevent this from being a spoiler, but if you don't trust me, then stop reading. I saw The Dark Knight in an IMAX last night. I really enjoyed it (for its genre). Movies have a tendency to bestow embarrassingly unrealistic capabilities on common technological devices. Computers, in general, are notoriously abused (Tron, The Ghost In The Machine, any modern detective mystery, etc...). Another commonly abused electronic device is the surveillance camera. A stereotypical example would be where a car drives by, and the forensics team is able to use advanced image "enhancement" to zoom in further and further, and eventually resolve the license plate number. They are violating so many laws (of physics, and math), I don't even know where to start with that one. However, I'd like to preemptively defend The Dark Knight on one seemingly ridiculous misuse of technology: the humble cell phone. Yes, what they attempted to do with cell pones isn't quite possible.... but its almost possible, so I was somewhat impressed.

In the movie, a system is developed for analyzing the sounds recorded by cellphones to create a three-dimensional representation of the room, or environment, surrounding the cell phone. This is actually sort-of possible via modern acoustic imaging techniques. Radio astronomy, passive sonar and bi-static radars all use a related set of techniques. Here's the basic idea: If you have a suite of microphones recording the ambient sound in the environment, you can cross-correlate the signals recorded by each microphone to estimate the locations of reflective surfaces (and sound-sources) in the environment. Now, I said its almost possible, but not actually possible. You could do it if you carefully controlled the conditions, but there are just too many deal-breakers in the real world to actually pull it off. Even if the NSA wanted to do this sort of thing, and was prepared to pay big bucks to try, here are the problems they'd have to contend with:

  1. The processing is vastly simpler when using an array of microphones vs a single microphone (cellphone). I think it might be possible, on paper, to do it with a single microphone, assuming it is moving in roughly random directions over the duration of the recording, you've calibrated the living hell out of the thing, and you know its precise location at each instant in time. My gut feel is that even under these conditions, you would need hours of data to be able to construct even the crudest of 3D images.
  2. Lets give them the benefit of the doubt, and assume they only attempt this method when they have multiple people in the same room, all with cell phones. This provides a spatially diverse array of microphones, perfect for acoustic imaging, right? Well, they are all likely oriented in different directions. Microphone frequency response is a function of AOA (angle of arrival). Even assuming that we have somehow calibrated (equalized) these microphones using some magical equalization technique which is capable of equalizing over all AOA space, we still have to contend with the fact that the person using each cell phone forms part of the beam pattern, and this is too big of an unknown (and a dynamic unknown, at that) to be calibrated away.
  3. Microphones are hideously noisey devices. Don't believe me, make some recordings using a normal microphone on your computer, and analyze the data in MatLab (or Python, if you aren't an evil software thief). This places severe limits on the rate of convergence to a solution.
  4. There's just no good way to know the exact 3D location of a cellphone, vs time. Yes, there are cellphones with built-in GPS, and there are geo-location techniques, which may be combined with GPS data, but the acoustic wavelengths of interest are on the order of centimeters. Therefore, we need to know the location of the cell phone to less than the smallest wavelength. 1/10th of the smallest wavelength would be nice. This means we'd need to know the cellphone's location down to about 1mm accuracy, for each sample of the recording. This is not, and will not be possible (in my lifetime). Please, prove me wrong.
  5. And finally, the most significant deal-breaker of all: data compression. Cellphones use extremely high data compression ratios. The compression scheme is optimized for the human voice and the human ear, and nothing else. It is horribly destructive to all other acoustic information, and most importantly, destroys the phase relationships between the spectral components in the data. The phase relationships are absolutely the worst components of the data to degrade, if you are interested in acoustic imaging. The phase information is where the time-delay information exists. If you throw that away, or significantly degrade it, there's just no hope of being able to form an image by correlating the data against data from other microphones. It would be a little like eliminating all of the consonants in a sentence. Example: o e i i ei a u i io ae. Good luck figuring that out.
So, it was neat to see modern DSP techniques get a little public exposure in a mainstream movie. Acoustic imaging is fun stuff, and is out there in the real world today, but it requires a lot of careful setup, calibration, and fancy algorithms (not to mention, massive computers), to make it work.

Junk Shed

I've created a Google Pages site called Photonymous's Junk Shed. I haven't posted anything there, as of yet, but the intent is to upload random stuff, which would be cumbersome or impossible via my blog. For instance, I'll zip up and upload any interesting little coding projects (MatLab, Python, C++) where I'd like to provide the source code for other folks to mess with. I'll have a page dedicated to each significant project, and links to downloadable files. Hopefully, it works out. Any blog posting referring to something with supplemental material found in my Junk Shed will be labeled "junk shed" and perhaps "projects", if it qualifies as a "project".

Wednesday, August 13, 2008

The computer hating continues....

I've tried 3 motherboards, which all work to varying degrees. I'm currently conducting long-term stability testing in one of the motherboards, to ensure there isn't a problem with my new graphics card (nVidia 9800 GTX), or perhaps, the nVidia drivers. If it behaves its self for a few days, then the problem definitely was the original motherboard's PCI Express slot, or supporting electronics. The current motherboard I'm using is low on features (no 1394!!), and seems a bit sluggish. I have one last motherboard installed in another computer that I'm going to try (in a few days), which is identical to the one that failed. If it works well, then I won't need to buy anything. If not, I'll need to shell out $500, which would make me angry.

I've pursued the "frankenputer" strategy for a while, to avoid spending lots of money when a single part fails. I just swap in a new part, and continue along my merry way. It really sucks when a single part failure forces you to buy MANY new parts. Oh well....

Monday, August 11, 2008

I hate computers

I had been hoping to wait to post on this subject until the exciting conclusion to my latest computer-related saga, but the conclusion never came. Here's a brief, day-by-day, intro, to set the stage:

Day 1) Got home from work, pushed power button on computer, no response. SHIT!
Day 2) Debugged, thought it was ($300, 2yr old) graphics card, bought new card.
Day 3) Tried to install card... doesn't fit. SHIT!
Day 4) Calmed down, hack away plastic on interfering SATA connector. Whew, it fit. Installed card. Booted. Worked fine. Crashed. SHIT!
Day 5) Next day, booted, worked fine. No problems.
Day 6) Got home from work, pushed power button on computer, no response. SHIT!

Ok, so in about a week, I've come full circle, having spent $230 on a new card. My next best guess is that the motherboard (PCI express slot) has some sort of problem. I'll be able to test this in another motherboard I have lying around. If that is stable over the course of a few days, I'll need to spend about $500 on a new MB+CPU+RAM, since my spare motherboard is too under-featured, and a little sluggish (hard disk controller? buggy chipset drivers?).

Ugh, I hate computers.

PS Its a damned good thing I have lots of them, though, or I wouldn't have been able to type this post.

Saturday, August 2, 2008

Phyrangufest


Photographic evidence that yours-truly attended the 2008 Phyrangufest, at the Wynkoop brewpub, in down-town Denver! PZ (right) looks like he was caught a little off guard by the spur-of-the-moment photo-op, but thats ok... isn't the giant tapestry in the background the coolest? You should go to the Wynkoop to check it out in person. Oh, and the girl who took this photo is unbelievably hot. Really, she is, but I don't have her picture, so you'll just have to accept my word on blind faith.

Vegan French Onion Dip

I'm not a vegan, but I know lots of them, and vegan cooking interests me just like any other engineering challenge: "Hmmmm..... how can I make something similar to X, which tastes good, and uses no animal products?" I actually think of vegan food as just another nationality of cuisine (yes, I'm aware, there's no country called Veganistan, but damn it, there should be!). Anyway, the following is my second attempt at vegan French onion dip, and it turned out wonderfully.

Vegan French Onion Dip

Ingredients:
  • 1 box of silken soft tofu (12oz). It must say "silken" and it must say "soft". These are two very important properties. I prefer the unrefrigerated Mori-Nu brand, from Cost Plus World Market.
  • 1 heaping half-cup of raw (unroasted) unsalted cashews. I prefer the ones from the Whole Foods bulk bin.
  • 3 tablespoons of freshly squeezed lemon juice.
  • 2 table spoons of dried onion slivers.
  • 1 pinch of salt (to taste). This can be tweaked at the end.
Instructions:
  1. In a blender (not a food processor), add the tofu, lemon juice, and cashews.
  2. Blend the living hell out of it (you want it to be creamy).
  3. Dump it into a small container, and fold in the salt and the onions.
  4. Let sit over night, and give it a stir before serving.
Looking at some other french onion dip recipes on the web, it appears there are some many ways to spice up this dip. Feel free to experiment with some or all of these:
  • Worchestershire sauce (if you want it to be purely vegan, you'll have to find a bottle with no anchovies)
  • Soy sauce
  • Miso
  • Garlic
  • Black pepper
  • Chives
It is great on unflavored, unsalted crackers (Triskets especially), potato chips, and carrots.

Friday, August 1, 2008

Sheesh... I can't believe I started a blog

I am 6-sigma certain that I will not be update this blog regularly. I don't even know what I'd blog about, or who the heck would read it if I did. My interests are just too damned varied. It wouldn't be able to be classified as a science blog, programming blog, DSP blog, philosophy blog, computational biology blog, or cooking blog, since I'd be writing about all of these things, and more. The topics would be so widely varying that it could never be classified under any one topic, and therefore, would probably never attract attention. And yet, here I am, blogging.