Saturday, December 20, 2008

Evolution

A recent Ted talk, by Susan Blackmore, discusses the idea of memetics. She offers the best explanation of memetics that I've heard yet, and goes on to make the claim that the idea of evolution (Darwin's) is the best idea anyone ever had. I really have to agree with her: it is a beautifully elegant way to describe an incredibly powerful process. She also lays out a very concise description of the requirements for evolution, which I will paraphrase:

"A system in which information is copied, with variance, followed by selection, must produce evolution."

For clarity, a definition of terms is warrented. Evolution is a universal process, and is not specific to biology, so I'll use the word "unit" rather than "organism" to describe a packet of information, which participates in the evolutionary process. I use the word "population" to describe a collection of units (similar to the idea of "species").

Evolution is the process of adaptation to a selection process. Multiple diverse units, where each is equally adapted to a static selection process, are no more or less evolved with respect to each other. Winners and losers emerge only when the selection process changes, thus "evolving" the population.

Variance means that the information is not copied perfectly, but there is some (low) probability of a small copying error, during each copy operation. If the copying process leads to too high of an error rate, then the new units will be so different from their parents that they will be poorly adapted to the subsequent selection process.

Selection is the process by which units are selected for inclusion in the subsequent population, as a function of their characteristics. The selection process must be slowly time varying with respect to the copying rate of the units, in order for them to adapt. Otherwise, a new set of copies may be adapted to a very different (older) selection process than the one currently in place. A perfectly static selection process (one which does not change over time) will not result in evolution. The units will adapt to it, and then will remain equally adapted from then on. Their information may appear to change over time, due to the gradual accumulation of copying errors, but they will remain equally adapted to the selection process. A biologist might say that this is an example where the genotype varies, but produces a static phenotype. Of course, there may be instances where multiple phenotypes are equally adapted to a given selection process, but that's more detail than I want to get into. Periods of selection stasis lead to the accumulation of variance. If the selection process suddenly changes, a population's variance gives it the resources to quickly adapt to the new selection process, thus squeezing it through the "evolutionary bottleneck". The absence of sufficient variance, in such a scenario, results in extinction.

Friday, December 12, 2008

P vs NP for dummies

One of the "great" outstanding problems in computer science is the question of P vs NP. In layman's terms, P represents all problems whose solution can be found in polynomial time. NP represents all problems whose solutions can be verified in polynomial time. Polynomial time just means that the time to find the solution has a polynomial relationship with the size of the problem. For instance, a particularly terrible sorting algorithm may take n^2 seconds to sort an arbitrary list of names, where n is the number of names in the list. This is a second-order polynomial of the form a+b*n+c*n^2, where a = b = 0, and c = 1.

The question, does N = NP? asks whether all problems which can be solved in polynomial time, can also be verified in polynomial time. Back to the list sorting example, verifying that a list is indeed sorted properly may only take n seconds, so this is also polynomial. In this single example, the problem was solvable in polynomial time, and its solution was verified in polynomial time.

For further reading, take a look at Wikipedia's entry on this topic. It also addresses the topics of NP-hard and NP-complete.

Tuesday, December 9, 2008

Crap is King

I've heard this song many times, but tonight was the first time I truly listened to it. I never realized it was a critique of mass media, specifically, journalism. Due to mass-media consolidation, it isn't too often that you hear lyrics these days which criticize such powerful institutions. I hope that the internet can one day support a fluorishing "ecosystem" of musical expression, where critical messages can be conveyed un-molested by the powers that be. Recent legislation has been preventing this from happening, and I'm uncertain as to the potential of current movements to enact any meaningful change.

Saturday, November 22, 2008

Bash Tab Completion

For all you Linux users out there, I just discovered something wonderfully useful at the Bash command line: tab completion suggestions. See this pic to get a better idea what I'm talking about:


For the above example, I typed "to", then hit the "tab" key. As expected, my computer beeped, indicating "hey, dummy there are a bunch of matches, how am I supposed to figure out which one you want?". To get a list of the possible matches, hit the "tab" key again. Then it prints out all of the available choices. Why hadn't I discovered this sooner?

Oh, and if you think my custom cursor is nifty, here's the line of code to put in your ~/.bashrc file:

PS1='\n================================\n\[\033[01;32m\]\h\[\033[00m\]:\n \[\033[01;34m\]\w\[\033[00m\]\n > '

I like having a nice separation between command outputs. It also gives me a double-clickable path (for quick copying), I never have to type "pwd" to remind myself where I am, and can quickly see which machine I'm logged into. Uber handy.

Tuesday, November 11, 2008

Election 2008 Probability Analysis

Ok, the results are in. I finally gave up on Missouri, and called it for McCain, since the returns are leaning in that direction. As an interesting little side project, I analyzed the prediction accuracy of www.fivethirtyeight.com vs www.intrade.com by computing the probability of the observed outcome based on the predictions by the two websites. Intrade predicted a 0.19% probability of this election outcome, while 538 predicted a 2.4% probability of this outcome. Note that by "this outcome" I am referring to the state-by-state results. There are a very large number of possible outcomes, and some of them have infinitesimally small probabilities. For instance, all states voting for either candidate would have an extremely low probability. The small probability from Intrade is due to the cascaded probabilities of many estimates which were significantly smaller than 100% (lots of ~90% predictions). Things would look very different if a single one of 538's 100% predictions had gone the other way. The cascaded probabilities would then be 0%, while Intrade's would still be much greater than 0% (Intrade predicted no 0% or 100% outcomes, while 538 predicted many).

Another way to look at the prediction accuracy is simply as a function of the number of correct state outcome predictions. Intrade only missed one (Indiana), while 538 missed two (Indiana and Missouri).

I have not analyzed this from an electoral-vote-weighted perspective. I took the perspective that each state behaves like a weighted coin toss, and each website has its own method for estimating the odds of the result. I simply wanted to understand the accuracy of that prediction, not the accuracy of the composite prediction (the overall winner). The sites themselves gave their predictions for the winner: 538 predicted a 98% probability of an Obama win, while Intrade predicted a 90% probability. If I were to construct a Monte Carlo simulation based on the state-by-state predictions, I'd have very similar results. However, this wouldn't be sufficient to assess the accuracy of the predictions. We'd need many more trials.

In conclusion, although Intrade more accurately predicted the results, it appears that 538 is a more accurate estimator of outcome probabilities.

Tuesday, October 7, 2008

When Bad Things Happen to Good Oranges

Check it out, I dun blowed me up some oranges!

How did I do it? I waited till dusk, wrapped a firecracker in saran wrap, then poked a hole in an orange and inserted the firecracker so just the fuse was sticking out. Then, with my camera on a tripod (bulb mode, F8.0, ISO 100), I lit the fuse, opened the shutter, waited for the bang, then closed the shutter. TADA!... a cool photo, using only the light from the firecracker to expose the photograph. I did it three times, and each came out looking cool, but different:



Saturday, September 27, 2008

The Second Great Depression

To give you an idea of what to look for to know if we have experienced anything similar to the first Great Depression:

August 30, 1929 : Dow @ 382
July 8, 1932 : Dow @ 42

So after ~3 years, the Dow was at ~11% of its peak value.

What would this look like in modern times? The Dow peaked at 14093 on October 12th, 2007. If we experience a similar plummet, then sometime in September of 2010, the Dow would bottom out around 1550.

This gives you an idea of just how bad a "real" depression is. That is an average decline of 6.1% per month. The worst we've seen, since our recent peak of 14093, is about a 3% decline per month, over a 9 month period. Yes, this is quite bad, but it is (so far) only half as bad as the slide experienced during the Great Depression.

It could be worse. A lot worse.

Sunday, September 21, 2008

Clusters are Beautiful

I have a computer cluster living in my living room. Ain't she puurrrrdee?

Saturday, September 13, 2008

MatLab vs Octave Benchmark

I just thought I'd share some informal benchmarking results with the world. This is with Octave 3.0.0 and MatLab 2008a. I'm running on Ubuntu Linux, 8.04, 64-bit, with an AMD Athlon X2, 2.2 GHz processor, and 2 GB of RAM. I wrote a little modulator-demodulator thingy, which is not very fancy at all. Its just a couple of for-loops, some non-vectorized Euclidean-distance calculations, and some comparisons.

Octave : 7.5 Seconds
MatLab : 1.5 Seconds

That's a 5x speed advantage for MatLab. I like open-source stuff, and Octave is lighter-weight, so I'll still use it, but MatLab sure shows off the fact that it has been low-level optimized. When I need a 5x speed boost, I'll certainly power up MatLab, and when I need way more than that, well, then its time to go to C++, and move things over to my cluster :)

Friday, September 12, 2008

Egg Trick

Hate peeling hard-boiled eggs? Try this!

I tried it, and it (pretty much) worked. I think the best part of the advice is to add baking soda to the water before you boil the eggs. Two tablespoons per quart is probably about right. This reduces adhesion, and makes them really easy to peel.

Sunday, September 7, 2008

Pseudo-random Number Generators

Fascinating. I just learned that one can construct a pseudo-random number generator out of any one-way function. Then I tried it out in MatLab with a few simple experiments, and yup, it works. Thats cool.

Gold ion beam-beam collisions

Science sure is purrrdee, ain't it? This image was stolen right off of Wikipedia, because it was so pretty, I just couldn't help myself.

Hadronized Charged Particle Debris

...and here's another Wikipedia article on a different topic, related to some of the research I did as a graduate student. The "pareto perspective" is a pretty helpful one, in many areas of life.

Saturday, September 6, 2008

ERROR 21

Well, that sucked.

I just spent about 1 hour breaking my laptop, and another 3 fixing it. To save you the pain, here's how I broke it, and how I fixed it.

As part of debugging another computer, I installed Ubuntu to an external USB drive, using my laptop. This went fine, and I was able to boot the laptop from the USB drive. I unplugged the drive, and tried to boot my laptop the usual way, which gave rise to the error message:

Grub loading, please wait...
Error 21

Oh well hell.

Anyway, like I said above, to "save you the pain", here's the solution:

1) Boot from an Ubuntu live CD
2) Open up a terminal
3) Type the following:
>> sudo grub
>> find /boot/grub/stage1
4) This will tell you something like "(hd0,2)". Use this information for the next step:
>> root (hd0,2) <------ there is a space between "root" and "(hd0,2)"
>> setup (hd0) <------ again, don't forget the space
>> quit
5) Now reboot. It should be fixed.

Tuesday, September 2, 2008

Slow SSH

Is SSH being slow? Are you running Ubuntu (8.04 at the time of this writing)? I've found the following solution fixes slow SSH and Telnet logins. Specifically, the symptom is an annoying ~10 second delay between attempting to log into a remote terminal, and when it prompts you for a password. If this is happening to you, do the following (to the remote machine):

sudo nano /etc/hosts

Then, just add a line with the IP address and host name of the machine you are logging in from. Apparently, the remote machine is trying to do a DNS lookup, or something, and timing out. If it already has your machine in its list of hosts, you can get in instantly. I've found a few other folks discussing this in the forums, but none of their suggestions worked for me.

Happy computing.

Tuesday, August 26, 2008

Telescope Simulator

I created a telescope simulator! WOOOT

WTF (Why The Heck) would I do such a thing? I don't know. But I did it. Now bow before me. Bwahahahh.....

Ok, silliness aside, here's the basic idea. Telescopes (and cameras, and imaging systems of any sort) act as spatial filters. The aperture's diameter, its shape, and the nature of any obstructions fundamentally limit the sharpness of the image that can be focused onto the image sensor. This phenomenon is known as the diffraction limit.

I created a software simulation of this effect, using Python and the following Python libraries: Matplotlib(PyLab), SciPy and NumPy. I could have done this all in MatLab, but I wanted to give Python a shot, for fun. I made the "Fourier Assumption", ie, that the 2D Fourier Transform of the aperture function is the point-spreading-function in the focal plane (which is a good assumption for imaging systems with high F-numbers), and then solved the Fourier Transform for a circular aperture with a circular central obstruction. I used wxMaxima to crunch the 2D integral for me, which was a good thing, since the solution involved Bessel functions.

My telescope simulator can simulate an imaging system with the following customizable parameters:
  1. Aperture diameter (circular)
  2. Aperture obstruction diameter (circular, centered)
  3. Light wavelength range (optical filter bandwidth)
  4. Focal length
  5. Image sensor pixel X and Y dimensions
The following images simulate a known image sensor with 4.64um (square) pixels. The aperture diameter is ~32", with a ~8" obstruction. The focal length is 342", and the imaging wavelength is 500nm (green). To simulate a wider bandwidth, I would just run multiple simulations, stepped over a bandwidth range and average the images together, but I didn't do that this time.

I begin with a very large image of Mars, probably taken by the Hubble, or an imaging satellite nearer to Mars. I then select out the green channel (since it is the highest-quality channel) and re-sample it (using perfect "Sinc" re-sampling), to simulate it being sampled by my simulated image sensor, at the simulated focal length, and with Mars at its closest to Earth. This is what I call the "perfect" image. It is the upper bound for what the telescope could ever hope to achieve. I then compute the point-spreading function of the aperture, and convolve this with the "perfect" image, which has the effect of smearing it. This simulates the effect of a real-world, finite, imperfect aperture. It shows, quite clearly, the diffraction limit's effect on image quality.

Original Image of Mars
(simulation only uses the green channel)

"Perfect" Image
(as if sampled by a telescope with an infinite aperture)

Simulated Image
(smeared by the effects of a non-infinite aperture)

You can click on the images for the full-res versions, although, the gray-scale images may not be any bigger than displayed here. There aren't very many pixels in the simulated image sensor, and using more pixels doesn't help! That's why its called the diffraction limit. It fundamentally limits an imaging system's resolving power, and partially explains why the ongoing consumer-digital-camera "mega-pixel race" is a gigantic in-your-face scam. Image noise is the other side of the issue.

I will upload my Python code to my Junk Shed sometime in the near future, once I clean it up a bit, to make it fit for public consumption.

Sunday, August 24, 2008

Beware the CMOS Battery

The computer saga continues.... after much sleuthing, and rearranging of hardware between my various frankenputers, I am 99% certain that my original problem began with a dying CMOS battery. The CMOS settings changed back to the factory default, which told the motherboard to use the built-in graphics card, instead of my nVidia 7900 GS. Thus, when I turned it on, no picture! (but it did boot, because it played the Ubuntu login sound). So the lesson is: When things go awry, especially with the video card, check your CMOS settings, make sure they stick, and keep some CR2032 coin-cells on hand!

Sunday, August 17, 2008

Fun With MatLab

I like to goof around with MatLab, and create little simulations, just for the fun of it. I call them "brain candy". In a few minutes, I'll be uploading a zip file containing the MatLab scripts, which created the following screen-shots, to my Junk Shed. But first, I'll take a moment to briefly describe them.

food_chain.m : This is an ecological simulation, which is seeded with a single alga. The alga gains energy over time (from sunlight), and reproduces once it has enough energy. As the algae reproduce, they sometimes mutate into fish. The fish only eat the algae (not each other). Once a fish has gained enough energy, it can reproduce. When the fish reproduce, they occasionally mutate into sharks. Sharks only eat fish. Mutation can happen in reverse, too. Sharks can mutate into fish, and fish into algae. I was interested in exploring the dynamics of a food chain, where A eats B and B eats C but A does not eat C and C does not eat A. I wanted to see how stable such a system is, and if there are any recurring trends.

Food Chain: Algae, Fish and Sharks
(grid showing simulation-in-progress)

Food Chain: Algae, Fish and Sharks
(Results after simulation finished)

logistic.m : This little script merely plots the logistic recursive equation, which I find to be pretty nifty. However, this guy is a total stud. He built an analog circuit which computes the logistic, and displays it on an oscilloscope! Is that not frigg'n awesome, or what?!?

The Logistic

n_body.m : I made a very crude n-body solver. I would really love, some day, to code up a much higher fidelity one in C++, using OpenGL to render it. It simulates a small star cluster, and does a very poor job of it. I haven't bothered to code in some of the tricks-of-the-trade, to ensure stability (or to even improve it a little). Numerical inaccuracies cause lots of stars to get ejected pretty quickly... but many of them don't get ejected, and you can see all sorts of interesting, very complex orbits develop.

N-Body: Star Simulator

alife1.m : I've titled this script "alife1.m" because it was my first success at achieving emergence in an artificial life simulation, and I intend to have many more. It uses a grid of independent, interacting, finite state machines, to implement a cellular automata. Life-like behavior emerges from total chaos. Some might argue it is more like an "artificial chemistry", and I can't disagree. Artificial chemistries are one of my current research interests.

ALife: Emergence

brownian_gravity.m : I created a discrete brownian motion model, and gave the randomness a bias towards downward motion. I then created a pile of particles in the middle of the grid, and set them loose. Its pretty fun to watch what happens.

Brownian Gravity

diffusion1.m : This one is similar to "brownian_gravity.m", but with no downward bias. Simulates a "sugar cube" disolving in water (if you use your imagination).

Diffusion: Discrete

diffusion2.m : This model is very different in structure from the above two models, but the results are similar. It simulates a cloud of particles in two-dimensional space, and at each simulation step, randomly moves each particle. This results in the expansion of the particle cloud, as each particle follows a random walk. The cross section takes on a Gaussian distribution.

Diffusion: Continuous

I hope you have time to download them and play with them. I apologize to the die-hard Octave users out there. These scripts use MatLab's handle graphics, which is not yet supported by Octave. Someday, when I have a job that doesn't require me to use MatLab, I'll do things like this in Python. Then everyone can join in the fun.

Acousting Imaging and The Dark Knight

I'm going to do my best to prevent this from being a spoiler, but if you don't trust me, then stop reading. I saw The Dark Knight in an IMAX last night. I really enjoyed it (for its genre). Movies have a tendency to bestow embarrassingly unrealistic capabilities on common technological devices. Computers, in general, are notoriously abused (Tron, The Ghost In The Machine, any modern detective mystery, etc...). Another commonly abused electronic device is the surveillance camera. A stereotypical example would be where a car drives by, and the forensics team is able to use advanced image "enhancement" to zoom in further and further, and eventually resolve the license plate number. They are violating so many laws (of physics, and math), I don't even know where to start with that one. However, I'd like to preemptively defend The Dark Knight on one seemingly ridiculous misuse of technology: the humble cell phone. Yes, what they attempted to do with cell pones isn't quite possible.... but its almost possible, so I was somewhat impressed.

In the movie, a system is developed for analyzing the sounds recorded by cellphones to create a three-dimensional representation of the room, or environment, surrounding the cell phone. This is actually sort-of possible via modern acoustic imaging techniques. Radio astronomy, passive sonar and bi-static radars all use a related set of techniques. Here's the basic idea: If you have a suite of microphones recording the ambient sound in the environment, you can cross-correlate the signals recorded by each microphone to estimate the locations of reflective surfaces (and sound-sources) in the environment. Now, I said its almost possible, but not actually possible. You could do it if you carefully controlled the conditions, but there are just too many deal-breakers in the real world to actually pull it off. Even if the NSA wanted to do this sort of thing, and was prepared to pay big bucks to try, here are the problems they'd have to contend with:

  1. The processing is vastly simpler when using an array of microphones vs a single microphone (cellphone). I think it might be possible, on paper, to do it with a single microphone, assuming it is moving in roughly random directions over the duration of the recording, you've calibrated the living hell out of the thing, and you know its precise location at each instant in time. My gut feel is that even under these conditions, you would need hours of data to be able to construct even the crudest of 3D images.
  2. Lets give them the benefit of the doubt, and assume they only attempt this method when they have multiple people in the same room, all with cell phones. This provides a spatially diverse array of microphones, perfect for acoustic imaging, right? Well, they are all likely oriented in different directions. Microphone frequency response is a function of AOA (angle of arrival). Even assuming that we have somehow calibrated (equalized) these microphones using some magical equalization technique which is capable of equalizing over all AOA space, we still have to contend with the fact that the person using each cell phone forms part of the beam pattern, and this is too big of an unknown (and a dynamic unknown, at that) to be calibrated away.
  3. Microphones are hideously noisey devices. Don't believe me, make some recordings using a normal microphone on your computer, and analyze the data in MatLab (or Python, if you aren't an evil software thief). This places severe limits on the rate of convergence to a solution.
  4. There's just no good way to know the exact 3D location of a cellphone, vs time. Yes, there are cellphones with built-in GPS, and there are geo-location techniques, which may be combined with GPS data, but the acoustic wavelengths of interest are on the order of centimeters. Therefore, we need to know the location of the cell phone to less than the smallest wavelength. 1/10th of the smallest wavelength would be nice. This means we'd need to know the cellphone's location down to about 1mm accuracy, for each sample of the recording. This is not, and will not be possible (in my lifetime). Please, prove me wrong.
  5. And finally, the most significant deal-breaker of all: data compression. Cellphones use extremely high data compression ratios. The compression scheme is optimized for the human voice and the human ear, and nothing else. It is horribly destructive to all other acoustic information, and most importantly, destroys the phase relationships between the spectral components in the data. The phase relationships are absolutely the worst components of the data to degrade, if you are interested in acoustic imaging. The phase information is where the time-delay information exists. If you throw that away, or significantly degrade it, there's just no hope of being able to form an image by correlating the data against data from other microphones. It would be a little like eliminating all of the consonants in a sentence. Example: o e i i ei a u i io ae. Good luck figuring that out.
So, it was neat to see modern DSP techniques get a little public exposure in a mainstream movie. Acoustic imaging is fun stuff, and is out there in the real world today, but it requires a lot of careful setup, calibration, and fancy algorithms (not to mention, massive computers), to make it work.

Junk Shed

I've created a Google Pages site called Photonymous's Junk Shed. I haven't posted anything there, as of yet, but the intent is to upload random stuff, which would be cumbersome or impossible via my blog. For instance, I'll zip up and upload any interesting little coding projects (MatLab, Python, C++) where I'd like to provide the source code for other folks to mess with. I'll have a page dedicated to each significant project, and links to downloadable files. Hopefully, it works out. Any blog posting referring to something with supplemental material found in my Junk Shed will be labeled "junk shed" and perhaps "projects", if it qualifies as a "project".

Wednesday, August 13, 2008

The computer hating continues....

I've tried 3 motherboards, which all work to varying degrees. I'm currently conducting long-term stability testing in one of the motherboards, to ensure there isn't a problem with my new graphics card (nVidia 9800 GTX), or perhaps, the nVidia drivers. If it behaves its self for a few days, then the problem definitely was the original motherboard's PCI Express slot, or supporting electronics. The current motherboard I'm using is low on features (no 1394!!), and seems a bit sluggish. I have one last motherboard installed in another computer that I'm going to try (in a few days), which is identical to the one that failed. If it works well, then I won't need to buy anything. If not, I'll need to shell out $500, which would make me angry.

I've pursued the "frankenputer" strategy for a while, to avoid spending lots of money when a single part fails. I just swap in a new part, and continue along my merry way. It really sucks when a single part failure forces you to buy MANY new parts. Oh well....

Monday, August 11, 2008

I hate computers

I had been hoping to wait to post on this subject until the exciting conclusion to my latest computer-related saga, but the conclusion never came. Here's a brief, day-by-day, intro, to set the stage:

Day 1) Got home from work, pushed power button on computer, no response. SHIT!
Day 2) Debugged, thought it was ($300, 2yr old) graphics card, bought new card.
Day 3) Tried to install card... doesn't fit. SHIT!
Day 4) Calmed down, hack away plastic on interfering SATA connector. Whew, it fit. Installed card. Booted. Worked fine. Crashed. SHIT!
Day 5) Next day, booted, worked fine. No problems.
Day 6) Got home from work, pushed power button on computer, no response. SHIT!

Ok, so in about a week, I've come full circle, having spent $230 on a new card. My next best guess is that the motherboard (PCI express slot) has some sort of problem. I'll be able to test this in another motherboard I have lying around. If that is stable over the course of a few days, I'll need to spend about $500 on a new MB+CPU+RAM, since my spare motherboard is too under-featured, and a little sluggish (hard disk controller? buggy chipset drivers?).

Ugh, I hate computers.

PS Its a damned good thing I have lots of them, though, or I wouldn't have been able to type this post.

Saturday, August 2, 2008

Phyrangufest


Photographic evidence that yours-truly attended the 2008 Phyrangufest, at the Wynkoop brewpub, in down-town Denver! PZ (right) looks like he was caught a little off guard by the spur-of-the-moment photo-op, but thats ok... isn't the giant tapestry in the background the coolest? You should go to the Wynkoop to check it out in person. Oh, and the girl who took this photo is unbelievably hot. Really, she is, but I don't have her picture, so you'll just have to accept my word on blind faith.

Vegan French Onion Dip

I'm not a vegan, but I know lots of them, and vegan cooking interests me just like any other engineering challenge: "Hmmmm..... how can I make something similar to X, which tastes good, and uses no animal products?" I actually think of vegan food as just another nationality of cuisine (yes, I'm aware, there's no country called Veganistan, but damn it, there should be!). Anyway, the following is my second attempt at vegan French onion dip, and it turned out wonderfully.

Vegan French Onion Dip

Ingredients:
  • 1 box of silken soft tofu (12oz). It must say "silken" and it must say "soft". These are two very important properties. I prefer the unrefrigerated Mori-Nu brand, from Cost Plus World Market.
  • 1 heaping half-cup of raw (unroasted) unsalted cashews. I prefer the ones from the Whole Foods bulk bin.
  • 3 tablespoons of freshly squeezed lemon juice.
  • 2 table spoons of dried onion slivers.
  • 1 pinch of salt (to taste). This can be tweaked at the end.
Instructions:
  1. In a blender (not a food processor), add the tofu, lemon juice, and cashews.
  2. Blend the living hell out of it (you want it to be creamy).
  3. Dump it into a small container, and fold in the salt and the onions.
  4. Let sit over night, and give it a stir before serving.
Looking at some other french onion dip recipes on the web, it appears there are some many ways to spice up this dip. Feel free to experiment with some or all of these:
  • Worchestershire sauce (if you want it to be purely vegan, you'll have to find a bottle with no anchovies)
  • Soy sauce
  • Miso
  • Garlic
  • Black pepper
  • Chives
It is great on unflavored, unsalted crackers (Triskets especially), potato chips, and carrots.

Friday, August 1, 2008

Sheesh... I can't believe I started a blog

I am 6-sigma certain that I will not be update this blog regularly. I don't even know what I'd blog about, or who the heck would read it if I did. My interests are just too damned varied. It wouldn't be able to be classified as a science blog, programming blog, DSP blog, philosophy blog, computational biology blog, or cooking blog, since I'd be writing about all of these things, and more. The topics would be so widely varying that it could never be classified under any one topic, and therefore, would probably never attract attention. And yet, here I am, blogging.