Showing posts with label andy ellington. Show all posts
Showing posts with label andy ellington. Show all posts

Tuesday, April 14, 2009

Molecular computers -- A historical perspective. Part 1

I've been having discussions lately with Andy regarding biological/molecular computers and these discussions have frequently turned to the history of analog and digital computers as a reference -- a history not well-known by biologists and chemists. I find writing blog entries to be a convenient way to develop bite-sized pieces of big ideas and therefore what follows is the first (of many?) entries on this topic.


In order to understand molecular computers -- be they biological or engineered -- it is valuable to understand the history of human-built computers. We begin with analog computers -- devices that are in many ways directly analogous to most biological processes.

Analog computers are ancient. The first surviving example is the astonishing Antikythera Mechanism (watch this excellent Nature video about it). Probably built by the descendants of Archimedes' school, this device is a marvel of engineering that computed astronomical values such as the phase of the moon. The device predated equivilent devices by at least a thousand years -- thus furthering Archimedies' already incredible reputation. Mechanical analog computers all work by the now familiar idea of inter-meshed gear-work -- input dials are turned and the whiring gears compute the output function by mechanical transformation.


(The Antikythera Mechanism via WikiCommons.)

Mechanical analog computers are particularly fiddly to "program", especially to "re-program". Each program -- as we would call it now -- is hard-coded into the mechanism, indeed it is the mechanism. Attempting to rearrange the gear-work to represent a new function requires retooling each gear not only to change their relative sizes but also because the wheels will tend to collide with one another if not arranged just so.

Despite these problems, mechanical analog computers advanced significantly over the centuries and by the 1930s sophisticated devices were in use. For example, shown below is the Cambridge Differential Analyzer that had eight integrators and appears to be easily programmable by nerds with appropriately bad hair and inappropriately clean desks. (See this page for more diff. analyzers including modern reconstructions).


(The Cambridge differential analyzer. Image from University of Cambridge via WikiCommons).

There's nothing special about using mechanical devices as a means of analog computation; other sorts of energy transfer are equally well suited to building such computers. For example, in 1949 MONIAC was a hydraulic analog computer that simulated an economy by moving water from container to container via carefully calibrated valves.


(MONIAC. Image by Paul Downey via WikiCommons)


By the 1930's electrical amplifiers were being used for such analog computations. An example is the 1933 Mallock machine that solved simultaneous linear equations.


(Image by University of Cambridge via WikiCommons)

Electronics have several advantages over mechanical implementation: speed, precision, and ease of arrangement. For example, unlike gear-work electrical computers can have easily re-configurable functional components. Because the interconnecting wires have small capacitance and resistance compared to the functional parts, the operational components can be conveniently rewired without having to redesign the physical aspects of mechanism, i.e. unlike gear-work wires can easily avoid collision.

Analog computers are defined by the fact that the variables are encoded by the position or energy level of something -- be it the rotation of a gear, the amount of water in a reservoir, or the charge across a capacitor. Such simple analog encoding is very intuitive: more of the "stuff" (rotation, water, charge, etc) encodes more of represented variable. For all its simplicity however, such analog encoding has serious limitations: range, precision, and serial amplification.

All real analog devices have limited range. For example, a water-encoded variable will overflow when the volume of its container is exceeded.



(An overflowing water-encoded analog variable. Image from Flickr user jordandouglas.)

In order to expand the range of variables encoded by such means all of the containers -- be they cups, gears, or electrical capacitors -- must be enlarged. Building every variable for the worst-case scenario has obvious cost and size implications. Furthermore, such simple-minded containers only encode positive numbers. To encode negative values requires a sign flag or a second complementary container; either way, encoding negative numbers significantly reduces the elegance of the such methods.

Analog variables also suffer from hard-to-control precision problems. It might seem that an analog encoding is nearly perfect -- for example, the water level in a container varies with exquisite precision, right? While it is true that the molecular resolution of the water in the cup is incredibly precise, an encoding is only as good as the decoding. For example, a water-encoded variable might use a small pipe to feed the next computational stage and as the last drop leaves the source resivoir, a meniscus will form due to water's surface tension and therefore the quantity of water passed to the next stage will differ from what was stored in the prior stage. This is but one example of many such real-world complications. For instance, electrical devices, suffer from thermal effects that limit precision due to added noise. Indeed, the faster one runs an electrical analog computer the more heat is generated and the more noise pollutes the variables.


(The meniscus of water in a container -- one example of the complications that limit the precision of real-world analog devices. Image via WikiCommons).

Owing to such effects, the precision of all analog devices is usually much less than one might intuit. The theoretical limit of the precision is given by Shannon's formula. Precision (the amount of information encoded by the variable, measured in bits) is log2( 1+S/N ). It is worth understanding this formula in detail as it applies to any sort of information storage and is therefore just as relevant to a molecular biologist studying a kinase as it is to an electrical engineering studying a telephone.

.... to be continued.

Tuesday, March 10, 2009

Workshop demo, Kitchenette framing, fence



"Yeah, those aren't necessary", says Bruce as he cuts them out. (He's the one who put them there in the first place. :-)



After duct and electrical rearrangements, leaving only the pipe to be re-routed.



Having two excellent professional carpenters around sure boosts the productivity. We began the morning by demolishing part of the chase that is adjacent to the workshop wall where we are opening space for a new set of tool drawers and storage. This required moving around a few supports, ducts, and eletrical boxes (my job). There's a big sewage pipe in the middle of this which will be re-routed when the plumber comes on Thursday. Then Bruce and Kurt framed out the wall where the kitchette will go and we went over to a discount appliance store and purchased the microwave/vent hood, half-size dishwasher, and gas cook top which look pretty nice for a pretty reasonable price of about $1100. Then I worked for a few hours on revisions to Andy's paper and then after a 1 hour gym, I managed to get 4 of the 6 stringers up on the back fence.

Monday, March 9, 2009

Fence line, Bed spreads, Screening, Sophia Collier, and a paper for the Royal Society


Today was an oddly productive day. In the morning I dug post holes for a last bit of fence line that will separate my utility yard from my back yard. Then Bruce came over for measurement on the screening that for the upstairs porch. Then Amberlee came over and we had a little Christmas where we opened all the packages that she had ordered for me -- a new bed headboard, douvet, sheets, and pillows! For lunch I had a marvelous time meeting with Sophia Collier who was introduced to me by my attorney. Sophia and I were apparently born with the same mutant genes; we both left school (I in 11th and she in 12th) and we went off to various other endeavors (although hers have been generally more profitable than mine!) After a career in such things as soda and mutual fund management, she's now into CNC artwork. She has a CNC mill, 3D software, and a lot of fun ideas. We geeked out for hours on art and science projects of all kinds and she gave me much valuable feedback with regards to my various forthcoming enterprises. After lunch I set the fence posts and poured the footer concrete and then started on the rewrite of the paper I've been writing with Andy Ellington for the Royal Society journal Interface which came back with deservedly so-so reviews and which as a result (as seems to often be the case with peer-reviewed journals) is forcing a rewrite that will no doubt result in a better paper.

Tuesday, January 27, 2009

Protocells Book


The book "Protocells" came out recently in which my friends Jeff Tabor, Matt Levy, Andy Ellington and myself have an paper entitled "Tragedy of the Molecular Commons". Check out the high-quality binding from MIT press. Angel pointed out that it's a convertible book: either hardback or paperback! There's many interesting articles in the book and therefore provides further evidence that you shouldn't judge a book by its (unglued) cover.

Thursday, January 1, 2009

Paper


Been working all week with Andy, Xi Chen, and Nam on a paper. Using the kinetics from Jongmin Kim's bi-stable switch paper, Nam produced a nice simulation of the amorphous ring oscillator. Happily, these images look much like my earlier, cruder, simulation but now have dimensions. Features are measured in mm and time in hours. I think that's pretty cool -- a molecular scale device producing features at the mm scale. Would be great if it actually works when we try it someday!



Also from this this paper, Andy, Xi Chen, and I came up with a hopefully plausible complementary transcriptional NAND gate. The idea is that all signals are encoded by the sense and anti-sense complements of an RNA sequence. For example, signal "A" is high when some specific RNA sequence is high and it is low when the anti-sense of that sequence is high. The hypothetical gate is made from two complementary promoters on opposite sides of an double stranded DNA. On the left side, two molecular-beacon-like devices sequester half of a promoter that activates only when both inputs are high. On the right side, a single hairpin is folded such that a promoter is normally active but is deactivated when A and B invade (thanks Xi Chen). To work, the kinetics will have to be very delicately balanced so maybe it won't work well but at least it's a conceptual step in the right direction; we've been talking about a CMOS analog for years now and this is the first time we've made any conceptual progress.

Tuesday, December 30, 2008

Finished door panel prototype


Actually I spent most of the day working on a paper with Andy but there's no cool picture for that. Afterwards I finished the door panel prototype. I think they look pretty good but they were a real pain in the ass. I'd like to do it all over the house but I think I'll wait until I have access to a large mill.

Friday, December 19, 2008

Science article



The journal Science wrote an article about me (free registration req.) released in this week's edition. It's not bad -- at least it gets all the facts right which, evidenced by numerous previous experiences, is a real accomplishment in journalism, kudos to the author Mitch Leslie. The article is a "Curious Character" kind of story which is a relief as I feared that it would be a "Man loses legs, runs marathon" story. Unfortunately it has only a touch of what I hoped for which was a "Want to get into science from the outside? You can! This guy did" story.

I get asked about my odd non-academic entry into the world science all the time. And I very much hope that my example demonstrates that if you dream of playing the ultimate-nerd-sport of pure science research then just do it. Not only is it possible to enter the so-called ivory towers from the outside but it was easier than I ever imagined. My outsider’s knowledge base was both sufficient and valuable. When I got into science I thought it would take a long time before I could contribute anything. I was pleased to quickly realize that I wildly underestimated what my contributions would be.

My entry story boils down to this. I went down to UT, talked to a graduate adviser who gave me the party-line ("first get a GED, then get an undergraduate degree, then ... "). As I left that adviser's office, discouraged, I asked for a name of a professor who might be into certain subjects and he mentioned Edward Marcotte's name. I took Edward to lunch and we became instant friends because we share a huge enthusiasm for all things nerdy. After hours of geeking-out together he asked: "So what do you want to do?" and I said, "I don't know, just hang out and learn stuff." "Cool," he replied, "there's a desk. Meetings are on Fridays".

That's really all there was to it. I started hanging out in his lab and everybody seemed to assume I was a postdoc. Before long I had met several other professors and within weeks I was working on more projects than I will be able to finish in my lifetime. It wasn’t long before people were making job offers. While this episode might be a rare event based on the meeting of two like minds, I think it says something about the refreshingly open culture of science. Don't get me wrong, academic science is a human endeavor with human feelings of territorialism, etc., but in comparison to many other fields, it deserves credit as being fairly open-minded and meritocratous. After all, science is the ultimate nerd pursuit -- and nerds as a stereotype value technical achievement over prestige (not all, but many). Still, contrast it to walking into the similarly nerdy engineering department of a major corporation, say Boeing or GM, and telling someone that you just wanted to "hang out". Even if you found a friend in the company it wouldn't be long before a higher-up manager would suspect you as being a corporate spy and want you to either join the company or get out while threatening your friend with NDA violations.

Part of the openness of academics lies in the simple fact that a University is not a chartered feudal hierarchy but rather a coalition of independent lords with a governing body. (I suspect this is not so much an analogy as it is that the actual history of English academics is modeled after the post-Magna Carta arrangement of free, independent lords under royal patronage). Thus, a tenured professor or "principal investigator" (PI) such as Edward runs his lab however he sees fit -- constrained only by safety, morality, and money. That said, there are standard working procedures: undergraduates become graduate students become doctors become post-docs become professors. So, while it is very abnormal for an outsider like me to just show up out of nowhere, the system is refreshingly tolerant to such an entry.

When writing this story, the author, Mitch, called my friend Professor John Davis of the EE Department. John told me that Mitch asked: “So should we be looking for more Zacks or is he totally unique.” I said to John, “I hope you replied that there are lots more Zacks in the world!” John fell silent. “Oh no!” I exclaimed. I mean, just among my own friends I’ve already gotten three people to come into the system in ways somewhat analogous to my own entry. Thomas -- game programmer now working on molecular simulators for two labs. Mark -- game programmer and self-taught organic chemist working in another lab. Steve -- playwright turned biotech entrepreneur about to employed by the Center for Systems and Synthetic Biology. I mean, if 3 of my small circle of friends are inspired to get into science in just 5 years then there must be tens of thousands of other outsider-nerds waiting to be recruited! It’s a vast would-be nerd conspiracy! The only thing I hoped for out of this article is for those people to be inspired to action if they so choose to be and I'm not too sure that came across.

I’ve made this argument about my entry and non-uniqueness to several “insider” friends and I keep getting the same response: “But Zack, you’re so smart”. I find this response psychologically interesting. I can’t help but think that my insider friends find it easier to explain me as a freak of nature than it is for them to admit that all the expense and work they went through to get into their positions could be so easily bypassed. Of course, they well know that I studied just as hard as them to get where I am. I wasn’t born knowing things anymore than they were. But there is a difference in our paths -- I never did even one second of work I didn’t want to do while many of my grad student friends frequently (and somewhat hyperbolically) complain of being treated like slaves. So, yes, I’m smart; but I’m no smarter than my PI friends such as Edward, John, or Andy.

Indeed, Edward and I form an almost perfect experiment and control. Edward and I are freakishly similar. We are both high-functioning mildly autistic. We have eerily similar responses to many stimuli and have very similar temperaments. We both hate being told what to do. The only really significant difference in our skills is that I have dyslexia and he has whatever the opposite of that would be called (“superlexia”?). He can read 20 papers in the time it takes me to read 1. We both went to bad public high-schools although his was slightly better than mine. Had my school been a little bit better or his a little bit worse, we could easily have ended up on the other one’s trajectory. What’s different about Edward and me is mostly the path we took, not our natures. And it is why we work so well together – because we have different points of view but backed with the same intelligence and enthusiasm.

People (such as my own family) often frame my story as success “despite” dropping out of school. I find this highly prejudiced. Nobody ever seems to consider that I succeeded “because” I dropped out of school. It seems to me that our society treats school as a kind of magical elixir – a cure to whatever ails ‘ya. Poor and disadvantaged? School! Rich and spoiled? School! Curious? School! Bored? School! Let me clear -- the universal access to school is one of the greatest and most important accomplishments of our civilization. I am not dismissing the wonderful contribution of formal education to the world. That said, school is not a cure all. It is not the perfect path for everyone’s journey. To make my case, let me point out some of the advantages of my path.

First, my natural temperament is to resist doing anything I’m told to do. My mother claims I’ve been like this since I was born and that parenting me was an exercise in making me believe that things in need of doing were my idea. So getting out of school took away all of this unnecessary friction. (One can argue that I should have “just gotten over” that stubborn streak and I’d counter that if school cures whatever ails ‘ya then why didn’t it “fix” that?)

Second, by entering the workforce at 17, I started saving money very early and the compound interest on that savings is significant. While my friends went into debt to educate themselves (some are still paying those debts), I was being *paid* to educate myself. At 38 I’m in a much better financial position than my friends who went through school and that affords a lot more options such as, but not limited to, hanging out in labs, making artwork, and building pretty houses.

Third, I arguably have a superior education -- after all, I had a student to teacher ratio of one to one! While they sat in big anonymous classes I sat on the porches and couches of those same professors’ homes. All my teachers were my friends; they didn’t teach me because it was part of an institutional compact, but rather because that’s what friends do -- they hang out, they share ideas, the older ones impart knowledge to the younger while the younger impart enthusiasm to the older. That bond of friendship is much stronger than the one between a professor and a student and thus the two-way street of care and respect that is the magic of education is consequently more robust when spontaneous and voluntary.

Fourth, I never did anything I didn’t want to do. I never did someone else’s dirty work. I didn’t take any retrospectively useless classes. I didn’t worry about my grades. I didn’t suck up to any professors. I didn’t have to prove myself to arbitrary gatekeepers. I wasn’t told what to learn and more importantly I wasn’t told what not to learn. Someone once told me that I “owned” my knowledge while others seemed to “borrow” it and while I think that is overstating it, the degree to which there is truth in that statement is a result of constructing the learning path myself.

Fifth, I ended up with a broad knowledge base. My knowledge in any one field is certainly shallower than any of my friend’s knowledge in their respective fields, but I have a passing knowledge of a lot more fields. Grad school is very narrowly focused and consequently it seems to me that it is as much about indoctrination as it is about education.

The world needs lots of people that have deep penetrating knowledge of their subjects. The world also needs people who have broad but consequently more shallow views of many subjects so that they can help to bridge subjects. The educational system produces many of the first type but few, if any, of the second. Indeed, this gets me back to my thesis: I think my utility, my success, is *because* I didn’t go to school not despite it. Outsider opinions are necessary and valuable; they, ipso facto, don’t come from inside the system.