Showing posts with label ideas. Show all posts
Showing posts with label ideas. Show all posts

Thursday, June 17, 2010

Video-wiki documentaries



Although I haven't played with it yet, now that Youtube has cloud editing, I predict that video based "wiki" documentaries will become a very cool new form of media.

I propose that a particularly good genre to start with is History. For example, start with a film of a lecture by an amateur but good historian (I as just talking to my history 7th grade history teacher Jerry Buttrey about this this morning). Others later contribute source material as it becomes availasble. For example, someone might live near a battle site and have footage of it. Someone else might live near a library where they can get images of documents and interviews with associated scholars. Someone else might have artifacts handed down from family members. It's easy to see how a strait-forward talking-head lecture could be edited over time with with more and more cuts to such external video shots with the lecture as voice-over and from there might have the narrative interrupted with other interviews -- mimicking the life-cycle of a typical wikipedia article.

A particularly good company to sponsor such activity would be the exceptionally high quality "The Teaching Company" whose lecture's I've enjoyed for a long time. They might be tempted to view such amateur media as competition to their products, but I think the opposite is true. If they would sponsored such endeavors (for example, by making a call for participation via their existing client base) I bet that they could increase their sales on related subjects as they'd tap into the social network of each project and with some clever marketing they could push their associated wares to a very receptive narrow market.

Finally, the very act of contributing to such a documentary, even if it's just going to a field and shooting a few seconds of video, would be a great way to engage pupils of all ages in history classes. I for one much more enjoyed our field trips than I did sitting in class, and had I had an active reason to collect documentation it would have been even more memorable.

Although I'm probably not going to make any of these forthcoming video-wikimentaries, I look forward to watching them.

Sunday, January 10, 2010

Camera ethernet


One or the combination of drugs I'm on (buproprion and celexa) induce very vivid dreams. The other night I dreamed about allowing laptop computers, which now often have built-in cameras, to communicate with each other by flashing their screens at each other. If the 640x480 cameras ran at 30 fps at, say, 50% efficiency then you might be able to achieve 30*(640*480)/4*8/2 = 9 Mbits/sec which is about the bandwidth of first generation Ethernet. (Although realistically I'd be impressed if you got 1 Mbps.) Implementing this might be a fun student programming assignment.

(Yes, it's true, I have super nerdy dreams! What did you expect?)

Saturday, January 2, 2010

Full scale automaton


British automaton
From Ray Bate's reconstruction


Torre dell’Orologio
, Venice
From flickr user kukudrulu

Last night I had a beautiful dream where I was in a huge building that was full of an elaborate automaton -- gears, levers, etc. Part of the machine had real people acting like the characters of a traditional automaton in old costumes such as might have been worn by the little figurines decorating an medieval automaton church clock. I think it would be a beautiful piece of theater to make a set like this where the participants come in, explore the space, flip levers and knobs causing the the actors to animate -- perhaps interacting in a full-scale puzzle game where there's some sort of order-of-operations problem to be solved by the group.

Friday, May 15, 2009

Idea: Cut healthcare costs? Reduce the patent duration.



Brooks has a good essay today about the proposed underwhelming health care cost-cutting measures. I agree that none of the proposed changes sound like enough to take a reasonable bite out of our growing health care costs; and I doubt that for such a big problem there exists many easy fixes. But, there is one very easy fix that would have an huge impact -- cut patent duration times from 20 years to, say, 10. Of course, innovating companies will hate the idea of reducing their patents and boring-old manufacturers will love it but I guarantee that 10 years from now there will be an incredible drop in drug prices.

We have a fundamental problem that no one wants to admit: until some revolution in drug development takes place (e.g: if it turns out that siRNAs are a magic bullet) then we simply can not have guns, butter, and bandages -- at least we can't have every newfangled "bandage" being made at such an incredible pace.

We have an impossible expectation for our health care that we don't have for any other sector of our economy. We simultaneously want the free market to invent new treatments on a for-profit motive and then we want everyone to have access to the result. In contrast, we don't expect every driver in the country to have access to a Lamborghini just because they exist. We don't expect everyone to have access to the latest iPhone gadget just because they exist. But we do expect -- for good ethical and moral reasons -- that everyone should have access to whatever the latest, best treatments are. While this expectation is understandable, it's nevertheless schizophrenic: "Pharma: go be innovative, invest a lot of money to make amazing drugs! Oh my god, why are they so expensive?" We don't say: "Apple: go be innovative, invest a lot of money to make amazing phone! Oh my god, why are they so expensive?" (Actually some people do, but most just recognize that if the phone is too expensive they'll just do without.)

Health care is always going to involve an insurance middle man be it private, public, or all-messed-up-in-between as it is now. So, health care will always be a collective venture. It is simply irrational to expect that we can collectively afford every possible innovation, just as it would be irrational to expect that we could all collectively own the latest iPhone gadgets. Thus, the systemic way to change the collective system is to simply lower the profit bar. And this can be done by changing one simple variable: the duration of patents. Make patents last 10 years and drug companies won't build as many expensive drugs and, yes, more people will die of things that could have been prevented. But, recognize that this is already the case! The 20 year limit is totally arbitrary. Had it been set at, say, 30 years then there would exist, right now, more amazing but even more expensive drugs and therefore because the number is set at 20 and not 30 we are "heartlessly" letting people go untreated because of an arbitrary number. The number has changed before (upwards) and we can change it again, downwards -- at least for drugs -- if we collectively choose to. It's the only "easy" fix.

Saturday, May 2, 2009

Vaccinate your child or gramps gets it in the stomach!

There seems to be a growing ignorance about vaccination. From my informal queries of friends and acquaintances who have chosen not to immunize their children or who do not get flu vaccines, I have found that few people understand that vaccination is part of a greater social compact, not merely a personal cost/benefit analysis. The effect is called Herd Immunity. When we vaccinate ourselves, and especially our children, we are adding to the communal common defenses. Obviously, everyone would like to have a defensive wall built to protect a community yet everyone would prefer to not contribute. But that's not the way a good society works; we share the costs of doing things that benefit the common good.

Immunization of children is particularly important for two reasons: 1) Children's immune systems respond to vaccination much more effectively than do others, especially the elderly who are the most likely to die of viral diseases such as influenza. 2) Children are responsible for much of the transport viruses throughout a community owing to their mobility and lack of hygiene.

For example, a controlled experiment conducted in 1968 by the University of Michigan demonstrated that large-scale vaccination of children conferred a 2/3 reduction in influenza illnesses to all age groups. For a nice article on the subject, here's a Slate article from 2008.

I propose an old-fashioned poster campaign to inform about the social benefits of vaccination. Here's a couple of prototype posters I photoshopped up this afternoon. (Apologies to Norman Rockwell!)






(Original photo Adam Quartarolo via WikiCommons)

Thursday, April 23, 2009

BSTQ - Bull Shit Tolerability Quotient

There are many traits that determine someone's performance in various social settings such as school, work, military, etc. A popular metric for correlation to "success" in such social system is the "Intelligence Quotient" which purports to measure elements of abstract intelligence. Another metric that has gained popularity is the "Emotional Intelligence Quotient" which purports to measure the ability to perceive and mange emotions in oneself and others. Both of these metrics claim a high correlation to success in aforementioned social institutions.

I submit that success in roles within said social systems -- student, factory worker, warrior, etc. -- requires a high tolerance of activities such as: implementing poorly articulated tasks, engaging in inane conversations, attending pointless engagements, and other time-wasting activities known informally as "Bull Shit" (BS). The ability to tolerate such BS is a very important trait that is not normally rigorously evaluated.

I propose a simple test to measure an individual's tolerance for BS: a list of increasingly inane questions and pointless tasks is given to the test taker. For example, the test might begin with questions like: "Fill in the blank: Apples are __ed" and end with stupendously pointless tasks such as "Sort these numbers from least to greatest" followed by several hundred ~20 digit numbers and then having the next task say: "Now randomize those same numbers". The Bull Shit Tolerability Quotient (BSTQ) would just ignore the given answers and simply count the number of questions that test taker was willing to consider before handing the test back in frustration and declaring: "This Bull Shit!"

If a formal BSTQ test is not available, most standardized academic tests can be used as a reasonable substitute. However, the dynamic range of such generic academic tests to measure BSTQ is low. In other words, only extreme low-scorers of a proper BSTQ test will be measurable via the number of unanswered questions on a standard academic test used as a BSTQ surrogate. Extreme caution must be used when interpreting an academic test as a BSTQ analog -- the test giver may misinterpret the number of unanswered questions as the result of the test taker's low knowledge of the test's subject matter instead of as a spectacularly low BSTQ score.

BSTQ tests can easily be made age independent. For pre-verbal children the test would involve increasingly inane tasks such as matching sets of colored blocks to colored holes and so forth. The test would simply measure how many of these tasks the pre-verbal child could engage in before he or she became irritated or upset with the examiner.

Like the IQ and EIQ I suspect that the BSTQ will be correlated to the degree of success within many social endeavors, in particular: school; however, I also suspect that there is a substantial fraction of the population that has an inverse correlation between their IQ and their BSTQ scores. Of these, of particular interest are those with high IQ with low BSTQ. I would not be surprised if the population of people rated by their co-workers as "indispensable" is significantly enriched for individuals with a high IQ / low BSTQ score. Finally, I submit that these individuals are severely under-served by the educational system which demands -- indeed glorifies -- extremely high BSTQ, especially among those with high IQ.

Adding a BSTQ evaluation to pre-academic children might suggest that the student would excel in a non-traditional educational environment where the student is allowed to select their own agendas and tasks. A very low BSTQ coupled with a very high IQ would seem to almost guarantee rebellion if a traditional educational approach is applied. Identifying individuals with exceptionally high IQ scores and exceptionally low BSTQ scores may be a valuable tool to prevent the mis-classification of such students as "trouble makers" and instead correctly classify them as "potential indispensable iconoclasts".


(This idea evolved from lunch discussion with Marvin today, so thanks Marvin!)

Tuesday, April 7, 2009

The 21st Century Chemical / Biological Lab.

White Paper: The 21st Century Chemical / Biological Lab.

Electronic and computer engineering professionals take for granted that circuits can be designed, built, tested, and improved in a very cheap and efficient manner. Today, the electrical engineer or computer scientist can write a script in a domain specific language, use a compiler to create the circuit, use layout tools to generate the masks, simulate it, fabricate it, and characterize it all without picking up a soldering iron. This was not always the case. The phenomenal tool-stack that permits these high-throughput experiments is fundamental to the remarkable improvements of the electronics industry: from 50-pound AM tube-radios to iPhones in less than 100 years!

Many have observed that chemical (i.e. nanotech) and biological engineering are to the 21st century what electronics was to the 20th. That said, chem/bio labs – be they in academia or industry – are still in their “soldering iron” epoch. Walk into any lab and one will see every experiment conducted by hand, transferring micro-liter volumes of fluid in and out of thousands of small ad-hoc containers using pipettes. This sight is analogous to what one would have seen in electronics labs in the 1930s – engineers sitting at benches with soldering iron in hand. For the 21st century promise of chem/nano/bio engineering to manifest
itself, the automation that made large-scale electronics possible must similarly occur in chem/bio labs.

The optimization of basic lab techniques is critical to every related larger-scale goal be it curing cancer or developing bio-fuels. All such application-specific research depends on experiments and therefore reducing the price and duration of such experiments by large factors will not only improve efficiency but also make possible work that was not previously. While such core tool paths are not necessarily “sexy”, they are critical. Furthermore, a grand vision of chem/bio automation is one that no single commercial company can tackle as the vision for such requires both a very long time commitment and a very wide view of technology. It is uniquely suited to the academic environment as it both depends upon and affords cross-disciplinary research towards a common, if loosely
defined, goal.

Let me elucidate this vision with a science-fiction narrative:

Mary has a theory about the effect of a certain nucleic acid on a cancer cell line. Her latest experiment involves transforming a previously created cell line by adding newly purchased reagents, an experiment that involves numerous controlled mixing steps and several purifications. In the old-days, she would have begun her experiment by pulling-out a pipette, obtaining reagents out of the freezer, off of her bench, and from her friend's lab and then performed her experiment in an ad hoc series of pipette operations. But today, all that is irrelevant; today, she never leaves her computer.

She begins the experiment by writing a protocol in a chemical programming language. Like high-level languages used by electrical and software engineers for decades, this language has variables and routines that allow her to easily and systemically describe the set of chemical transformations (i.e. “chemical algorithms”) that will transpire during the experiment. Many of the subroutines of this experiment are well established protocols such as PCR or antibody
separation and for those Mary need not rewrite the code but merely link in the subroutines for these procedures just as a software engineer would. When Mary is finished writing her script, she compiles it. The compiler generates a set of fluidic gates that are then laid-out using algorithms borrowed from integrated circuit
design. Before realizing the chip, she runs a simulator and validates the design before any reagents are wasted – just as her friends in EE would do before they sent their designs to “tape out.” Because she can print the chip on a local printer for pennies, she is able to print many identical copies for replicate experiments. Furthermore, because the design is entirely in a script, it can be reproduced next week, next year, or by someone in another lab. The detailed script means that Mary’s successors won’t have to interpret a 10 page hand-waving explanation of her protocol translated from her messy lab notes in the supplementary methods section of the paper she publishes – her script *is* the experimental protocol. Indeed, this abstraction means that, unlike in the past, her experiments can be copyrighted or published under an open source license just as code from software or chip design can be.

Designing and printing the chip is only the first step. Tiny quantities of specific fluids need to be moved into and out of this chip – the “I/O” problem. But Mary’s lab, like any, both requires and generates thousands of isolated chemical and biological reagents each of which has to be stored separately in a controlled environment and must be manipulated without risking cross-contamination. In the old days, Mary would have used hundreds of costly sterilized pipette
tips as she laboriously transfered tiny quantities of fluid from container to container. Each tip would be wastefully disposed of despite the fact that only a tiny portion of it was actually contaminated – such was the cost when everything had to be large enough to be manipulated by hand. In the old days, each of the target containers – from large flasks to tiny plastic vials – would have had to be hand-labeled resulting in benches piled with tiny cryptic scribbled notes with all of the confusion and inefficiency that results from such clutter. Fortunately for Mary, today all of the stored fluids for her entire lab are maintained in a single fluidic database; she never touches any of them. In this fluidic database, a robotic pipette machine addresses thousands of individual fluids. These fluids are stored inside of tubes that are spooled off of a single supply and cut to length and end-welded by the machine as needed. Essentially, this fluidic database has merged the concepts of “container” and “pipette” – it simply partitions out a perfectly sized container on-demand and therefore the consumables are cheaper and less wasteful. Also, the storage of these tube-containers is extremely compact in comparison to the endless bottles (mostly filled with air) that one would have seen in the old days. The fluid-filled tubes could be simply wrapped around temperature-controlled spindles and, just like an electronic database or disk drive, the system can optimize itself by “defragmenting” its storage spindles ensuring there’s always efficient usage of the space. Furthermore, because the fluidic
database knows the manifest of its contents, all reagent accounting can be automated and optimized.

Mary has her experiment running. But, moving all these fluids around is just a means to an end. Ultimately she needs to collect data about the performance of her new reagent on the cancer line in question. In the old days, she would have run a gel, used a florescent microscope, or any number of other visualization techniques to quantify her results – any of these measurements would have required a large and expensive machine. But today, most of these measurements are either printed directly on the same chip as the fluidics using printable chemical / electronic sensors or those that can’t be printed are interfaced to a standardized re-usable sensor array. The development of those standards was crucial to the low capital cost of her equipment. Before far-sighted university engineering departments set those standards, each diagnostic had its own proprietary interface and therefore the industry was dominated by an oligopoly of several companies. But now, the standards have promoted competition and thus the price and capabilities of all the diagnostics has improved.

As Mary’s chemical program executes on her newly minted chip, she gets fluorescent read-outs on one channel and antibody detection on another – all such diagnostic were written into her experimental program in the same way that a “debug” or “trace” statement is placed into a software program. After her experiment runs, the raw sensor data is uploaded to the same terminal where she wrote the program and she begins her analysis without getting out of her chair.

After the experiment, the disposable chip and the temporary plumbing that connected to it are all safely incinerated to avoid any external contamination. In the old days, such safety protocols would have had to be known by every lab member and this would have required a time-consuming certification process. But today, all of these safety requirements are enforced by the equipment itself and therefore there’s much less risk of human mistake. Furthermore, because of the
enhanced safety and lower volumes, some procedures that were once classified as bio-safety level 3 are now BSL 2 and some that were 2 are now 1, meaning that more labs are available to work on important problems.

Mary’s entire experiment from design to data-acquisition took her under 1 hour – comparable to a week by old manual techniques. Thanks to all of this automation, Mary has evaluated her experiment and moved on to her next great discovery much faster than would have been possible before. Moreover, because so little fluid was used in these experiments her reagents last longer and therefore the cost has also fallen. Mary can contemplate larger-scale experiments than anybody dreamed of just a decade ago. Mary also makes many fewer costly mistakes because of the rigor imposed by writing and validating the entire experimental script instead of relying on ad hoc procedures. Finally, the capital cost of the equipment itself has fallen due to standardization, competition, and economies of scale. The combined result of these effects is to make the acquisition of chemical and biological knowledge orders of magnitude faster than was possible just decades ago.

Monday, March 30, 2009

An Idea: Federal Reserve Random Moves


Reference historical DJIA (log scale)

Self Organized Criticality (SOC) is a model to describe the dynamics of certain kinds of systems built out of many interacting non-linear actors. The "sand pile" model" relates the frequency of avalances to their magnitude by 1/f (i.e. avalances happen with inverse frequency to their size).

It seems intuitive that economic systems should also show this "sand pile" behavior and this paper claims that stock markets do indeed show "near-self organized critical" behavior. (The exact function is not relevant to my argument.) This intuition for this comes from the fact that each economic actor relies on others in a complicated web of interactions. The value of assets in the system are subjective and are strongly biased by the perception of other actor's subjectively valued assets. Moreover, the perceived future value of those assets is a strong function of the cultural perception of the unknown future. In other words, the macroeconomic system is in a strong, multi-scale, positive feedback.

In the sandpile model, a few grains of sand will end up holding an enormous load of upstream stress and therefore their perturbation will create large avalanches. Analogously, a few economic actors (insurance companies, banks, hedge funds, etc) will end up with an enormous load of upstream dependencies that will similarly cause avalanches if they are disrupted.

In the sand pile model one can imagine a large conical basin of uphill dependencies resting on a few critical grains -- those critical bits are the ones that are "too big to be allowed to fail". Playing very loosely with the analogy, the stress on a gain from its uphill neighbors is analogous to the balance sheets of an economic actor. But not exactly. In the sand pile, all potential energy is explicitly accounted for -- there's no hiding the cumulative stresses due to the weight of each particle. This is not true in the economic analog. Real balance sheets do not account for total stresses because complicated financial transactions (like mortgages and insurance contracts) contain off-balance-sheet information that is usually one-way. For example, when a bank realizes that there is risk in a mortgage they will pass on this cost to the uphill actor but when a debtor realizes that there's more risk (for example, they might know that their financial situation is not as stable as it appears on paper) they will not pass along this information. In other words, there will tend to be even more uphill stress than is accounted for by the balance sheets of each downhill actor.

Now the point.

If you wanted to reduce the number of large scale catastrophic avalanches in the sand pile model, the method for doing so is easy: add noise. The vibration of the sand pile would ensure that potential energy in excess of the noise energy would not be allowed to build up. It's the same idea of forest management -- lots of small fires prevent larger ones. Therefore, by analogy, a good strategy for the Federal Reserve might be to similarly add noise. Conveniently, this "add noise" strategy is inherently simpler to execute than is their current strategy -- they would simply roll a die every few months and change the discount rate by some number between zero and ten percent.

Crazy? Well, as it stands now, the Federal Reserve operates under the belief that it can act as a negative-feedback regulator of the macro economic system. The idea is sound, but based on my experience attempting to control even very simple systems, I'm skeptical of the reality. To begin with the obvious, the economy is anything but simple. Furthermore, the Fed does not have, never has had, and never will have, an accurate measurement of the economy. To wit: it neglected the huge volume of CDOs built up in the last 10 years, and the S&L stress of the 80s, and the tech bubble of the 90s, etc, etc. History shows that there have always been, and will always be, bubbles and newfangled leveraging instruments so anything short of draconian regulation that stopped all financial innovation (which would be worse) will not be anything but reactive. But it gets worse. There are also large and unpredictable latencies in both the measurements and the results of the Fed's actions. Even in simple linear systems, such latencies can have destabilizing effects and since the macro economic system is highly non-linear and constantly evolving the effects are essentially unknowable apriori.

In summary, I suspect that the macroeconomic system is not directly controllable in the way that is envisioned by the creators of the Federal Reserve due to non-linearity, poor measurability, and latency. Therefore, given that the economy probably has some SOC like organization, I suspect that random Fed moves would probably be no worse than the current strategy and would probably be better.

Saturday, March 28, 2009

An Idea: Internet Security Though Random Compilation

This morning an idea occurred to me -- a way to stop malware, viruses, and worms. When someone wishes to crack an internet protocol for nefarious purposes, one way to do so is to exploit bugs in buffer handling. For example, some specific implementation of the email protocol might have a bug whereby if certain characters are passed in the addess field then it causes a buffer overflow that could permit writing onto the stack. By sending the right set of characters, the overflow might be directed to upload and execute arbitrary instructions. Similar exploits have existed/still exist in many systems such as the image handlers for Microsoft Outlook and countless other programs.

As clever as it is, exploiting such a bug requires having a copy of the code locally during development so that the programmer can step through it and figure out exactly how to exploit the overflow. Thus, a way to defeat this is to ensure that every single instance of that code running on every machine is unique. Therefore the solution is simple. Write a compiler that generates random code that performs the same task but with different execution paths. Such a complier would stop all such exploits by effectively creating a local unique encryption. A random compiler would be easy to write and indeed already exists in Java as "code obfuscators" for the purposes of reducing reverse engineering. The only difficulty in deploying such a system is that the relevant software could no longer be deployed on mass-produced media such as CDs since each instance has to be different. But this is a declining issue as more and more software is delivered online where each instance could be different. Furthermore, many of the main internet protocols are open source implementations and where local compilation is already possible or, in many cases, already occuring. Therefore adding this feature to Gnu C would be a big step in the right direction.