It has now been over a year since my move to Boston from Palo Alto, which seems like a fitting time to take a retrospective look at the two places. My sampling will be far from unbiased, having lived close to 20 years in the Bay Area. As a result this will be more like “Boston through the eyes of a Northern Californian”. There is no specific order to the comparisons below; I will vacillate between the substantive and the frivolous. And there will be no declared winner; both places are far too different and offer far too much for one to dominate the other in the Pareto optimal sense. At times, this will be more about Stanford vs. Harvard than the Bay Area vs. Boston, as much of my experience is ultimately grounded by my local environment.
There has been much haranguing about the apparent uselessness of the federal government. While I am no political pundit, I can speak about my little corner of the universe. The US federal government includes something called the National Institutes of Health or NIH, which happens to be the largest scientific research organization in the world. With a budget of over $30 billion, it spends more on research than Microsoft, IBM, Intel, Google, and Apple combined, supporting over 300,000 researchers nationwide. It also employs 6,000 scientists internally, who collectively produce more biomedical research than any other organization in the United States. What does it mean for the NIH staff to be furloughed? It means that every single day, 16.4 research years are wasted, or about three Ph.D. theses. This is likely to be an underestimate because the scientists employed by the NIH are professionals whose scientific output exceeds that of graduate students, and the quality of NIH-produced research backs this up. What kind of research will be delayed every day? You can read the list yourself, but it includes things like deciphering the genetic code, inventing MRI, and sequencing the human genome. This is not hyperbole; all these discoveries were made by NIH-supported researchers, who have received 83 Nobel prizes in total.
The US is the world’s preeminent scientific superpower, “a player without peer” as Nature recently put it. Only through profound and self-inflicted displays of stupidity such as we have witnessed during the past 24 hours will this cease to be the case.
I just came back from ICSB 2013, the leading international conference on systems biology (short write-up here). During the conference Bernhard Palsson gave a great talk, which he ended by promoting a view that (I suspect) is widely held among computational and theoretical biologists but rarely vocalized: most high-impact journals require that novel predictions are experimentally validated before they are deemed worthy for publication, by which point they cease to be novel predictions. Why not allow scientists to publish predictions by themselves?
I recently had the pleasure of attending the 14th International Conference on Systems Biology in Copenhagen. It was a five-day, multi-track bonanza, a strong sign of the field’s continued vibrancy. The keynotes were generally excellent, and while I cannot help but feel a little dismayed by the incrementalism that is inherent to scientific research and that is on display in conferences, the forest view was encouraging and hopeful. This is one of the most exciting fields of science today.
I will soon reach the one-year mark of my fellowship at HMS, which seems like a fitting time to examine how effectively I have spent my time here so far. I have been a practitioner of self quantification long before the movement acquired its name, having tracked some aspect of my life since I was 16. Given the movement’s growing popularity, I thought it appropriate to share some of my life hacking experiments. My approach has cyclically peaked and waned in sophistication, something that I will expound upon later in the post, but I believe that the overall trajectory of my effort has been that of increasing usefulness. Any lifestyle change, particularly one that involves compulsive tracking of one’s behavior, ought to result in actionable information that is demonstrably useful and not merely be a quantitative exercise in vanity. In this post I hope to show that this can in fact be the case for self quantification.
I recently had the pleasure of visiting Berlin for the first time. Prior to my arrival I had heard a lot about the city, and had many expectations. The real Berlin turned out to be very different from the one of my imagination, in more ways than one.
I occasionally engage in a somewhat macabre exercise: lost in thought, I begin to imagine hypothetical reactions from everyone I know to the news of my sudden death, usually due to an unexpected event like a car accident. I don’t do this very often, maybe once every few months. And there is no specific recurrent trigger for it. The last time I did it, a little over a month ago, was soon after hearing about the death of Roger Ebert. I, like many others who followed his life and writings over the years, felt saddened by this loss, and that sadness prompted me to consider my own mortality. At first, it was the “typical” story. I thought about my girlfriend, my parents, and the people closest to me. I thought about all the years invested, the memories formed and the futures planned; about all that was worthy and that was hard won. Then I thought about the sadness that would engulf all these people, about their sense of loss and the emptiness that they would experience. And, as such exercises typically end for me, I began to experience a deep sense of sorrow. I felt saddened by the inevitability of my death, by the eventual destruction of all that I have built, by the wasted memories, meticulously acquired then blown away as if they were never experienced. Perhaps I even felt angry by the seeming meaninglessness of life, by an existence in which we strive to live great lives, only to have them yanked away from us by the fragility of an aging and imperfectly evolved biological machine. It is typically at this point that, feeling hopelessly defeated, I turn my attention to something else, get distracted, and go on merrily living a life of ignorant bliss. For some reason however, this time my thoughts took a different turn.
… For although in a certain sense and for light-minded persons non-existent things can be more easily and irresponsibly represented in words than existing things, for the serious and conscientious historian it is just the reverse. Nothing is harder, yet nothing is more necessary, than to speak of certain things whose existence is neither demonstrable nor probable. The very fact that serious and conscientious men treat them as existing things brings them a step closer to existence and to the possibility of being born.
A few weeks ago a paper titled Life Before Earth was posted on the arXiv preprint repository. It came to my attention by way of this MIT Technology Review article and this blog post. The paper, using a rather simple extrapolation, argues that the apparent rate at which the complexity of terrestrial life increases suggests that its birth occurred approximately 9.7 billion years ago. Earth, in contrast, is around 4.5 billion years old. If their extrapolation is to be believed, then this discrepancy can only be resolved if terrestrial life is in fact of extraterrestrial origin. I will briefly summarize their argument, but I will not attempt to justify its validity. The original paper can be read here and is fairly accessible. The paper’s conclusions are consistent with a fact that has always puzzled me; the surprising complexity and maturity of what is known as the Last Universal Common Ancestor. It is this topic that I wish to focus on in this post.
This is the fourth and final post in a series about community and social cohesion in the United States. In the preceding entries (I, II, III) I put forth the thesis that American culture lacks a strong sense of community, and outlined some of the reasons I believe are responsible for this coming to be. In this post, I will propose some ideas to counteract the problem, although my ideas do not yet constitute a comprehensive solution. I am at an early enough stage in my thinking to only begin to realize the scope of this problem, let alone devise serious and credible solutions. What follows are shots in the dark; the first steps in what is bound to be a long journey.