I recently had the pleasure of attending the 14th International Conference on Systems Biology in Copenhagen. It was a five-day, multi-track bonanza, a strong sign of the field’s continued vibrancy. The keynotes were generally excellent, and while I cannot help but feel a little dismayed by the incrementalism that is inherent to scientific research and that is on display in conferences, the forest view was encouraging and hopeful. This is one of the most exciting fields of science today.
The focus of the talks and sessions parallels the broader shifts the field has been witnessing in the past decade. The dominant topic was the intersection of biomedicine and systems biology, a field that seems to change names every few years: translational medicine, personalized medicine, systems pharmacology, precision medicine, you get the picture. A full 7 out of the 21 sessions were dedicated to this intersection, and a number of other sessions had an ostensibly different focus but ended up veering into the topic nonetheless. Everyone’s favorite model system, it seems, is none other than Homo sapiens. Synthetic biology and metabolic engineering made a strong appearance as well, and there were some emerging subfields that garnered a lot of attention, most notably the microbiome. Microbial research, a favorite subject of mine and one that seems to teeter on the brink of extinction once every generation, is clearly making a comeback with the explosion of interest in (gut) microbiota. One speaker quipped that there is a monthly Nature article dedicated to the microbiome.
Copenhagen is charming but a bit on the quiet side. Food here is fantastic, one of the best European foodie cities in my opinion. The hotel’s location was a bit unfortunate though, in a pseudo-industrial area that is a good 20-30 minutes walk from anything interesting. The city itself however is small and extremely walkable.
Below is a selection of talks that stuck out for me, in order of appearance:
Stuart Kauffman, one of the fathers of systems biology and the inventor of Boolean networks, made an impassioned call for reforming and rethinking the way we do medicine. The thrust of his argument is that medicine and pharmacology, as practiced today, is no longer working. The low-hanging fruit has been picked long ago, and ignoring the systems nature of drug-body interactions is an exercise in futility. In particular, he singled out the idea of random sampling of patients for clinical trials as especially ludicrous, given the high-dimensionality of the problem and its concomitant curse. His vision of the way forward appears to revolve around some sort of guided search of the space, perhaps an adaptive search approach where one begins with random sampling but adjusts the pool as a trial goes on. He summed up his points with “we’re killing people”, and made repeated references to this paper.
Marc Vidal highlighted a tour de force effort to obtain the binary protein-protein interaction network of the human proteome, and in so doing coined a new term, the “densome”. He described an analysis of the existing protein-protein interaction literature and noted that it has a very noticeable density gradient. Better-studied and better-understood proteins have more interactions between them than proteins that are not as well studied (this region of the interaction space he half-jokingly called the densome). He wondered whether this reflects biological or sociological reality, and proceeded to answer the question by analyzing the density of protein-protein interactions as obtained from their high-throughput effort. Unsurprisingly, the density there appears far more uniform, with well-studied and not so well-studied proteins being equally likely to have interactions between them.
Morten Sommer described some really interesting bacterial work on developing “cycled” therapies where instead of giving one drug or a cocktail of drugs, the therapy cycles drugs, i.e. give A, then B, then C, then A again. The idea is that when a bug develops resistance to a given antibiotic, it often comes at the expense of sensitivity to other drugs. His lab has experimentally and systematically mapped this by comparing the effects of drug treatments on resistance/sensitivity to other drugs in microbes. The end result is a nice heatmap highlighting drugs that go together (evolution of resistance to one confers resistance to the other) and drugs that go in opposite directions (evolution of resistance to one leads to sensitivity to the other). With this they generate a network of hops where treatment can move from one drug to another. The idea is that as an infection becomes resistant to one drug, the patient is switched to another one, and the process can conceivably keep on cycling like this, so that the infection never develops resistance and is perhaps eliminated at some point. The work reminded me of Franziska Michor’s research on pulsed treatments.
Bence Mélykúti presented some nice theoretical work on the statistical mechanics of transcription factor (TF) binding. Typically TF-DNA binding is modeled as a single molecular event involving one protein molecule and one DNA molecule. Instead he considers alternative scenarios, such as when two TFs must first dimerize before binding DNA, or when one TF must first bind before another TF binds. He works out the equilibrium statistical mechanics of many such scenarios.
Luca Albergante has done some interesting work on quantifying robustness and evolvability in genetic regulatory networks (GRNs). The main idea rests on a graph theoretic result which shows that long feedback loops (basically just loops in a directed graph) lead to instability. He formalizes this notion in two ways: (i) shorter loops are better because they favor robustness, and (ii) fewer potential loops, i.e. topologies that would close into a loop by adding a single edge, are better because they favor evolvability (i.e. if adding edges is likely to form loops then the organism is not very evolvable). He looks at the GRNs of E. coli, yeast, and humans (the latter based on ChIP-seq data from ENCODE), and shows that in all of them the number of loops and potential loops is far lower than would be expected by chance. On its own this wasn’t necessarily convincing, because one can imagine many reasons why an organism would want to evolve in this way, beyond his formalization of robustness and evolvability. But what’s interesting is that when he compares wildtype human cells to cancer cell lines, the picture changes dramatically, with cancer cells having much longer loops and potential loops.
Bernhard Palsson began with the statement that something is seriously afoot in the field of systems biology. After a decade of new ideas and concepts, of new mathematical techniques and experimental technologies, the field is finally starting to show real results. He then proceeded to highlight over a dozen papers published in the last two years, almost all in very high-profile journals, that are genuinely systems in nature and that make biological discoveries that could have not been made without the systems perspective. These papers are also the subject of this recent review. On the whole his grand tour of the last two years of systems biology was convincing, optimistic, and inspiring. He ended his talk with a provocative suggestion, one that I expound upon here.
Rama Ranganathan described his lab’s fantastic work on protein sectors, internal networks of residues within proteins that appear to act as the proteins’ functional cores. The way he views it, the globular structure of the protein is effectively a localized liquid solvent in which the protein sector resides. The protein sector in turn acts as a constraint force to define the basic function of the protein. This was at least his original view, which he says he has now “upgraded” to something more complex. Namely, that evolutionary forces act on multiple timescales. Protein sectors are the most dynamic and adaptable, able for example to switch the protein’s binding specificity using a handful of residue changes. The globular soup on the other hand is slower, and is sculpted over longer eons to define the general function of the protein (e.g. as a phospho-signaling protein). Presumably even slower forces act to evolve entire signaling pathways and circuits.
Dana Pe’er gave a great technical talk highlighting her lab’s use of non-linear dimensionality reduction and manifold reconstruction methods like Laurens van der Maaten’s tSNE to analyze flow cytometry data (they rebranded it viSNE). I found this interesting because almost all biological data is still analyzed using PCA, even when it makes absolutely no sense to do so. Her talk was the first where I see well-reasoned applications of more advanced dimensionality reduction methods. Perhaps biology is finally growing up! (To be fair, I think PCA is still the best off-the-shelf scheme for seeing global structure. The application of local methods is non-trivial and requires a great deal of care. PCA is a “technology”. All the other fancy methods are still firmly in the realm of science.)
Domitilla Del Vecchio described a combination of theoretical and experimental work on designing synthetic circuits with a load driver. The basic problem as she described it is that engineered transcriptional circuits, such as Michael Elowitz’s famous oscillator, are useless in driving the oscillation of other systems because once the oscillating TF begins to bind other genes to drive their expression, the oscillations start to dampen until they eventually disappear. To get around this problem, they integrate a phospho-signaling network into the transcriptional oscillator, which, because it is much faster acting, is able to transmit the signal without dampening the transcriptional oscillations. They model this and experimentally test it, and the agreement looked phenomenal.