Mechanistic computational modeling of brain metabolism 

Extensive manual curation of the biochemical literature, together with transcriptomic and proteomic data are combined with novel algorithms to reconstruct metabolic models specific to certain neuronal sub-types, e.g., dopaminergic neurons. Constraint-based computational modeling is used to model metabolism at genome-scale. Iterative rounds of reconstruction, model prediction and reconciliation with existing experimental data is used to develop a computational model which is formal synthesis of current knowledge on neuronal metabolic function. The accuracy of the neuronal metabolic model is tested by comparison with independent experimental data. Computational models of neuronal metabolism are used as an aid to interpret experimental data, to optimally design in vitro experiments with stem-cell derived neurons, to understand the aetiopathogenesis of neurodegenerative disease, e.g., Parkinson’s disease, and to develop new approaches for early diagnosis and treatment of disease. 

Automated microfluidic cell culture 

Automated microfluidic cell culture has various advantages over manual, macroscopic cell culture, including increased precision, ease of conducting perfusion culture, direct coupling to downstream analysis systems and the potential for real time on-chip analyses. In collaboration with Prof. Thomas Hankemeier, Leiden University, we are working toward the establishment of automated microfluidic cell culture of stem-cell derived cultures, coupled to analytical chemical measurement of metabolite concentrations.  This, quantitative data is subsequently integrated with corresponding mechanistic models of neuronal metabolism. 

Algorithms and software for mechanistic modeling of biochemical networks 

In experimental systems biology, the majority of high throughput experimental data is of molecular abundance and the minority is of reaction rates. We seek a modeling framework flexible enough to integrate experimental data on both rates and abundance. Established genome-scale modeling methods do not explicitly represent the abundance of each molecule. Without explicitly representing abundance, the incorporation of experimental constraints from measurement of molecular abundance is always an approximation. Such approximation may be useful in the short to medium term, but ultimately we seek a computational model of biochemical reaction networks to explicitly represent the abundance of each molecule and the rate of each reaction. We are developing a novel approach to model stationary state reaction kinetics for large systems of reactions based on nonlinear continuous optimization algorithms. These variational kinetic models aim to preserve the computational tractability associated with numerical optimization and add the biochemical realism typical of kinetic models.  Our general approach is to focus on the development of biochemically tailored polynomial-time optimization algorithms with guaranteed convergence properties. This effort requires the development and application of mathematical concepts at the intersection of Variational Analysis, Continuous Nonlinear Optimization, Generalised Convexity and Numerical Analysis. In tandem with mathematical algorithm development, we prototype numerical analysis software then test its performance when applied to a set of low, medium and high dimensional biochemically relevant modeling problems.  


a, A genome-scale metabolic reconstruction is a structured knowledge base that abstracts pertinent information on the biochemical transformations taking place within a chosen biochemical system, e.g., the human gut microbiome. Genome-scale metabolic reconstructions are built in two steps. First, a draft metabolic reconstruction based on genome annotations is generated using one of several platforms. Second, the draft reconstruction is refined on the basis of known experimental and biochemical data from the literature6. Novel experiments can be performed on the organism and the reconstruction can be refined accordingly. b, A phenotypically feasible solution space is defined by specifying certain assumptions, e.g., a steady-state assumption, and then converting the reconstruction into a computational model that eliminates physicochemically or biochemically infeasible network states. Various methods are used to interrogate the solution space. For example, optimization for a biologically motivated objective function (e.g., biomass production) identifies a single optimal flux vector (v), whereas uniform sampling provides an unbiased characterization via flux vectors uniformly distributed in the solution space. c, Flux balance analysis is an optimization method that maximizes a linear objective function, ψ(v) = cTv, formed by multiplying each reaction flux vj by a predetermined coefficient, cj, subject to a steady-state assumption, Sv = 0, as well as lower and upper bounds on each reaction flux (lbj and ubj, respectively). 

Please see here for research publications.

Funding

  • European Union Horizon 2020