Collaborations
with investigators at Harvard Medical School, The University of
Minnesota Masonic Cancer Center, and the Environmental Protection
Agency provide the opportunity to demonstrate our technology's
efficacy across a range of bio-medical topics including cancer
research, toxicology, and cancer education.
Our approach to
modeling is outlined below.
Utility
of Modeling
Model
organisms are widely used by biologists because they are tractable
systems that are amenable to biochemical, genetic, and physiological
manipulations. The utility of this approach is evidenced by the
wealth of biological knowledge derived from studies on model
organisms, such as bacteria, yeast, worms, flies, and mice. Such
organisms allow scientists to develop and test hypotheses in a system
of reduced complexity that shares a set of cellular fundamental
processes with more complex species, allowing them to translate their
findings. However, new technologies have generated a flood of
biological data at the molecular level, which has emphasized the gap
between the scale at which data are collected and the higher scales
at which we seek to understand biological processes.
A
computer model can help bridge this gap by accurately representing
known data and by predicting the outcome of wet bench experiments.
The process of constructing computer models is itself an informative
exercise as it uncovers knowledge gaps, biases, and inconsistencies
within the knowledge framework. Models that are rooted in the
language of the cell are especially useful as a vehicle for
collaboration and debate, ultimately driving the scientific process
forward.
We
have attempted to combine the usefulness of model organisms with the
utility of computer modeling to create computer-modeling techniques
that enable scientists to integrate their wet-bench biology and
modeling efforts.
Our
Approach to Modeling
As
Sydney Brenner noted, "a proper simulation must be couched in
the machine language of the object: in genes, proteins, and cells".
Accordingly, we have focused on simulating key functions of cells as
basic units of computation, simulating the physiological processes
that build, organize, and maintain tissue integrity.
The
philosophy of our modeling
approach shares a number of ideas with agent based modeling, to
enable the emergence of complex behavior through interaction of many,
relatively simple, heterogeneous components. Models are created by
manipulating a number of different types of basic elements (e.g.,
virtual genes or molecular resources) through a graphical user
interface. Each component operates by its own simple set of rules or
functions that define its responses, given its internal state and
inputs from its local environment and neighboring elements.
To
develop enabling methods for improved mammalian
tissue modeling, we started with a single cell and proceeded to
simulate the essential aspects of fundamental processes –cell
growth, division, differentiation, and response– as necessary for
development and organization of virtual tissue. “Be the cell” is
a simple reminder that living cells are not aware of anything beyond
their current state, and interactions among cells and their
components are the basis for higher order behavior. We avoid the use
of over-arching equations that govern system-wide behavior, because
it is antithetical to our modeling philosophy, and because we simply
don't think that living systems work that way.
Where
is the math?
The
question often arises, “Where is the math?” Any math in
our models defines interactions between components at the lowest
levels of the simulation. For example, transcription of a gene (a
base component) involves a user-specified algebraic function to
determine the level of expression caused by a transcription factor
(another base component). Likewise, metabolic processes such as
enzymatic reactions, translocation and secretion of molecules, or
protein interactions are represented as simple algebraic expressions.
In contrast to approaches that describe in mathematical terms
aggregate behavior from a top-down perspective, aggregate behavior of
our models emerges solely from local interaction of base components.
In
constructing a virtual tissue model, a modeler works directly with
these base components, setting parameters and interactions to build
models with incrementally increasing complexity and fidelity.
Initially, the modeler may know little about the details of some
underlying process, but can still construct a simple model that
abstracts much of the supporting detail yet captures the essential
behavior. From this, the modeler can quickly explore feasible
pathways and interactions that generate reasonable organization and
behaviors. As more data and greater understanding become available,
the model can be improved through a process of iterative refinement.
We
agree that, rather than simply reproducing data that are already
known, one important goal of biological modeling is to predict the
outcome of novel wet bench experiments. However, even if this goal is
not achieved in a particular instance, there are great scientific
benefits that come from the efforts of constructing and refining a
model.
These
activities require modelers to formalize their understanding of
underlying processes, which often leads to important new questions
and avenues of research. As the fidelity of the model improves, so
does its ability to predict and guide wet bench research, focusing
wet bench experiments on hypotheses most likely to prove fruitful.
Modeling is thus complementary and synergistic with wet bench
research, each guiding and informing the other.
|