“Since the early days of molecular biology, the search for the minimal genome has been the ‘Holy Grail’..”
“The search for the ‘smallest autonomous self-replicating entity,’ which subsequently became the search for the smallest cell genome, was begun in the late 1950s by Harold Morowitz
… This led to studies of the mycoplasmas, showing that these microorganisms have the smallest reported cell and genome sizes [as of 1996]. The DNA sequence of the smallest known mycoplasma genome, that of Mycoplasma genitalium, recently was determined… The nature of selective pressure for repeated genome reductions..is not known..[but] have been suggested to be due to selection for faster (hence, smaller) replicating genomes to produce greater progeny..yields, selection for smaller genomes to reduce the energy burden..in limiting nutrient environments, and loss..[due to] deleterious mutations… Since the early days of molecular biology, the search for the minimal genome has been the ‘Holy Grail’… subtracting the minimal gene set from an organism’s total gene inventory should reveal genes for the phenotype characters that make each organism unique.” http://www.sbs.utexas.edu/psaxena/bio226r/articles/minimum_cell_genome.pdf
Harold J. Morowitz:
Oral History Interview, March 16, 2005 — “This interview chronicles Morowitz’s scientific career in detail, beginning with his education in physics, his transition to biophysics, to his ongoing attempt to apply..information theory to..biological problems. His description..gives a valuable insight of how and why physicists after WWII came to be interested in..biological problems…[and] illustrates the reciprocal interaction between scientific projects..in Yales’s biophysics program..and the Atomic Energy Commission’s promotion of the peaceful use of atomic energy (e.g. nuclear fallout debate)..”
Remember SARS in 2003? Looking back on SARS led me to something dubbed the Minimal Genome Project which I found a lot of experimental documentation for between 1998 and 2000, including the creation of completely new amino acids –until this, every biological entity that has ever existed used the same available 20 amino acids. SARS patients were found to be infected with a novel corona virus and, coincidently, a novel corona virus was created in a lab with a new amino acid in 2001. Since the original outbreak, the SARS creature seems to have disappeared. Was it part of an experiment to create a new “minimal genome organism” ? Indirect evidence is tantalizingly suggestive that it could be the case. First, consider this research news in the right time window, and I’ll add more to this post supporting a proof-of-concept. http://jenniferlake.wordpress.com/2009/12/01/sweating-the-small-stuff/ This link, http://jvi.asm.org/cgi/content/abstract/74/22/10600 within Sweating the Small Stuff states that coronavirus is the largest known genome in nature –ripe for a take-away project, I suppose. Since the 2003 outbreak new information about coronavirus is being published, such as this statement from August 2004: “The coronavirus replicase-transcriptase was recently predicted to contain RNA-processing enzymes that are extremely rare or absent in other RNA viruses” –in other words, the researchers have made a novel discovery about their specimen. In this case, the studied coronavirus is generating a unique protein that has an ability to preferentially cleave double stranded RNA (dsRNA): http://www.ncbi.nlm.nih.gov/pmc/articles/PMC514660/ . This feature of the SARS virus is creating a handy new enzyme tool for genetic engineers.
Minimal genomes are a boon to living DNA computers:“Human cells and computers process and store information in much the same way… ‘If you look inside the cell, you find a bunch of amazing little tools,’ said Adelman, who made the first DNA-based computation in 1994. ‘The cell is a treasure chest.’ ” http://www.cbsnews.com/stories/2003/08/18/tech/main568893.shtml; Leonard Adelman is employed by George Mason University and the MITRE Corporation, producer of the JASON reports for the DoE, DoD, etc.; Leonard Adelman says “Late one evening, while lying in bed reading Watson’s text, I came to a description of DNA polymerase. This is the king of enzymes – the maker of life. Under appropriate conditions, given a strand of DNA, DNA polymerase produces a second “Watson-Crick” complementary strand, in which every C is replaced by a G, every G by a C, every A by a T and every T by an A. For example, given a molecule with the sequence CATGTC, DNA polymerase will produce a new molecule with the sequence GTACAG. The polymerase enables DNA to reproduce, which in turn allows cells to reproduce and ultimately allows you to reproduce. For a strict reductionist, the replication of DNA by DNA polymerase is what life is all about… DNA polymerase is an amazing little nanomachine, a single molecule that “hops” onto a strand of DNA and slides along it, “reading” each base it passes and “writing” its complement onto a new, growing DNA strand.“ http://www-gap.dcs.st-and.ac.uk/~history/Biographies/Adleman.html; http://www.genealogy.math.ndsu.nodak.edu/id.php?id=62298 ; simpler genomes are helping this process along. Adelman’s counterpart in Israel, Ehud Shapiro, works on a project at the Weizmann Institute called “The Human Cell Lineage Tree** Flagship Initiative” http://www.wisdom.weizmann.ac.il/~udi/ An accomplishment of Shapiro and colleagues in 2004 illustrates a DNA computer at work: “Recently, simple molecular-scale autonomous programmable computers were demonstrated..allowing both input and output…. Such computers, using biological molecues as input data and bilogically active molecules as outputs, could produce a system for ‘logical’ control of biological processes… As proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes associated with models of..cancer, and to produce a single-stranded DNA molecule after an anticancer drug.” http://www.nature.com/nature/journal/v429/n6990/abs/nature02551.html
There it is: cell surveillance and cancer ‘self’ control.
With DNA polymerase to “process”, restriction enzymes to “cut” and ribozymes to “paste”, the working parts of DNA computers look to be on hand with the exception maybe of genetically stable, high-fidelity replicators. http://www.jci.org/articles/view/19386
I speculate that cancer cells are very useful for this purpose with their immortal qualities, and in so many cases, engineered recombinant microbes (i.e. viruses) in current therapy use end up causing cancers, that the cancer cells, and controlling the cells
, in the first place is the most useful approach. Minimal genomes, real and artificial, look like other lines of pursuit running apace for the ultimate creation of DNA computer life. This general field of study is called bioinformatics http://www.med.nyu.edu/rcr/Fordham/
and another of its Holy Grails is finding perfect cell-based delivery
systems. The DNA computer ‘proof of principle’ by the Israelis (above) used a “stochastic molecular automaton” to provide the computer input, meaning a precise-targeting agent –I’m perusing the literature to find the exact agents– but this fundamentally defines the action of viruses. And the genetically small ones, like SV40 monkey virus and polioviruses are already human-adapted and under intense ‘quantitative’ experimentation to see how well they deliver genetic information.
Chalk this up to spooky things scientists say.
November 22, 2002 — “Craig Venter’s ‘minimal genome’ project announced Wednesday is not about creating a new life form…[that comes later with Synthia]. The high profile project was just funded by the U.S. Department of Energy (DoE) with $3 million going to the Institute for Biological Energy Alternatives (IBEA), one of the nonprofit research institutes Venter founded after leaving..Celera Genomics early this year.
“The question of the minimal genome for an organism [–the base gene set required for life–] is always ‘minimal in which environment,’ said Francisco J. Silva of the University of Valencia in Spain. Silva and colleagues..are studying the genome of the insect endosymbiont Buchnera aphidicola, which appears to have an even smaller genome than that of the parasite Mycoplasma genitalium, Venter’s organism of choice.
“..pathogenic bacteria usually have more genes than their harmless relatives do..but the disease-related genes are not essential to life, so they could be the first catagory of genes a scientist would jettison from a minimal genome organism… ‘A minimal genome organism’ [Silva] said, ‘will be a prisoner of the laboratory dish where it lives, and will be unable to compete with the outer world.’
“The minimal genome organism now being planned..will be further crippled so that it cannot survive wihout laboratory coddling.
The strategy is to synthesize an artificial chromosome containing the presumptive minimal gene set, remove all existing genetic material.. and then insert the synthetic chromosome into the vacant cell… The long-term goal..will be to help the world solve some of its environmental problems. That’s why the funding is coming from DoE, where recent genetics projects have focused on bioremediation…” http://www.the-scientist.com/article/display/20885/
>>>In June 2010
, Venter announced “Synthia”
–“a synthetic cell from scratch” and “a hitherto unseen lifeform…To do this, he read the DNA of Mycoplasma mycoides, a bug that infects goats, and recreated it piece by piece. The fragments were then ‘stitched together’ and inserted into a bacterium of a different species… he claimed the breakthrough had changed his views on the definition of life. ‘We have..the first synthetic cell powered and controlled by a synthetic chromosome and made from four bottles of chemicals,’ he said..” http://www.dailymail.co.uk/sciencetech/article-1279988/Artificial-life-created-Craig-Venter–wipe-humanity.html
“The constraints of the genetic code are history”
Feb. 14, 2002
– ” ‘The constraints of the genetic code are history,’ proclaims Peter Schultz of the Scripps Research Institute in La Jolla, California…’At least in bacteria, the genetic constraints..are gone.’ Dr. Schultz’s statement is no idle boast…[His] laboratory, along with another team of chemists based in Japan, are introducing completely new strands [of DNA]…If successful, they will fabricate..a new sort of living thing… [C]olleagues..in..Japan..have developed unnatural versions of all the ingredients in this process: the bases, the DNA, the RNAs and the amino acids. The result is a protein that could never have existed in nature… By exposing..unnatural bacteria to the sort of conditions believed to have prevailed on the early earth..it might be possible to observe whether bacteria with 21, 22, or even more amino acids might win out
over those with the standard complement… Back in the present, the next order of business is to apply the research to mammals. Dr. Wang says that a certain strain of monkey cells has shown a promising tendency to incorporate unnatural amino acids. Within a year, he thinks, the group should have made mammalian cells equipped with 21 amino acids.” http://www.economist.com/node/987697?story_id=987697
>>>Dr. Lei Wang is on the faculty of the Salk Institute http://www.salk.edu/faculty/wang.html
; it certainly seems like the Schultz team was eager to know if their creations would ‘compete’ outside the lab.
In July 2002, Eckard Wimmer of SUNY, in conjunction with the Pentagon, announced the result of his DARPA-funded project to create viruses from scratch:
“This is the first time that a working biological entity has been made using chemicals alone…But poliovirus is easier to build than many others. It has a short and simple genome and assembles itself directly from a DNA template… More complex viruses could be synthesized, Wimmer believes, by additional chemical steps or by putting synthetic genes into living cells..” http://www.nature.com/news/1998/020708/full/news020708-17.html
Commenting on Wimmer’s achievement: ” ‘This work should never have been done, funded, or published,’ said J.Craig Venter…’Somehow the whole system broke down here.’ ..The work, [Wimmer] said, was intended to serve as a warning of what is now possible.”
2004– “Several attempts have been made to identify the minimal set of genes that is required for life using computational approaches… These experiments resemble those already performed by nature; a few hundred million years ago an ancestor of E. coli was domesticated by aphids which resulted in the elimination of 70-75% of the original bacterial genome. Amazingly, the small genomes of the imprisoned bacteria are more stable than those of their free-living relatives.
..”Obligate host-associated bacteria have among the smallest genomes known in nature… Two different scenarios have been proposed to explain the process of genome shrinkage in B. aphidicola… that the minimization..was continuous, with genes being lost individually through a large number of small deletion events..[or] that many genes were lost simultaneously at an early stage of the internalization process..through the elimination of large blocks of DNA spanning multiple genes…
…”as computational analysis becomes more refined and the data sets grow larger, fewer genes remain that are conserved among all taxa…[and] the few remaining genes are organized in a surprisingly similar manner… The lack of..sequences..reduces rearrangement possibilities…
“The next few years will tell whether..the..endosymbiont genomes are self-sustainable… An interesting question for the future is to determine whether the dependent partner will be ‘allowed’ to stay as a ‘silent parasite’ if the need for its functions is lost during evolution, or if such a loss will cause it to deteriorate and collapse.”
Too many mutations, too rapid a pace and/or too few genes leads to a potential species collapse
in a nose-diving degenerative process that microbe researchers in the lab call “error catastrophe”; virus experimenters document the fundamental principle, noting, “There is an intrinsic limit to the maximum variability of viral genetic information before it loses meaning and if an RNA virus quasispecies goes beyond that mutation limit, the population will no longer be viable. The phenomenon that occurs when the loss of genetic fidelity results in a lethal accumulation of errors has been termed ‘error catastrophe’. Most cellular organisms have evolved a number of sophisticated processes to maintain their genetic information with high fidelity and stay far away from the threshold of error catastrophe. In contrast, it has been predicted that RNA viruses with high mutation frequencies exist close to the edge of error catastrophe and can be forced into error catastrophe by a moderate increase in mutation rate… We also describe direct evidence that the error catastrophe theory applies to poliovirus.” http://www.pnas.org/content/98/12/6895.full
“One very important fact to remember is that radiation increases the spontaneous mutation rate” — Nuclear Regulatory Commission (NRC) power plant training manual
on the effects of ionizing radiation
Another important fact is that viruses are not organisms or cells and have no strategies for survival! The authors of the ‘error catastrophe’ paper are really telling us that the “edge of viability” on which polioviruses (and others) exist needs maintenance to prevent their extinction. Circulating polioviruses were determined in 2000 to be all vaccine-derived, and the buzz around the conference tables of the WHO and other agencies brought up the issue of stopping the inoculation programs. Very soon thereafter
a complete scratch poliovirus made from mail-order chemicals blitzed the science headlines invoking a new threat from terrorists. As a mode of comparison, organisms do have an array of survival strategies.