DNA computing is a form of computing which uses DNA and molecular biology, instead of the traditional silicon-based computer technologies.
A changing view of biology has given rise to the new field of "DNA computers". This field seeks to use biological molecules such as DNA and RNA to solve basic mathematical problems. Fundamentally, many of these experiments recapitulate natural evolutionary processes that take place in biology, especially during the early evolution of life and the creation of genes. This computer will explore both new uses of nucleic acids as the building blocks of life but also complex architectural structures and even biomolecular-computing "hardware". It will also examine the information processing in cells that have had billions of years to evolve and to perfect a "molecular computer".
This field was initially developed by Leonard Adleman of the University of Southern California. In 1994, Adleman demonstrated a proof-of-concept use of DNA as form of computation which was used to solve the seven-point Hamiltonian path problem. Since the initial Adleman experiments, advances have been made, and various Turing machines have been proven to be constructable.
There are works over one dimensional lengths, bidimensional tiles, and even three dimensional DNA graphs processing.
DNA computers were initially coupled with an input and output module and is capable of diagnosing cancerous activity within a cell, and then releasing an anti-cancer drug upon diagnosis. DNA computing is fundamentally similar to parallel computing in that it takes advantage of the many different molecules of DNA to try many different possibilities at once.
For certain specialized problems, DNA computers are faster and smaller than any other computer built so far. But DNA computing does not provide any new capabilities from the standpoint of computational complexity theory, the study of which computational problems are difficult. For example, problems which grow exponentially with the size of the problem (EXPSPACE problems) on von Neumann machines still grow exponentially with the size of the problem on DNA machines. For very large EXPSPACE problems, the amount of DNA required is too large to be practical.
It was in 1994 that Adleman first used DNA, the molecule that our genes are made of, to solve a simple version of the "traveling salesman" problem. In this classic conundrum, the task is to find the most efficient path through several cities-given enough cities, the problem can challenge even a supercomputer. Adleman demonstrated that the billions of molecules in a drop of DNA contained raw computational power that might-just might-overwhelm silicon. But since then, scientists have run into tough practical and theoretical barriers. As Adleman and others in the field have come to realize, there may never be a computer made from DNA that directly rivals today's silicon-based microelectronics.
Main Features:-
1. High-precision, high-speed, low-cost gene expression profiling.
Through the complete automation of all processes from sample injection to the completion of quantitative reactions, the time required has been greatly shortened, to just six hours. By comparison, the conventional manual process based on the use of microarrays typically takes three days. Fitted with a maximum of 96 wells, each of which can contain up to 100 different DNA strands with abilities to bind to specific DNA sequences, the computer enables a large number of high-precision calculations to be performed simultaneously.
2. Versatile artificial DNA fragments for reactions designed using special software.
The artificial DNA fragments that make up the memory of the DNA computer (that is, the medium that allows computational calculations to be performed) are designed using special proprietary software so that they will have regular physicochemical properties and not interfere each other. Since this makes the so-called hybridization process extremely precise, changes can be made to the computing program performed depending on the composition of the DNA fragments chosen. This makes the computer highly versatile, and capable of being optimized for a wide range of analytical techniques applicable to different diseases, or for the analysis of the genomes of various organisms.
3. Reliability of reactions improved through introduction of Magtration technology.
The reliability of the reactions that underpin the computational abilities of the computer has been greatly improved through the use within the system of Magtration technology, which enables precise control of magnetic beads using specially designed disposable tips and magnets.
3 comments:
ЎHola!
El pensamiento justo
[url=http://rsfiles.serveblog.net/]Dougles[/url]
By doing nothing we learn to do ill.
Nice lines. Got more than i had expected. Thanks 4 such a nice blog
Post a Comment