Paper published in a book (Scientific congresses, symposiums and conference proceedings)
An Analog Chemical Circuit with Parallel-Accessible Delay Line for Learning Temporal Tasks
Banda, Peter; Teuscher, Christof
2014 • In Sayama, Hiroki; Rieffel, John; Risi, Sebastianet al. (Eds.) Artificial Life 14: Proceedings of the Fourteenth International Conference on the Synthesis and Simulation of Living Systems
chemical delay line; chemical perceptron; chemical reaction network; analog asymmetric signal perceptron; temporal learning; chemical computing
Abstract :
[en] Current synthetic chemical systems lack the ability to self-modify and learn to solve desired tasks. In this paper we introduce a new parallel model of a chemical delay line, which stores past concentrations over time with minimal latency. To enable temporal processing, we integrate the delay line with our previously proposed analog chemical perceptron. We show that we can successfully train our new memory-enabled chemical learner on four non-trivial temporal tasks: the linear moving weighted average, the moving maximum, and two variants of the Nonlinear AutoRegressive Moving Average (NARMA). Our implementation is based on chemical reaction networks and follows mass-action and Michaelis-Menten kinetics. We show that despite a simple design and limited resources, a single chemical perceptron extended with memory of variable size achieves 93-99% accuracy on the above tasks. Our results present an important step toward actual biochemical systems that can learn and adapt. Such systems have applications in biomedical diagnosis and smart drug delivery.
Disciplines :
Computer science Chemistry
Author, co-author :
Banda, Peter ; Portland State University > Department of Computer Science
Teuscher, Christof
External co-authors :
no
Language :
English
Title :
An Analog Chemical Circuit with Parallel-Accessible Delay Line for Learning Temporal Tasks
Publication date :
2014
Event name :
ALIFE 14: The Fourteenth Conference on the Synthesis and Simulation of Living Systems
Event place :
New York, United States
Event date :
from 30-07-2014 to 02-08-2014
Audience :
International
Main work title :
Artificial Life 14: Proceedings of the Fourteenth International Conference on the Synthesis and Simulation of Living Systems
Atiya, A. F. and Parlos, A. G. (2000). New results on recurrent network training: unifying the algorithms and accelerating convergence. Neural Networks, IEEE Transactions on, 11(3):697-709.
Banda, P. and Teuscher, C. (2014). Learning two-input linear and nonlinear analog functions with a simple chemical system (in press). In Ibarra, O. H. and Kari, L., editors, Unconventional Computing and Natural Computing Conference, volume 8553 of Lecture Notes in Computer Science, pages 14-26. Springer International Publishing Switzerland.
Banda, P., Teuscher, C., and Lakin, M. R. (2013). Online learning in a chemical perceptron. Artificial life, 19(2):195-219.
Banda, P., Teuscher, C., and Stefanovic, D. (2014). Training an asymmetric signal perceptron through reinforcement in an artificial chemistry. Journal of The Royal Society Interface, 11(93).
Bray, D. (1995). Protein molecules as computational elements in living cells. Nature, 376(6538):307-312.
Büsing, L., Schrauwen, B., and Legenstein, R. (2010). Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons. Neural Computation, 22(5):1272-1311.
Copeland, R. A. (2002). Enzymes: A practical introduction to structure, mechanism, and data analysis. JohnWiley & Sons, Inc., New York, New York, second edition.
Dittrich, P., Ziegler, J., and Banzhaf, W. (2001). Artificial chemistries - a review. Artificial Life, 7(3):225-275.
Espenson, J. (1995). Chemical kinetics and reaction mechanisms. McGraw-Hill, Singapore.
Haykin, S. (2009). Neural networks and learning machines. Pearson, New Jersey, third edition.
Jaeger, H. (2001). The echo state approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, 148:34.
Kim, J., Hopfield, J. J., and Winfree, E. (2004). Neural network computation by in vitro transcriptional circuits. In Saul, L. K., Weiss, Y., and Bottou, L., editors, Advances in Neural Information Processing Systems, volume 17, pages 681-688. MIT Press.
LaVan, D. A., McGuire, T., and Langer, R. (2003). Small-scale systems for in vivo drug delivery. Nature biotechnology, 21(10):1184-91.
Liu, J., Cao, Z., and Lu, Y. (2009). Functional nucleic acid sensors. Chemical Reviews, 109(5):1948-1998. PMID: 19301873.
Maass, W., Natschläger, T., and Markram, H. (2002). Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation, 14(11):2531-2560.
Mills, A. P., Yurke, B., and Platzman, P. M. (1999). Article for analog vector algebra computation. Biosystems, 52(1-3):175-180.
Moles, J., Banda, P., and Teuscher, C. (2014). Delay line as a chemical reaction network (under review). Parallel Processing Letters, http://arxiv.org/abs/1404.1152.
Qian, L., Winfree, E., and Bruck, J. (2011). Neural network computation with DNA strand displacement cascades. Nature, 475(7356):368-372.
Rodan, A. and Tino, P. (2011). Minimum complexity echo state network. IEEE transactions on neural networks / a publication of the IEEE Neural Networks Council, 22(1):131-44.
Rojas, R. (1996). Neural networks: A systematic introduction. Springer-Verlag, Berlin.
Rosenblatt, F. (1958). The perceptron: A probabilistic model for information storage and organisation in the brain. Psychological Review, 65:368-408.
Soloveichik, D., Seelig, G., and Winfree, E. (2010). DNA as a universal substrate for chemical kinetics. Proceedings of the National Academy of Sciences of the United States of America, 107(12):5393-5398.
Stojanovic, M. N. and Stefanovic, D. (2003). A deoxyribozymebased molecular automaton. Nature Biotechnology, 21(9):1069-1074.
Zhang, D. Y. and Seelig, G. (2011). Dynamic DNA nanotechnology using strand-displacement reactions. Nature chemistry, 3(2):103-113.