Extended Self-Replication
Reference
- Gabor, T., Illium, S., Zorn, M., et al. 2022. Self-replication in neural networks. Artificial Life 28, 2, 205–223.

This journal article provides an extended and more in-depth exploration of self-replicating neural networks, building upon earlier foundational work (Gabor et al., 2019). The research further investigates the use of backpropagation-like mechanisms not for typical supervised learning, but as an effective means to enable non-trivial self-replication – where networks learn to reproduce their own connection weights.
Key extensions and analyses presented in this work include:
- Robustness Analysis: A systematic evaluation of the self-replicating networks’ resilience and stability when subjected to various levels of noise during the replication process.
- Artificial Chemistry Environments: Further development and analysis of simulated environments where populations of self-replicating networks interact, leading to observable emergent collective behaviors and ecosystem dynamics.
- Dynamical Systems Perspective: A detailed theoretical analysis of the self-replication process viewed as a dynamical system. This includes identifying fixpoint weight configurations (networks that perfectly replicate themselves) and characterizing their attractor basins (the regions in weight space from which networks converge towards a specific fixpoint).

By delving deeper into the mechanisms, robustness, emergent properties, and underlying dynamics, this study significantly enhances the understanding of how self-replication can be achieved and analyzed within neural network models, contributing valuable insights to the fields of artificial life and complex systems. [Gabor et al. 2022]