2023.07.28.

Thread: Differentiable Self-organizing Systems

Thread: Differentiable Self-organizing Systems


Thread: Differentiable Self-organizing Systems















































How can we construct robust, general-purpose self-organising systems?

Self-organisation is omnipresent on all scales of biological life. From complex interactions between molecules
forming structures such as proteins, to cell colonies achieving global goals like exploration by means of the
individual cells collaborating and communicating, to humans forming collectives in society such as tribes,
governments or countries. The old adage “the whole is greater than the sum of its parts”, often ascribed to
Aristotle, rings true everywhere we look.

The articles in this thread focus on practical ways of designing self-organizing systems. In particular we use
Differentiable Programming (optimization) to learn agent-level policies that satisfy system-level objectives. The
cross-disciplinary nature of this thread aims to facilitate ideas exchange between ML and developmental biology
communities.

Articles & Comments

Distill has invited several researchers to publish a “thread” of short articles exploring differentiable
self-organizing systems,
interspersed with critical commentary from several experts in adjacent fields.
The thread will be a living document, with new articles added over time.
Articles and comments are presented below in chronological order:

Growing Neural Cellular Automata

Building their own bodies is the very first skill all living creatures possess. How can we design systems that
grow, maintain and repair themselves by regenerating damages? This work investigates morphogenesis, the
process by which living creatures self-assemble their bodies. It proposes a differentiable, Cellular Automata
model of morphogenesis and shows how such a model learns a robust and persistent set of dynamics to grow any
arbitrary structure starting from a single cell.

Read Full Article

Self-classifying MNIST Digits

This work presents a follow up to Growing Neural CAs, using a similar computational model for the goal of
digit “self-classification”. The authors show how neural CAs can self-classify the MNIST digit they form. The
resulting CAs can be interacted with by dynamically changing the underlying digit. The CAs respond to
perturbations with a learned self-correcting classification behaviour.

Read Full Article

Self-Organising Textures

Here the authors apply Neural Cellular Automata to a new domain: texture synthesis. They begin by training NCA
to mimic a series of textures taken from template images. Then, taking inspiration from adversarial
camouflages which appear in nature, they use NCA to create textures which maximally excite neurons in a
pretrained vision model. These results reveal that a simple model combined with well-known objectives can lead
to robust and unexpected behaviors.

Read Full Article

This is a living document

Expect more articles on this topic, along with critical comments from
experts.

Get Involved

The Self-Organizing systems thread is open to articles exploring differentiable self-organizing sytems.
Critical
commentary and discussion of existing articles is also welcome. The thread
is organized through the open #selforg channel on the
Distill slack. Articles can be
suggested there, and will be included at the discretion of previous
authors in the thread, or in the case of disagreement by an uninvolved
editor.

If you would like get involved but don’t know where to start, small
projects may be available if you ask in the channel.

About the Thread Format

Part of Distill’s mandate is to experiment with new forms of scientific
publishing. We believe that that reconciling faster and more continuous
approaches to publication with review and discussion is an important open
problem in scientific publishing.

Threads are collections of short articles, experiments, and critical
commentary around a narrow or unusual research topic, along with a slack
channel for real time discussion and collaboration. They are intended to
be earlier stage than a full Distill paper, and allow for more fluid
publishing, feedback and discussion. We also hope they’ll allow for wider
participation. Think of a cross between a Twitter thread, an academic
workshop, and a book of collected essays.

Threads are very much an experiment. We think it’s possible they’re a
great format, and also possible they’re terrible. We plan to trial two
such threads and then re-evaluate our thought on the format.

Editorial Note

Part of Distill’s mandate is to experiment with new forms of scientific publishing.
We believe something along the lines of this “thread” format might be promising,
but see it very much as an experiment.
We plan to trial two such threads and then re-evaluate our thought on the format.

Citation Information

If you wish to cite this thread as a whole, citation information can be found below.
The author order is all participants in the thread in alphabetical order.
Since this is a living document, the citation may add additional authors as it evolves.
You can also cite individual articles using the citation information provided at the
bottom of the corresponding article.



Updates and Corrections

If you see mistakes or want to suggest changes, please create an issue on GitHub.

Reuse

Diagrams and text are licensed under Creative Commons Attribution CC-BY 4.0 with the source available on GitHub, unless noted otherwise. The figures that have been reused from other sources don’t fall under this license and can be recognized by a note in their caption: “Figure from …”.

Citation

For attribution in academic contexts, please cite this work as

Mordvintsev, et al., "Thread: Differentiable Self-organizing Systems", Distill, 2020.

BibTeX citation

@article{mordvintsev2020thread:,
  author = {Mordvintsev, Alexander and Randazzo, Ettore and Niklasson, Eyvind and Levin, Michael and Greydanus, Sam},
  title = {Thread: Differentiable Self-organizing Systems},
  journal = {Distill},
  year = {2020},
  note = {https://distill.pub/2020/selforg},
  doi = {10.23915/distill.00027}
}





Source link

Facebook
Twitter
LinkedIn
Pinterest