# Difference between revisions of "Effects of topology on network evolution"

(New page: == Effect of topology of network evolution == Panos Oikonomou and Philippe Cluzel nature physics | vol 21 | August 2006 '''Keywords:''' networks === Summary === At a fundamental level, ...) |
(→Summary) |
||

Line 9: | Line 9: | ||

At a fundamental level, physical systems can be characterized by the network of interactions between the components, or nodes. Many natural systems are classified as "scale-free" networks, in which there is a power-law distribution in the number of connections per node. This differs from a homogeneous random network, in which there is a Poisson distribution in the number of connections. | At a fundamental level, physical systems can be characterized by the network of interactions between the components, or nodes. Many natural systems are classified as "scale-free" networks, in which there is a power-law distribution in the number of connections per node. This differs from a homogeneous random network, in which there is a Poisson distribution in the number of connections. | ||

− | + | [[Image:CluzelDiagram.jpg | 360 px]] | |

− | By studying the fitness of the system over the course of hundreds of generations, the researchers noticed a clear difference between the random and scale-free networks. Basically, the random networks improved their fitness in large, infrequent jumps, whereas the scale-free networks had a faster approach to optimal fitness, consisting of many small steps. | + | The authors of this paper use a simple mathematical model to demonstrate how such system may have evolved through natural selection. Specifically, they create a neural network, in which the nodes are "neurons," that can be either on or off. Each node may be connected to any other nodes, each with a different connection strength. The nodes sum the product of the inputs and the weights of each connection in order to determine whether to turn on or off ('''a'''). This process is repeated over many cycles to study the dynamics of the network. The "fitness" of each network is evaluated by how well the output neuron replicates a target sequence ('''b'''). Just as in Darwinian evolution, a collection of these neural networks undergoes random mutations and the most successful (i.e. highest fitness) offspring are chosen for the next generation ('''c'''). |

+ | |||

+ | [[Image:CluzelGraphs.jpg | 360 px]] | ||

+ | |||

+ | By studying the fitness of the system over the course of hundreds of generations, the researchers noticed a clear difference between the random ('''a''') and scale-free ('''b''') networks. Basically, the random networks improved their fitness in large, infrequent jumps, whereas the scale-free networks had a faster approach to optimal fitness, consisting of many small steps. For the random networks, the researchers investigated various average number of connections per node (K), and, for the scale-free networks, the researchers looked at the effect of the exponent in the power law distribution (<math>\gamma</math>). | ||

=== Connection to soft matter === | === Connection to soft matter === |

## Revision as of 15:23, 16 March 2009

## Effect of topology of network evolution

Panos Oikonomou and Philippe Cluzel

nature physics | vol 21 | August 2006

**Keywords:** networks

### Summary

At a fundamental level, physical systems can be characterized by the network of interactions between the components, or nodes. Many natural systems are classified as "scale-free" networks, in which there is a power-law distribution in the number of connections per node. This differs from a homogeneous random network, in which there is a Poisson distribution in the number of connections.

The authors of this paper use a simple mathematical model to demonstrate how such system may have evolved through natural selection. Specifically, they create a neural network, in which the nodes are "neurons," that can be either on or off. Each node may be connected to any other nodes, each with a different connection strength. The nodes sum the product of the inputs and the weights of each connection in order to determine whether to turn on or off (**a**). This process is repeated over many cycles to study the dynamics of the network. The "fitness" of each network is evaluated by how well the output neuron replicates a target sequence (**b**). Just as in Darwinian evolution, a collection of these neural networks undergoes random mutations and the most successful (i.e. highest fitness) offspring are chosen for the next generation (**c**).

By studying the fitness of the system over the course of hundreds of generations, the researchers noticed a clear difference between the random (**a**) and scale-free (**b**) networks. Basically, the random networks improved their fitness in large, infrequent jumps, whereas the scale-free networks had a faster approach to optimal fitness, consisting of many small steps. For the random networks, the researchers investigated various average number of connections per node (K), and, for the scale-free networks, the researchers looked at the effect of the exponent in the power law distribution (<math>\gamma</math>).

### Connection to soft matter

This paper could help view complex matter on a continuum between non-living "hard" matter (like crystals) and biological systems. The long chains of polymers allow interaction between more widely separated components, leading to more complex behavior. It may be productive to try to apply the concepts of network and graph theory to polymer and other soft condensed matter systems.

written by: Naveen Sinha