Network Robustness and the Next Net

I am posting this because recently i’ve been asked about it in conversations surrounding the building of the “#nextnet” (!/search?q=%23nextnet ).

While there is a plethora of work on networks and robustness / resilience, there are a few starting points:

Error and attack tolerance of complex networks
Réka Albert, Hawoong Jeong & Albert-László Barabási
NATURE | VOL 406 | 27 JULY 2000

Many complex systems display a surprising degree of tolerance against errors. For example, relatively simple organisms grow, persist and reproduce despite drastic pharmaceutical or environmental interventions, an error tolerance attributed to the robustness of the underlying metabolic network. Complex communication networks display a surprising degree of robustness: although key components regularly malfunction, local failures rarely lead to the loss of the global information-carrying ability of the network. The stability of these and other complex systems is often attributed to the redundant wiring of the functional web defined by the systems’ components. Here we demonstrate that error tolerance is not shared by all redundant systems: it is displayed only by a class of inhomogeneously wired networks, called scale-free networks, which include the World-Wide Web, the Internet, social networks and cells. We find that such networks display an unexpected degree of robustness, the ability of their nodes to communicate being unaffected even by unrealistically high failure rates. However, error tolerance comes at a high price in that these networks are extremely vulnerable to attacks (that is, to the selection and removal of a few nodes that play a vital role in maintaining the network’s connectivity). Such error tolerance and attack vulnerability are generic properties of communication networks.

Network Robustness and Fragility: Percolation on Random Graphs
D. S. Callaway, M. E. J. Newman, S. H. Strogatz, D. J. Watts

Opinions appear to differ over whether networks such as this are robust or fragile to this selective removal of vertices. Albert et al. point out that only a small fraction of the highest-degree vertices need be removed to destroy the giant component inthe network and hence remove all long-range connectivity. Conversely, Broder et al. point out that one can remove all vertices with degree greater than kmax and still have a giant component even for surprisingly small values of kmax. As we show in Fig. 2, both viewpoints are correct: they are merely different representations of the same data…. Among other results, we find that a distribution network such as the Internet, which has an approximately power-law vertex degree distribution, should be highly robust against random removal of nodes (for example, random failure of routers), but is relatively fragile, at least in terms of fraction of nodes removed, to the specific removal of the most highly connected nodes.

To summarize an example, networks that have a few hubs are robust to random failures but not to targeted attacks. Why? Because random failures are unlikely to take out the hubs, but attacks can target the hubs first.

The bottom is line that “robustness” is relational, i.e. robust to what. In the case of networks, different topologies have different consequences. As a result, only a network of networks, i.e. a network that contains different kinds of edges (connections), operating simultaneously, exhibits multiple topologies, and is therefore maximally robust to known kinds of damage.


Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.