Talk:Complexity

From Scholarpedia
Jump to: navigation, search

    The article contains some interesting material. As it stands, however, it covers primarily topics that are related in one way or the other to the algorithmic and information theoretic aspects of complexity. There is practically no reference to dynamical aspects, which are of special relevance for physical systems and for the foundations of complexity theory. "Statistical complexity" mentioned in section 5 constitutes an exception in this respect. The term "Physical complexity" in the same section is misleading since, as the author points out promptly, it is specifically designed to address biological systems.

    In view of the foregoing, in this reviewer's opinion it would be more appropriate to entitle the article in its present conception and form as "Complexity, Randomness and Information".

    As a minor comment, the term "nontrivial structure" in the opening sentence is not very well defined. Furthermore the graph in figure 1 seems to imply that there exists a generally accepted quantitative expression linking complexity and disorder, which is not the case.

    Reviewer B

    This is a nice overview of some aspects of complexity notions. There is of course no way to be complete in such a short article. Nevertheless, I would suggest adding a few more references. For example, when discussing algorithmic methods, it would be nice to include Grassberger's effective measure complexity. It has the same properties as the others (namely in basically quantifies irregularity), but it also has an information-theoretic interpretation, in terms of the conditional entropy to find a particular symbol on the sequence given the n preceding symbols. Also, Crutchfield's statistical complexity essentially falls in the same category. Furthermore, Gell-Mann and Lloyd's effective complexity is defined as a mutual Kolmogorov complexity. Adami's physical complexity is defined on the same level, and is not at all restricted to biological objects. Instead, it is designed to estimate the complexity of any sequence that is about a physical world (as opposed to the rules of mathematics). Biological sequence complexity is just one application. Of course, it is the most immediate one because evolution produces the ensemble of sequences necessary to estimate the mutual entropy. But it is general enough so that the complexity of any sequence can be estimated if an ensemble of similar sequences exists.

    Physical complexity has been used to estimate the complexity of biomolecules. The group of Jack Szostak has evolved ribozymes whose functional activity can be measured, along with their structural complexity. They find that the physical (or informational) complexity goes hand in hand with the measured functional and structural complexity of these molecules. The reference is Carothers et al., Journal of the American Chemical Society 126 (2004) 5130.

    Finally, the last paragraph on "Why complexity" fails to mention evolution. Clearly, complexity can be due to intelligent design or evolution, but in the case of evolution, the "Why" is clear: Survival of an organism depends on the organism's ability to make predictions about its environment that are better than random. An ability to predict the environment better than a competitor directly leads to differential survival. The ability to make predictions better than random requires information about the environment. That, in turn, is complexity. Thus, complexity in biology, if interpreted as "information about the environment", simply has survival value.

    Personal tools
    Namespaces

    Variants
    Actions
    Navigation
    Focal areas
    Activity
    Tools