Wednesday, July 25, 2012

Three philosophical essays

From Algorithmic Information Theory:

Charles Bennett has discovered an objective measurement for sophistication. An example of sophistication is the structure of an airplane. We couldn't just throw parts together into a vat, shake them up, and hope thereby to assemble a flying airplane. A flying structure is vastly improbable; it is far outnumbered by the wide variety of non-flying structures. The same would be true if we tried to design a flying plane by throwing a bunch of part templates down on a table and making a blueprint out of the resulting overlays.

On the other hand, an object can be considered superficial when it is not very difficult to recreate another object to perform its function. For example, a garbage pit can be created by a wide variety of random sequences of truckfulls of garbage; it doesn't matter much in which order the trucks come.

More examples of sophistication are provided by the highly evolved structures of living things, such as wings, eyes, brains, and so on. These could not have been thrown together by chance; they must be the result of an adaptive algorithm such as Darwin's algorithm of variation and selection. If we lost the genetic code for vertebrate eyes in a mass extinction, it would take nature a vast number of animal lifetimes to re-evolve them. A sophisticated structure has a high replacement cost.

Bennett calls the computational replacement cost of an object its logical depth. Loosely speaking, depth is the necessary number of steps in the causal path linking an object with its plausible origin. Formally, it is the time required by the universal Turing machine to compute an object from its compressed original description.


From Objective versus Intersubjective Truth:

Post-Hayek and algorithmic information theory, we recognize that information-bearing codes can be computed (and in particular, ideas evolved from the interaction of people with each other over many lifetimes), which are

(a) not feasibly rederivable from first principles,

(b) not feasibly and accurately refutable  (given the existence of the code to be refuted)

(c) not even feasibly and accurately justifiable (given the existence of the code to justify)

("Feasibility" is a measure of cost, especially the costs of computation and empircal experiment. "Not feasibly" means "cost not within the order of magnitude of being economically efficient": for example, not solvable within a single human lifetime. Usually the constraints are empirical rather than merely computational).

(a) and (b) are ubiqitous among highly evolved systems of interactions among richly encoded entities (whether that information is genetic or memetic). (c) is rarer, since many of these interpersonal games are likely no more diffult than NP-complete: solutions cannot be feasibly derived from scratch, but known solutions can be verified in feasible time. However, there are many problems, especially empirical problems requiring a "medical trial" over one or more full lifetimes, that don't even meet (c): it's infeasible to create a scientifically repeatable experiment. For the same reason a scientific experiment cannot refute _any_ tradition dealing with interpersonal problems (b), because it may not have run over enough lifetimes, and we don't know which computational or empirical class the interpersonal problem solved by the tradition falls into. One can scientifically refute traditional claims of a non-interpersonal nature, e.g. "God created the world in 4004 B.C.", but one cannot accurately refute metaphorical interpretations or imperative statements which apply to interpersonal relationships.

As Dawkins has observed, death is vastly more probable than life. Cultural parts randomly thrown together, or thrown together by some computationally shallow line of reasoning, most likely result in a big mess rather than well functioning relationships between people. The cultural beliefs which give rise to civilization are, like the genes which specify an organism, a highly improbable structure, surrounded in "meme space" primarily by structures which are far more dysfunctional. Most small deviations, and practically all "radical" deviations, result in the equivalent of death for the organism: a mass breakdown of civilization which can include genocide, mass poverty, starvation, plagues, and, perhaps most commonly and importantly, highly unsatisying, painful, or self-destructive individual life choices.


From Hermeneutics: An Introduction to the Interpretation of Tradition:

Hermeneutics derives from the Greek hermeneutika, "message analysis", or "things for interpreting": the interpretation of tradition, the messages we receive from the past... Natural law theorists are trying to do a Heideggerean deconstruction when they try to find the original meaning and intent of the documents deemed to express natural law, such as codifications of English common law, the U.S. Bill of Rights, etc. For example, the question "would the Founding Fathers have intended the 1st Amendment to cover cyberspace?" is a paradigmatic hermeneutical question...[Hans-Georg] Gadamer saw the value of his teacher [Martin] Heidegger's dynamic analysis, and put it in the service of studying living traditions, that is to say traditions with useful applications, such as the law . Gadamer discussed the classical as a broad normative concept denoting that which is the basis of a liberal eduction. He discussed his historical process of Behwahrung, cumulative preservation, that, through constantly improving itself, allows something true to come into being. In the terms of evolutionary hermeneutics, it is used and propagated because of its useful application, and its useful application constitutes its truth. Gadamer also discusses value in terms of the duration of a work's power to speak directly.




2 comments:

fortunato said...

What's a good starting point to learn more about Hayek?

nick said...

The Use of Knowledge in Society