Intelligence — Natural and Artificial

An American court has just granted two chimpanzees a writ of habeas corpus.  Legally, this constitutes them as persons.  For the first time in history, a human legal system has recognized that creatures other than ourselves can be “cognitively complex” enough to merit the status of legal personhood.

And  this just in: the Defense Advanced Research Projects Agency (which, around fifty years ago, initiated the project that eventually yielded us the Internet) is inviting proposals for “Building Resource Adaptive Software Systems“.  They appear to mean by this systems that are intelligent enough to update themselves and even adapt to new hardware designs and platforms without human intervention.


Clearly, the concept of intelligence is undergoing some powerful transformational pressures.  Remember when there were “I.Q tests”?  Remember Mensa?  Does anybody still believe in IQ?

With regard to animal intelligence, we’ve come a long way.

Walk with me for a few minutes through an A.I. fantasy.  After a lifetime of waffling back and forth about the matter, I’m now convinced that some level of Artificial Intelligence will be active within the lifetime of my grandsons (who are now 19, 18, and 10 years old respectively).

What sort of behavior would we accept as “Artificial Intelligence”?  The best-known characterization of such behavior, for the past sixty-five years, has been the Turing Test.  But for riffing on the Darpa proposal, the turing test is too broad: it stipulates a system whose verbal behavior — say, on a chat line — is indistinguishable from that of a human.  (Think of the movie “Her“.)  We don’t need that for the Darpa proposal .

What we would need is a system that — whatever its primary purpose is — could

  • update itself
  • update any of its components, including not only current ones but also…
    • replacements for current ones, written in new computer languages
    • new ones, unforeseen at its creation
    • interfaces to newly installed hardware devices, and the drivers to operate them

OK, for a simple use-case, let’s imagine a system whose purpose is — oh, not military, for God’s sake —  maintaining archives.  Over the hundred years stipulated in the DARPA proposal, the technological infrastructure of the archives will likely go through several changes.  Just in the past fifty years, data memory has gone from ferrite-core to a bewildering variety of solid-state fabrications, while storage technologies have evolved from magnetic drums to tape to multi-terabyte hard drives.

A couple of propositions seem obvious:

  1. The system acts on its own initiative.  It does not wait for someone to press the “update” button.
  2. If it acts on its own initiative, it must have the capacity to “decide” when to do so.
  3. Items 1 and 2 can properly be identified, analogically, as a capacity for reflection.
  4. Such a capacity is necessarily independent of the other tasks assigned to the system.  It is executed by a subsystem to which all of the other system tasks appear as objects or processes.
  5. There will be some entirely practical boundaries to the system’s mutability.  It is not required to be able to transform itself, for example, from an archive maintainer into, say, the control system of an orbiting satellite.  Whatever its stipulated role, it maintains that role throughout its centuries-long lifetime.  It is not a shmoo.

To be continued . . .   .

The Seventy-Fifth Year

…just ended yesterday.  I’m entering my 76th.

Here’s what I can report, more or less long-term: compared to the state of my heart, soul, mind, and strength as of, let’s say, ten years ago.

  • Time — or rather, the passage of time — really does speed up.
  • But nothing really settles down either, or comes to a steady state.  I have, if anything, changed more in the past ten years, since 65, than in any previous decade except perhaps my first, from 1940 to 1950.
  • I know I’ve changed, because the world looks different from the way it looked to me ten years ago.  Call this a shift in perspectives.
  • The language of elder generations is very different from the language of their juniors.  Though it’s composed of the same words, it does not carry the same meanings.  Not even close.  Maintaining decent communication with younger generations requires more energy, every year, than the year before.


I have a useless, and not quite honorable, interest in comparing my longevity with that of people I have admired, or emulated, or been in some way close to.

My father died in 1977.  He was only 67; I’m eight years older than that.

His father was 92 when he died in 1973.  It’ll take me seventeen years to reach that from here.

J.S.Bach only made it to 66.  I have 9 years on him.

Socrates was executed at 70.

Ludwig Wittgenstein was 62 at his death in 1951.

The philosopher J.L.Austin was only 48 when he died in 1960.

Olivier Messiaen, the French composer, died in 1992 at the age of 84.

Charles Taylor, the Canadian philosopher, is 83.  God bless him.

Variations on a theme by Wittgenstein


From Philosophical Investigations, §2

“Imagine a language . . . to serve for communication between a builder A and an assistant BA is building with building-stones: there are blocks, pillars, slabs and beams.  B has to pass the stones, and that in the order in which A needs them.  For this purpose they use a language consisting of the words block, pillar, slab, beamA calls them out; — B brings the stone which he has learnt to bring at such-and-such a call. — Conceive this as a complete primitive language.”

Variation 1

After a day of hard labor with builder A, assistant B has eaten — wordlessly — the dinner his wife has — wordlessly — prepared.  He sits crosslegged before the fire, satisfied.  By degrees, a distant look comes into his eyes.

“Block”, he says softly, staring into the fire.  The wife glances quizzically in his direction.

He looks down at his own hands.  “Block”, he says again.  The woman, who knows the language from observing him at work with his colleagues but has never participated in it, looks around to see the block he must be referring to.

Once again, he speaks, slowly, softly, wonderingly: “Pillar”.  The woman grows distraught and finally frightened.

Variation 2

Assistant B has lunched upon strange mushrooms.  The afternoon’s work begins ordinarily enough; but at some point he responds to builder A’s call “beam”, not by tossing a beam, but with a suggestive pelvic thrust.  Builder A, attending to the work in front of him, does not see this, but calls out again “beam”.  Assistant B dances from side to side, erotically, giggling.

Assistant D, from the adjacent worksite, whose mate gathers from the same fields as assistant B’s, joins the dance.  Several other assistants do the same.

The afternoon grows chaotic, as the workers’ conduct deteriorates to a rhythmic bump-and-grind, shouting each of the four words of their language in turn, circling the stock of parts.  Some begin clapping at each bump.  Others join in on the off-beats.

The masons, bewildered, abandon their work but do not join in.  They go home.

Variation 3

Assistant A and his mate luxuriate — wordlessly — under their furs, after a capacious meal.  He holds up his hand, palm upwards.  “Slab”, he says.  She does not understand.  “Slab”, he says again, and claps his hand to his chest.  Tentatively, she presses a hand to her own breast; he grasps it, transfers her touch to himself, and presses his own hand to her breast.  “Slab,” he says, softly.

She smiles.  He chuckles.  She puts a hand to his crotch.  “Pillar”, she says.  He laughs uproariously.

The rest of the evening proceeds wordlessly.