“Peer-to-peer distributed computing and clusters are two
recurring hot topics in the Linux world. What’ll it be like,
though, when those technologies truly take root, and we each have
not two or ten external processors working for us, but a thousand,
or a million?“
“That’s the sort of question Harold Abelson, Gerald Sussmann,
and eight alphabetically-sorted more junior coauthors address in
their 1999 MIT memorandum, “Amorphous Computing” (reprinted by the
Communications for the Association of Computing Machinery earlier
this year). Among their conclusions: we’ll need new programming
models to exploit processors that are individually unreliable and
communicate over unreliable channels. It’ll be worth it, though,
because the marginal cost of each additional processor will be
under a penny, and the right kind of design and engineering will
give us unprecedented computational power.”
“Amorphous computing is sometimes called swarm computing to
emphasize that a collective result emerges from individual
microlevel behaviors with the surprising symmetry of a relocating
bee or ant colony. This form of computing is also important for
controlling the devices created by nanotechnology. Amorphous
computing builds on research into distributed computing models like
Jini. It presumably will be fueled by nanotechnology research and
will ultimately provide the intelligence for nanotechnology
products. And it might well be built with calculating biological
molecules like those proposed by “Amorphous Computing” coauthor Tom
Knight.”