Chapter 6

C

THE SCIENTIFIC-TECHNICAL EXPLOSION

A school assembly that made a strong impression on me as a high school student in Denver almost sixty years ago involved a demonstration of how nuclear fission works which now seems elementary but that illustrates a point that will be important to us here.  Dr. Orr Roberts of Boulder, Colorado, had placed a large cage in the middle of the stage.  There were hundreds of mousetraps in it, each set with a ping-pong ball balanced where the bait would ordinarily go.  The demonstration consisted of dropping a single ball into the cage.  The ball set off one trap, which sent its ball flying along with the first one. As one would expect, these set off more, and within literally a split second the entire cage was filled with flying ping-pong balls because of the chain reaction set in motion.

Roberts’ demonstration aptly illustrates a similar chain reaction we are living through today. This is the exponential increase of science and technology, except that there is no reason to expect it will be over in a flash.  The beginnings occurred millennia ago, starting no doubt well before the harnessing of fire and the invention of the wheel.  The accretion of knowledge and technique was already immense by the end of the nineteenth century, but it was still possible to trace the history of most subjects by focusing on a series of outstanding inventors or discoverers, each adding something to what was already known (or, as sometimes happens, going down a blind alley).

The account given by L. T. Woodward in his The History of Surgery (1963) provides an example. Woodward was able to follow the contributions of individual surgeons, as superstars of their profession, until World War I.  When that war brought together unspeakable carnage and modern medical technique, it gave surgery a giant leap forward.  After that, with new work done by many surgeons across a broad spectrum and increasingly within specialties and sub-specialties, Woodward was forced to tell the history mostly by referring to whole schools of surgery and major areas of development.  Complex history swamped out the individual superstars. 

Imagine what a currently-written history has to contend with, now that microsurgery, computerized prosthetics, organ transplants, joint replacements, laser surgery, genetic manipulation, the burgeoning uses of stem cell research and other innovations so numerous and so startling that most people will not even have heard of them have come into being.       Imagine, too, what is in the offing for the near future, much less for the medium- and long-term futures.  Even writing more than forty-five years ago, Woodward said that “surgery moves swiftly to transform one generation’s miracle into another generation’s commonplace event” and that “the word ‘impossible’ is never spoken aloud any more.”       

Although this exponential increase in knowledge and technique is possible because of the foundation laid over many hundreds or even thousands of years, the most remarkable progress has come in the most recent five centuries. In the sixteenth century, Andreas Vesalius revolutionized medicine with his clandestinely-conducted empirical studies of anatomy.  Modern anesthesia began in 1772 with the first use of “laughing gas” and was expanded with ether in 1842.  A surgeon performed the first appendectomy as recently as 1880.  It was no longer ago than 1901 that Karl Landsteiner “discovered that there were different types of human blood, and blood of one type was incompatible with blood of another” – a discovery that made transfusions possible.  These details do no more than hint at the fascinating content of Woodward’s account.  But he says the pace picked up most after 1880 (i.e., even before the impetus given by World War I).  “In retrospect, the story of surgery from about 1880 to the present date seems like a chronicle of an age of miracles.”  In the twentieth century “the story becomes formidably complex.  The incredible forward march of surgical ability in the past several generations is without parallel in history….”    

Surgery is just one part of the general scientific and technological expansion.  The late economist Milton Friedman spoke of “a major industrial revolution comparable to the one that occurred two hundred years ago,” but as knowledge and technique multiply even this seems a considerable understatement.  In his book The Twilight of Sovereignty, Walter Wriston says that “scientific knowledge is currently doubling about every fifteen years.”  This is an apt description (for which we should allow him poetic license even if it is a fanciful quantification of something that isn’t genuinely quantifiable).  He says “at least 80 percent of all the scientists who have ever lived are now alive. In our country at least half of all scientific research done since the United States was founded has been conducted in the last decade.”

Projects in scientific and technical research go from the very large to the very small. Considerable investment is needed to develop something like a new memory chip for computers, or to fashion a new automobile platform.  One effect of a global market, with its expanded demand, is to make the expenditure feasible.

An implication from our look at the history of surgery is that science, not computers, is fundamentally the cause of the accelerating increase in knowledge.  It doesn’t diminish this truth to point out that a particular scientific subset, computers and other information technology such as satellites and fiber optics, is nevertheless pivotal to where the world economy is headed in almost all areas.  Knowledge-intensive industry has become the norm.  Not only are the new technologies knowledge-based, but they are also built into organizations that are structured around information.

Recent developments include such things as “just-in-time inventory management,” “statistical quality control,” expert systems, artificial intelligence (AI), computer-aided design (CAD), and computer-aided manufacturing (CAM).  Almost all American products now carry the Universal Bar Code.  Electronic data interchange (EDI) is allowing suppliers to “network” with customers, making possible the “agile manufacturing” of customized products for individual customers.  In textiles there are now “microcomputerized sewing systems.”  And the machine-tool industry operates through computerized “numerical controls.”

Programmable automation invokes both computer science and manufacturing engineering. Machines can be redirected easily from task to task, giving more flexibility for customized batch production than with “dedicated machinery” that performs a single task.  The automation means that it operates with little human involvement.  The result is “computer integrated manufacturing” (CIM).       

Worldwide information technology has brought the installation of millions of miles of optical fiber. Each fiber carries thousands of times more telephone conversations than were carried by the earlier digital copper cables.  The content of thousands of pages of written work can be transmitted almost instantly by a single fiber that is so small it is difficult to see. The uses of fiber optics seem without limit in general medicine and in remote, robotically-aided surgery, and in such things as finding flaws in bridges.

Telemedicine is linking rural hospitals and major medical centers, allowing long-distance medical testing and examination.

High-speed Internet access is available across the world, using hundreds of space satellites.  A bandwidth explosion has occurred in the communications industry, where again fiber optics plays a role: six fibers could carry just one televised football game in 1985 but by 1998 one line could transmit 700 such broadcasts.  A “convergence” that merges all lines of cable and telephone traffic is lowering costs and allowing a hundred-fold increase in modem speeds.  The change to digital television from the original analog TV was mandated in the United States by legislation in 1996, with the transition completed in 2009.

The 1958 invention of the silicon chip started the reduction in computer size, but it wasn’t until Intel’s 1971 introduction of the microprocessor that society began to organize so thoroughly around the computer.  Personal computers may seem to have been around forever, but actually they didn’t arrive until 1982.  The cost of chips has fallen rapidly, while computer power has multiplied many times over.  Several startling scientific approaches are pointing toward new foundations for computers, such as by using quantum mechanics. 

Parallel processing combines anywhere from two to hundreds of computers to work together, offering to take the place of supercomputers and mainframes.  It is expected that eventually desktop computers will be linked to a single chip. 

Object technology was first developed in the 1980s.  It breaks computer programs into building blocks called “objects” which can then be combined to make larger programs.  Programmers don’t need to recreate all elements of a program each time, since they can use objects from other programs.  It makes programming much easier and faster.

We have just reviewed several developments.  Here are others:

The miniaturization of technology is well underway.  The public has experienced this with computers, some of which are so small that they can be hand-held.  In fact, pinhead-size “nanocomputers” are being developed, and these are expected to become embedded in virtually all commodities at low expense. The prefix “nano” means one-billionth, so that a nanometer is one-billionth of a meter.  The technology works at the atomic level, so the literature talks of using nanotechnology recipes to build tools and products starting from single atoms.

Photovoltaics have experienced global expansion.  Semiconductors called photovoltaic cells are embedded into building materials to turn sunlight into electricity.  As the price has fallen, such things as solar roof shingles and opaque glass facades have become affordable.  And solar panels have become more attractive, fitting better into building design.  

Nonlinear equations that deal with unpredictable behavior are used in such an area as aerospace engineering where they allow the simulation of the aerodynamics not just of the wings but of the full airplane.  In biotechnology such equations make possible a better understanding of DNA; in the automotive industry, they help design safer cars.  The financial world uses “nonlinear optimization equations.” And manufacturing brings them to bear on both products and production processes.

Materials science has resulted in a “materials revolution.”  The time-honored way to obtain resources was to dig them out of the ground, but they can now be brought into being (as we saw with nanotechnology) “one atom at a time,” fashioned into forms for particular needs. The result is radically altering industrial technology.  The late Clyde Sluhan of Master Chemical Corporation wrote me that “in manufacturing, many products are being made of plastics instead of metals.  Consequently, machining is being replaced.  Also, what metals are [still] being machined are being done on far faster machines and on automatic systems….”

Speech recognition computers will potentially make the computer keyboard obsolete.  Many more people will become involved when typing skill is no longer needed for computer use.         

With the virtual office, employees in more and more companies operate out of their homes or out of office space that is reserved as needed (a process called “hoteling”).  They network with others through laptops, faxes and cell phones. 

Electronic money, called “E-cash,” created by companies on their own, makes possible direct and instantaneous payments by computer without going through a bank.  

Electronic books allow the downloading of thousands of pages from the Internet.  Books are now able to “stay in print” as long as a digital copy exists.  “Print on demand” systems make a hardcopy available for those who want one, custom-printed within minutes.  It is not expected that printed books are going to vanish completely, but the world is being turned upside down for authors, publishers and readers.  Newspapers and magazines are moving to Internet formats under the pressure to exist.

Robotics

The International Standards Organization (ISO), a non-governmental organization with representatives in 157 countries, defines an industrial robot as “an automatically controlled, reprogrammable, multipurpose manipulator programmable in three or more axes.”  This draws attention to the attributes of automation, flexibility and multitasking.  New technologies have rapidly come into existence.  Some robots use cameras to allow human guidance or to let the robot itself see products and reposition itself.  A new field called mechatronics combines mechanical and electrical engineering with computer science.

Japan and South Korea have given special emphasis to robotics.  Japan plans to join in building a robotic lunar base, projected for 2020, in preparation for the exploration of Mars. As of March 2006, Japan was far ahead in the worldwide use of robotics, with 46 percent; but South Korea was working to be not far behind, with plans soon to have domestic-service robots in each home.   A major step forward came when Microsoft released “a new Windows-based development environment for creating robotic software,” which would overcome the lack of a “common development platform.”

An Internet search under topics such as “mechanical harvesting,” “construction automation,” “manufacturing robotics,” and the like reveals a rapidly proliferating use of robots in automobile manufacturing, agriculture, construction, meat processing, medicine, undersea exploration, the military – and even in such things as sheep shearing, snowplowing, disaster relief and road repair.  Robots are used in welding and painting; in deploying cameras and mechanical claws to do repairs at offshore oil drilling sites; in harvesting a wide variety of farm products that include, among many others, cotton, wine grapes, almonds, cherries and oranges; in fine-tuning television sets before they are sent to buyers; in assisting banks with data storage; in finding buried land mines; in drilling cavities in femurs as part of hip-replacement surgery; in digging mine shafts; in being the first to arrive to a wounded soldier on the battlefield¼.

Biotechnology

Selective breeding of animals and use of organisms to make food and drink such as wine, cheese and bread have been done for centuries.  What we call “biotechnology,” however, came into being after recombinant DNA and monoclonal antibody technology were discovered in the 1970s.  Watson and Crick had discovered the DNA double-helix twenty years earlier.

The many biotech companies have created a major industry.  Hesitation in Europe and Japan slowed the development there, but the United States pressed forward with it.  The immense number of patents make biotechnology rank with computers as an explosive area of development.  To illustrate how far it has come, we will look primarily at medicine, cloning and agriculture.

In medicine:

Work with monoclonal antibodies has lead to improved diagnostic tools and to the selective targeting of cells to receive medicines such as radionuclides, chemotherapeutic drugs and toxins. 

Gene therapy involves the insertion of beneficial genes into the cells of patients. It holds out the prospect of curing several diseases by addressing their root causes.  Thousands of genes either cause disease or predispose people to it.  As these genes are identified, pre-natal and even pre-implantation screening and correction become possible.  Pre-natal screening has been so successful with the Tay-Sachs syndrome, common among Jews of East European descent, that the incidence among such Jews born in the United States has been greatly lowered. With conditions such as spinal bifida, surgery can correct the genetic defect when a fetus is about seven months old. A recent development is PGD (“pre-implantation genetic diagnosis”), where the DNA of fertilized eggs is analyzed even before in-vitro fertilization.  The most obvious goal is to eliminate genetic diseases, but the possibilities even include “designer babies,” where parents choose in advance what traits they want their baby to have as to such things as height, eye color and intelligence. This raises ethical, social and legal issues which will no doubt result in sharply contrasting attitudes for coming generations. Jeremy Rifkin’s book The Biotech Century goes into detail about the prospect of developing a “super-race” through genetic manipulation and electronic implants – and about the social issues that will raise.

New vaccines are directed toward such afflictions as herpes, multiple sclerosis, anemia, hepatitis, diabetes and Lou Gehrig’s disease.  Ways are being developed to deliver vaccines without needles.  One of these is a genetically-engineered potato that carries a low-cost oral vaccine against hepatitis B.  Several pharmaceutical companies have done years of work to develop a vaccine against Alzheimer’s, which has become a major scourge for an aging population.

Autism and schizophrenia may someday be helped by behavioral genetics, although progress is slow and has for several years been centered on modeling autism in mice. 

Work has long been underway to map the genetics behind the bacterial mutation that threatens to make antibiotics ineffective. 

Digital hearing aids bring the computer age to the hard-of-hearing as tiny chips perform tens of thousands of calculations per second.     

Stem cell research, with or without the use of embryonic stem cells, has made great strides, and offers by itself to be a major revolution in medicine.  Replacement body parts are in the offing, and the injection of neural stem cells into the brain is expected eventually to repair genetic defects that cause multiple sclerosis, Alzheimer’s and Parkinson’s.  A master stem cell has been isolated in adult bone marrow that can grow muscle, fat, bone, tendons and cartilage.   Some forms of blindness are now being cured by transplanting tissue-making cells into the eye.

The medical developments just mentioned will look primitive compared to what is coming, and different societies will long be involved in the very difficult moral and political decisions that are bound to be involved about their implementation. All sorts of biological manipulation is possible: e.g., the growing of headless clones from which body parts can be transplanted; or the growth of human embryos possessing no head or central nervous system  (so that they won’t have feeling), with only such body parts as are desired to be harvested. As with the long-standing abortion issue, the argument will be between those who stress the medical, humanitarian uses of such “organ farms” and those who find such things a violation of other values, religious, cultural or moral.

The Human Genome Project began in 1988 to map the human genome.  This has involved a multi-billion dollar collaboration between private sources such as Craig Venter’s laboratory and such governmental agencies in the United States as the National Institutes of Health and the Department of Energy.  There are 80,000 to 100,000 human genes, which come together in 10 trillion combinations.  Gene mapping, gene sequencing and the difference in genes from one person to another are research subjects.  There are 10 billion neurons in the brain that can be mapped as the science develops.  And, of course, the research doesn’t stop with human beings, since much is to be gained by extending it to other species, plants and microorganisms such as various forms of bacteria.

Throughout history, aging and mortality have been “part of the human condition,” not problems to be solved.  Medical research within the past few years has taken slow but meaningful steps toward changing this.  As long ago as 1991, it was discovered that every time a cell replicates, the tips of its chromosomes become shorter, limiting the number of replications possible.  An enzyme, telomerase, was brought into play in 1998 to lengthen the tips.  Research has continued since that time, and it is possible that at some point a therapy to stop or greatly slow aging will result.  The American Federation for Aging Research (AFAR) has been one of the leading actors in the field. 

 In cloning

The cloned sheep Dolly was born in 1997 and the first cloned calves in 1998.  They led on to the new area of pharming, which customizes animals to create medicines in their milk, blood or urine. The first such drug was made from goat’s milk. If the customizing is done with plants (called Biopharming), the medicine can be received by a person’s eating the plant, or the medicine can be processed from the plant. Although pharming is extremely promising for the inexpensive production of medicines and vaccines to satisfy worldwide demand, its development has been slowed by regulatory concerns about the possible contamination of other crops. As that concern is met, pharming offers to become a major industry.

In agriculture:

Productivity in farming has been rising rapidly through a number of technologies that include new types of seeds, growth hormones, animal genetics, improved fermentation, and the like.   After 125 years of planting hard red wheat, the American wheat industry may gradually shift to white wheat, marking the culmination of fifteen years of research at Kansas State University.  The white wheat has as much nutrition and fiber as red wheat, but is more desirable in color and taste.

DNA-coated pellets are fired into plants’ chromosomes to accomplish what is by now very extensive genetic engineering.  This makes possible cotton with implanted insecticides, tomatoes that stay fresh longer or that are made to resist freezing by the introduction of a gene from the winter flounder, strawberries with less sugar, and even coffee beans that grow without caffeine.  Corn is engineered genetically to resist the very damaging European corn borer.  Because strawberries and ornamental plants are attacked in Florida by the spider mite, biotechnologists at the University of Florida have genetically altered a cousin of the spider mite to cause the cousin to prey upon it. 

A development that saves farmers money and is at the same time environmentally helpful is the use of “global positioning satellites” for precision farming.  Here, the satellites divide farm fields into square-foot grids and then, using computers, inform the farmer about what parts of the field need more fertilizer, pesticide, herbicide or water.

Biotechnology seeks not just increased yields, but also improved food desirability and nutritional value.  Work has been done on new lines of tomatoes that carry ten to 25 times more of the beta carotene that provides Vitamin A.  Other work has continued for several years on developing a type of “transgenic super cassava,” funded by the U.S. government and the Bill and Melinda Gates Foundation.  Cassava is “the primary source of nutrition for 800 million people worldwide.”

Farming and ranching have moved toward an industrialized agriculture.  A sizeable environmentalist and animal rights literature has come to use the name “factory farming” pejoratively and has opposed it vehemently.  The criticism may result in extensive regulation and a slowing of the transition, but in one form or another it is likely that agriculture will continue to be radically impacted by science and large-scale organization.  The result could be, as the author Jeremy Rifkin has speculated, an eventual supplanting of outdoor farming.  Rifkin, writing in the mid-1990s, cited an example of how the 70,000 vanilla farmers on Madagascar may be displaced by vanilla produced in “a bacterial bath”; and an Internet search today shows how “synthetic vanilla” has indeed taken over a large part of the vanilla market. Humanity has been moving away from the pre-Neolithic “hunter/gather” model for ten thousand years, and a recent step further away from it has occurred recently in fishing, where Norway, Chile and Japan have gone heavily into farm-raised salmon.

The technology offers a much-improved environmental impact.  The new technology is leading to the use of smaller amounts of materials, fewer resources and less transportation.  In countless ways, it promises to be more “environmentally friendly” than the old smokestack industries.  The Economist says “enthusiasts for IT [information technology] cite another point in its favor: that it makes fewer claims on resources… Whereas cars, railways and steam engines were heavy users of raw materials and energy, IT is speeding up the shift towards a so-called ‘weightless’ economy.” 

“Hybrid passenger cars” are no longer experimental oddities, but occupy a growing market share. Not only does the hybrid get much higher mileage than an entirely combustion-engine automobile, but it “cuts carbon dioxide emissions by half and other emissions about 90 percent.” 

Some of the technology is directed toward cleaning up past pollution.  Ned Hettinger talks in a law review article about “genetically-altered microbes that eat toxic wastes and degrade synthetic compounds.”  Utility companies have been studying a possible use of halophytes to clean up wastewater from coal-fired generating plants.  Business Week explains that halophytes are “a group of salt-tolerant plants ranging from cacti to sea grass [that] can absorb salt and heavy metals….”

Environmental problems will increasingly come not from the advanced economies, but from the less developed countries.  There, the population is bulging at the same time that a great many primitive techniques are widely used. 

This review of recent developments has, of course, just scratched the surface.  Even as impressive as the new developments are, readers in the future will no doubt find them archaic, perhaps amusingly rudimentary.

Go to Chapter 7