Faraday, Maxwell, and the Electromagnetic Field Page 4
For now, he continued to build a reputation as a skillful and reliable chemist. Manufacturers and food producers had begun to recognize the need for precise chemical analyses, but there were very few people in the country able to carry them out—British universities didn't begin to teach practical chemistry until many years later. So Faraday, in his splendidly equipped laboratory, found himself in great demand from commercial companies and government departments. For example, he measured the water content in consignments of sodium nitrate supplied to gunpowder manufacturers, analyzed the gases emitted by aging eggs, and tested methods of drying various kinds of meat and fish to be used as food for sailors—a service performed for the Admiralty. Such business brought in much-needed income to the Royal Institution.
So did acting as expert witnesses in court cases. Faraday's senior colleagues generally took on the legal work, but in 1820 he was hired by a group of insurance companies to help defend their case, only to find that both Davy and Brande had been hired by the other party, a sugar-refining company. The insurers had refused to pay out after a fire, claiming that the refiner had invalidated the policy by using oil during refining. Faraday gave compelling evidence on the flammable nature of the oil but lost the case; the court somehow decided that the insurers should pay because the refiner had no intention to defraud. Perhaps the case turned on a fine point of law but, whatever the legal niceties, pocketing three fees from one case was good business for the Royal Institution.
On one occasion, Faraday let his standards slip. Davy, ever combative, published a paper on phosphorus compounds in which he challenged the findings of the Swedish chemist Jöns Jacob Berzelius. Included in the paper were some results from Faraday's work; Berzelius checked Faraday's results, found errors, and let fly:
If M. Davy would be so kind as to take the pains of repeating these experiments himself he should be convinced of the fact that when it comes to exact analysis, one should never entrust them into the care of another person; and this is above all a necessary rule to observe when it comes to refuting the works of other chemists who have not shown themselves ignorant of the art of making exact experiments.12
This was utter humiliation, and a lesson. Never again did Faraday publish anything before doing his utmost to eliminate all possible sources of error.
Although chemistry is the word we use for the kind of work that Davy, Brande, and Faraday were doing, they didn't think of themselves as specialists but simply as men of science, or natural philosophers. They worked at chemistry because that is where the frontier of science was. Progress lay in discovering more about the composition of substances, and how they reacted when mixed or subjected to an electric current. Davy himself had broken fresh ground by isolating seven new elements. Scientists, as ever, were driven by a thirst for knowledge, and there was further motivation from industry. Manufacturers wanted to take advantage of the latest findings in chemistry—to make new products or to make old ones more economically—and were prepared to pay for research that could give them an edge: Faraday found himself visiting ironworks and being called in to try to improve the quality of steel used for surgical instruments. All in all, the way ahead seemed clear: more of the same was a thoroughly satisfactory strategy for anyone with intent to push back the frontier of science. But nobody could see what lay just over the horizon.
By the time of his twenty-ninth birthday in September 1820, Michael Faraday had established himself in the middle ranks of British scientists. He was a first-rate chemical analyst—set, it seemed, for an honorable, if unspectacular, career as a stalwart of the Royal Institution. Nothing he had done so far seemed to signal momentous feats to come. Yet everything he had done to date turned out to be the perfect preparation for some of the greatest scientific achievements of all time. All of his faculties of observation, exploration, imagination, and contemplation, together with his experimental skill, meticulous record keeping, and sheer determination, would be tested to the full and not found wanting.
His call to arms came on October 1, 1820. Sir Humphry Davy arrived at the Royal Institution with some astonishing news from Denmark. Hans Christian Oersted had put a magnetic compass near a current-carrying electric wire and had seen the needle move to a position at right angles to the wire. In the twenty years since Volta gave them electric currents, scientists had been scrabbling in the undergrowth for scraps of knowledge while a discovery of the first magnitude lay on the path at their feet. The spirit of exploration was strong, so why had nobody else thought of placing a compass near an electric circuit to see if anything would happen? Strange though it seems to us, none but a few scientists thought there could be any connection between the forces of electricity and magnetism, and these few were regarded by the others as airy-fairy metaphysicians. The majority held firmly to what is generally called the Newtonian model, though Newton himself probably would have disowned some of it: material bodies inflicted forces by acting on one another instantaneously at a distance along straight lines. This model made no attempt to explain how one of nature's forces, like electricity, could interact with another, like magnetism, but, as we'll see, it had such a hold on scientific opinion that it clung on for many decades—even in the face of mounting evidence to the contrary, first from Oersted and then from Faraday's work and, later, Maxwell's.
Shocked and fascinated by the news of Oersted's discovery, Davy and Faraday naturally began to experiment with currents and magnets. And before long, Faraday would be combing the Royal Institution library, and other libraries, to see what could be gleaned from the history of electricity and magnetism.
Since ancient times, electricity and magnetism had been obscured by a fog of superstition, mysticism, and quackery. The man who began to dispel the fog was William Gilbert. Born in 1544 in Colchester, he trained as a physician and became a very good one, rising to be president of the College of Physicians and personal doctor to Queen Elizabeth. But we have still have more reason than his patients did to be grateful to him. He was the first to study electricity and magnetism experimentally, and his careful observation and scientific reasoning cleared the way for those who took up the work later.
Why did a suspended magnetic needle always align itself north–south? Why did amber attract pieces of paper and fluff after being rubbed with fur? Fascinated by such questions, Gilbert looked for enlightenment to works of scholarship, both ancient and contemporary, but found nothing that shed any light on the subject. In his book De Magnete, published in 1600, he reports:
Many modern authors have written about amber and jet attracting chaff and other facts unknown to the generality: with the results of their labors booksellers’ shops are crammed full. Our generation has produced many volumes about recondite, abstruse and occult causes and wonders…but never a proof from experiment, never a demonstration do you find in them. The writers…treat the subject esoterically; miracle-mongeringly, abstrusely, reconditely, mystically. Hence such philosophy bears no fruit; for it rests simply on a few Greek or unusual terms—just as our barbers toss off a few Latin words in the hearing of the ignorant rabble in token of their learning, and thus win reputation…few of the philosophers are investigators, or have any first-hand acquaintance with things.1
The “miracle-mongerers,” by implication, included the Church. It says much for English tolerance that Gilbert was able to publish his views without fear for his life or liberty, especially as he emphatically supported the view of Copernicus that the earth was not the center of the universe. Things were different closer to Rome, where others who advanced Copernican views were brutally dealt with: Giardino Bruni was burned at the stake and Galileo Galilei was kept under house arrest for life.
Spurning the scholars, Gilbert talked to people who used magnets: compass makers, navigators, and ship captains. No doubt he heard all the popular theories many times—that magnets were attracted by the North Star or by a huge arctic mountain that would pull out all the ship's iron nails if one got too close to it, and that garlic interfered with compa
ss readings. But an idea came to him that was consistent with all he had heard about how compasses actually behaved: Earth could be a giant magnet. To test the idea, he made what he called a “terella”—a model Earth formed from a naturally magnetic iron ore called lodestone—and moved a compass around it. Interpreting the findings as a though he were a traveler on the surface of his surrogate Earth, he found that a compass needle behaved in every way as it did on the real Earth.
To investigate electrical forces, which were much weaker than magnetic ones, he needed a sensitive detector and so made the world's first electroscope. He called it a “versorium”—a light metal needle balanced on a pinhead, rather like a compass except that the needle was not magnetized. When brought toward an electrified object, the needle would move so as to point toward it. Using the versorium, he produced a great list of materials that became electrified when you rubbed them. But this device, wonderful as it was, didn't distinguish positive from negative electricity, so Gilbert failed to discover that some substances became positively electrified and others negatively so. Nor did he notice that two similarly charged objects repelled one another. He also failed to see the symmetry of magnetic attraction and repulsion—to him the mutual repulsion exerted by like poles was just a preliminary shuffle to get the unlike poles together.
One can hardly fault him for these failures. Gilbert had made giant strides in the understanding of electricity and magnetism, breaking away from mediaeval thinking and clearing a path to modern science. Among his near-contemporaries were Francis Bacon and Galileo Galilei, even more powerful advocates of what we now call the scientific method. It took Galileo, some twenty years later, to show the full relationship between observation, hypotheses, mathematical deduction, and confirmatory experimentation, but in electricity and magnetism it was Gilbert who showed the way. By reporting exactly what he had done in his experiments, he made it possible for others to repeat them, verify the results, and, perhaps, extend them. Others were led to study the subject, and in the 1620s, Niccolo Cabeo, an Italian teacher of theology and mathematics, found what had eluded Gilbert. He noticed that iron filings seemed to jump away from a piece of electrified amber as soon as they touched it. Like magnetism, electricity could push as well as pull.
Scientific knowledge was advancing, but much of it was still contained in statements of the “if you do so-and-so, then such-and-such will happen” variety. The notion that everything that happened in the physical world might be governed by universal laws in mathematical form seemed as fanciful as the magic mountain. But everything changed in 1687 when Isaac Newton published his Principia Mathematica. He showed that three simple laws were sufficient to describe how any material object moved under the action of forces, and that any two objects attracted one another with a force that was proportional to the product of their masses and inversely proportional to the square of the distance between them. Everything from an apple's fall to a planet's orbit could now be described by precise equations. It is hard to find words adequate to describe Newton's achievement. Perhaps Alexander Pope did it best when he wrote an epitaph for Newton in 1727: “Nature and Nature's Laws lay hid in night: God said, Let Newton be! And all was light.”
Science had entered a new era and was set on the path that it still follows today. The aim was to bring everything within universal laws—the fewer and the simpler, the better—and to do this by employing both experiment and mathematics. Newton himself did some spectacularly successful experiments on light, showing that what we perceive as white light was actually a mixture of all the colors in the visible spectrum. He never turned his hand to electricity and magnetism, but, as we'll see, others came to use his law of gravitation as a model for both.
Meanwhile, scientists were slowly improving their acquaintance with the ways of electricity. In the 1730s, the French army officer turned chemist Charles du Fay discovered that glass, when rubbed with silk, acquired a different kind of electricity from that acquired by amber when rubbed with fur. Moreover, while electrified glass attracted electrified amber, two pieces of electrified glass repelled one another, as did two pieces of electrified amber. Du Fay thought that the amber and the glass might have each became imbued with a distinct type of electrical fluid and introduced what became known as the “two-fluid” theory. Meanwhile, an American was working along different lines. Benjamin Franklin was someone who, it seemed, could do anything. Already a brilliantly successful printer, publisher, and journalist, he went on to become a distinguished, if rather raffish, politician and statesman. He was also a great scientist. In 1747 he put forward the idea of electrical charge, which could be positive, as with glass, or negative, as with amber. Franklin's charge came in the form of a single hypothetical electrical fluid. By his “one-fluid” theory, a body with the normal amount of fluid would have no charge, but one with a surplus of fluid was positively charged and one with a dearth of fluid was negatively charged. To interpret du Fay's experimental findings, Franklin assumed: (1) that rubbing transferred electrical fluid from the silk to the glass but took it from the amber to give to the fur; and (2) that, by analogy with magnetic poles, unlike charges attracted one another while like ones repelled each other. By the same token, this explained why Cabeo's iron filings jumped away from the electrified amber—when they touched it, they acquired the same (negative) charge and were repelled.
According to a popular story, Franklin flew a kite into a thundercloud to prove that lightning was electrical and so establish a sound theoretical base for installing lightning rods on buildings. The experiment was successful and lightning rods became widely used. Franklin, if indeed it was he, had saved many lives by risking his own—several others were killed trying similar experiments. For the kite experiment something was needed to collect the electricity from the lightning and store it for later examination. Such a device had been invented a few years earlier by Pieter van Musschenbroek, a professor at Leyden in Holland. He tried storing electricity in a water-filled jar and was more successful than he had thought possible. While he turned the handle of a machine that generated electricity by mechanical rubbing, his student and assistant Andreas Cunnaeus picked up the jar to try to draw a spark from it to a gun barrel held in his other hand. They had seen sparks before, but not like this one. There was a great flash, and the shock that passed through Cunnaeus's body almost killed him. Their device, improved by coating the jar, inside and outside, with foil and dispensing with the water, became the Leyden jar, the first capacitor. It quickly became the standard equipment for storing electricity, and many people, including young Faraday, later made their own versions of it to use in home experiments.
Gilbert and others had long ago observed that electric and magnetic forces became weaker as one moved away from an electrified body or a magnetic pole. As Newton's inverse-square law worked so well for gravity, it seemed likely that similar laws applied to electricity and magnetism. John Michell demonstrated in 1750 that this was so for magnetism, and Joseph Priestley did the same for electricity in 1766. But it was the French physicist Charles Augustin Coulomb who carried out a definitive set of experiments in 1785 and gave his name to the law. For the purpose, Coulomb had independently reinvented a wonderfully precise instrument—the torsion balance.2 John Michell had begun to construct one thirty years earlier but had died before putting it to use.
Coulomb didn't hold with Franklin's one-fluid hypothesis. Since du Fay's time, French scientists had come to believe in the two-fluid theory of electricity—that there was one fluid for the amberlike substances and another for those like glass—and, when Coulomb endorsed it, this model became ingrained in French scientists’ thinking. On the other side of the Channel, Franklin was very popular and British scientists became ardent supporters of his single-fluid version. The debate went on for many years, and both sides had passionate adherents. It all seems rather silly and irrelevant now, like an argument about whether flying pigs have one pair of wings or two, but in the late 1700s, so-called imponderable fluids—hypot
hetical substances undetectable by the senses—were serious components of scientific thinking; for example, Antoine Laurent Lavoisier, the father of chemistry, believed that heat was a fluid called “caloric.”
Magnetism similarly had its fluids, two for both the British and the French. But, whether with one fluid or two, the theories of electricity and magnetism now followed the pattern of Newton's law of gravitation, with the difference that electricity and magnetism both attracted and repelled while gravity only attracted. The equations looked just like Newton's and gave exact values for the forces. It all looked so right. But, buried deep, there was a flaw. Newton had seen it. He had written to his friend Richard Bentley:
That gravity should be innate, inherent, and essential to matter, so that one body can act on another at a distance, through a vacuum, without the mediation of anything else, by and through which their action and force may be conveyed from one to another, is to me so great an absurdity that I believe no man who has in philosophical matters a competent faculty of thinking, can ever fall into it.3
Newton knew that his equations were not the last word on the matter. No force could act instantaneously across a distance. Something had to exist in the intervening space to transmit the force, even though he carefully avoided making any hypothesis about what it was. But in France, mathematically inclined physicists pushed thoughts about a transmission medium to the back of the mind while confidently building on Newton's foundation. They did not feel the need to agonize about the ultimate meaning of gravity or other forces and how they were transmitted; it was enough to present the mathematical equations describing the universe, and, voilà, it all became comprehensible. They believed, as Newton did, that the whole physical world behaved as though composed of point masses that obeyed precise laws (as did the planets) and so reduced the reality of ponderable matter in the universe down to a series of equations. Their mathematics was elegant, elaborate, and all-encompassing, and the results were beautiful.