Chapter energy and technology the enhancement of skin


Electrification and communications



Yüklə 1,53 Mb.
səhifə14/15
tarix11.07.2018
ölçüsü1,53 Mb.
#55223
1   ...   7   8   9   10   11   12   13   14   15

6.8. Electrification and communications


Electricity is the purest form of energy, since it can be converted to any other form, from heat to light to mechanical motion with practically no loss. However, apart from the occasional lightning strike, there is no natural source of electric power, as such. In fact, electric power is derived today mostly from kinetic energy of motion, originally from reciprocating engines based on Newcomen’s pumping scheme and since 1900 in the form of a turbine wheel. Early contributions to the science that enabled the development of an electrical industry included discoveries by Andre-Marie Ampere, Charles-Augustin Coulomb, Humphrey Davy, Michael Faraday, Benjamin Franklin, Luigi Galvani, Joseph Henry, Hans Oersted, Georg Simon Ohm, Alessandro Volta, and many others.

The first two practical applications of electricity were the electric telegraph and the light- house (using arc-lights). The other major component of an electric power system is the generator or dynamo. Early key inventions were the efficient DC dynamo (and its alter ego, the DC motor). The dynamo-and-motor evolved in parallel over a period of many years, following Michael Faraday's initial discovery of electromagnetic induction in 1831. The major milestones were European, with significant contributions by Woolrich (1842), Holmes (1857), Wheatstone (1845, 1857), Siemens (1856), Pacinotti (1860), Siemens (1866), Gramme (1870) and Von Hefner-Altaneck (1872) {Sharlin, 1961 #4592}. The most widely-used commercial dynamo before Edison (by Zénobe Gramme) achieved an efficiency of barely 40% in terms of converting mechanical energy to electrical energy. Gramme's machine was the first capable of producing a relatively continuous current.

The prevailing engineering practice was to produce large DC currents for arc-lights at low voltages. Edison reversed this practice. Edison’s breakthrough, the “Jumbo” generator (1879) came late in the game because he started from a “systems” perspective. He was intending to make and sell electric power systems to supply whole regions, from central generating plants. Edison’s generator took advantage of the fact that higher voltages greatly reduced resistance losses in the wiring, thereby effectively doubling the efficiency of the generator. Edison was also the first to build an integrated system of electric power, as applied originally to incandescent lighting in buildings. The efficiency progression is shown in Figure 29.

Figure : Generator efficiency: Electric power output per unit mechanical power input



Once electric power became available in the 1850s and 60s, new applications – and new industries – quickly emerged. The first application of DC power was for electric lighting. Arc lights were already known, and a number of practical arc-lighting systems for public purposes were developed in the late 1870's, pre-eminently those of Charles Brush (1876) and Elihu Thomson-Edwin Houston (1878). These systems were suitable for outside use and large public rooms, though not for more intimate indoor quarters. But soon they were being produced in significant numbers. By 1881 an estimated 6000 arc lights were in service in the U.S., each supplied by its own dynamo. See Figure 30.



Thomas Edison's decision in 1877 to develop a practical incandescent lamp suitable for household and office use (in competition with the gas-light), was historically momentous, albeit not especially interesting, or original, technologically. In fact, Edison was by no means the first to work on the idea of incandescent light. At least 22 inventors, starting with Humphrey Davey (1802) demonstrated incandescent electric light. The first to be patented was in 1841, in England. It was a design based on a coiled platinum filament in an evacuated glass bulb. The first patent on a carbon filament in a glass bulb was granted to an American in 1845. Neither was actually produced commercially.

Nor was Edison alone in the field when he began his campaign; he had several active competitors, some with important patents on features of the incandescent light. Joseph Swan, in England, was actually the first to light his house by electricity, and the first to go into commercial production. (Swan’s lightbulbs lit the Savoy Theater in 1881.) Joseph Swan, in England, was actually the first to light his house by electricity, and the first to go into commercial production. Swan’s lightbulbs lit the Savoy Theater in 1881.

However Edison was the first to install a complete electrical lighting system in a commercial building, in the fall of 1880. It, too, went into production in 1881, both by Edison's company and several competitors, including Swan in England. Both incandescent and arc-lighting systems met rapidly growing demand. By 1885 the number of arc lights in service in the US was 96,000 and the number of incandescent lights had already reached 250,000. Five years later those numbers had risen to 235,000 and 3 million, respectively.15 By 1891 there were 1300 electric light generating stations in the USA.

Figure : Rapid growth on electrification of factories and households USA



The main difference between Edison and his competitors was that Edison was business-oriented, and had the patience and the financial backing (from J. P. Morgan) to make it work. (It was during the development phase that he remarked to someone that “genius is one percent inspiration and ninety-nine percent perspiration.”) The first problem he faced was to develop a filament and a way of mass producing it. The constraints were that the bulb would last a reasonable number of hours before burning up, and that it could be manufactured in quantity. His carbon filament, in an evacuated bulb (an idea borrowed from several predecessors) solved that problem temporarily, though the carbon filament was eventually replaced by tungsten filaments. The key was to get as much as possible of the air out of the light-bulb. Sprengel’s improved vacuum pump was an important prerequisite of Edison’s success.

It is of some interest to mention why electric light was so revolutionary. When the sun is shining, the light level on a clear day is about ten thousand lux (lumens per square meter). The light level in a contemporary home is only around a hundred lux. (The human eye is able to adjust for the difference automatically by opening the pupil, so the difference doesn’t seem so great.) But the light supplied by a wax candle, which was the only source of indoor light available before Edison and Swan, produces only 13 lumens. A 100 watt incandescent lamp produces about 1200 lumens, or nearly 100 times as much. And recent developments have gone much further.

William Nordhaus has calculated the price of illumination in terms of labor-hours{Nordhaus, 1998 #7666}. In 1850, when people still used candles, the price of light was 3 labor hours per 1000 lumen hours, the equivalent of about 10 candles.16 By 1890 the price of lighting was down to 0.133 labor hours per 1000 lumen hours (a factor of 21) and by 1992 (the last year of Nordhaus’ calculations), it was down to 0.00012 labor hours per 1000 lumen hours, or a factor of about 24,000. Today we take light for granted and barely give a thought to its price. But in terms of human well-being, the difference is as between night and day.

Non-lighting applications of DC power soon followed. The most notable in terms of immediate impact was the electric street-railway (or trolley). Street railways sprang up more or less independently in a number of cities. The contributions of Van De Poele (especially the carbon-brush, 1888) and Sprague deserve particular mention. The building of urban transit systems not only employed a good many people, but also permanently influenced urban geography, permitting much higher density development than had previously been possible, as well as creating many new "suburbs" along the trolley lines. A remarkable number of street railways (trolleys) were built in the following decade. By 1910 it was said to be possible to travel by trolley – albeit with many interchanges – from southern Maine to Milwaukee, Wisconsin.

To supply electricity in large amounts economically it was (and is) necessary to take advantage of economies of scale in production, whether from available hydro-power sources or from large steam-powered generating plants. In the eastern USA the obvious initial location for a large generating plant was Niagara Falls. It was some distance from the major centers of industry (although some energy-intensive industry soon moved to the neighborhood). In Europe, the first hydroelectric generating plants were naturally located on tributaries of the Rhone, Rhine and Danube rivers, mostly in remote Alpine Valleys also far from industrial towns.

To serve a dispersed population of users from a central source an efficient means of long-distance transmission was needed. But efficient (i.e. low loss) transmission inherently requires much higher voltages than lighting. DC power cannot easily be converted from low to high or high to low voltages. By contrast, AC power can be transformed from low to high and from high to low voltages, easily. (It’s what electrical transformers do.)

This simple technical fact dictated the eventual outcome. All that was needed was the requisite technology for generating and transmitting AC power at high voltages. Elihu Thomson (inventor of the volt-meter) had already developed a practical AC generator (1881), as the arc-light business of Thomson-Houston went into high gear. Gaulard and Gibbs, in Europe, and Stanley in the U.S. developed prototype transmission and distribution systems for AC incandescent light by 1885. Nicola Tesla introduced the poly-phase system, the AC induction ("squirrel cage") motor and the transformer by 1888. Tesla was one of the great inventive geniuses of all time.

George Westinghouse was the entrepreneur who saw the potential of AC power, acquired licences to all the patents of Gaulard, Gibbs, Stanley and (most of all) Tesla and "put it all together". His original agreement with Tesla called for a royalty payment of $2.50 for every horsepower of AC electrical equipment based on his patents. If those royalties had been paid Tesla would soon have been a billionaire, but the royalty was wildly unrealistic. It would have made the equipment so costly that it could never have competed with Edison’s DC equipment. (Tesla generously tore up his contract, but failed to negotiate a more realistic one.)

Sadly, Thomas Edison is regarded as great genius whereas Tesla, probably the greater inventor, is now regarded as a failure and mostly forgotten. Westinghouse’s success was assured by a successful bid for the Niagara Falls generating plant (1891) and, subsequently, the great Chicago Exhibition (1893). It was Nikola Tesla, incidentally, who persuaded Westinghouse to establish the 60 cycle standard for AC, which ultimately became universal in the USA.

Yet Edison, the “father” of DC power, strongly resisted the development and use of AC systems (as Watt, before him, had resisted the use of high-pressure steam). During the 1880’s both the Edison companies and the Thomson-Houston companies had been growing rapidly and swallowing up competing (or complementary) smaller firms, such as Brush Electric, Van De Poele and Sprague Electric. In 1892 the two electric lighting and traction conglomerates merged – thanks to J P Morgan’s influence – to form the General Electric Co. (GE), with combined sales of $21 million, as compared to only $5 million for competitor Westinghouse. Yet, in the end, AC displaced DC for most purposes, as it was bound to do.17

The conversion from reciprocating to turbine steam engines was remarkably fast. In fact the last and biggest reciprocating steam engine, which produced 10,000 hp was completed in 1899 to power the growing New York City subway (metro) system. The pistons needed a tower 40 feet high. But by 1902, three years later, it was obsolete and had to be junked and replaced by a steam turbine only one tenth of its size {Forbes, 1963 #1859} p.453.

Turbines powered by steam or high temperature combustion products (gas turbine) deserve special mention because they have to operate at very high speeds and very high temperatures, constituting a technological challenge. The first effective steam turbines for electric power generation were built in 1884 by Sir Charles Parsons. Parsons turbines were later adopted by the British Navy, which enabled the British fleet to maintain its global dominance until after WW I. The steam turbine wheels utilize high quality carbon steel with some alloying elements such as molybdenum for machinability. They need to operate at steam temperatures above 600 degrees Celsius.

The first effective gas turbines (for aircraft) were built in 1937 by Frank Whittle, and adapted to electric power generation independently by Siemens in Germany and GE in the US during the 1950s and 1960s. Gas turbine wheels must withstand much higher temperatures, as high as 1500 degrees Celsius. To do they must be made from so-called “superalloys” usually based on nickel, cobalt, molybdenum, chromium and various minor components. These metals are very hard to machine or mold accurately, as well as being costly to begin with, which is why gas turbines are used only by high performance aircraft and have never been successfully used for motorcars, buses or trucks.

The steam for a turbo-generator is generated by heat, either from nuclear (fission) reactions or by the combustion of fossil fuels, usually coal. Steam turbines supply 90% of the electric power in the US today. Most of the rest is from hydraulic turbines. Steam turbine technology today is extremely sophisticated because steam moves faster and is much “slipperier” than liquid water. Combustion products from a gas turbine move even faster. Therefore steam or gas turbine wheels need to turn much faster, requiring very good bearings and very precise machining. This need had a great influence on metallurgy, machining technology and ball bearing design and production in the late 19th century.

Two new and important applications of electric power also appeared during the 1890's, viz. electro-metallurgy and electro-chemistry. The high temperature electric arc furnace, invented by William Siemens (1878) was first used commercially to manufacture aluminum alloys (Cowles 1886). In 1892 Moissan proposed the use of the high energy gas, acetylene (C2H2) as an illuminating gas. In the same year (1892) Willson demonstrated a method of producing acetylene from calcium carbide, made from limestone and coke in an electric furnace. The carbide is a solid that can be transported and stored safely. Acetylene can be made from the calcium carbide by simply adding water.

This made the use of acetylene practical for many purposes, from welding to illumination, and quickly led to the creation of a large industry, of which Union Carbide was the eventual survivor. Acetylene was rapidly adopted as an illuminant for farms and homes in towns without electricity. By 1899 there were a million acetylene gas jets, fed by 8000 acetylene plants, in Germany alone {Burke, 1978 #1052 p. 209}. The acetylene boom collapsed almost as rapidly as it grew, due to the impact of the Welsbach mantle on the one hand and cheaper electricity on the other. Acetylene continued as a very important industrial chemical feedstock, however, until its displacement by ethylene (made from natural gas) in the 1930's.

Another early application of electric furnaces was the discovery and production of synthetic silicon carbide ("carborundum") (Acheson, 1891). This was the hardest material known at the time, apart from diamonds. It found immediate use as an abrasive used by the rapidly growing metal-working industry, especially in so-called production grinding machines introduced by Charles Norton (1900). High-speed grinding machines turned out to be essential to the mass production of automobile engines, especially the complex shape of the crankshaft.

The research of Henri Moissan in the late 1890's also led to the use of the electric furnace for melting metals with high melting points, such as chromium, molybdenum, nickel, platinum, tungsten and vanadium. Heroult further developed the electric furnace for industrial purposes and his work was instrumental in permitting its use for the production of ferroalloys and special steels, beginning in Canada after 1903. The importance of abrasives (for grinding) and special tool steels and "hard" materials, such as tungsten carbide, is illustrated by Figure 31 in terms of metalworking rates.

Figure : Measures of precision in metalworking



Electrochemistry –the application of electrolysis– also became industrially practical in the 1880's. The first, and most important, industrial electrolytic process was discovered in1887 independently by Paul Heroult and France and Charles M. Hall in the U.S. It was a practical way of producing aluminum from aluminum oxide (alumina). To avoid the classic difficulty that defeated earlier efforts (that electrolysis of aluminum salts dissolved in water produced aluminum hydroxide, not aluminum metal), both inventors came up with the same solution. They dissolved aluminum oxide in molten cryolite a mineral (sodium-aluminum fluoride) found originally in Greenland and later made synthetically. This process was commercially exploited in both countries within two years. The price of metallic aluminum dropped rapidly, from $2/lb in the late 1880's, to $0.30/lb by 1897. Not surprisingly, many new applications of aluminum emerged, starting with pots and pans for the kitchen, which were previously made from cast iron and didn’t conduct heat very well.

Curiously, the availability of metallic aluminum had no real impact on the infant aircraft industry in its first three decades, during which air- frames were made from wood and wire and surfaces were often made of fabric. It was 1927 when the first all-metal plane (Ford Tri-motor) was built. Needless to say, the commercial airline industry, in its present form, could not exist without aluminum. The next generation Boeing 787 “Dreamliner” will be the first to utilize significant quantities of carbon-fiber composites materials in place of aluminum.

The second important electrolytic process (1888) was Castner's system for manufacturing metallic sodium or sodium hydroxide (and chlorine) from fused sodium chloride, better known as common salt. At first, it was sodium hydroxide that was the important product. It was used in soap-making and for “whitening” illuminating oil for kerosene lamps, and later for converting bauxite to alumina. At first, chlorine was a cheap by-product, used initially – and still today – as a disinfectant for swimming pools and municipal water treatment. However, chlorine was soon in demand for a variety of chemical purposes, as well as for bleaching paper.

But the German discovery of a host of uses of chlorine in organic chemistry chlorine chemistry was the game-changer. Today, chlorine is one of the most important basic industrial materials, with a host of important uses from plastics (e.g. polyvinyl chloride) to insecticides (beginning with DDT). The cumulative economic impact of electro-metallurgy and electrochemistry has clearly been enormous, although most of it was not felt until many years after the key manufacturing innovations.

Another by-product of electrification was the telephone. Precursors of the telephone were, primarily, the telegraph commercially developed and patented in 1837 by Wheatstone (England) and Morse (USA) and its various later improvements. The key invention, by some accounts the most valuable of all time, was, to some extent, accidental. Alexander Graham Bell's backers were merely seeking to increase the capacity of the telegraph system, which was having difficulty expanding its network fast enough to accommodate growing demand. The original invention was the practical implementation of the notion that speech could be transmitted and reproduced as an "undulatory electric current of varying frequency".

To be sure, the telegraph was already a well-established technology by the 1860s, having growth up with the railroads. But the telephone was disconnected from the railroads from the beginning, and it needed its own power supply. The entire telephone system, as well as radio, TV and now the Internet are all essentially developments for the transmission of electro-magnetic signals as “undulatory currents of varying frequency”. (Wireless transmission came later.) Bell’s first telephone in 1876 actually preceded Edison's breakthrough in DC power, and utilized battery power. Nevertheless, the telephone system in its present form is entirely dependent on the availability of inexpensive, reliable electric power in large amounts.

In any case, Bell's invention was soon followed by the creation of a business enterprise (American Bell Telephone Co.) which grew with incredible rapidity. Manufacturing began under license within a year and 5600 phones had been produced by the end of 1877. In 1878 the first commercial switchboard was placed in operation in New Haven, Connecticut, with 21 subscribers. Telephone companies sprang up in almost every town and city, not only in the U.S. but also Western Europe. The annual rate of U.S. telephone production rose to 67,000 in the year 1880 (of which 16,000 were exported), and the number of units in the hands of licensed Bell agents in the U.S. alone reached 132,692 as of Feb. 20 1881 {Smith, 1985 #4716 p. 161}. The number in service nearly doubled two years later.18

Bell's original patent was the subject of a long drawn out lawsuit due to the near-simultaneous patent submission by another inventor, Elisha Gray, a co-founder of Western Electric Co19. The dispute was based on whether or not Bell had stolen Gray’s idea of a liquid (mercury) transmitter. It was not decided until 1888. While the most essential, this was only the first step in a massive technological enterprise. Telephony has spawned literally tens of thousands of inventions, some of them very important in their own right. One of the first and most important was the carbon microphone invented by Edison and Berliner (1877). Elisha Gray, a lifetime inventor with 70 patents (including a method of transmitting musical notes, and is credited with inventing the music synthesizer.

Many of the key inventions in sound reproduction, electronic circuitry and radio were by-products of an intensive exploration of ways to reduce costs and improve the effectiveness of telephone service. There were 47,900 phones actually in service at the end of 1880, 155,800 by the end of 1885, 227,900 by the end of 1890, and 339,500 by the end of 1895. The original Bell patent expired in 1894, releasing a burst of activity. U.S. demand skyrocketed: a million phones were added to the system in the next 5 years, 2.8 million in the period 1905-10, and 3.5 million in the period 1910-15. The U.S. industry grew much faster than its European counterpart after 1895 (having lagged somewhat behind Sweden and Switzerland previously).

Employment generated by the telephone system in all its ramifications has not been estimated, but was probably not less than 1 per hundred telephones in service. By 1900, the manufacturer, Western Electric Co. (later acquired by At&T) alone employed 8,500 people and had sales of $16 million; by 1912 it was the third largest manufacturing firm in the world, with annual sales of $66 million. It need scarcely be pointed out that the telephone system could not have grown nearly so fast or so large without the concomitant availability of electric power.

By 1920 all US cities and towns also had electric power generating and distribution systems, as well as telephone exchanges. Growth was facilitated thanks to the introduction of regulated public utilities with local monopoly over distribution. This innovation facilitated large-scale central power generating plants capable of exploiting economies of scale. In the 1920s and 1930s the electrical industry expanded to embrace a variety of electrical appliances and "white goods" such as refrigerators, vacuum cleaners and washing machines.

From 1910 to 1960 the average efficiency of electric power generation in the industrialized countries increased from under 10% to over 30%, resulting in sharp cost reductions and price cuts. Today the electric power generating and distribution industry is, by capital invested, the biggest industry in the world.

An offshoot of the telephone (and another application of electricity) was radio-telegraphy, which has subsequently morphed into radio, TV and the Internet. It all started in 1867 when James Clerk Maxwell published his comprehensive theory of electromagnetism, which predicted the existence of electromagnetic waves. This was one of the greaatest intellectual achievements in human history, although it was barely noticed except by a few physicists at the time. Twenty years later Heinrich Hertz demonstrated the existence of radio waves. The measure of frequency is named for him.

The man who saw a business opportunity for radio-telegraphy and developed the technology was Guglielmo Marconi, starting in 1896. However, the technology was very limited at first and its applications remained few and specialized – mainly to ships – for the next two decades. But electronic technology progressed very rapidly, thanks to Alexander Fleming’s invention of the thermionic diode, or “valve” in 1904. It got that name because, like a valve, it could open and shut like a switch. (He was working for Marconi at the time, taking time off from his professorship.) This “valve” was the predecessor of the ubiquitous “vacuum tube”, which has by no mean been completely replaced by solid-state transistors.

Fleming’s invention was followed by a clever ‘add on” two years later by the American inventor Lee De Forest (no slouch – he had 180 patents). De Forest added a third filament – creating a triode – and observed that the “valve” had become an amplifier of sorts. De Forest and Fleming fought for years over patent rights, while a young genius named Edwin Armstrong proceeded to invent several of the most important circuitry innovations in the radio business, starting with the regenerative amplifier, the super-regenerative circuit for transmitters, and the super-heterodyne circuit for receivers. (Armstrong had 70 patents.)

Broadcast radio took off in the 1920s. By 1929 Radio Corporation of America (RCA) was the hottest stock on Wall Street. In the 1930s Armstrong pioneered frequency modulation (FM) – previously invented (1922) by John Renshaw Carson as single sideband modulation (SSM), at Bell Laboratories – as a substitute for amplitude modulation (AM) for radio broadcasting. The superiority of FM is perfectly evident today. But it was fought tooth and nail in the courts and the Federal Communications Commission (FCC) by RCA, under its unscrupulous chief, David Sarnoff (himself a former Marconi inspector). RCA’s legal victories impoverished Armstrong – who had been a wealthy man in the 1930s, thanks to his radio patents – and he committed suicide in 1954. Armstrong’s widow finally succeeded in reversing RCA’s legal victories, but too late for him.

The underlying ideas for TV were developed gradually in several countries over the decades after 1897, when British physicist J.J. Thomson demonstrated the ability to control cathode rays in a cathode ray tube (CRT). The first to patent a complete system including “charge storage” for scanning and displaying an image was an Hungarian engineer, Kalman Tihanyi in 1926. Tihanyi’s patents (including patents not yet issued), were bought by RCA. RCA also bought Vladimir Zworykin’s 1923 “imaging tube” patent from Westinghouse, further developed by Zworykin into a color version in 1928 and incorporated in RCA’s “iconoscope” camera in 1933.

Meanwhile Diekmann and Hell patented an “image dissector” camera in Germany in 1925. That later became the basic design for European TV systems. Meanwhile, Philo Farnsworth in the US had independently invented his “image dissector” camera which he patented (and demonstrated in 1927). (RCA sued Farnsworth for infringement on Zworykin’s 1923 patent, but lost and had to pay him $1 million starting in 1939). Farnsworth demonstrated his (improved) system at the Franklin Institute in 1934. During the early 1930s other imaging systems for CRTs were being developed and demonstrated in Berlin (Manfred von Ardenne), Mexico (Gonzales Camarena) and London (Isaac Schoenberg at EMI). However, the signal-to-noise ratios (sensitivity) of the RCA iconoscope, and the first EMI “emitron” were both still much too low. In fact, under Schoenberg’s supervision the signal to noise ratio of first “emitron” was radically improved (by a factor of 10-15 or more) as a “super-emitron”. That breakthrough became the basis for the world’s first 405-line “high-definition” TV broadcasting service, by the BBC, in 1937.

Television using vacuum tubes and cathode ray tubes (CRTs) for pictures was further developed in the 1940s and spread rapidly in the 1950s. Bell Telephone Laboratories demonstrated a mechanically scanned color TV in 1929. Camarena’s patent on a “trichromatic field sequential system” in 1940 was one of the enablers of color TV. There were a number of problems, and a great many firms in different countries were working on the problems. The first color TV broadcast in the US was in 1963 (the Rose Bowl parade).The “color transition” in the US started in 1965 when all the TV networks started transmitting more than half of their programming in color. By 1972 the transition was complete. But by 1970, thanks to transistors, RCA had lost its technological edge and the TV industry had moved to Japan (Sony, Matsushita).

Digital high density TV (HDTV) was pioneered in Japan and began to penetrate the global market in 1990. The digital TV transition started in 2007 or so. Analog broadcasting was rapidly phased out, and was expected to be finished by 2015 or so.

The flat screen “plasma display” system was first proposed in 1936, by the same Hungarian engineer, Kalman Tihanyi, whose TV patents from 1926 had been acquired by RCA. Flat screen systems, mostly using liquid crystal displays (LCDs) using solid-state light-emitting diodes (LEDs) took a long time to be commercially viable, in competition with CRTs. But by the 1990s, flat screens were replacing CRTs and by 2007 LCD displays outsold CRTs for the first time.

By 2013, 79 percent of the households in the world had TV sets.

After WW II electricity demand was driven by the spread of telephones, radios, TV, freezers, air-conditioning and – of course – more lighting. The rapidly rising electrical demands of the rapidly expanding telephone system in the US, which depended entirely on electromechanical switching systems, prompted the executives of the Bell Telephone Laboratories to initiate a new R&D project in 1945. It was aimed at finding cheaper and more economical solid state switching devices to replace vacuum tubr diodes and triodes.

The outcome of that project was the transistor, a solid-state electronic device based on semi-conductors, a newly discovered class of materials, and the so-called “semi-conductor” effect: to alter the conductivity of a material by means of an external field, discovered previously by Shockley. The Bell Labs project stalled at first, when the experiments got nowhere. Bardeen found (and published) the explanation, in terms of quantum-mechanical surface effects. That idea led to the first point-contact transistors, which were demonstrated in December 1947 at Bell Telephone Laboratories and announced publically in June 1948. This breakthrough by three physicists (John Bardeen, Walter Brattain and William Shockley) earned a Nobel Prize in 1956.

The problem of patenting the transistor was complicated by the fact that the first patents for a solid-state triode were filed by Julius Edgar Lilienfeld, in Canada and the US in 1925 and 1926. In 1934 Oskar Heil patented such a device in Germany, but neither Lilienfeld nor Heil published research articles or demonstrated actual prototypes. However the lawyers decided to patent “around” the Lilienfeld patents. They submitted four narrow patent applications, none of which contained Shockley’s name. This omission angered him, and broke up the group. In 1951 Shockley – working in secret – independently invented the bipolar junction transistor, which is the basis of most later devices. Publicity for this infuriated Bardeen and Brattain. Bardeen left Bell Labs and moved to the University of Illinois in 1951 to work on the theory of superconductivity (for which he received another Nobel Prize in 1972).

In 1956 Shockley (still angry) moved to California to run Shockley Semiconductor Lab, a division of Beckman Instruments. But because of his decision to terminate R&D on silicon (and his unilateral management style) eight of his top employees left as a group to form Fairchild Semiconductor. A few years later, several of those men departed again, to form Intel.

Because of a “consent agreement” by AT&T with the Anti-Trust Division of the US Department of Justice, that fundamental patent was licensed by AT&T to all comers, for a modest fee ($50,000). One of the earliest licensees was Sony, which used transistors in its successful portable “transistor radios”, starting in 1957. The original black-and-white TV sets, using vacuum tubes in their innards, were replaced by color as the vacuum tubes were replaced by transistors, in the 1960s and 70s. In the 1970s and 1980s the TV manufacturing industry had moved to Japan and South Korea.

The early stored program digital computers, starting with ENIAC in 1948, used vacuum tubes for switching. The next generation of computers by IBM and UNIVAC in the early 1950s was using transistors instead of vacuum tubes. At that time one of the chief executives of IBM opined that there was room in the world for a dozen or so of those “main-frame” computers. (I worked on one in the summer of 1953). Other entrepreneurs saw it differently. The next major breakthrough was the “integrated circuit” invented, independently, by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor in 1958-59. The circuit-on-a-silicon-chip technology took a decade to scale up, but the dynamic random-access memory (D-RAM) on a chip arrived in 1970, followed by the first micro-processor (1971), both by Intel.

Those technologies, in turn, enabled computers to become faster and more powerful as they shrank in size. The first personal computer by Olivetti (1965) cost $3200. Hewett-Packard began manufacturing small computers soon after that. To make a long and interesting story short, the merger of telecommunications and computer technology, resulted in the internet, the mobile phone and the digital revolution. Since 1980 information technology devices, starting with personal computers (PCs) and more recently cellphones, and so-called tablets have spread rapidly.

In retrospect, it can be argued that the transistor and the integrated circuit did open up a new resource albeit one that is hard to compare with natural resources like coal or copper. The new resource can be called “bandwidth”. It revolutionized the radio and TV industries and led to the ubiquitous digital information and information technology that dominates the world economy today.


Yüklə 1,53 Mb.

Dostları ilə paylaş:
1   ...   7   8   9   10   11   12   13   14   15




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©genderi.org 2024
rəhbərliyinə müraciət

    Ana səhifə