Selasa, 30 November 2021
Electronics media entertaiment support AuVid by Agustinus Manguntam Siboro / WIPER / GLock
Change is something that continues to run in this life, changes in mathematics are usually called deltas or derivatives, life continues to change because life is definitely change itself, without change there is no basis for moving and progressing. Change makes us more dignified, there is no life without change. y' = dy / dx ..delta force orbitory .
change.over
Gen . Strobe light
$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$
Integrated electronics engineering as a force
in remote and close control
$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$
RENE_WINGS
( Mount Agung and Lake Batur in the schematic networking job to joint Dewa Pasupati Link in Naga Naga is turbulent )
------------------------------------------
Decima Machine learning use to transmitt
electromagnet catalyput
__________________________________________
In Roman mythology, Decima was one of the Parcae, or the Fates. She measured the thread of life with her rod. She was also revered as the goddess of childbirth. Her Greek equivalent was Lachesis.
we need change management to improve science
_____________________________________________
Change management draws on theories from many disciplines, including psychology, behavioral science, engineering, and systems thinking.
Successful change management relies on four core principles:
Understand Change.
Plan Change.
Implement Change.
Communicate Change.
It outlines five things you should address in your communications:
Awareness (of the need for change).
Desire (to participate in and support it).
Knowledge (of how to change).
Ability (to change).
Reinforcement (to sustain the change in the long term).
There are four key principles of change management:
1. Understand Change: for change to be effective, you need to understand all the "ins and outs" of the change. For example, what it is, how it will be achieved, and why it needs to happen.
2. Plan Change: this can include achieving high-level sponsorship of the change project, as well as identifying wider involvement and buy-in opportunities.
3. Implement Change: when you come to carry out your plan, you need to ensure that everyone involved knows what they're doing. This may encompass addressing training needs, appointing "change agents," providing support for people across the organization, and setting specific success criteria.
4 Communicate Change: everyone needs to know why the change is happening, feel positive about it, and understand how they can achieve success.
The evolution of knowledge within and across fields in modern electronics trigger physics
========================
The exchange of knowledge across different areas and disciplines plays a key role in the process of knowledge creation, and can stimulate innovation and the emergence of new fields. We develop here a quantitative framework to extract significant dependencies among scientific disciplines and turn them into a time-varying network whose nodes are the different fields, while the weighted links represent the flow of knowledge from one field to another at a given period of time .
The analysis of knowledge flows internal to each field displays a remarkable variety of temporal behaviours, with some fields of physics showing to be more self-referential than others. The temporal networks of knowledge exchanges across fields reveal cases of one field continuously absorbing knowledge from another field in the entire observed period, pairs of fields mutually influencing each other, but also cases of evolution from absorbing to mutual or even to back-nurture behaviors.
Knowledge creation and knowledge sharing go hand in hand. Knowledge is in fact created through combination and integration of different concepts, and can benefits from social interactions and interdisciplinary collaborations. Recent works have explored from many angles how knowledge flows across scholars institutions and disciplines. In particular, it has been shown that knowledge exchange across fields can influence the evolution of culture and language strengthen multi-faceted cooperation and drive the innovation and development of science. Research publications are one of the primary channels of communication for the exchange and spreading of knowledge in science .
Interactions among scientific fields can be better characterized by making use of scientific citations.
electronics
===========
electronics, branch of physics and electrical engineering that deals with the emission, behaviour, and effects of electrons and with electronic devices .but now all fields require electronics as instruments and controls to be smarter control .
Electronics encompasses an exceptionally broad range of technology. The term originally was applied to the study of electron behaviour and movement, particularly as observed in the first electron tubes. It came to be used in its broader sense with advances in knowledge about the fundamental nature of electrons and about the way in which the motion of these particles could be utilized. Today many scientific and technical disciplines deal with different aspects of electronics. Research in these fields has led to the development of such key devices as transistors, integrated circuits, lasers, and optical fibres. These in turn have made it possible to manufacture a wide array of electronic consumer, industrial, and military products. Indeed, it can be said that the world is in the midst of an electronic revolution at least as significant as the industrial revolution of the 19th century
Integrated circuits
By 1960 vacuum tubes were rapidly being supplanted by transistors, because the latter had become less expensive, did not burn out in service, and were much smaller and more reliable. Computers employed hundreds of thousands of transistors each. This fact, together with the need for compact, lightweight electronic missile-guidance systems, led to the invention of the integrated circuit (IC) independently by Jack Kilby of Texas Instruments Incorporated in 1958 and by Jean Hoerni and Robert Noyce of Fairchild Semiconductor Corporation in 1959. Kilby is usually credited with having developed the concept of integrating device and circuit elements onto a single silicon chip, while Noyce is given credit for having conceived the method for integrating the separate elements.
physics industry electronic system
Electronics encompasses an exceptionally broad range of technology. The term originally was applied to the study of electron behaviour and movement, particularly as observed in the first electron tubes. It came to be used in its broader sense with advances in knowledge about the fundamental nature of electrons and about the way in which the motion of these particles could be utilized. Today many scientific and technical disciplines deal with different aspects of electronics. Research in these fields has led to the development of such key devices as transistors, integrated circuits, lasers, and optical fibres. These in turn have made it possible to manufacture a wide array of electronic consumer, industrial, and military products. Indeed, it can be said that the world is in the midst of an electronic revolution at least as significant as the industrial revolution of the 19th century.
The history of electronics
The vacuum tube era
Theoretical and experimental studies of electricity during the 18th and 19th centuries led to the development of the first electrical machines and the beginning of the widespread use of electricity. The history of electronics began to evolve separately from that of electricity late in the 19th century with the identification of the electron by the English physicist Sir Joseph John Thomson and the measurement of its electric charge by the American physicist Robert A. Millikan in 1909.
At the time of Thomson’s work, the American inventor Thomas A. Edison had observed a bluish glow in some of his early lightbulbs under certain conditions and found that a current would flow from one electrode in the lamp to another if the second one (anode) were made positively charged with respect to the first (cathode). Work by Thomson and his students and by the English engineer John Ambrose Fleming revealed that this so-called Edison effect was the result of the emission of electrons from the cathode, the hot filament in the lamp. The motion of the electrons to the anode, a metal plate, constituted an electric current that would not exist if the anode were negatively charged.
This discovery provided impetus for the development of electron tubes, including an improved X-ray tube by the American engineer William D. Coolidge and Fleming’s thermionic valve (a two-electrode vacuum tube) for use in radio receivers. The detection of a radio signal, which is a very high-frequency alternating current (AC), requires that the signal be rectified; i.e., the alternating current must be converted into a direct current (DC) by a device that conducts only when the signal has one polarity but not when it has the other—precisely what Fleming’s valve (patented in 1904) did. Previously, radio signals were detected by various empirically developed devices such as the “cat whisker” detector, which was composed of a fine wire (the whisker) in delicate contact with the surface of a natural crystal of lead sulfide (galena) or some other semiconductor material. These devices were undependable, lacked sufficient sensitivity, and required constant adjustment of the whisker-to-crystal contact to produce the desired result. Yet these were the forerunners of today’s solid-state devices. The fact that crystal rectifiers worked at all encouraged scientists to continue studying them and gradually to obtain the fundamental understanding of the electrical properties of semiconducting materials necessary to permit the invention of the transistor.
In 1906 Lee De Forest, an American engineer, developed a type of vacuum tube that was capable of amplifying radio signals. De Forest added a grid of fine wire between the cathode and anode of the two-electrode thermionic valve constructed by Fleming. The new device, which De Forest dubbed the Audion (patented in 1907), was thus a three-electrode vacuum tube. In operation, the anode in such a vacuum tube is given a positive potential (positively biased) with respect to the cathode, while the grid is negatively biased. A large negative bias on the grid prevents any electrons emitted from the cathode from reaching the anode; however, because the grid is largely open space, a less negative bias permits some electrons to pass through it and reach the anode. Small variations in the grid potential can thus control large amounts of anode current.
The vacuum tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer), completed in 1946.
The special requirements of the many different applications of vacuum tubes led to numerous improvements, enabling them to handle large amounts of power, operate at very high frequencies, have greater than average reliability, or be made very compact (the size of a thimble). The cathode-ray tube, originally developed for displaying electrical waveforms on a screen for engineering measurements, evolved into the television picture tube. Such tubes operate by forming the electrons emitted from the cathode into a thin beam that impinges on a fluorescent screen at the end of the tube. The screen emits light that can be viewed from outside the tube. Deflecting the electron beam causes patterns of light to be produced on the screen, creating the desired optical images.
Notwithstanding the remarkable success of solid-state devices in most electronic applications, there are certain specialized functions that only vacuum tubes can perform. These usually involve operation at extremes of power or frequency.
Vacuum tubes are fragile and ultimately wear out in service. Failure occurs in normal usage either from the effects of repeated heating and cooling as equipment is switched on and off (thermal fatigue), which ultimately causes a physical fracture in some part of the interior structure of the tube, or from degradation of the properties of the cathode by residual gases in the tube. Vacuum tubes also take time (from a few seconds to several minutes) to “warm up” to operating temperature—an inconvenience at best and in some cases a serious limitation to their use. These shortcomings motivated scientists at Bell Laboratories to seek an alternative to the vacuum tube and led to the development of the transistor.
The semiconductor revolution
Invention of the transistor
The invention of the transistor in 1947 by John Bardeen, Walter H. Brattain, and William B. Shockley of the Bell research staff provided the first of a series of new devices with remarkable potential for expanding the utility of electronic equipment (see photograph). Transistors, along with such subsequent developments as integrated circuits, are made of crystalline solid materials called semiconductors, which have electrical properties that can be varied over an extremely wide range by the addition of minuscule quantities of other elements. The electric current in semiconductors is carried by electrons, which have a negative charge, and also by “holes,” analogous entities that carry a positive charge. The availability of two kinds of charge carriers in semiconductors is a valuable property exploited in many electronic devices made of such materials.
Early transistors were produced using germanium as the semiconductor material, because methods of purifying it to the required degree had been developed during and shortly after World War II. Because the electrical properties of semiconductors are extremely sensitive to the slightest trace of certain other elements, only about one part per billion of such elements can be tolerated in material to be used for making semiconductor devices.
During the late 1950s, research on the purification of silicon succeeded in producing material suitable for semiconductor devices, and new devices made of silicon were manufactured from about 1960. Silicon quickly became the preferred raw material, because it is much more abundant than germanium and thus less expensive. In addition, silicon retains its semiconducting properties at higher temperatures than does germanium. Silicon diodes can be operated at temperatures up to 200 °C (400 °F), whereas germanium diodes cannot be operated above 85 °C (185 °F). There was one other important property of silicon, not appreciated at the time but crucial to the development of low-cost transistors and integrated circuits: silicon, unlike germanium, forms a tenaciously adhering oxide film with excellent electrical insulating properties when it is heated to high temperatures in the presence of oxygen. This film is utilized as a mask to permit the desired impurities that modify the electrical properties of silicon to be introduced into it during manufacture of semiconductor devices. The mask pattern, formed by a photolithographic process, permits the creation of tiny transistors and other electronic components in the silicon.
Integrated circuits
By 1960 vacuum tubes were rapidly being supplanted by transistors, because the latter had become less expensive, did not burn out in service, and were much smaller and more reliable. Computers employed hundreds of thousands of transistors each. This fact, together with the need for compact, lightweight electronic missile-guidance systems, led to the invention of the integrated circuit (IC) independently by Jack Kilby of Texas Instruments Incorporated in 1958 and by Jean Hoerni and Robert Noyce of Fairchild Semiconductor Corporation in 1959. Kilby is usually credited with having developed the concept of integrating device and circuit elements onto a single silicon chip, while Noyce is given credit for having conceived the method for integrating the separate elements.
Early ICs contained about 10 individual components on a silicon chip 3 mm (0.12 inch) square. By 1970 the number was up to 1,000 on a chip of the same size at no increase in cost. Late in the following year the first microprocessor was introduced. The device contained all the arithmetic, logic, and control circuitry required to perform the functions of a computer’s central processing unit (CPU). This type of large-scale IC was developed by a team at Intel Corporation, the same company that also introduced the memory IC in 1971. The stage was now set for the computerization of small electronic equipment.
Until the microprocessor appeared on the scene, computers were essentially discrete pieces of equipment used primarily for data processing and scientific calculations. They ranged in size from minicomputers, comparable in dimensions to a small filing cabinet, to mainframe systems that could fill a large room. The microprocessor enabled computer engineers to develop microcomputers—systems about the size of a lunch box or smaller but with enough computing power to perform many kinds of business, industrial, and scientific tasks. Such systems made it possible to control a host of small instruments or devices (e.g., numerically controlled lathes and one-armed robotic devices for spot welding) by using standard components programmed to do a specific job. The very existence of computer hardware inside such devices is not apparent to the user .
The large demand for microprocessors generated by these initial applications led to high-volume production and a dramatic reduction in cost. This in turn promoted the use of the devices in many other applications—for example, in household appliances and automobiles, for which electronic controls had previously been too expensive to consider. Continued advances in IC technology gave rise to very large-scale integration (VLSI), which substantially increased the circuit density of microprocessors. These technological advances, coupled with further cost reductions stemming from improved manufacturing methods, made feasible the mass production of personal computers for use in offices, schools, and homes.
Electronics encompasses an exceptionally broad range of technology. The term originally was applied to the study of electron behaviour and movement, particularly as observed in the first electron tubes. It came to be used in its broader sense with advances in knowledge about the fundamental nature of electrons and about the way in which the motion of these particles could be utilized. Today many scientific and technical disciplines deal with different aspects of electronics. Research in these fields has led to the development of such key devices as transistors, integrated circuits, lasers, and optical fibres. These in turn have made it possible to manufacture a wide array of electronic consumer, industrial, and military products. Indeed, it can be said that the world is in the midst of an electronic revolution at least as significant as the industrial revolution of the 19th century.
The history of electronics
The vacuum tube era
Theoretical and experimental studies of electricity during the 18th and 19th centuries led to the development of the first electrical machines and the beginning of the widespread use of electricity. The history of electronics began to evolve separately from that of electricity late in the 19th century with the identification of the electron by the English physicist Sir Joseph John Thomson and the measurement of its electric charge by the American physicist Robert A. Millikan in 1909.
At the time of Thomson’s work, the American inventor Thomas A. Edison had observed a bluish glow in some of his early lightbulbs under certain conditions and found that a current would flow from one electrode in the lamp to another if the second one (anode) were made positively charged with respect to the first (cathode). Work by Thomson and his students and by the English engineer John Ambrose Fleming revealed that this so-called Edison effect was the result of the emission of electrons from the cathode, the hot filament in the lamp. The motion of the electrons to the anode, a metal plate, constituted an electric current that would not exist if the anode were negatively charged.
This discovery provided impetus for the development of electron tubes, including an improved X-ray tube by the American engineer William D. Coolidge and Fleming’s thermionic valve (a two-electrode vacuum tube) for use in radio receivers. The detection of a radio signal, which is a very high-frequency alternating current (AC), requires that the signal be rectified; i.e., the alternating current must be converted into a direct current (DC) by a device that conducts only when the signal has one polarity but not when it has the other—precisely what Fleming’s valve (patented in 1904) did. Previously, radio signals were detected by various empirically developed devices such as the “cat whisker” detector, which was composed of a fine wire (the whisker) in delicate contact with the surface of a natural crystal of lead sulfide (galena) or some other semiconductor material. These devices were undependable, lacked sufficient sensitivity, and required constant adjustment of the whisker-to-crystal contact to produce the desired result. Yet these were the forerunners of today’s solid-state devices. The fact that crystal rectifiers worked at all encouraged scientists to continue studying them and gradually to obtain the fundamental understanding of the electrical properties of semiconducting materials necessary to permit the invention of the transistor.
In 1906 Lee De Forest, an American engineer, developed a type of vacuum tube that was capable of amplifying radio signals. De Forest added a grid of fine wire between the cathode and anode of the two-electrode thermionic valve constructed by Fleming. The new device, which De Forest dubbed the Audion (patented in 1907), was thus a three-electrode vacuum tube. In operation, the anode in such a vacuum tube is given a positive potential (positively biased) with respect to the cathode, while the grid is negatively biased. A large negative bias on the grid prevents any electrons emitted from the cathode from reaching the anode; however, because the grid is largely open space, a less negative bias permits some electrons to pass through it and reach the anode. Small variations in the grid potential can thus control large amounts of anode current.
The vacuum tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer), completed in 1946.
The special requirements of the many different applications of vacuum tubes led to numerous improvements, enabling them to handle large amounts of power, operate at very high frequencies, have greater than average reliability, or be made very compact (the size of a thimble). The cathode-ray tube, originally developed for displaying electrical waveforms on a screen for engineering measurements, evolved into the television picture tube. Such tubes operate by forming the electrons emitted from the cathode into a thin beam that impinges on a fluorescent screen at the end of the tube. The screen emits light that can be viewed from outside the tube. Deflecting the electron beam causes patterns of light to be produced on the screen, creating the desired optical images.
Notwithstanding the remarkable success of solid-state devices in most electronic applications, there are certain specialized functions that only vacuum tubes can perform. These usually involve operation at extremes of power or frequency.
Vacuum tubes are fragile and ultimately wear out in service. Failure occurs in normal usage either from the effects of repeated heating and cooling as equipment is switched on and off (thermal fatigue), which ultimately causes a physical fracture in some part of the interior structure of the tube, or from degradation of the properties of the cathode by residual gases in the tube. Vacuum tubes also take time (from a few seconds to several minutes) to “warm up” to operating temperature—an inconvenience at best and in some cases a serious limitation to their use. These shortcomings motivated scientists at Bell Laboratories to seek an alternative to the vacuum tube and led to the development of the transistor.
The semiconductor revolution
Invention of the transistor
The invention of the transistor in 1947 by John Bardeen, Walter H. Brattain, and William B. Shockley of the Bell research staff provided the first of a series of new devices with remarkable potential for expanding the utility of electronic equipment (see photograph). Transistors, along with such subsequent developments as integrated circuits, are made of crystalline solid materials called semiconductors, which have electrical properties that can be varied over an extremely wide range by the addition of minuscule quantities of other elements. The electric current in semiconductors is carried by electrons, which have a negative charge, and also by “holes,” analogous entities that carry a positive charge. The availability of two kinds of charge carriers in semiconductors is a valuable property exploited in many electronic devices made of such materials.
Early transistors were produced using germanium as the semiconductor material, because methods of purifying it to the required degree had been developed during and shortly after World War II. Because the electrical properties of semiconductors are extremely sensitive to the slightest trace of certain other elements, only about one part per billion of such elements can be tolerated in material to be used for making semiconductor devices.
During the late 1950s, research on the purification of silicon succeeded in producing material suitable for semiconductor devices, and new devices made of silicon were manufactured from about 1960. Silicon quickly became the preferred raw material, because it is much more abundant than germanium and thus less expensive. In addition, silicon retains its semiconducting properties at higher temperatures than does germanium. Silicon diodes can be operated at temperatures up to 200 °C (400 °F), whereas germanium diodes cannot be operated above 85 °C (185 °F). There was one other important property of silicon, not appreciated at the time but crucial to the development of low-cost transistors and integrated circuits: silicon, unlike germanium, forms a tenaciously adhering oxide film with excellent electrical insulating properties when it is heated to high temperatures in the presence of oxygen. This film is utilized as a mask to permit the desired impurities that modify the electrical properties of silicon to be introduced into it during manufacture of semiconductor devices. The mask pattern, formed by a photolithographic process, permits the creation of tiny transistors and other electronic components in the silicon.
Integrated circuits
By 1960 vacuum tubes were rapidly being supplanted by transistors, because the latter had become less expensive, did not burn out in service, and were much smaller and more reliable. Computers employed hundreds of thousands of transistors each. This fact, together with the need for compact, lightweight electronic missile-guidance systems, led to the invention of the integrated circuit (IC) independently by Jack Kilby of Texas Instruments Incorporated in 1958 and by Jean Hoerni and Robert Noyce of Fairchild Semiconductor Corporation in 1959. Kilby is usually credited with having developed the concept of integrating device and circuit elements onto a single silicon chip, while Noyce is given credit for having conceived the method for integrating the separate elements.
Early ICs contained about 10 individual components on a silicon chip 3 mm (0.12 inch) square. By 1970 the number was up to 1,000 on a chip of the same size at no increase in cost. Late in the following year the first microprocessor was introduced. The device contained all the arithmetic, logic, and control circuitry required to perform the functions of a computer’s central processing unit (CPU). This type of large-scale IC was developed by a team at Intel Corporation, the same company that also introduced the memory IC in 1971. The stage was now set for the computerization of small electronic equipment.
Until the microprocessor appeared on the scene, computers were essentially discrete pieces of equipment used primarily for data processing and scientific calculations. They ranged in size from minicomputers, comparable in dimensions to a small filing cabinet, to mainframe systems that could fill a large room. The microprocessor enabled computer engineers to develop microcomputers—systems about the size of a lunch box or smaller but with enough computing power to perform many kinds of business, industrial, and scientific tasks. Such systems made it possible to control a host of small instruments or devices (e.g., numerically controlled lathes and one-armed robotic devices for spot welding) by using standard components programmed to do a specific job. The very existence of computer hardware inside such devices is not apparent to the user.
The large demand for microprocessors generated by these initial applications led to high-volume production and a dramatic reduction in cost. This in turn promoted the use of the devices in many other applications—for example, in household appliances and automobiles, for which electronic controls had previously been too expensive to consider. Continued advances in IC technology gave rise to very large-scale integration (VLSI), which substantially increased the circuit density of microprocessors. These technological advances, coupled with further cost reductions stemming from improved manufacturing methods, made feasible the mass production of personal computers for use in offices, schools, and homes.
By the mid-1980s inexpensive microprocessors had stimulated computerization of an enormous variety of consumer products. Common examples included programmable microwave ovens and thermostats, clothes washers and dryers, self-tuning television sets and self-focusing cameras, videocassette recorders and video games, telephones and answering machines, musical instruments, watches, and security systems. Microelectronics also came to the fore in business, industry, government, and other sectors. Microprocessor-based equipment proliferated, ranging from automatic teller machines (ATMs) and point-of-sale terminals in retail stores to automated factory assembly systems and office workstations.
By mid-1986 memory ICs with a capacity of 262,144 bits (binary digits) were available. In fact, Gordon E. Moore, one of the founders of Intel, observed as early as 1965 that the complexity of ICs was approximately doubling every 18–24 months, which was still the case in 2000. This empirical “Moore’s law” is widely used in forecasting the technological requirements for manufacturing future ICs .
Compound semiconductor materials
Many semiconductor materials other than silicon and germanium exist, and they have different useful properties. Silicon carbide is a compound semiconductor, the only one composed of two elements from column IV of the periodic table. It is particularly suited for making devices for specialized high-temperature applications. Other compounds formed by combining elements from column III of the periodic table—such as aluminum, gallium, and indium—with elements from column V—such as phosphorus, arsenic, and antimony—are of particular interest. These so-called III-V compounds are used to make semiconductor devices that emit light efficiently or that operate at exceptionally high frequencies.
A remarkable characteristic of these compounds is that they can, in effect, be mixed together. One can produce gallium arsenide or substitute aluminum for some of the gallium or also substitute phosphorus for some of the arsenic. When this is done, the electrical and optical properties of the material are subtly changed in a continuous fashion in proportion to the amount of aluminum or phosphorus used.
Except for silicon carbide, these compounds have the same crystal structure. This makes possible the gradation of composition, and thus the properties, of the semiconductor material within one continuous crystalline body. Modern material-processing techniques allow these compositional changes to be controlled accurately on an atomic scale.
These characteristics are exploited in making semiconductor lasers that produce light of any given wavelength within a considerable range. Such lasers are used, for example, in compact disc players and as light sources for optical fibre communication.
Digital electronics
Computers understand only two numbers, 0 and 1, and do all their arithmetic operations in this binary mode. Many electrical and electronic devices have two states: they are either off or on. A light switch is a familiar example, as are vacuum tubes and transistors. Because computers have been a major application for integrated circuits from their beginning, digital integrated circuits have become commonplace. It has thus become easy to design electronic systems that use digital language to control their functions and to communicate with other systems
A major advantage in using digital methods is that the accuracy of a stream of digital signals can be verified, and, if necessary, errors can be corrected. In contrast, signals that vary in proportion to, say, the sound of an orchestra can be corrupted by “noise,” which once present cannot be removed. An example is the sound from a phonograph record, which always contains some extraneous sound from the surface of the recording groove even when the record is new. The noise becomes more pronounced with wear. Contrast this with the sound from a digital compact disc recording. No sound is heard that was not present in the recording studio. The disc and the player contain error-correcting features that remove any incorrect pulses (perhaps arising from dust on the disc) from the information as it is read from the disc.
FAST FACTS
transistor
Electronics encompasses an exceptionally broad range of technology. The term originally was applied to the study of electron behaviour and movement, particularly as observed in the first electron tubes. It came to be used in its broader sense with advances in knowledge about the fundamental nature of electrons and about the way in which the motion of these particles could be utilized. Today many scientific and technical disciplines deal with different aspects of electronics. Research in these fields has led to the development of such key devices as transistors, integrated circuits, lasers, and optical fibres. These in turn have made it possible to manufacture a wide array of electronic consumer, industrial, and military products. Indeed, it can be said that the world is in the midst of an electronic revolution at least as significant as the industrial revolution of the 19th century.
flexible electronics
flexible electronics
The development of screen-printable electronic ink for flexible electronics.
© American Chemical Society
The history of electronics
The vacuum tube era
Theoretical and experimental studies of electricity during the 18th and 19th centuries led to the development of the first electrical machines and the beginning of the widespread use of electricity. The history of electronics began to evolve separately from that of electricity late in the 19th century with the identification of the electron by the English physicist Sir Joseph John Thomson and the measurement of its electric charge by the American physicist Robert A. Millikan in 1909.
At the time of Thomson’s work, the American inventor Thomas A. Edison had observed a bluish glow in some of his early lightbulbs under certain conditions and found that a current would flow from one electrode in the lamp to another if the second one (anode) were made positively charged with respect to the first (cathode). Work by Thomson and his students and by the English engineer John Ambrose Fleming revealed that this so-called Edison effect was the result of the emission of electrons from the cathode, the hot filament in the lamp. The motion of the electrons to the anode, a metal plate, constituted an electric current that would not exist if the anode were negatively charged.
This discovery provided impetus for the development of electron tubes, including an improved X-ray tube by the American engineer William D. Coolidge and Fleming’s thermionic valve (a two-electrode vacuum tube) for use in radio receivers. The detection of a radio signal, which is a very high-frequency alternating current (AC), requires that the signal be rectified; i.e., the alternating current must be converted into a direct current (DC) by a device that conducts only when the signal has one polarity but not when it has the other—precisely what Fleming’s valve (patented in 1904) did. Previously, radio signals were detected by various empirically developed devices such as the “cat whisker” detector, which was composed of a fine wire (the whisker) in delicate contact with the surface of a natural crystal of lead sulfide (galena) or some other semiconductor material. These devices were undependable, lacked sufficient sensitivity, and required constant adjustment of the whisker-to-crystal contact to produce the desired result. Yet these were the forerunners of today’s solid-state devices. The fact that crystal rectifiers worked at all encouraged scientists to continue studying them and gradually to obtain the fundamental understanding of the electrical properties of semiconducting materials necessary to permit the invention of the transistor.
In 1906 Lee De Forest, an American engineer, developed a type of vacuum tube that was capable of amplifying radio signals. De Forest added a grid of fine wire between the cathode and anode of the two-electrode thermionic valve constructed by Fleming. The new device, which De Forest dubbed the Audion (patented in 1907), was thus a three-electrode vacuum tube. In operation, the anode in such a vacuum tube is given a positive potential (positively biased) with respect to the cathode, while the grid is negatively biased. A large negative bias on the grid prevents any electrons emitted from the cathode from reaching the anode; however, because the grid is largely open space, a less negative bias permits some electrons to pass through it and reach the anode. Small variations in the grid potential can thus control large amounts of anode current.
The vacuum tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer), completed in 1946.
The special requirements of the many different applications of vacuum tubes led to numerous improvements, enabling them to handle large amounts of power, operate at very high frequencies, have greater than average reliability, or be made very compact (the size of a thimble). The cathode-ray tube, originally developed for displaying electrical waveforms on a screen for engineering measurements, evolved into the television picture tube. Such tubes operate by forming the electrons emitted from the cathode into a thin beam that impinges on a fluorescent screen at the end of the tube. The screen emits light that can be viewed from outside the tube. Deflecting the electron beam causes patterns of light to be produced on the screen, creating the desired optical images.
Notwithstanding the remarkable success of solid-state devices in most electronic applications, there are certain specialized functions that only vacuum tubes can perform. These usually involve operation at extremes of power or frequency.
Vacuum tubes are fragile and ultimately wear out in service. Failure occurs in normal usage either from the effects of repeated heating and cooling as equipment is switched on and off (thermal fatigue), which ultimately causes a physical fracture in some part of the interior structure of the tube, or from degradation of the properties of the cathode by residual gases in the tube. Vacuum tubes also take time (from a few seconds to several minutes) to “warm up” to operating temperature—an inconvenience at best and in some cases a serious limitation to their use. These shortcomings motivated scientists at Bell Laboratories to seek an alternative to the vacuum tube and led to the development of the transistor.
The semiconductor revolution
Invention of the transistor
The invention of the transistor in 1947 by John Bardeen, Walter H. Brattain, and William B. Shockley of the Bell research staff provided the first of a series of new devices with remarkable potential for expanding the utility of electronic equipment (see photograph). Transistors, along with such subsequent developments as integrated circuits, are made of crystalline solid materials called semiconductors, which have electrical properties that can be varied over an extremely wide range by the addition of minuscule quantities of other elements. The electric current in semiconductors is carried by electrons, which have a negative charge, and also by “holes,” analogous entities that carry a positive charge. The availability of two kinds of charge carriers in semiconductors is a valuable property exploited in many electronic devices made of such materials.
transistor
Early transistors were produced using germanium as the semiconductor material, because methods of purifying it to the required degree had been developed during and shortly after World War II. Because the electrical properties of semiconductors are extremely sensitive to the slightest trace of certain other elements, only about one part per billion of such elements can be tolerated in material to be used for making semiconductor devices.
During the late 1950s, research on the purification of silicon succeeded in producing material suitable for semiconductor devices, and new devices made of silicon were manufactured from about 1960. Silicon quickly became the preferred raw material, because it is much more abundant than germanium and thus less expensive. In addition, silicon retains its semiconducting properties at higher temperatures than does germanium. Silicon diodes can be operated at temperatures up to 200 °C (400 °F), whereas germanium diodes cannot be operated above 85 °C (185 °F). There was one other important property of silicon, not appreciated at the time but crucial to the development of low-cost transistors and integrated circuits: silicon, unlike germanium, forms a tenaciously adhering oxide film with excellent electrical insulating properties when it is heated to high temperatures in the presence of oxygen. This film is utilized as a mask to permit the desired impurities that modify the electrical properties of silicon to be introduced into it during manufacture of semiconductor devices. The mask pattern, formed by a photolithographic process, permits the creation of tiny transistors and other electronic components in the silicon.
Integrated circuits
By 1960 vacuum tubes were rapidly being supplanted by transistors, because the latter had become less expensive, did not burn out in service, and were much smaller and more reliable. Computers employed hundreds of thousands of transistors each. This fact, together with the need for compact, lightweight electronic missile-guidance systems, led to the invention of the integrated circuit (IC) independently by Jack Kilby of Texas Instruments Incorporated in 1958 and by Jean Hoerni and Robert Noyce of Fairchild Semiconductor Corporation in 1959. Kilby is usually credited with having developed the concept of integrating device and circuit elements onto a single silicon chip, while Noyce is given credit for having conceived the method for integrating the separate elements.
Early ICs contained about 10 individual components on a silicon chip 3 mm (0.12 inch) square. By 1970 the number was up to 1,000 on a chip of the same size at no increase in cost. Late in the following year the first microprocessor was introduced. The device contained all the arithmetic, logic, and control circuitry required to perform the functions of a computer’s central processing unit (CPU). This type of large-scale IC was developed by a team at Intel Corporation, the same company that also introduced the memory IC in 1971. The stage was now set for the computerization of small electronic equipment.
Until the microprocessor appeared on the scene, computers were essentially discrete pieces of equipment used primarily for data processing and scientific calculations. They ranged in size from minicomputers, comparable in dimensions to a small filing cabinet, to mainframe systems that could fill a large room. The microprocessor enabled computer engineers to develop microcomputers—systems about the size of a lunch box or smaller but with enough computing power to perform many kinds of business, industrial, and scientific tasks. Such systems made it possible to control a host of small instruments or devices (e.g., numerically controlled lathes and one-armed robotic devices for spot welding) by using standard components programmed to do a specific job. The very existence of computer hardware inside such devices is not apparent to the user.
The large demand for microprocessors generated by these initial applications led to high-volume production and a dramatic reduction in cost. This in turn promoted the use of the devices in many other applications—for example, in household appliances and automobiles, for which electronic controls had previously been too expensive to consider. Continued advances in IC technology gave rise to very large-scale integration (VLSI), which substantially increased the circuit density of microprocessors. These technological advances, coupled with further cost reductions stemming from improved manufacturing methods, made feasible the mass production of personal computers for use in offices, schools, and homes.
By the mid-1980s inexpensive microprocessors had stimulated computerization of an enormous variety of consumer products. Common examples included programmable microwave ovens and thermostats, clothes washers and dryers, self-tuning television sets and self-focusing cameras, videocassette recorders and video games, telephones and answering machines, musical instruments, watches, and security systems. Microelectronics also came to the fore in business, industry, government, and other sectors. Microprocessor-based equipment proliferated, ranging from automatic teller machines (ATMs) and point-of-sale terminals in retail stores to automated factory assembly systems and office workstations.
By mid-1986 memory ICs with a capacity of 262,144 bits (binary digits) were available. In fact, Gordon E. Moore, one of the founders of Intel, observed as early as 1965 that the complexity of ICs was approximately doubling every 18–24 months, which was still the case in 2000. This empirical “Moore’s law” is widely used in forecasting the technological requirements for manufacturing future ICs
Compound semiconductor materials
--------------------------------
Many semiconductor materials other than silicon and germanium exist, and they have different useful properties. Silicon carbide is a compound semiconductor, the only one composed of two elements from column IV of the periodic table. It is particularly suited for making devices for specialized high-temperature applications. Other compounds formed by combining elements from column III of the periodic table—such as aluminum, gallium, and indium—with elements from column V—such as phosphorus, arsenic, and antimony—are of particular interest. These so-called III-V compounds are used to make semiconductor devices that emit light efficiently or that operate at exceptionally high frequencies.
A remarkable characteristic of these compounds is that they can, in effect, be mixed together. One can produce gallium arsenide or substitute aluminum for some of the gallium or also substitute phosphorus for some of the arsenic. When this is done, the electrical and optical properties of the material are subtly changed in a continuous fashion in proportion to the amount of aluminum or phosphorus used.
Except for silicon carbide, these compounds have the same crystal structure. This makes possible the gradation of composition, and thus the properties, of the semiconductor material within one continuous crystalline body. Modern material-processing techniques allow these compositional changes to be controlled accurately on an atomic scale.
These characteristics are exploited in making semiconductor lasers that produce light of any given wavelength within a considerable range. Such lasers are used, for example, in compact disc players and as light sources for optical fibre communication.
Digital electronics
×××××××××××××××××××
Computers understand only two numbers, 0 and 1, and do all their arithmetic operations in this binary mode. Many electrical and electronic devices have two states: they are either off or on. A light switch is a familiar example, as are vacuum tubes and transistors. Because computers have been a major application for integrated circuits from their beginning, digital integrated circuits have become commonplace. It has thus become easy to design electronic systems that use digital language to control their functions and to communicate with other systems.
A major advantage in using digital methods is that the accuracy of a stream of digital signals can be verified, and, if necessary, errors can be corrected. In contrast, signals that vary in proportion to, say, the sound of an orchestra can be corrupted by “noise,” which once present cannot be removed. An example is the sound from a phonograph record, which always contains some extraneous sound from the surface of the recording groove even when the record is new. The noise becomes more pronounced with wear. Contrast this with the sound from a digital compact disc recording. No sound is heard that was not present in the recording studio. The disc and the player contain error-correcting features that remove any incorrect pulses (perhaps arising from dust on the disc) from the information as it is read from the disc.
As electronic systems become more complex, it is essential that errors produced by noise be removed; otherwise, the systems may malfunction. Many electronic systems are required to operate in electrically noisy environments, such as in an automobile. The only practical way to assure immunity from noise is to make such a system operate digitally. In principle it is possible to correct for any arbitrary number of errors, but in practice this may not be possible. The amount of extra information that must be handled to correct for large rates of error reduces the capacity of the system to handle the desired information, and so trade-offs are necessary.
A consequence of the veritable explosion in the number and kinds of electronic systems has been a sharp growth in the electrical noise level of the environment. Any electrical system generates some noise, and all electronic systems are to some degree susceptible to disturbance from noise. The noise may be conducted along wires connected to the system, or it may be radiated through the air. Care is necessary in the design of systems to limit the amount of noise that is generated and to shield the system properly to protect it from external noise sources.
Optoelectronics
A new direction in electronics employs photons (packets of light) instead of electrons. By common consent these new approaches are included in electronics, because the functions that are performed are, at least for the present, the same as those performed by electronic systems and because these functions usually are embedded in a largely electronic environment. This new direction is called optical electronics or optoelectronics.
In 1966 it was proposed on theoretical grounds that glass fibres could be made with such high purity that light could travel through them for great distances. Such fibres were produced during the early 1970s. They contain a central core in which the light travels. The outer cladding is made of glass of a different chemical formulation and has a lower optical index of refraction. This difference in refractive index indicates that light travels faster in the cladding than it does in the core. Thus, if the light beam begins to move from the core into the cladding, its path is bent so as to move it back into the core. The light is constrained within the core even if the fibre is bent into a circle.
electronics Home Science Physics electronics
××××××××××××××××××××××××××××××××××××××××××××
physics industry electronic system
----------------------------------
Electronics encompasses an exceptionally broad range of technology. The term originally was applied to the study of electron behaviour and movement, particularly as observed in the first electron tubes. It came to be used in its broader sense with advances in knowledge about the fundamental nature of electrons and about the way in which the motion of these particles could be utilized. Today many scientific and technical disciplines deal with different aspects of electronics. Research in these fields has led to the development of such key devices as transistors, integrated circuits, lasers, and optical fibres. These in turn have made it possible to manufacture a wide array of electronic consumer, industrial, and military products. Indeed, it can be said that the world is in the midst of an electronic revolution at least as significant as the industrial revolution of the 19th century.
flexible electronics
flexible electronics
The development of screen-printable electronic ink for flexible electronics.
© American Chemical Society
The history of electronics
--------------------------
The vacuum tube era
Theoretical and experimental studies of electricity during the 18th and 19th centuries led to the development of the first electrical machines and the beginning of the widespread use of electricity. The history of electronics began to evolve separately from that of electricity late in the 19th century with the identification of the electron by the English physicist Sir Joseph John Thomson and the measurement of its electric charge by the American physicist Robert A. Millikan in 1909.
At the time of Thomson’s work, the American inventor Thomas A. Edison had observed a bluish glow in some of his early lightbulbs under certain conditions and found that a current would flow from one electrode in the lamp to another if the second one (anode) were made positively charged with respect to the first (cathode). Work by Thomson and his students and by the English engineer John Ambrose Fleming revealed that this so-called Edison effect was the result of the emission of electrons from the cathode, the hot filament in the lamp. The motion of the electrons to the anode, a metal plate, constituted an electric current that would not exist if the anode were negatively charged.
This discovery provided impetus for the development of electron tubes, including an improved X-ray tube by the American engineer William D. Coolidge and Fleming’s thermionic valve (a two-electrode vacuum tube) for use in radio receivers. The detection of a radio signal, which is a very high-frequency alternating current (AC), requires that the signal be rectified; i.e., the alternating current must be converted into a direct current (DC) by a device that conducts only when the signal has one polarity but not when it has the other—precisely what Fleming’s valve (patented in 1904) did. Previously, radio signals were detected by various empirically developed devices such as the “cat whisker” detector, which was composed of a fine wire (the whisker) in delicate contact with the surface of a natural crystal of lead sulfide (galena) or some other semiconductor material. These devices were undependable, lacked sufficient sensitivity, and required constant adjustment of the whisker-to-crystal contact to produce the desired result. Yet these were the forerunners of today’s solid-state devices. The fact that crystal rectifiers worked at all encouraged scientists to continue studying them and gradually to obtain the fundamental understanding of the electrical properties of semiconducting materials necessary to permit the invention of the transistor.
In 1906 Lee De Forest, an American engineer, developed a type of vacuum tube that was capable of amplifying radio signals. De Forest added a grid of fine wire between the cathode and anode of the two-electrode thermionic valve constructed by Fleming. The new device, which De Forest dubbed the Audion (patented in 1907), was thus a three-electrode vacuum tube. In operation, the anode in such a vacuum tube is given a positive potential (positively biased) with respect to the cathode, while the grid is negatively biased. A large negative bias on the grid prevents any electrons emitted from the cathode from reaching the anode; however, because the grid is largely open space, a less negative bias permits some electrons to pass through it and reach the anode. Small variations in the grid potential can thus control large amounts of anode current.
The vacuum tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer), completed in 1946.
The special requirements of the many different applications of vacuum tubes led to numerous improvements, enabling them to handle large amounts of power, operate at very high frequencies, have greater than average reliability, or be made very compact (the size of a thimble). The cathode-ray tube, originally developed for displaying electrical waveforms on a screen for engineering measurements, evolved into the television picture tube. Such tubes operate by forming the electrons emitted from the cathode into a thin beam that impinges on a fluorescent screen at the end of the tube. The screen emits light that can be viewed from outside the tube. Deflecting the electron beam causes patterns of light to be produced on the screen, creating the desired optical images.
Notwithstanding the remarkable success of solid-state devices in most electronic applications, there are certain specialized functions that only vacuum tubes can perform. These usually involve operation at extremes of power or frequency.
Vacuum tubes are fragile and ultimately wear out in service. Failure occurs in normal usage either from the effects of repeated heating and cooling as equipment is switched on and off (thermal fatigue), which ultimately causes a physical fracture in some part of the interior structure of the tube, or from degradation of the properties of the cathode by residual gases in the tube. Vacuum tubes also take time (from a few seconds to several minutes) to “warm up” to operating temperature—an inconvenience at best and in some cases a serious limitation to their use. These shortcomings motivated scientists at Bell Laboratories to seek an alternative to the vacuum tube and led to the development of the transistor.
The semiconductor revolution
Invention of the transistor
The invention of the transistor in 1947 by John Bardeen, Walter H. Brattain, and William B. Shockley of the Bell research staff provided the first of a series of new devices with remarkable potential for expanding the utility of electronic equipment (see photograph). Transistors, along with such subsequent developments as integrated circuits, are made of crystalline solid materials called semiconductors, which have electrical properties that can be varied over an extremely wide range by the addition of minuscule quantities of other elements. The electric current in semiconductors is carried by electrons, which have a negative charge, and also by “holes,” analogous entities that carry a positive charge. The availability of two kinds of charge carriers in semiconductors is a valuable property exploited in many electronic devices made of such materials.
transistor
----------
The first transistor, invented by American physicists John Bardeen, Walter H. Brattain, and William B. Shockley.
© Windell Oskay, www.evilmadscientist.com (CC BY 2.0)
Early transistors were produced using germanium as the semiconductor material, because methods of purifying it to the required degree had been developed during and shortly after World War II. Because the electrical properties of semiconductors are extremely sensitive to the slightest trace of certain other elements, only about one part per billion of such elements can be tolerated in material to be used for making semiconductor devices.
During the late 1950s, research on the purification of silicon succeeded in producing material suitable for semiconductor devices, and new devices made of silicon were manufactured from about 1960. Silicon quickly became the preferred raw material, because it is much more abundant than germanium and thus less expensive. In addition, silicon retains its semiconducting properties at higher temperatures than does germanium. Silicon diodes can be operated at temperatures up to 200 °C (400 °F), whereas germanium diodes cannot be operated above 85 °C (185 °F). There was one other important property of silicon, not appreciated at the time but crucial to the development of low-cost transistors and integrated circuits: silicon, unlike germanium, forms a tenaciously adhering oxide film with excellent electrical insulating properties when it is heated to high temperatures in the presence of oxygen. This film is utilized as a mask to permit the desired impurities that modify the electrical properties of silicon to be introduced into it during manufacture of semiconductor devices. The mask pattern, formed by a photolithographic process, permits the creation of tiny transistors and other electronic components in the silicon.
Integrated circuits
-------------------
By 1960 vacuum tubes were rapidly being supplanted by transistors, because the latter had become less expensive, did not burn out in service, and were much smaller and more reliable. Computers employed hundreds of thousands of transistors each. This fact, together with the need for compact, lightweight electronic missile-guidance systems, led to the invention of the integrated circuit (IC) independently by Jack Kilby of Texas Instruments Incorporated in 1958 and by Jean Hoerni and Robert Noyce of Fairchild Semiconductor Corporation in 1959. Kilby is usually credited with having developed the concept of integrating device and circuit elements onto a single silicon chip, while Noyce is given credit for having conceived the method for integrating the separate elements.
Early ICs contained about 10 individual components on a silicon chip 3 mm (0.12 inch) square. By 1970 the number was up to 1,000 on a chip of the same size at no increase in cost. Late in the following year the first microprocessor was introduced. The device contained all the arithmetic, logic, and control circuitry required to perform the functions of a computer’s central processing unit (CPU). This type of large-scale IC was developed by a team at Intel Corporation, the same company that also introduced the memory IC in 1971. The stage was now set for the computerization of small electronic equipment.
Until the microprocessor appeared on the scene, computers were essentially discrete pieces of equipment used primarily for data processing and scientific calculations. They ranged in size from minicomputers, comparable in dimensions to a small filing cabinet, to mainframe systems that could fill a large room. The microprocessor enabled computer engineers to develop microcomputers—systems about the size of a lunch box or smaller but with enough computing power to perform many kinds of business, industrial, and scientific tasks. Such systems made it possible to control a host of small instruments or devices (e.g., numerically controlled lathes and one-armed robotic devices for spot welding) by using standard components programmed to do a specific job. The very existence of computer hardware inside such devices is not apparent to the user.
The large demand for microprocessors generated by these initial applications led to high-volume production and a dramatic reduction in cost. This in turn promoted the use of the devices in many other applications—for example, in household appliances and automobiles, for which electronic controls had previously been too expensive to consider. Continued advances in IC technology gave rise to very large-scale integration (VLSI), which substantially increased the circuit density of microprocessors. These technological advances, coupled with further cost reductions stemming from improved manufacturing methods, made feasible the mass production of personal computers for use in offices, schools, and homes.
By the mid-1980s inexpensive microprocessors had stimulated computerization of an enormous variety of consumer products. Common examples included programmable microwave ovens and thermostats, clothes washers and dryers, self-tuning television sets and self-focusing cameras, videocassette recorders and video games, telephones and answering machines, musical instruments, watches, and security systems. Microelectronics also came to the fore in business, industry, government, and other sectors. Microprocessor-based equipment proliferated, ranging from automatic teller machines (ATMs) and point-of-sale terminals in retail stores to automated factory assembly systems and office workstations.
By mid-1986 memory ICs with a capacity of 262,144 bits (binary digits) were available. In fact, Gordon E. Moore, one of the founders of Intel, observed as early as 1965 that the complexity of ICs was approximately doubling every 18–24 months, which was still the case in 2000. This empirical “Moore’s law” is widely used in forecasting the technological requirements for manufacturing future ICs .
Compound semiconductor materials
________________________________
Many semiconductor materials other than silicon and germanium exist, and they have different useful properties. Silicon carbide is a compound semiconductor, the only one composed of two elements from column IV of the periodic table. It is particularly suited for making devices for specialized high-temperature applications. Other compounds formed by combining elements from column III of the periodic table—such as aluminum, gallium, and indium—with elements from column V—such as phosphorus, arsenic, and antimony—are of particular interest. These so-called III-V compounds are used to make semiconductor devices that emit light efficiently or that operate at exceptionally high frequencies.
A remarkable characteristic of these compounds is that they can, in effect, be mixed together. One can produce gallium arsenide or substitute aluminum for some of the gallium or also substitute phosphorus for some of the arsenic. When this is done, the electrical and optical properties of the material are subtly changed in a continuous fashion in proportion to the amount of aluminum or phosphorus used.
Except for silicon carbide, these compounds have the same crystal structure. This makes possible the gradation of composition, and thus the properties, of the semiconductor material within one continuous crystalline body. Modern material-processing techniques allow these compositional changes to be controlled accurately on an atomic scale.
These characteristics are exploited in making semiconductor lasers that produce light of any given wavelength within a considerable range. Such lasers are used, for example, in compact disc players and as light sources for optical fibre communication.
Digital electronics
-------------------
Computers understand only two numbers, 0 and 1, and do all their arithmetic operations in this binary mode. Many electrical and electronic devices have two states: they are either off or on. A light switch is a familiar example, as are vacuum tubes and transistors. Because computers have been a major application for integrated circuits from their beginning, digital integrated circuits have become commonplace. It has thus become easy to design electronic systems that use digital language to control their functions and to communicate with other systems.
A major advantage in using digital methods is that the accuracy of a stream of digital signals can be verified, and, if necessary, errors can be corrected. In contrast, signals that vary in proportion to, say, the sound of an orchestra can be corrupted by “noise,” which once present cannot be removed. An example is the sound from a phonograph record, which always contains some extraneous sound from the surface of the recording groove even when the record is new. The noise becomes more pronounced with wear. Contrast this with the sound from a digital compact disc recording. No sound is heard that was not present in the recording studio. The disc and the player contain error-correcting features that remove any incorrect pulses (perhaps arising from dust on the disc) from the information as it is read from the disc.
As electronic systems become more complex, it is essential that errors produced by noise be removed; otherwise, the systems may malfunction. Many electronic systems are required to operate in electrically noisy environments, such as in an automobile. The only practical way to assure immunity from noise is to make such a system operate digitally. In principle it is possible to correct for any arbitrary number of errors, but in practice this may not be possible. The amount of extra information that must be handled to correct for large rates of error reduces the capacity of the system to handle the desired information, and so trade-offs are necessary.
A consequence of the veritable explosion in the number and kinds of electronic systems has been a sharp growth in the electrical noise level of the environment. Any electrical system generates some noise, and all electronic systems are to some degree susceptible to disturbance from noise. The noise may be conducted along wires connected to the system, or it may be radiated through the air. Care is necessary in the design of systems to limit the amount of noise that is generated and to shield the system properly to protect it from external noise sources.
Optoelectronics
_______________
A new direction in electronics employs photons (packets of light) instead of electrons. By common consent these new approaches are included in electronics, because the functions that are performed are, at least for the present, the same as those performed by electronic systems and because these functions usually are embedded in a largely electronic environment. This new direction is called optical electronics or optoelectronics.
In 1966 it was proposed on theoretical grounds that glass fibres could be made with such high purity that light could travel through them for great distances. Such fibres were produced during the early 1970s. They contain a central core in which the light travels. The outer cladding is made of glass of a different chemical formulation and has a lower optical index of refraction. This difference in refractive index indicates that light travels faster in the cladding than it does in the core. Thus, if the light beam begins to move from the core into the cladding, its path is bent so as to move it back into the core. The light is constrained within the core even if the fibre is bent into a circle.
The core of early optical fibres was of such a diameter (several micrometres [μm], or about one-tenth the diameter of a human hair) that the various rays of light in the core could travel in slightly different paths, the shortest directly down the axis and other longer paths wandering back and forth across the core. This limited the maximum distance that a pulse of light could travel without becoming unduly spread by the time it arrived at the receiving end of the fibre, with the central ray arriving first and others later. In a digital communications system, successive pulses can overlap one another and be indistinguishable at the receiving end. Such fibres are called multimode fibres, in reference to the various paths (or modes) that the light can follow.
During the late 1970s, fibres were made with smaller core diameters in which the light was constrained to follow only one path. This occurs if the core has a diameter a little larger than the wavelength of the light traveling in it—i.e., about 10 to 15 μm (0.01 to 0.015 mm, or 0.0004 to 0.0006 inch). These single-mode fibres avoid the difficulty described above. By 1993 optical fibres capable of carrying light signals more than 215 km (135 miles) became available. Such distance records have become obsolete with the use of specialized fibres that incorporate integral amplifying features. Fibres employing these optical amplifiers carry light signals over transoceanic distances without the conventional pulse regeneration measures that were needed in the past.
Optical fibres have several advantages over copper wires or coaxial cables. They can carry information at a much higher rate, they occupy less space (an important feature in large cities and in buildings), and they are quite insensitive to electrical noise. Moreover, it is virtually impossible to make unauthorized connections to them. Costs, initially high, have dropped to the point where most new installations of telephone circuits between switching centres and over longer distances consist of optical fibres.
Given the fact that communication signals arrive at a central switching office in optical form, it has been attractive to consider switching them from one route to another by optical means rather than electrically, as is done today. The distances between central offices in most cases are substantially shorter than the distance light can travel within a fibre. Optical switching would make unnecessary the detection and regeneration of the light signals, steps that are currently required. Such optical central-office switches are ready for installation today and will further advance the dramatic changes wrought by the use of light waves rather than electrons.
Another direction in optoelectronics builds in part on the foregoing developments but to a quite different end. A key problem in developing faster computers and faster integrated circuits to use in them is related to the time required for electrical signals to travel over wire interconnections. This is a difficulty both for the integrated circuits themselves and for the connections between them. Under the best circumstances, electrical signals can travel in a wire at about 90 percent of the speed of light. A more usual rate is 50 percent. Light travels about 30 cm (12 inches) in a billionth of a second. Modern computers operate at speeds of more than one billion operations per second. Thus, if two signals that start simultaneously from different sites are to arrive at their destination simultaneously, the paths they travel must not differ in length by more than a few centimetres.
Two approaches can be envisioned. In one, all the integrated circuits are placed as close together as possible to minimize the distances that signals must travel. This creates a cooling problem, because the integrated circuits generate heat. In the other possible approach, all the paths for signals are made equal to the longest path. This requires the use of much more wire, because most paths are longer than they would otherwise be. All this wire takes space, which means that the integrated circuits have to be placed farther apart than is preferable.
Ultimately, as computers operate even faster, neither approach will work, and a radically new technique must be used. Optical communication between integrated circuits is one possible answer. Light beams do not take up space or interfere with cooling air. If the communication is optical, then the computation might be done optically as well. Optical computation will require a radically different form of integrated circuit, which can in principle be made of gallium arsenide and related III-V compounds. These matters are currently under serious study in research laboratories.
Superconducting electronics
___________________________
Numerous metals completely lose their resistance to the flow of electric current at temperatures approaching absolute zero (0 K, −273 °C, or −460 °F) and become superconducting. Other equally dramatic changes in electrical properties occur as well. One of these is the Josephson effect, named for the British physicist Brian D. Josephson, who predicted and then discovered the phenomenon in 1962. The Josephson effect governs the passage of current from one superconducting metal to another through a very thin insulating film between them (the Josephson junction) and the effects of small magnetic fields on this current.
Josephson junction devices change from one electrical state to another in extraordinarily short times, offering the possibility of producing superconducting microcircuits that operate faster than any other kind known. Serious efforts have been made to construct a computer on this basis, but most of the projects have been either discontinued or sharply cut back because of technical difficulties. Interest in the approach has also waned because of increases in the speed of III-V semiconductor microcircuits.
Josephson junctions have other uses in science. They make extremely sensitive detectors of small magnetic fields, for example. The voltage across a Josephson junction is known on theoretical grounds to be dependent only on the values of certain basic physical constants. Since these constants are known to great accuracy, Josephson junctions are now used to provide the absolute standard of voltage.
Other important applications of Josephson junctions have to do with the metrology of very high-speed signals. Measurements of fast phenomena require the use of even faster measurement tools, which Josephson devices provide.
Flat-panel displays
Display devices convey information in visible form from electronic devices to human viewers. Common examples are the faces on digital watches, numerical indicators on stereo equipment, and the picture tubes in television sets and computer monitors. Until recently the most versatile of these has been the picture tube, which can present numbers, letters, graphs, and both still and moving pictures. While picture tubes set a very high standard of performance and provide bright colour images, they are bulky, heavy, and expensive. Designers of television receivers have long desired a display device having the virtues of the picture tube but fewer of the disadvantages, so that a “picture on the wall” television set can be produced.
New developments in flat-panel displays have made this possible. Such displays are advanced versions of the liquid crystal display familiar in digital watch faces. They are essentially two parallel sheets of thin glass having the facing sides coated with a transparent yet electrically conducting film such as indium tin oxide. The film layer nearer the viewer is patterned, while the other layer is not. The space between the films is filled with a fluid with unusual electrical and optical properties, so that, if an electrical field is established between the two thin films, the molecules of the fluid line up in such a way that the light-reflecting or light-transmitting properties of the assembly are radically changed. The electro-optical fluid is an electrical insulator, so very little electric current flows. Thus, almost no power is consumed, making the display well suited for use in battery-powered applications. All flat-panel displays have these characteristics in common, but the many different varieties exploit the electro-optical effects in numerous ways.
Displays that produce images are patterned with myriads of tiny picture elements that can be electrically activated independently to produce patterns of light and dark or arbitrary forms. Superposed colour filters having arrays of elements corresponding to those in the display permit the formation of colour images of a quality rivaling that of colour cathode-ray tube displays. Such displays are used as viewing devices for television sets, computers, and video and digital cameras.
colour, also spelled color, the aspect of any object that may be described in terms of hue, lightness, and saturation. In physics, colour is associated specifically with electromagnetic radiation of a certain range of wavelengths visible to the human eye. Radiation of such wavelengths constitutes that portion of the electromagnetic spectrum known as the visible spectrum—i.e., light. Vision is obviously involved in the perception of colour. A person can see in dim light, however, without being able to distinguish colours. Only when more light is present do colours appear. Light of some critical intensity, therefore, is also necessary for colour ...(100 of 8833 words) .
The science of electronics
Valence electrons
Since electronics is concerned with the control of the motion of electrons, one must keep in mind that electrons, being negatively charged, are attracted to positive charges and repelled by other negative charges. Thus, electrons in a vacuum tend to space themselves apart from one another and form a cloud, subject to the influences of other charges that may be present. An electric current is created by the motion of electrons, whether in a vacuum, in a wire, or in any other electrically conducting medium. In each of these cases, electrons move as a result of their attraction to positive charges or repulsion from negative ones .
An atom consists of a nucleus of protons and neutrons around which electrons, equal in number to the protons in the nucleus, travel in orbits much like those of the planets around the Sun. Because of this equality in the number of positively and negatively charged constituent particles, the atom as a whole is electrically uncharged. When atoms are combined into certain solids called covalent solids (notably the elements of column IV of the periodic table), the valence electrons (outer electrons) are shared between neighbouring atoms, and the atoms thereby become bound together. This occurs not only in elemental solids, wherein all the atoms are of the same kind, but also in chemical compounds (e.g., the III-V compounds).
Different materials vary greatly in their ability to conduct electricity, depending directly on the ease or difficulty of setting electrons free from their atoms. In insulating materials all the outermost electrons of the atoms are tightly bound in the chemical bonds between atoms and are not free to move. In metals there are more valence electrons than are required for bonding, and these excess electrons are freely available for electrical conduction.
The fabrication processes used to make real devices are not as well understood, although much has been learned. Theoretical designs incorporate the assumptions that the materials are entirely pure, that dopants exist only in the proper amounts and distributions, and that the dimensions of structures have the intended values. These assumptions are true in practice only to a limited degree. Major efforts in universities and company laboratories are focused on better understanding these issues and on developing improved computer-based modeling and process-design methods. Large sums of money are spent to provide equipment and manufacturing environments that adequately control each process step and protect the material being processed from contamination.
Integrated electronic control instrument functions commonly used in integrated electronic circuit devices in advanced and smart electronics networks.
**********************************************
Electronics media setting communication
Langganan:
Postingan (Atom)