Chipmakers


Alexander Graham Bell

Alexander Graham Bell (March 3, 1847 – August 2, 1922)[4] was an eminent Scottish-born scientist, inventor, engineer and innovator who is credited with inventing the first practical telephone.[N 3]

Bell’s father, grandfather, and brother had all been associated with work on elocution and speech, and both his mother and wife were deaf, profoundly influencing Bell’s life’s work.[7] His research on hearing and speech further led him to experiment with hearing devices which eventually culminated in Bell being awarded the first U.S. patent for the telephone in 1876.[N 4] Bell considered his most famous invention an intrusion on his real work as a scientist and refused to have a telephone in his study.[9][N 5]

Many other inventions marked Bell’s later life, including groundbreaking work in optical telecommunications, hydrofoils and aeronautics. In 1888, Bell became one of the founding members of the National Geographic Society.[11]

Wikipedia | Alexander Graham Bell


Volta Laboratory and Bureau

The Volta Laboratory (also known as the “Alexander Graham Bell Laboratory”, the “Bell Carriage House” and the “Bell Laboratory”) and the Volta Bureau were created in Georgetown, Washington, D.C. by Alexander Graham Bell.[3]

The Volta Laboratory was founded in 1880–1881 with Charles Sumner Tainter and Bell’s cousin, Chichester Bell,[4] for the research and development of telecommunication, phonograph and other technologies.

The Volta Bureau, (also known as the Alexander Graham Bell Laboratory, Bell Carriage House, Bell Laboratory, and Volta Laboratory)

Using funds generated by the Volta Laboratory, Bell later founded the Volta Bureau in 1887 “for the increase and diffusion of knowledge relating to the deaf“, and merged with the American Association for the Promotion and Teaching of Speech to the Deaf (AAPTSD) in 1908.[5] It was renamed as the Alexander Graham Bell Association for the Deaf in 1956 and then the Alexander Graham Bell Association for the Deaf and Hard of Hearing in 1999.[6]

Wikipedia | Volta Laboratory and Bureau


Bell Labs

At its peak, Bell Laboratories was the premier facility of its type, developing a wide range of revolutionary technologies, including radio astronomy, the transistor, the laser, information theory, the UNIX operating system, the C programming language and the C++ programming language. Eight Nobel Prizes have been awarded for work completed at Bell Laboratories.[8]

The Turing Award has twice been won by Bell Labs researchers:

  • 1968: Richard Hamming for his work on numerical methods, automatic coding systems, and error-detecting and error-correcting codes.
  • 1983: Ken Thompson and Dennis Ritchie for their work on operating systems theory, and their development of Unix.

Bell Laboratories in Murray Hill, New Jersey

Wikipedia | Bell Labs


Who invented the cell phone?

Do you remember a time when cell phones were rare? Today, it’s hard to imagine a world without them. Even if you don’t own one yourself, you probably see dozens of people talking on a cell phone every day. The rate at which we adopted the devices is astounding. But who invented them?

The first cell phone was larger even than this monster. © Stockphoto/Thinkstock

To get the answer to that question, we need to look back more than a century. Alexander Graham Bell invented the telephone in 1876. And then in 1900, on December 23 on the outskirts of Washington, D.C., an inventor named Reginald Fessenden accomplished a remarkable feat: He made the first wireless telephone call. He was the first to transmit the human voice via radio waves, sending a signal from one radio tower to another.

Fessenden’s work paved the way for broadcast radio but it also provided the foundation for cell phones and networks. In 1947, an engineer named William Rae Young proposed that radio towers arranged in a hexagonal pattern could support a telephone network. Young worked under another engineer named D.H. Ring, who led a team at Bell Laboratories, which was part of AT&T at the time.

Howstuffworks | Who invented the cell phone?


Books: The Moses of Silicon Valley

BOOK REVIEWED -Broken Genius: The Rise and Fall of William Shockley, Creator of the Electronic Age


Walter Houser Brattain

Walter Houser Brattain – Wikipedia.

Brattain, Walter Houser (1902-1987), an American physicist, shared the Nobel Prize in physics in 1956 with his colleagues John Bardeen and William Shockley for inventing the transistor, which ushered in the era of microminiature electronic parts and led to today’s computers.

Brattain was born in Xiamen, China, where his father, a recent graduate of Whitman College in Walla Walla, Washington, was teaching science and math. The following year, the Brattain family returned to Washington, where Walter and his brother, Robert, spent much of their youth helping out on the family’s cattle ranch near the Canadian border. In 1920, he enrolled at Whitman College, where he majored in physics and math. After completing a bachelor’s degree in physics at Whitman in 1924, Brattain earned a master’s degree at the University of Oregon (1926) and a doctorate degree at the University of Minnesota (1929). Brattain’s first job, as a radio engineer at the National Bureau of Standards, left him anxious to return to physics. At a meeting of the American Physical Society, his thesis adviser, John Tate, introduced him to Joseph Becker of the Bell Telephone Laboratories, a major U.S. corporate research center. Becker hired Brattain and he remained at Bell Labs until his retirement in 1967.

Howstuffworks | Walter Houser Brattain


John Bardeen, el único hombre en ganar dos premios Nobel de física

 La historia de la ciencia no siempre es justa con sus protagonistas: Mientras que algunos científicos gozan de enorme popularidad, otros son poco conocidos o incluso olvidados por la población en general. Y lo más curioso es que, paradójicamente, en muchas ocasiones estos científicos “poco conocidos” han realizado importantísimas aportaciones a la ciencia. Es el caso del físico John Bardeen, uno de los científicos más importantes del siglo XX y que, por desgracia, no goza de una fama a la altura de sus contribuciones. El periódico Chicago Tribune definió a la perfección la figura de Bardeen en la historia:

“Para los científicos Bardeen es un Einstein. Para el público en general es un … ¿John qué?”

Bardeen nació en Madison (Wisconsin) en el año 1908. Su padre era profesor de anatomía y llegó a ser el primer decano de la facultad de medicina en la universidad de Wisconsin, y su madre, que gozaba de cierta fama, se dedicaba al mundo del arte. Por tanto, se puede decir que John nació en una familia intelectual que siempre le alentó a los estudios. Además, el chico era bastante despierto y tenía pasión por la ciencia: Cuando estaba en séptimo grado, su profesor le dijo que gozaba de un gran talento para las matemáticas y que en un futuro podría conseguir un trabajo dentro de ese campo.

Blogspot | John Bardeen, el único hombre en ganar dos premios Nobel de física


Quién se acuerda de John Bardeen este año que se cumplen 100 años de su nacimiento

John Bardeen tiene el honor de ser el único científico que ha recibido 2 premios Nobel en Física por el descubrimiento del transistor y por su teoría de la superconductividad. Frederick Sanger ganó el Premio Nobel de Química en dos ocasiones en 1958 y 1980, Marie Curie ganó el de Física en 1903 y el de Química en 1911, y Linus Carl Pauling el de Química en 1954 y el Premio Nobel de la Paz en 1962. Merece la pena recordarlo este año que se cumplen 100 años de su nacimiento. La entrada de la wiki es breve pero efectiva. Su biografía más famosa es “TRUE GENIUS. THE LIFE AND SCIENCE OF JOHN BARDEEN. The Only Winner of Two Nobel Prizes in Physics,” Lillian Hoddeson y Vicki Daitch, Joseph Henry Press, Washington, 2002 .

John Bardeen bajaba despacio por el corredor del edificio de física, parecía perdido en sus pensamientos, era el 1 de noviembre de 1956, llevaba 5 años siendo catedrático de física en la University of Illinois, trataba de digerir la noticia que había recibido esa misma mañana: él y dos de sus colegas, William Shockley y Walter Brattain, habían ganado el Premio Nobel de Física por la invención del transistor en diciembre de 1947, cuando trabajaba en los Bell Telephone Laboratories.

Naukas | Quién se acuerda de John Bardeen este año que se cumplen 100 años de su nacimiento


William Shockley

William Shockley, Nobel Prize in physics

Shockley, William (1910 – 1989) was an American physicist. He received the 1956 Nobel Prize in physics for inventing the transistor, a tiny device that controls the flow of electric current in radios, television sets, computers, and almost every other kind of electronic equipment. Shockley shared the prize with two members of his research staff, the American physicists John Bardeen and Walter Houser Brattain.

In the early 1970’s, Shockley’s views on race and intelligence sparked much controversy. He claimed that heredity, rather than environment, was mainly responsible for whites generally scoring higher than blacks on intelligence tests. Most geneticists and psychologists disagreed with this theory.

William Bradford Shockley was born on Feb. 13, 1910, in London. His parents were Americans working in England. His father, William Hillman Shockley, was a mining engineer. His mother, May (Bradford) Shockley, was a mineral surveyor. The family returned to the United States in 1913 and lived in Palo Alto, California. Shock-ley’s early interest in science was encouraged by his parents and by a neighbor who was a physics professor at nearby Stanford University. As a child, Shockley was first educated at home. He later attended Palo Alto Military Academy and then Hollywood High School, from which he graduated in 1927.

Shockley attended the University of California at Los Angeles (UCLA) for one year before transferring to the California Institute of Technology (often called Caltech) to study physics. He received a B.S. degree in physics from Caltech in 1932. He then obtained a teaching fellowship that allowed him to pursue graduate work at the Massachusetts Institute of Technology (MIT).

Shockley received his Ph.D. degree in physics from MIT in 1936. His doctoral thesis, titled “Calculations of Wave Functions for Electrons in Sodium Chloride Crystals,” reflected his early research in solid-state physics.

Howstuffworks | Shockley, William


Fairchild Semiconductor

From left to right: Gordon Moore, C. Sheldon Roberts, Eugene Kleiner, Robert Noyce, Victor Grinich, Julius Blank, Jean Hoerni and Jay Last. (1960)

In 1956, William Shockley opened Shockley Semiconductor Laboratory as a division of Beckman Instruments in Mountain View, California; his plan was to develop a new type of “4-layer diode” that would work faster and have more uses than then-current transistors. At first he attempted to hire some of his former colleagues from Bell Labs, but none were willing to move to the West Coast or work with Shockley again at that time. Shockley then founded the core of the new company with what he considered the best and brightest graduates coming out of American engineering schools.

While Shockley was effective as a recruiter, he was less effective as a manager. A core group of Shockley employees, later known as thetraitorous eight, became unhappy with his management of the company. The eight men were Julius Blank, Victor Grinich, Jean Hoerni, Eugene Kleiner, Jay Last, Gordon Moore, Robert Noyce, and Sheldon Roberts. Looking for funding on their own project, they turned to Sherman Fairchild‘s Fairchild Camera and Instrument, an Eastern U.S. company with considerable military contracts.[4] In 1957 the Fairchild Semiconductor division was started with plans to make silicon transistors at a time when germanium was still the most common material for semiconductor use.

Wikipedia | Fairchild Semiconductor


Historic Transistor Photo Gallery: 2N697

dmThe unique history of Fairchild Semiconductor and its pioneering devices has been documented in a variety of sources.  The initial product line of this company, developed shortly after the founders departed as a group from Shockley Transistor Corporation in 1957, was a high performance silicon mesa transistor.  Initial shipments were made to IBM in 1958 for use as a memory core driver ($200 for each transistor).  By 1958, Fairchild began marketing the transistor commercially as the 2N697.   These were the best performing silicon transistors available at the time, and were very successful commercially.  Other semiconductors companies quickly joined in, and the 2N697 (and it’s related cousins, the 2N696, 2N1131 and 2N1132) soon became commonly available and were produced for many years by many companies.  Early prices were quite high and made substantial profit for those companies capable of silicon transistor manufacturing.  Texas Instruments, for example, offered the 2N697 for $28.50 each in the 1960 Lafayette radio catalog.  (Note: the earliest Fairchild devices are easily identified by the stylized “F” stamped in the top of the metal case, as in the example shown in the photo above).

Transistor Museum | FAIRCHILD 2N697


Jack Kilby

Jack St. Clair Kilby (November 8, 1923 – June 20, 2005) was an American electrical engineer who took part (along with Robert Noyce) in the realization of the first integrated circuit while working at Texas Instruments (TI) in 1958. He was awarded the Nobel Prize in physics on December 10, 2000.[1] To congratulate him, US President Bill Clinton wrote, “You can take pride in the knowledge that your work will help to improve lives for generations to come.”[2]

He is also the inventor of the handheld calculator and the thermal printer, for which he has patents. He also has patents for seven other inventions.[3]

In mid-1958, Kilby, as a newly employed engineer at Texas Instruments (TI), did not yet have the right to a summer vacation. He spent the summer working on the problem in circuit design that was commonly called the “tyranny of numbers” and finally came to the conclusion that manufacturing the circuit components en masse in a single piece of semiconductor material could provide a solution. On September 12 he presented his findings to management, which included Mark Shepherd. He showed them a piece of germanium with an oscilloscope attached, pressed a switch, and the oscilloscope showed a continuous sine wave, proving that his integrated circuit worked and thus that he had solved the problem. U.S. Patent 3,138,743 for “Miniaturized Electronic Circuits”, the first integrated circuit, was filed on February 6, 1959.[4] Along with Robert Noyce (who independently made a similar circuit a few months later), Kilby is generally credited as co-inventor of the integrated circuit.

Jack Kilby’s original integrated circuit

Wikipedia | Jack Kilby


Miniaturized electronic circuits: US 3138743 A

Many methods and techniques for miniaturizing electronic circuits have been proposed in the past. At first, most of the effort was spent upon reducing the size of the components and packing them more closely together. Work directed toward reducing component size is still going on but has nearly reached a limit. Other efforts have been made to reduce the size of electronic circuits such as by eliminating the protective coverings from components, by using more or less conventional techniques to form components on a single substrate, and by providing the components with a uniform size and shape to permit closer spacings in the circuit packaging therefor.

All of these methods and techniques require a very large number and variety of operations in fabricating a complete circuit. For example, of all circuit components, resistors are usually considered the most simple to form, but when adapted for miniaturization by conventional techniques, fabrication requires at least the following steps:

(a) Formation of the substrate.

(b) Preparation of the substrate.

(0) Application of terminations.

(d) Preparation of resistor material.

(e) Application of the resistor material.

(1) Heat treatment of the resistor material. (g) Protection or stabilization of the resistor.

Capacitors, transistors, and diodes when adapted for miniaturization each require at least as many steps in the fabrication thereof. Unfortunately, many of the steps required are not compatible. A treatment that is desirable for the protection of a resistor may damage another element, such as a capacitor or transistor, and as the size of the complete circuit is reduced, such conflicting treatments, or interactions, become of increasing importance. Interactions may be minimized by forming the components separately and then assembling them into a complete package, but the very act of assembly may cause damage to the more sensitive components.

US Patents | Miniaturized electronic circuits: US 3138743 A


How Transistors Work

by

Without transistors, engineers might never have created amazingly small and power digital products. Ethan Miller/Getty Images

Once mass-produced transistorized hearing aids and radios became realities, engineers realized that transistors would replace vacuum tubes in computers, too. One of the first pre-transistor computers, the famous ENIAC (Electronic Numerical Integrator and Computer) weighed 30 tons, thanks in part to its more than 17,000 vacuum tubes. It was obvious that transistors would completely change computer engineering and result in smaller machines.

Germanium transistors certainly helped start the computer age, but silicon transistors revolutionized computer design and spawned an entire industry in California’s aptly-named Silicon Valley.

In 1954, George Teal, a scientist at Texas Instruments, created the first silicon transistor. Soon after, manufacturers developed methods for mass-producing silicon transistors, which were cheaper and more reliable than germanium-based transistors.

Silicon transistors worked wonderfully for computer production. With smart engineering, transistors helped computers power through huge numbers of calculations in a short time. The simple switch operation of transistors is what enables your computer to complete massively complex tasks. In a computer chip, transistors switch between two binary states — 0 and 1. This is the language of computers. One computer chip can have millions of transistors continually switching, helping complete complex calculations.

In a computer chip, the transistors aren’t isolated, individual components. They’re part of what’s called an integrated circuit (also known as a microchip), in which many transistors work in concert to help the computer complete calculations. An integrated circuit is one piece of semiconductor material loaded with transistors and other electronic components.

Computers use those currents in tandem with Boolean algebra to make simple decisions. With many transistors, a computer can make many simple decisions very quickly, and thus perform complex calculations very quickly, too.

Computers need millions or even billions of transistors to complete tasks. Thanks to the reliability and incredibly small size of individual transistors, which are much smaller than the diameter of a single human hair, engineers can pack an unfathomable number of transistors into a wide array of computer and computer-related products.

Howstuffworks | How Transistor Work


IBM DNA Transistor

YouTube | IBM Research


Texas Instruments

TI’s new signboard at its Dallas headquarters

Texas Instruments was founded in 1951.[9] It emerged after a reorganization of Geophysical Service. This company manufactured equipment for use in the seismic industry as well as defense electronics. TI began research in transistors in the early 1950s and produced the world’s first commercial silicon transistor. In 1954, Texas Instruments designed and manufactured the first transistor radio and Jack Kilby invented the integrated circuit in 1958 while working at TI’s Central Research Labs. The company produced the first integrated circuit-based computer for the U.S. Air Force in 1961. TI researched infrared technology in the late 1950s and later made radar systems as well as guidance and control systems for both missiles and bombs. The hand-held calculator was introduced to the world by TI in 1967.

In the 1970s and 80s the company focused on consumer electronics including digital clocks, watches, hand-held calculators, home computers as well as various sensors. In 1997, its defense business was sold to Raytheon. In 2007, Texas Instruments was awarded the Manufacturer of the Year for Global Supply Chain Excellence by World Trade magazine. Texas Instruments is considered to be one of the most ethical companies in the world.[10]

After the acquisition of National Semiconductor in 2011, the company has a combined portfolio of nearly 45,000 analog products and customer design tools,[11] making it the world’s largest maker of analog technology components. In 2011, Texas Instruments ranked 175 in the Fortune 500. TI is made up of two main divisions: Semiconductors (SC) and Educational Technology (ET) of which Semiconductor products account for approximately 96% of TI’s revenue.

Wikipedia | Texas Instruments


Evolution of the Electronic Calculator

Graphing calculators have many advanced functions, including solving and graphing equations. © iStockphoto.com/mbbirdy

Several electronics companies and inventors may claim a first when it comes to the development of the electronic calculator. Japanese company Sharp is said to have created the first desktop calculator, the CS-10A, in 1964. This model resembled a cash register and cost about as much as mid-sized car

The next few years became something of a race between manufacturers to make calculators smaller, more accessible and less expensive. In 1972, British inventor Sir Clive Sinclair introduced the Sinclair Executive, which is considered by many to be the world’s first affordable pocket calculator. Its thickness was that of a pack of cigarettes.

These continued advancements in calculator technology were largely made possible by the development of the single-chip microprocessor in the late 1960s. Before this time, engineers built the computing “brains” of calculators (and computers) with multiple chips or other components. Basically, a single-chip microprocessor allows an entire central processing unit (CPU) to exist on one silicon microchip. (To learn more about this technology, check out How Microprocessors Work.)

Howstuffworks | Evolution of the Electronic Calculator


Texas Instruments: from calculators to cars

by Lauren Silverman | Tuesday, May 5, 2015 – 16:00

 In the 1980s, Texas Instruments was excited about its microchips in a hot toy called the Speak & Spell.

TI’s Speak & Spell used the first single-chip voice synthesizer, a tiny device that just a few years later gave the beloved alien E.T. a voice.

E.T. took advantage of the microchip, and later so did some Chrysler vehicles.

Despite its reputation for calculators, Texas Instruments isn’t new to the car business. TI’s automotive business is growing faster than the rest of the company, thanks to selling microprocessors and car technology.

LAURA SILVERMAN/KERA NEWS

“Most of the major car brands have TI tech inside of them that you don’t even know about,” says Automotive Processors general manager Curt Moore.

Microprocessors created by TI are in lots of cars, including Fords and BMWs, where they help control everything from car windows to power steering.

.

MARKETPLACE BUSINESS | Texas Instruments: from calculators to cars


How Moore’s Law Works

by

Gordon Moore at Intel’s headquarters AP Photo/Paul Sakuma

There’s a joke about personal computers that has been around almost as long as the devices have been on the market: You buy a new computer, take it home and just as you finish unpacking it you see an advertisement for a new computer that makes yours obsolete. If you’re the kind of person who demands to have the fastest, most powerful machines, it seems like you’re destined for frustration and a lot of trips to the computer store.

While the joke is obviously an exaggeration, it’s not that far off the mark. Even one of today’s modest personal computers has more processing power and storage space than the famous Cray-1 supercomputer. In 1976, the Cray-1 was state-of-the-art: it could process 160 million floating-point operations per second (flops) and had 8 megabytes (MB) of memory.

Today, many personal computers can perform more than 10 times that number of floating-point operations in a second and have 100 times the amount of memory. Meanwhile on the supercomputer front, the Cray XT5 Jaguar at the Oak Ridge National Laboratory performed a sustained 1.4 petaflops in 2008

The prefix peta means 10 to the 15th power — in other words, one quadrillion. That means the Cray XT5 can process 8.75 million times more flops than the Cray-1. It only took a little over three decades to reach that milestone.

Moore’s law

If you were to chart the evolution of the computer in terms of processing power, you would see that progress has been exponential. The man who first made this famous observation is Gordon Moore, a co-founder of the microprocessor company Intel. Computer scientists, electrical engineers, manufacturers and journalists extrapolated Moore’s Law from his original observation. In general, most people interpret Moore’s Law to mean the number of transistors on a 1-inch (2.5 centimeter) diameter of silicon doubles every x number of months.

Howstuffworks | How Moore’s Law Works


Intel History

In 1968, Robert Noyce and Gordon Moore were two unhappy engineers working for the Fairchild Semiconductor Company who decided to quit and create their own company at a time when many Fairchild employees were leaving to create start-ups. People like Noyce and Moore were nicknamed the “Fairchildren”.

Jasper James/ Stone/ Getty Images

Robert Noyce typed himself a one page idea of what he wanted to do with his new company, and that was enough to convince San Francisco venture capitalist Art Rock to back Noyce’s and Moore’s new venture.

Rock raised $2.5 million dollars in less than 2 days by selling convertible debentures. Art Rock became the first chairmen of Intel.

Intel Trademark

The name “Moore Noyce” was already trademarked by a hotel chain, so the two founders decided upon the name “Intel” for their new company, a shortened version of “Integrated Electronics”. However, the rights to the name had to bought from a company called Intelco first.

ABOUT | Intel History


Intel 8080

The Intel 8080 (“eighty-eighty”) was the second 8-bit microprocessor designed and manufactured by Intel and was released in April 1974.[1] It was an extended and enhanced variant of the earlier 8008 design, although without binary compatibility. The initial specified clock frequency limit was 2 MHz, and with common instructions having execution times of 4, 5, 7, 10, or 11 cycles. This meant that it operated at an effective speed of a few hundred thousand instructions per second.

The 8080 has sometimes been labeled “the first truly usable microprocessor”, although earlier microprocessors were used for calculators, cash registers, computer terminals, industrial robots[2] and other applications. The architecture of the 8080 strongly influenced Intel’s 8086 CPU architecture, which spawned the x86 family of processors.

Federico Faggin, the originator of the 8080 architecture in early 1972, proposed it to Intel’s management and pushed for its implementation. He finally got the permission to develop it six months later. Faggin hired Masatoshi Shima from Japan who did the detailed design under his direction, using the design methodology for random logic with silicon gate that Faggin had created for the 4000 family. Stanley Mazor contributed a couple of instructions to the instruction set.

CPU Intel i8080

The 8080 required two support chips to function, the i8224 clock generator/driver and (most often) the i8228 bus controller and it was implemented using non-saturated enhancement-load NMOS, demanding an extra +12 V and a −5 V supply in addition to the main TTL compatible +5 V Supply.

The Intel 8080 was the successor to the 8008. It used the same basic instruction set and register model as the 8008 (developed by Computer Terminal Corporation), even though it was not source code compatible nor binary compatible with its predecessor. Every instruction in the 8008 has an equivalent instruction in the 8080 (even though the actual opcodes differ between the two CPUs). The 8080 also added a few 16-bit operations to its instruction set as well. Whereas the 8008 required the use of the HL register pair to indirectly access its 14-bit memory space, the 8080 added addressing modes to allow direct access to its full 16-bit memory space. In addition, the internal 7-level push-down call stack of the 8008 was replaced by a dedicated 16-bit stack pointer (SP) register. The 8080’s large 40-pin DIP packaging permitted it to provide a 16-bit address bus and an 8-bit data bus, allowing easy access to 64 KB of memory.

Wikipedia | Intel 8080


Intel 8080

The Intel 8080 (“eighty-eighty”) was the second 8-bit microprocessor designed and manufactured by Intel and was released in April 1974. It was an extended and enhanced variant of the earlier 8008 design, although without binary compatibility. The initial specified clock frequency limit was 2 MHz, and with common instructions having execution times of 4, 5, 7, 10, or 11 cycles this meant that it operated at an effective speed of a few hundred thousand instructions per second.
The 8080 has sometimes been labeled “the first truly usable microprocessor”, although earlier microprocessors were used for calculators, cash registers, computer terminals, industrial robots and other applications. The architecture of the 8080 strongly influenced Intel’s 8086 CPU architecture, which spawned the x86 family of processors.

YouTube | Audiopedia


Motorola

Motorola, Inc. /mtɵˈrlə/ was a multinational[5] telecommunications company based in Schaumburg, Illinois, United States (U.S.). After having lost $4.3 billion from 2007 to 2009, the company was divided into two independent public companies, Motorola Mobility and Motorola Solutions on January 4, 2011.[6] Motorola Solutions is generally considered to be the direct successor to Motorola, Inc., as the reorganization was structured with Motorola Mobility being spun off.[7]

Motorola designed and sold wireless network equipment such as cellular transmission base stations and signal amplifiers. Motorola’s home and broadcast network products included set-top boxes, digital video recorders, and network equipment used to enable video broadcasting, computer telephony, and high-definition television. Its business and government customers consisted mainly of wireless voice and broadband systems (used to build private networks), and, public safety communications systems like Astro and Dimetra. These businesses (except for set-top boxesand cable modems) are now part of Motorola Solutions. Google sold Motorola Home (the former General Instrument cable businesses) to theArris Group in 2012.[8]

The company began making televisions in 1947 with the model VT-71 with 7-inch cathode ray tube. In 1960, it introduced the world’s first large-screen portable (19-inch), transistorized, cordless television. In 1963, it introduced the first rectangular color picture tube and in 1967 introduced the modular Quasar brand. In 1974, Motorola sold its television business to the Japan-based Matsushita – the parent company of Panasonic.

In 1952, Motorola opened its first international subsidiary in Toronto, Canada to produce radios and televisions. In 1953, the company established the Motorola Foundation to support leading universities in the United States. In 1964, it opened its first company Research and Development branch outside of the United States, in Israel under the management of Moses Basin.

In 1969 Neil Armstrong spoke the famous words “one small step for a man, one giant leap for mankind” from the Moon on a Motorola transceiver.[21]

In 1973, Motorola demonstrated the first hand-held portable telephone.[22]

In 1974, Motorola introduced its first microprocessor, the 8-bit MC6800, used in automotive, computing and video game applications.[23]

In 1976, Motorola moved its headquarters to the Chicago suburb of Schaumburg, Illinois.

In 1980, Motorola’s next generation 32-bit microprocessor, the MC68000, led the wave of technologies that spurred the computing revolution in 1984, powering devices from companies such as Apple, Commodore, Atari, Sun, and Hewlett Packard.[24]

Dr. Martin Cooper of Motorola made the first private handheld mobile phone call on a larger prototype model in 1973. This is a reenactment in 2007.

 In September 1983, the U.S. Federal Communications Commission (FCC) approved the DynaTAC 8000X telephone, the world’s first commercial cellular device. By 1998, cellphones accounted for two thirds of Motorola’s gross revenue.[25] The company was also strong in semiconductor technology, includingintegrated circuits used in computers. In particular, it is known for the 6800 family and 68000 family of microprocessors used in Atari ST, Commodore Amiga,Color Computer, and Apple Macintosh personal computers. The PowerPC family was developed with IBM and in a partnership with Apple (known as the AIM alliance). Motorola also has a diverse line of communication products, including satellite systems, digital cable boxes and modems.

In 1986, Motorola invented the Six Sigma quality improvement process. This became a global standard. In 1990, General Instrument Corporation, which was later acquired by Motorola, proposed the first all-digital HDTV standard. In the same year, the company introduced the Bravo numeric pager which became the world’s best-selling pager.

In 1991, Motorola demonstrated the world’s first working-prototype digital cellular system and phones using GSM standard in Hanover, Germany. In 1994, Motorola introduced the world’s first commercial digital radio system that combined paging, data and cellular communications and voice dispatch in a single radio network and handset. In 1995 Motorola introduced the world’s first two-way pager which allowed users to receive text messages and e-mail and reply with a standard response.

In 1998, Motorola was overtaken by Nokia as the world’s biggest seller of mobile phone handsets.[21]

Wikipedia | Motorola


Motorola 68000

The Motorola 68000 (“‘sixty-eight-thousand'”; also called the m68k or Motorola 68k, “sixty-eight-kay“) is a 16/32-bit[1] CISC microprocessorcore designed and marketed by Motorola Semiconductor Products Sector (now Freescale Semiconductor). Introduced in 1979 with HMOStechnology as the first member of the successful 32-bit m68k family of microprocessors, it is generally software forward compatible with the rest of the line despite being limited to a 16-bit wide external bus.[2] After 35 years in production, the 68000 architecture is still in use.

Pre-release XC68000 chip manufactured in 1979.

The 68000 grew out of the MACSS (Motorola Advanced Computer System on Silicon) project, begun in 1976 to develop an entirely new architecture without backward compatibility. It would be a higher-power sibling complementing the existing 8-bit 6800 line rather than a compatible successor. In the end, the 68000 did retain a bus protocol compatibility mode for existing 6800 peripheral devices, and a version with an 8-bit data bus was produced. However, the designers mainly focused on the future, or forward compatibility, which gave the 68000 platform a head start against later 32-bitinstruction set architectures. For instance, the CPU registers are 32 bits wide, though few self-contained structures in the processor itself operate on 32 bits at a time. The MACSS team drew heavily on the influence of minicomputer processor design, such as the PDP-11 and VAX systems, which were similarly microcoded.

In the mid 1970s, the 8-bit microprocessor manufacturers raced to introduce the 16-bit generation. National Semiconductor had been first with its IMP-16 and PACE processors in 1973–1975, but these had issues with speed. Intel had worked on their advanced 16/32-bit iAPX432 (alias 8800) since 1975 and their Intel 8086 since 1976 (it was introduced in 1978 but became really widespread in the form of the almost identical 8088 in the IBM PC a few years later). Arriving late to the 16-bit arena afforded the new processor more transistors (roughly 40 000 active versus 20 000 active in the 8086), 32-bit macroinstructions, and acclaimed general ease of use.

The original MC68000 was fabricated using an HMOS process with a 3.5 µm feature size. Formally introduced in September 1979,[3] Initial samples were released in February 1980, with production chips available over the counter in November.[4] Initial speed grades were 4, 6, and 8 MHz. 10 MHz chips became available during 1981[citation needed], and 12.5 MHz chips by June 1982.[4] The 16.67 MHz “12F” version of the MC68000, the fastest version of the original HMOS chip, was not produced until the late 1980s. Tom Gunter, retired Corporate Vice President at Motorola, is known as the “Father of the 68000”.

Two Hitachi 68HC000 CPUs being used on an arcade game PCB

The 68k instruction set was particularly well suited to implement Unix,[5] and the 68000 became the dominant CPU for Unix-based workstations including Sun workstations and Apollo/Domain workstations, and also was used for mass-market computers such as the Apple Lisa, Macintosh, Amiga, and Atari ST. The 68000 was used in Microsoft Xenix systems as well as an early NetWare Unix-based Server. The 68000 was used in the first generation of desktop laser printers including the original Apple Inc. LaserWriter and the HP LaserJet. In 1982, the 68000 received an update to its ISAallowing it to support virtual memory and to conform to the Popek and Goldberg virtualization requirements. The updated chip was called the 68010. A further extended version which exposed 31 bits of the address bus was also produced, in small quantities, as the 68012.

Wikipedia | Motorola 68000

Anuncios

Responder

Introduce tus datos o haz clic en un icono para iniciar sesión:

Logo de WordPress.com

Estás comentando usando tu cuenta de WordPress.com. Cerrar sesión / Cambiar )

Imagen de Twitter

Estás comentando usando tu cuenta de Twitter. Cerrar sesión / Cambiar )

Foto de Facebook

Estás comentando usando tu cuenta de Facebook. Cerrar sesión / Cambiar )

Google+ photo

Estás comentando usando tu cuenta de Google+. Cerrar sesión / Cambiar )

Conectando a %s