North South University
Department of Electrical Engineering & Computer Science
ETE 521 Assignment # 3
Name: MD. Rakibul Islam Monshy
ID: 1131048556
a) No Ans:
Define
ILEC: An incumbent local exchange carrier (ILEC) is a local telephone company in the United States that was in existence at the time of the breakup of AT&T into the Regional Bell Operating Companies (RBOCs), also known as the "Baby Bells." The ILEC is the former Bell System or Independent Telephone Company responsible for providing local telephone exchange services in a specified geographic area. GTE was the second largest ILEC after the Bells, but it has since been absorbed into Verizon, a RBOC. ILECs compete with competitive local exchange carriers (CLEC). When referring to the technical communities ILEC is often used just to mean a telephone provider. In Canada, the term ILEC refers to the original telephone companies such as Telus (BC Tel and AGT), SaskTel, Manitoba Telephone Systems (MTS Allstream), Bell Canada Enterprises and Aliant.
ILEC, with respect to an area in the United States, is a local exchange carrier (LEC) that:
On the date of enactment of the Telecommunications Act of 1996, provided telephone exchange service in such area and on such date of enactment, was deemed to be a member of the exchange carrier association pursuant to the Code of Federal Regulations (C.F.R) Title 47, section 69.601(b).Or is a person or entity that, on or after such date of enactment, became a successor or assignee of a member described in the previous bullet.
The Federal Communications Commission (FCC) may, by rule, provide for the treatment of an LEC (or class or category thereof) as an ILEC if:
Such carrier occupies a position in the market for telephone exchange service within an area that is comparable to the position occupied by a carrier described previously
Such carrier has substantially replaced an ILEC described previously
Such treatment is consistent with the public interest, convenience and necessity
Example: For USA: AT&T.
Example: For Bangladesh: BTRC.
CLEC:
A competitive local exchange carrier (CLEC), in the United States, is a telecommunications provider company (sometimes called a "carrier") competing with other, already established carriers (generally the incumbent local exchange carrier (ILEC)). Local exchange carriers (LECs) are divided into incumbent (ILECs) and competitive (CLECs). The ILECs are usually the original, monopoly LEC in a given area, and receive different regulatory treatment from the newer CLECs. A data local exchange carrier (DLEC) is a CLEC specializing in DSL services by leasing lines from the ILEC and reselling them to Internet service providers. Example: For USA: Access Fibre Group Inc, ACN Communication Services Inc., Airespring Inc, American Telecommunications Systems, Inc, Armstrong Telecommunications, Inc, AT&T Communications of Ohio, Inc. Example: For Bangladesh: BTRC, BanglaLink of Orascom Telecom, GrameenPhone Limited, Robi Mobile, Tele Talk, Dhaka Mobile, AirTel Bangladesh, OneTel Commonication Ltd, AirTel Bangladesh, Anik Telecom, Anik Telecom
DLEC
(Data Local Exchange Carrier) A CLEC that specializes in DSL services primarily by leasing lines from local phone companies and reselling them to ISPs. The DLEC industry got off to a big start in the late 1990s, but ended up in trouble just a couple years later. Several minor DLECs and two of the three largest (Rhythms and North point) have gone out of business. The incumbent local telcos (ILECs) have always had the lion's share of DSL customers and have increased their share since the DLEC demise.
Example: For USA: America Online (AOL)
Example: For Bangladesh: Banglalion, Qubee.
BLEC
Building local exchange carriers
Unlike their competitive local exchange carrier (CLEC) and incumbent LEC (ILEC) counterparts, BLECs are concerned exclusively with bringing data and voice services to office towers, industrial parks, hotels and apartment complexes. While CLECs and ILECs focus on building out broadband networks that terminate at the edges of buildings, BLECs concentrate on running broadband inside buildings
Example: For USA: Cablevision, Comcast, Fairpoint. Example: For Bangladesh: Aamra Networks Limite, Access Telecom (BD) Ltd, Aftab IT Ltd.
ICP
Interexchange telecommunications provider: An ICP though the term has not caught on is a company that offers data in local market voice data services. Example: For USA: Comcast, Megapath Example: For Bangladesh: BanglaLink of Orascom Telecom, GrameenPhone Limited, Robi Mobile, Tele Talk.
ASP
An application service provider (ASP) is a business that provides computer-based services to customers over a network. Software offered using an ASP model is also sometimes called On-demand software or software as a service (SaaS). The most limited sense of this business is that of providing access to a particular application program (such as customer relationship management) using a standard protocol such as HTTP.
The need for ASPs has evolved from the increasing costs of specialized software that have far exceeded the price range of small to medium sized businesses. As well, the growing complexities of software have led to huge costs in distributing the software to end-users. Through ASPs, the complexities and costs of such software can be cut down. In addition, the issues of upgrading have been eliminated from the end-firm by placing the onus on the ASP to maintain up-to-date services, 24 x 7 technical support, physical and electronic security and in-built support for business continuity and flexible working. The importance of this marketplace is reflected by its size. As of early 2003, estimates of the United States market range from 1.5 to 4 billion dollars. Clients for ASP services include businesses, government organizations, non-profits, and membership organizations.
There are several forms of ASP business. These are:
A specialist or functional ASP delivers a single application, such as credit card payment processing or timesheet services;
A vertical market ASP delivers a solution package for a specific customer type, such as a dental practice;
An enterprise ASP delivers broad spectrum solutions;
A local ASP delivers small business services within a limited area.
Some analysts identify a volume ASP as a fifth type. This is basically a specialist ASP that offers a low cost packaged solution via their own website. PayPal was an instance of this type, and their volume was one way to lower the unit cost of each transaction. In addition to these types, some large multi-line companies (such as HP and IBM), use ASP concepts as a particular business model that supports some specific customers. Example: For USA: Microsoft, IBM, and Oracle. Example: For Bangladesh: DataSoft Systems Bangladesh Limited, eGeneration, Millennium Information Solution Ltd.
IXC
An Interexchange Carrier (IXC) is a U.S. legal and regulatory term for a telecommunications company, commonly called a long-distance telephone company, such as MCI (before its absorption by Verizon), Sprint and the former AT&T (before its merger with SBC in 2005) in the United States. It is defined as any carrier that provides inter-LATA communication, where a LATA is a local access and transport area. An IXC carries traffic, usually voice traffic, between telephone exchanges. Telephone exchanges are usually identified in the United States by the three-digit area code (NPA) and the first three digits of the phone number (NPA-NXX). Different exchanges are generally in different geographic locations, such as separate central offices (COs, also called "wire centres"). IXCs used to carry voice traffic on analogy lines, but these days, most voice traffic is digitized. Therefore, voice traffic is more typically a data stream. These voice data streams therefore can be intermixed with data traffic, too, such as uplinks for DSL. Most commonly, links between a IXCs and COs are ATM links carried on optical fibre. Example: For USA: AT&T Example: For Bangladesh: BTRC, BanglaLink of Orascom Telecom, GrameenPhone Limited, Robi Mobile, Tele Talk.
CAP
Competitive Access Provider (CAP) is type telecommunication service providers that provide local telecommunications services mainly to business customers in competition with a local Bell Operating Company (BOC). Teleport and MFS are the two major CAPs operating in major metropolitan areas in the United States.
b) No Ans
1)
Sketch of local telephone network interfaces with a long distance telephone network
Shared Trunks
IXC Long Distance Switch
IXC Long Distance Switch
Shared Trunks
Shared Trunks
Telephone
Telephone line
Telephone line
ILEC
Office Switch A
ILEC
Office Switch
B
Shared Trunks
Shared Trunks
Telephone
2) Sketch of international interfaces telephone network with a long distance telephone network
Telephone
Telephone
Local long Distance Switch
International Gateway Switch
International Gateway Switch
ILEC
Office Switch A
ILEC
Office Switch
B
Shared Trunks
Shared Trunks
IXC Long Distance Switch
IXC Long Distance Switch
IXC Long Distance Switch
Shared Trunks
Shared Trunks
Telephone line
Shared Trunks
Telephone line
Shared Trunks
6) Me and my friend Jose’s Burlington to Muncie Indiana telephone network
Local long Distance Muncie Indiana Switch
Local long Distance Burlington Switch
Shared Trunks
IXC Long Distance
Switch
IXC Long Distance
Switch
Shared Trunks
Shared Trunks
Telephone line
Telephone line
Shared Trunks
Shared Trunks
Telephone
Telephone
ILEC
Clinton
Office Switch
ILEC
Oak Street Office Switch
ILEC
Office Switch A
ILEC
Office Switch A
7) Me and my friend Karl’s USA to DENMARK telephone network DENMARK Gateway Switch
USA Gateway Switch
Shared Trunks
IXC Long Distance Switch
IXC Long Distance Switch
Local long Distance Local Switch
Local long Distance Burlington Switch
Shared Trunks
Shared Trunks IXC Long Distance Switch
IXC Long Distance Switch
ILEC
Office Switch A
ILEC
Office Switch A
Shared Trunks
Shared Trunks
Shared Trunks
Shared Trunks
ILEC
Oak Street Office Switch
ILEC
Local
Office Switch
Telephone
Telephone
Telephone line
Telephone line
c) No Ans:
Technological regulatory milestone in telecom history from 1920 –Present
1928
One-Way Police Radio Communication:
At this site on April 7, 1928 the Detroit Police Department commenced regular one-way radio communication with its patrol cars. Developed by personnel of the department's radio bureau, the system was the product of seven years of experimentation under the direction of police commissioner, William P. Rutledge. Their work proved the practicality of land-mobile radio for police work and led to its adoption throughout the country In the 1920s gangster era, bank robbers and bootleggers made clean getaways time after time, to the great consternation of police. For this was before reliable mobile-radio communications existed, communications that could have quickly dispatched patrol cars to the scene of the crime, But in 1928, a dedicated Detroit patrolman and an electronics buff devised the first successful one-way radio link between police headquarters and cruisers. Critical news of crimes in progress could now be transmitted from the stationhouse to police cars as they drove. Electronics was a fledgling science when Detroit Patrolman Kenneth Cox and Robert L. Batts, an engineering student, built a stable radio receiver and antenna system. Their successful one-way radio, coming after years of trial and error, was installed in April 1928. The Detroit Police Department made history as the first to dispatch patrol cars regularly by radio. Many city police departments shortly followed suit with their own systems. Imagine driving along, listening to music on your car radio. All of a sudden, someone starts reading, on the air, a list of stolen vehicles. That's how the first police radio system in the world operated, and it operated in Detroit. Between 1921 and 1927, radio buffs Kenneth R. Cox, Walter Vogler and Bernard Fitzgerald, all Detroit police officers, experimented with radio sets they had installed in the back seat of a Model T Ford police patrol car. The receivers picked up signals, but not very consistently. Frequently, broadcasts would fade out as the car passed large buildings or under railroad bridges. Also, police had no designated band on which to broadcast, so the system operated like any radio station. The station was appropriately called KOP. KOP was listed officially as an entertainment station. To meet FRC (Federal Radio Commission, predecessor of the FCC) licensing requirements, police officers broadcast recorded music in between lists of stolen vehicles and descriptions of missing children. Persistent work by Cox and Robert Batts led to the development of an improved receiver in 1927. A broadcasting station, W8FS, was set up on Belle Isle and regular dispatches began in 1928. (from Detroit Free Press, Thursday, May 7, 1987).
1929
Yosami Radio Transmitting Station
In April 1929, the Yosami Station established the first wireless communications between Japan and Europe with a long wave operating at 17.442 kHz. An inductor-type high-frequency alternator provided output power at 500 kW. The antenna system used eight towers, each 250m high. The facilities were used for communicating with submarines by the Imperial Japanese Navy from 1941 to 1945 and by the United States Navy from 1950 to 1993. Establishment of wireless communications in Japan before the First World War, Japan did not have its own overseas communication networks and depended on the wired networks operated by a foreign company. After the First World War, the Japanese government recognized that its own communication network was indispensable for dealing with the increasing amount of trade and diplomatic negotiations and decided to establish the long wave wireless transmitting stations for communications between Japan and the US and between Japan and Europe. The station for communication with US was established in 1927 in Iwaki, north of Tokyo, and the station for Europe in 1929 in Yosami near Nagoya. Long wave generation using machine-senders In the 1920s when there was no vacuum tube with high output power, long waves (continuous carrier waves) with high output power were generated by machine-senders, i.e., high frequency (HF) generators. Two types of generator were proposed for the machine-senders, as described in the following item (h). In the Yosami station, an inductor-type alternator was selected with the idea of high output power. The station started communications to Warsaw in Poland on April 15, 1929 as the first destination with a long wave of 17.442 kHz and output power of 500kW. Communications to Berlin, Paris and London followed in turn. By using the generator with such high output power, the long wave could cross a long distance of 9000km,
1933
Two-Way Police Radio Communication
In 1933, the police department in Bayonne, New Jersey initiated regular two-way communications with its patrol cars, a major advance over previous one-way systems. The very high frequency system developed by radio engineer Frank A. Gunther and station operator Vincent J. Doyle placed transmitters in patrol cars to enable patrolmen to communicate with headquarters and other cars instead of just receiving calls. Two-way police radio became standard throughout the country following the success of the Bayonne system. The plaque can be viewed in Bayonne, New Jersey, in the municipal park at the corner of 26th St. and Ave. C. In March 1933, the first two-way AM mobile radio was installed in a patrol car of the Bayonne Police Department. The system was designed by Lieutenant Vincent J. Doyle of the Bayonne Police and radio engineer Frank Gunther. Through the use of a combined transmitter and receiver in the patrol car, the two-way system allowed communication between patrol cars and with the police station. The Bayonne system was developed less than five years after the deployment of the one-way AM mobile-radio system by the Detroit Police Department. On April 7, 1928, the Detroit Police commenced regular one-way radio communication with its patrol cars, using a system developed by Patrolman Kenneth Cox and Robert L. Batts, an engineering student. This system proved the practicality of land-mobile radio for police work and led to its adoption throughout the county.
1934
Long-Range Shortwave Voice
Beginning 3 February 1934, Vice Admiral Richard E. Byrd's Antarctic Expedition transmitted news releases to New York via short-wave radio voice equipment. From New York, the US nationwide CBS network broadcast the news releases to the public. Previous expeditions had been limited to dot-dash telegraphy, but innovative equipment from the newly formed Collins Radio Company made this long-range voice transmission feasible. Expedition, voice transmissions over shortwave had become a reality. Previous expeditions had used code to shortwave their messages. The change to voice was major and consequently, during 1933, quality was lacking. Mr. Collins careful attention to insulation, to reduction of high circulating currents and the reduction of stray RF fields helped to give the voice transmissions thumbs up approval. "Clear as a Bell" was the response of many. Other new ideas which were later widely used in the radio field were multiple pre-tuned radio frequency bays which allowed frequency changes much more quickly, and unitized construction which gave the radio top-notch quality and reliability. A final important feature of the Type 20B Transmitter was called Class B modulation. Its application to low and medium powered transmitters was virtually pioneered by Collins Radio Company and was used to produce large audio power from relatively small tubes.
1934
The Federal Communications Commission (FCC)
Is an independent agency of the United States government, created by Congressional statute (see 47 U.S.C. § 151 and 47 U.S.C. § 154), and with the majority of its commissioners appointed by the current President. The FCC works towards six goals in the areas of broadband, competition, the spectrum, the media, public safety and homeland security. The Commission is also in the process of modernizing itself. The FCC took over wire communication regulation from the Interstate Commerce Commission. The FCC's mandated jurisdiction covers the 50 states, the District of Columbia, and U.S. possessions. The FCC also provides varied degrees of cooperation, oversight, and leadership for similar communications bodies in other countries of North America. The FCC is funded entirely by regulatory fees. It has an estimated fiscal-2011 budget of US$335.8 million and a proposed fiscal-2012 budget of $354.2 million. It has 1,898 federal employees
The FCC's mission, specified in section one of the Communications Act of 1934 and amended by the Telecommunications Act of 1996 (amendment to 47 U.S.C. §151) is to "make available so far as possible, to all the people of the United States, without discrimination on the basis of race, colour, religion, national origin, or sex, rapid, efficient, Nation-wide, and world-wide wire and radio communication services with adequate facilities at reasonable charges." The Act furthermore provides that the FCC was created "for the purpose of the national defence" and "for the purpose of promoting safety of life and property through the use of wire and radio communications.
Consistent with the objectives of the Act as well as the 1993 Government Performance and Results Act (GPRA), the FCC has identified six goals in its 2006-2011 Strategic Plan. These are:
Broadband: "All Americans should have affordable access to robust and reliable broadband products and services. Regulatory policies must promote technological neutrality, competition, investment, and innovation to ensure that broadband service providers have sufficient incentives to develop and offer such products and services."
Competition: "Competition in the provision of communication services, both domestically and overseas, supports the Nation's economy. The competitive framework for communications services should foster innovation and offer consumers reliable, meaningful choice in affordable services."
Spectrum: "Efficient and effective use of non-federal spectrum domestically and internationally promotes the growth and rapid development of innovative and efficient communication technologies and services."
Media: "The Nation's media regulations must promote competition and diversity and facilitate the transition to digital modes of delivery."
Public Safety and Homeland Security: "Communications during emergencies and crisis must be available for public safety, health, defence, and emergency personnel, as well as all consumers in need. The Nation's critical communications infrastructure must be reliable, interoperable, redundant, and rapidly restorable."
Modernize the FCC: "The Commission shall strive to be highly productive, adaptive, and innovative organization that maximizes the benefits to stakeholders, staff, and management from effective systems, processes, resources, and organizational culture
The FCC is directed by five commissioners appointed by the U.S. president and confirmed by the U.S. Senate for five-year terms, except when filling an unexpired term. The president designates one of the commissioners to serve as chairman. Only three commissioners may be members of the same political party. None of them may have a financial interest in any FCC-related business
1946
Electronic Numerical Integrator and Computer
A major advance in the history of computing occurred at the University of Pennsylvania in 1946 when engineers put the Electronic Numerical Integrator and Computer (ENIAC) into operation. Designed and constructed at the Moore School of Electrical Engineering under a U. S. Army contract during World War II, the ENIAC established the practicality of large scale, electronic digital computers and strongly influenced the development of the modern, stored-program, general-purpose computer. Announced the unveiling of "an amazing machine that applies electronic speeds for the first time to mathematical tasks hitherto too difficult and cumbersome for solution... Leaders who saw the device in action for the first time," the report continued "heralded it as a tool with which to begin to rebuild scientific affairs on new foundations." With those words, the world´s first large-scale electronic general-purpose digital computer, developed at The Moore School of Electrical Engineering at the University of Pennsylvania in Philadelphia, emerged from the wraps of secrecy under which it had been constructed in the last years of World War II. ENIAC (Electronic Numerical Integrator and Computer) was completed in 1945. Its first major job, finished in a little over two hours, would have required many years of labor to perform by conventional calculating methods. That task, assigned to the ENIAC on its first test run, involved many thousand computations connected with top-secret studies on thermonuclear reactions. While many projects were scrapped at the end of the war, the ENIAC had proven to be significant for military research at Los Alamos as well as the Ballistic Research Laboratory in neighboring Maryland. ENIAC was built at the Moore School, under a contract with the U.S. Army. The machine contained more than 18,000 vacuum tubes, which were cooled through the use of eighty air blowers. It measured 8 feet high, 3 feet wide and almost one hundred feet long, filled a 30-by-50 foot room, and weighed thirty tons. In much the same way that the airplane expanded man´s physical domain, this invention extended the capacity of human reason. In the brief span since its inception, individuals have explored our nearest planets, walked on the moon, and revolutionized the business, scientific and engineering worlds. It´s a far cry from today´s laptops and PDA´s
1947
Transistor: A transistor is a semiconductor device used to amplify and switch electronic signals and power. It is composed of a semiconductor material with at least three terminals for connection to an external circuit. A voltage or current applied to one pair of the transistor's terminals changes the current flowing through another pair of terminals. Because the controlled (output) power can be higher than the controlling (input) power, a transistor can amplify a signal. Today, some transistors are packaged individually, but many more are found embedded in integrated circuits. The transistor is the fundamental building block of modern electronic devices, and is ubiquitous in modern electronic systems. Following its development in the early 1950s the transistor revolutionized the field of electronics, and paved the way for smaller and cheaper radios, calculators, and computers, among other things.
The thermionic triode, a vacuum tube invented in 1907, propelled the electronics age forward, enabling amplified radio technology and long-distance telephony. The triode, however, was a fragile device that consumed a lot of power. Physicist Julius Edgar Lilienfeld filed a patent for a field-effect transistor (FET) in Canada in 1925, which was intended to be a solid-state replacement for the triode. Lilienfeld also filed identical patents in the United States in 1926 and 1928. However, Lilienfeld did not publish any research articles about his devices nor did his patents cite any specific examples of a working prototype. Since the production of high-quality semiconductor materials was still decades away, Lilienfeld's solid-state amplifier ideas would not have found practical use in the 1920s and 1930s, even if such a device were built. In 1934, German inventor Oskar Heil patented a similar device. From November 17, 1947 to December 23, 1947, John Bardeen and Walter Brattain at AT&T's Bell Labs in the United States, performed experiments and observed that when two gold point contacts were applied to a crystal of germanium, a signal was produced with the output power greater than the input.[8] Solid State Physics Group leader William Shockley saw the potential in this, and over the next few months worked to greatly expand the knowledge of semiconductors. The term transistor was coined by John R. Pierce as a portmanteau of the term "transfer resistor". According to Lillian Hoddeson and Vicki Daitch, authors of a biography of John Bardeen, Shockley had proposed that Bell Labs' first patent for a transistor should be based on the field-effect and that he be named as the inventor. Having unearthed Lilienfeld’s patents that went into obscurity years earlier, lawyers at Bell Labs advised against Shockley's proposal since the idea of a field-effect transistor which used an electric field as a “grid” was not new. Instead, what Bardeen, Brattain, and Shockley invented in 1947 was the first bipolar point-contact transistor.[6] In acknowledgement of this accomplishment, Shockley, Bardeen, and Brattain were jointly awarded the 1956 Nobel Prize in Physics "for their researches on semiconductors and their discovery of the transistor effect. In 1948, the point-contact transistor was independently invented by German physicists Herbert Mataré and Heinrich Welker while working at the Compagnie des Freins et Signaux, a Westinghouse subsidiary located in Paris. Mataré had previous experience in developing crystal rectifiers from silicon and germanium in the German radar effort during World War II. Using this knowledge, he began researching the phenomenon of "interference" in 1947. By witnessing currents flowing through point-contacts, similar to what Bardeen and Brattain had accomplished earlier in December 1947, Mataré by June 1948, was able to produce consistent results by using samples of germanium produced by Welker. Realizing that Bell Labs' scientists had already invented the transistor before them, the company rushed to get its "transistron" into production for amplified use in France's telephone network. The first silicon transistor was produced by Texas Instruments in 1954. This was the work of Gordon Teal, an expert in growing crystals of high purity, who had previously worked at Bell Labs.The first MOS transistor actually built was by Kahng and Atalla at Bell Labs in 1960 The transistor is the key active component in practically all modern electronics. Many consider it to be one of the greatest inventions of the 20th century. Its importance in today's society rests on its ability to be mass produced using a highly automated process (semiconductor device fabrication) that achieves astonishingly low per-transistor costs. The invention of the first transistor at Bell Labs was named an IEEE Milestone in 2009. Although several companies each produce over a billion individually packaged (known as discrete) transistors every year, the vast majority of transistors now are produced in integrated circuits (often shortened to IC, microchips or simply chips), along with diodes, resistors, capacitors and other electronic components, to produce complete electronic circuits. A logic gate consists of up to about twenty transistors whereas an advanced microprocessor, as of 2011, can use as many as 3 billion transistors (MOSFETs). "About 60 million transistors were built in 2002 ... for [each] man, woman, and child on Earth." The transistor's low cost, flexibility, and reliability have made it a ubiquitous device. Transistorized mechatronic circuits have replaced electromechanical devices in controlling appliances and machinery. It is often easier and cheaper to use a standard microcontroller and write a computer program to carry out a control function than to design an equivalent mechanical control function.
1947
Invention of the First Transistor at Bell Telephone Laboratories
At this site, in Building 1, Room 1E455, from 17 November to 23 December 1947, Walter H. Brattain and John A. Bardeen -- under the direction of William B. Shockley -- discovered the transistor effect, and developed and demonstrated a point-contact germanium transistor. This led directly to developments in solid-state devices that revolutionized the electronics industry and changed the way people around the world lived, learned, worked, and played. This was beginning of solid state electronics which quickly reduced the size and power requirements of existing electronic tube based electronic devices. This revolutionized the electronics field and eventually ushered in the information age via small, low power electronic devices and eventually low cost integrated circuits.
It has been cited as the most important invention of the 20th Century, see reference, It’s hard to imagine any field of human endeavour where it has not had a profound positive impact from communications, computing, transportation, medical, etc. The invention of the first working transistor set the stage for all subsequent solid-state device developments. Although Julius Edgar Lilienfeld successfully patented a number of field-effect transistor configurations starting in 1925 and into the 1930’s, there is no indication that he ever produced any working models. If he did, they would not have worked very well due to the state of germanium and silicon crystal growth at that time. By 1947, advances in germanium semiconductors made it possible for Bardeen and Brattain to make the first working point contact transistor.
1956
The First Submarine Transatlantic Telephone Cable System (TAT-1)
September 1956, when the first transatlantic undersea telephone system, TAT-1, went into service. This site is the eastern terminal of the transatlantic cable that stretched west to Clarenville, Newfoundland. TAT-1 was a great technological achievement providing unparalleled reliability with fragile components in hostile environments. It was made possible through the efforts of engineers at AT&T Bell Laboratories and British Post Office. The system operated until 1978. The plaques can be viewed in three locations: at 52 Cormack Dr., Clarenville, Newfoundland, Canada; at the Cape Breton Fossil Centre in Sydney Mines on Cape Breton Island, Canada; and in Gallanach Bay, in Oban, Scotland. The first transatlantic telephone cable, TAT-1, inaugurated the modern era of global communications. Many of the basic concepts and processes developed for achieving highly reliable submarine infrastructure have not changed significantly from those used in TAT-1. Before TAT-1, voice was carried across the Atlantic on unreliable and expensive radio channels. Text messaging was carried on submarine telegraph cables (the technology of the previous 90 years) which were reliable, but slow and expensive. Cooperation between North America and the United Kingdom to build an electrical bridge across the Atlantic had gone back over a century. After a period of failure and learning, the Great Eastern, the world’s largest ship, laid in 1866 the first permanent transatlantic link under the leadership of Cyrus Field, and telegraph communication began. However, the communication capacity of the first transatlantic cable was very limited while the demand for rapid communication continued to increase. Telegraph systems developed steadily over the years. Advances in materials and techniques, such as inductive loading, led to gradual increases in performance to the point that, in 1919 a study of deep-water submarine telephones began. In 1928 this work culminated in a proposal for a repeater less cable bearing a single voice channel. Two considerations, however, killed the project: radio circuits were continuously improving, and the cost estimate was $15 million, a prohibitive price tag after the economic collapse that began in 1929.
A commercial radiotelegraph service, which began in 1908, had greatly contributed to transatlantic communication. Transatlantic long-wave and short-wave services had been established in 1927 and 1928, respectively. The first commercial voice link across the Atlantic, which was launched in 1927 with a single radio telephone circuit, shed new light on the desirability of a transatlantic telephone cable. While radio circuits provided a voice service, the vagaries of sunspot and seasonal and daily variations were never overcome entirely. Moreover, radio did not guarantee its users privacy and security. Recognition of the technical limitations of radio for transatlantic telephony led to studies of the feasibility of a North Atlantic submarine telephone cable. In the mid-1930’s electronic technology had advanced to the point where a submarine cable system with repeaters, electrical devices that would boost voice signals after they had reached the fading point along a circuit, became feasible. Since the repeaters had to have sufficiently long lives to operate with small likelihood of failure over a period of time, they were subject to rigid reliability requirements. Most fragile, however, were the vacuum tubes, which were the only means of amplification. Development of these tubes was begun in 1933, and they were continually tested for a period of eighteen years.
The North American side utilized the flexible repeater technology in the 1950 Havana-Key West cable, which adopted an earlier version of the TAT-1 repeater. British Post Office had developed a single repeater system and used it for shallow-water links in the 1940’s. In 1953 the agreement for the first transatlantic telephone cable was signed. TAT-1 was a joint effort of AT&T Bell Laboratories, the British Post Office Engineering Department, and the Canadian Overseas Telecommunication Corporation. The design of the TAT-1 repeater provided a unique solution to the historic challenge of placing a telephone cable two and a half miles beneath the surface of the North Atlantic. The repeater was flexible thus allowing it to be wound over a cable standard drum. It was eight feet long and had a diameter of 2.875 inches tapering down to the cable width of 1.625 inches over twenty feet. The main Atlantic link, designed by the Bell System, called for two cables (one in each direction of transmission), which embodied one-way flexible repeaters at 37-mile intervals. H.M.T.S. Monarch, then the world’s largest cable ship, laid the two cables in the summers of 1955 and 1956, respectively. The links were from Clarenville, Newfoundland to Oban, Scotland. Each cable had fifty-one repeaters in a cable stretching over approximately 1950 nautical miles. The repeater provided 65 dB of gain and 144 kHz bandwidth around 164 kHz. Amplification in each repeater was made possible by means of three vacuum tubes, whose design, testing and manufacture set new standards of reliability. The vacuum tubes of the original TAT-1 never failed in twenty-two years of continuous service from 1956 to 1978. TAT-1 also included an overland portion and an underwater link. The Canadian provided an overland line-of-sight radio system from Nova Scotia to Montreal and to a point in Maine where the Bell System took over. Under the shallow waters of the Cabot Straits, British-pioneered two-way rigid repeaters allowed transmission from Newfoundland to the mainland through Sydney Mines, Nova Scotia over a single cable. TAT-1 initial service provided twenty-nine telephone circuits between London and New York, six circuits between London and Montreal and a single circuit split among the three destinations for telegraph and other narrow band applications.
Over the last fifty years since TAT-1 went into service, the capacity of telephone cables has grown explosively from initial thirty-six voice-band channels to modern broadband optical fibre systems. Today, single cables can support eight fibre pairs and carry in excess of eight terabits of capacity across the Atlantic and the Pacific Oceans, which is approximately four million times the number of voice circuits carried on TAT-1. With communications traffic travelling at the speed of light on undersea cable, optical or electrical, the time difference encountered between end points across the ocean or across a city does not disturb communications being barely noticeable hence, there is little difference between a voice call to another continent and one within one’s own city. The transmission capabilities of undersea optical fibre are crucial for linking computers of different continents. Whether surfing the internet, making a reservation or calling a friend in another country on another continent, all these services are made possible due to the unique technologies deployed in modern global submarine cable systems, whose progenitor was TAT-1.
1958
Integrated Circuit
As with many inventions, two people had the idea for an integrated circuit at almost the same time. Transistors had become commonplace in everything from radios to phones to computers, and now manufacturers wanted something even better. Sure, transistors were smaller than vacuum tubes, but for some of the newest electronics, they weren't small enough. But there was a limit on how small you could make each transistor, since after it was made it had to be connected to wires and other electronics. The transistors were already at the limit of what steady hands and tiny tweezers could handle. So, scientists wanted to make a whole circuit -- the transistors, the wires, everything else they needed -- in a single blow. If they could create a miniature circuit in just one step, all the parts could be made much smaller. One day in late July, Jack Kilby was sitting alone at Texas Instruments. He had been hired only a couple of months earlier and so he wasn't able to take vacation time when practically everyone else did. The halls were deserted, and he had lots of time to think. It suddenly occurred to him that all parts of a circuit, not just the transistor, could be made out of silicon. At the time, nobody was making capacitors or resistors out of semiconductors. If it could be done then the entire circuit could be built out of a single crystal -- making it smaller and much easier to produce. Kilby's boss liked the idea, and told him to get to work. By September 12, Kilby had built a working model, and on February 6, Texas Instruments filed a patent. Their first "Solid Circuit" the size of a pencil point, was shown off for the first time in March. But over in California, another man had similar ideas. In January of 1959, Robert Noyce was working at the small Fairchild Semiconductor startup company. He also realized a whole circuit could be made on a single chip. While Kilby had hammered out the details of making individual components, Noyce thought of a much better way to connect the parts. That spring, Fairchild began a push to build what they called "unitary circuits" and they also applied for a patent on the idea. Knowing that TI had already filed a patent on something similar, Fairchild wrote out a highly detailed application, hoping that it wouldn't infringe on TI’s similar device. All that detail paid off. On April 25, 1961, the patent office awarded the first patent for an integrated circuit to Robert Noyce while Kilby's application was still being analyzed. Today, both men are acknowledged as having independently conceived of the idea.
The integrated circuit is the invention that enabled the modern electronics industry. Originally used in military applications, it quickly became the core of commercial and consumer electronics, and moved into medical equipment, household appliances, automobiles and even musical greeting cards. It is estimated that the average person encounters thousands of integrated circuits every day. Because of this invention, the electronics industry has grown from $29 billion in 1961 to $1500 billion today. Among the remarkable things it has enabled are:
Space exploration;
Personal computers;
Cell phones;
Digital cameras;
Anti-locking brakes;
Cochlear implants that helps the deaf to hear and cornea implants that help the blind to see;
Picture-perfect images for sonograms and medical diagnostics, the invention of the integrated circuit won the inventor, Jack Kilby, the Nobel Prize in Physics in 2000, the National Medal of Science in 1970, and induction into the National Inventors Hall of Fame in 1982. This invention set in motion the technology that would enable the second industrial revolution, and its in-situ form made it possible for future generations of integrated circuits to become orders of magnitude smaller and more powerful. Today, the integrated circuit is the fundamental building block of all electronic equipment. The integrated circuit was the answer to a difficult technological problem known as the “tyranny of numbers.” At the time, the recently invented transistor was inspiring engineers to design evermore complex electronic circuits and equipment containing hundreds or thousands of discrete components such as transistors, diodes, rectifiers and capacitors. But the problem was that these components still had to be interconnected to form electronic circuits, and hand-soldering thousands of components to thousands of bits of wire was expensive and time-consuming. It was also unreliable; every soldered joint was a potential source of trouble. The challenge was to find cost-effective, reliable ways of interconnecting these components and producing them. It wasn’t until the invention of the integrated circuit by Jack Kilby that this could be done and electronic equipment could start its dramatic course of commercialization and miniaturization. In 1976, Kilby provided insight into his thinking by explaining, “Further thought led me to the conclusion that semiconductors were all that were really required — that resistors and capacitors [passive devices], in particular, could be made from the same material as the active devices [transistors]. I also realized that, since all of the components could be made of a single material, they could also be made in situ interconnected to form a complete circuit." The invention caused a lot of buzz and controversy in the first few years as it was shown at trade shows. Recognizing the need for a "demonstration product" to speed widespread use of the integrated circuit, TI management challenged Kilby to design a calculator as powerful as the large, electro-mechanical desktop models of the day, but small enough to fit in a coat pocket. The resulting electronic hand-held calculator, of which Kilby is a co-inventor, successfully commercialized the integrated circuit.
1960
First Working Laser
On this site in May 1960 Theodore Maiman built and operated the first laser. A number of teams around the world were trying to construct this theoretically anticipated device from different materials. Maiman’s was based on a ruby rod optically pumped by a flash lamp. The laser was a transformative technology in the 20th century and continues to enjoy wide application in many fields of human endeavour. Theodore Maiman developed the first working laser at Hughes Research Lab in 1960 and his paper describing the operation of the first laser was published in Nature three months later. Since then, more than 55,000 patents involving the laser have been granted in the United States. Today, lasers are used in countless areas of modern life. Some examples include telecommunications, medical diagnostics and surgery, manufacturing, environmental sensing, basic scientific research, space exploration and entertainment. In the past, the IEEE has recognized the significance of the laser as being one of the key technical achievements of the 20th century.
Although there is some controversy over the proper credit for the "open cavity" design used by Maiman, there is complete consensus among both historians and the broader public (e.g., the National Inventors' Hall of Fame) that among all the teams seeking to build a laser based on the Schawlow-Townes paper, Maiman was the first to succeed (see Joan Lisa Bromberg, The Laser in America 1950 - 1970, MIT Press, Cambridge, Mass., 1991; pp. 86 - 92). While there were no previous lasers before Maiman’s achievement, a predecessor of the laser, called the MASER, for "Microwave Amplification by Stimulated Emission of Radiation", was independently developed in 1954 at Columbia University by Charles Townes and coworkers and in Russia by Nicolay Basov and Alexsandr Prokhorov. Soon after the maser was demonstrated, Schawlow at Bell Labs and Townes began thinking about ways to make infrared or visible light masers (called optical masers by Townes and Schawlow). While microwave cavities were well understood in the 1950’s, it was not clear how one might make an optical cavity that incorporated gain. In 1957 Schawlow and Townes eventually realized the solution was aligning two highly reflecting mirrors parallel to each other, forming a Fabry-Perot cavity, and placing the amplifying medium in between. Resonator side walls were not necessary as they were in the microwave case. They soon performed a detailed analysis of laser theory as well as requirements and published a seminal Physical Review paper in 1958. The acronym LASER follows the example of the MASER and stands for "Light Amplification by Stimulated Emission of Radiation. Gordon Gould, a graduate student at Columbia University working on optical and microwave spectroscopy independently proposed the idea of a Fabry Perot cavity and was the first to publicly use the term “laser”.
1969
UNIX (officially trademarked as UNIX, sometimes also written as Unix)
Is a multitasking, multi-user computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs, including Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas McIlroy, Michael Lesk and Joe Ossanna. The Unix operating system was first developed in assembly language, but by 1973 had been almost entirely recoded in C, greatly facilitating its further development and porting to other hardware. Today's Unix system evolution is split into various branches, developed over time by AT&T as well as various commercial vendors, universities (such as University of California, Berkeley's BSD), and non-profit organizations. The Open Group, an industry standards consortium, owns the UNIX trademark. Only systems fully compliant with and certified according to the Single UNIX Specification are qualified to use the trademark; others might be called Unix system-like or Unix-like, although the Open Group disapproves of this term. However, the term Unix is often used informally to denote any operating system that closely resembles the trademarked system. During the late 1970s and early 1980s, the influence of Unix in academic circles led to large-scale adoption of Unix (particularly of the BSD variant, originating from the University of California, Berkeley) by commercial startups, the most notable of which are Solaris, HP-UX and AIX, as well as Darwin, which forms the core set of components upon which Apple's OS X, Apple TV, and iOS are based.[2][3] Today, in addition to certified Unix systems such as those already mentioned, Unix-like operating systems such as MINIX, Linux, Android, and BSD descendants (FreeBSD, NetBSD, OpenBSD, and DragonFly BSD) are commonly encountered. The term traditional Unix may be used to describe an operating system that has the characteristics of either Version 7 Unix or UNIX System V.
1970
The Worlds First Low-Loss Optical Fibre for Telecommunications
Donald Keck developed a highly pure optical glass that effectively transmitted light signals over long distances. This astounding medium, which is thinner than a human hair, revolutionized global communications. By 2011, the world depended upon the continuous transmission of voice, data, and video along more than 1.6 billion kilometres of optical fibre installed around the globe.
During the mid-1960s, members of the British Post Office came to Corning seeking assistance in creating pure glass fibre optics. Their design required a single-mode fibre (100 micron diameter with a 0.75 micron core) having a total attenuation of about 20 dB/km. The very best bulk optical glasses of the day had attenuations approximately 1,000 dB/km; this meant Corning’s scientists had to see an improvement in transparency of 1,098 in order to reach the 20 dB/km goal. It seemed impossible, but they did it, inventing an optical fibre with attenuation of 17 dBkm. As a result, Corning’s invention of the first low-loss optical fibre and the manufacturing process used to produce it revolutionized the telecommunications industry and changed the world forever. The explosion of the Internet and other information technologies would not have been possible without optical fibre. Only optical fibre provides the bandwidth required for high-speed transmission of voice, data, and video the world depends upon for the way we live, work, and play. Today, there are more than 1.6 billion kilometres of fibre installed around the globe.
This breakthrough work established the optical fibre category. There were no similar achievements at the time of the invention. In recognition of this achievement, the three scientists responsible for inventing low-loss optical fibre– Dr. Robert Maurer, Dr. Peter Schultz, and Dr. Donald Keck – have been inducted into the Inventors Hall of Fame and were awarded the National Medal of Technology.
Optical communication systems date back to the 1790s, to the optical semaphore telegraph invented by French inventor Claude Chappe. In 1880, Alexander Graham Bell patented an optical telephone system, which he called the Photo phone. However, his earlier invention, the telephone, was more practical and took tangible shape. The Photo phone remained an experimental invention and never materialized. During the 1920s, John Logie Baird in England and Clarence W. Hansell in the United States patented the idea of using arrays of hollow pipes or transparent rods to transmit images for television or facsimile systems.
In 1954, Dutch scientist Abraham Van Heel and British scientist Harold H. Hopkins separately wrote papers on imaging bundles. Hopkins reported on imaging bundles of unclad fibres, whereas Van Heel reported on simple bundles of clad fibres. Van Heel covered a bare fibre with a transparent cladding of a lower refractive index. This protected the fibre reflection surface from outside distortion and greatly reduced interference between fibres.
Abraham Van Heel is also notable for another contribution. Stimulated by a conversation with the American optical physicist Brian O'Brien, Van Heel made the crucial innovation of cladding fibre -optic cables. All earlier fibres developed were bare and lacked any form of cladding, with total internal reflection occurring at a glass-air interface. Abraham Van Heel covered a barefibreor glass or plastic with a transparent cladding of lower refractive index. This protected the total reflection surface from contamination and greatly reduced cross talk between fibres. By 1960, glass-clad fibres had attenuation of about 1 decibel (dB) per meter, fine for medical imaging, but much too high for communications. In 1961, Elias Snitzer of American Optical published a theoretical description of a fibre with a core so small it could carry light with only one waveguide mode. Snitzer's proposal was acceptable for a medical instrument looking inside the human, but the fibre had a light loss of 1 dB per meter. Communication devices needed to operate over much longer distances and required a light loss of no more than 10 or 20 dB per kilometre. By 1964, a critical and theoretical specification was identified by Dr. Charles K. Kao for long-range communication devices, the 10 or 20 dB of light loss per kilometre standard. Dr. Kao also illustrated the need for a purer form of glass to help reduce light loss.
In the summer of 1970, one team of researchers began experimenting with fused silica, a material capable of extreme purity with a high melting point and a low refractive index. Corning Glass researchers Robert Maurer, Donald Keck, and Peter Schultz invented fibre-optic wire or "optical waveguide fibres" (patent no. 3,711,262), which was capable of carrying 65,000 times more information than copper wire, through which information carried by a pattern of light waves could be decoded at a destination even a thousand miles away. The team had solved the decibel-loss problem presented by Dr. Kao. The team had developed an SMF with loss of 17 dB/km at 633 nm by doping titanium into the fibre core. By June of 1972, Robert Maurer, Donald Keck, and Peter Schultz invented multimode germanium-doped fibre with a loss of 4 dB per kilometre and much greater strength than titanium-doped fibre. By 1973, John MacChesney developed a modified chemical vapour-deposition process for fibre manufacture at Bell Labs. This process spearheaded the commercial manufacture of fibre-optic cable. In April 1977, General Telephone and Electronics tested and deployed the world's first live telephone traffic through a fibre-optic system running at 6 Mbps, in Long Beach, California. They were soon followed by Bell in May 1977, with an optical telephone communication system installed in the downtown Chicago area, covering a distance of 1.5 miles (2.4 kilometres). Each optical-fibre pair carried the equivalent of 672 voice channels and was equivalent to a DS3 circuit. Today more than 80 percent of the world's long-distance voice and data traffic is carried over optical-fibre cables.
1971
Microprocessor
It all began in 1971, when Intel invented the microprocessor. Or perhaps more accurately, they invented the term "microprocessor." An earlier 8-bit chip, the Four-Phase Systems AL1, had been invented by Lee Boysel in 1969 as part of a multi-chip CPU. But it wasn't called a "microprocessor" until a court case in the 1990's, when it was demonstrated that the AL1 could function as the core of a computer. For all practical purposes, however, the PC age began with Intel's first microprocessor, which by itself contained as much processing power as the most powerful computer that existed in the world at the time: the ENIAC, which filled an entire room. The world's first commercially-viable microprocessor was dubbed the Intel 4004. It was quickly succeeded less than a year later by the 8008, which was twice as powerful. In 1978, Intel released the 16-bit 8086 processor. The 8088, also a 16-bit chip, followed less than a year later. The 8088 incorporated technologies designed to make it backward-compatible with the 8-bit chips that were still in wide use at the time. IBM chose the 8088 chip to power their original Personal Computer. And so it was that IBM, Intel, and a little start up company called Microsoft brought computing to the masses. In the early 1990's, Intel released the i386 processor. The 386 was the first commercially available 32-bit microprocessor. For the first time, it made multitasking (running more than one program at a time) possible on desktop computers. The i486 added an onboard math coprocessor, improved data transfer, and an onboard memory cache, all of which were stunning advances in technology in that era. The Intel Pentium processor, released in 1993, was the first commercially available microprocessor capable of executing two instructions for every clock cycle. More recent releases in the Pentium line have revolutionized everything from the way data is moved about on the chips, to the way that multimedia content is handled
1975
Radio telephone system
Inventors: Martin Cooper, Richard W. Dronsuth, ; Albert J. Mikulski, Charles N. Lynk Jr., James J. Mikulski, John F. Mitchell, Roy A. Richardson, John H. Sangster
Dr Martin Cooper, a former general manager for the systems division at Motorola, is considered the inventor of the first modern portable handset. Cooper made the first call on a portable cell phone in April 1973. He made the call to his rival, Joel Engel, Bell Labs head of research. Bell Laboratories introduced the idea of cellular communications in 1947 with the police car technology. However, Motorola was the first to incorporate the technology into portable device that was designed for outside of a automobile use. Cooper and his co-inventors are listed above.
By 1977, AT&T and Bell Labs had constructed a prototype cellular system. A year later, public trials of the new system were started in Chicago with over 2000 trial customers. In 1979, in a separate venture, the first commercial cellular telephone system began operation in Tokyo. In 1981, Motorola and American Radio telephone started a second U.S. cellular radio-telephone system test in the Washington/Baltimore area. By 1982, the slow-moving FCC finally authorized commercial cellular service for the USA. A year later, the first American commercial analog cellular service or AMPS (Advanced Mobile Phone Service) was made available in Chicago by Ameritech.
Despite the incredible demand, it took cellular phone service 37 years to become commercially available in the United States. Consumer demand quickly outstripped the 1982 system standards. By 1987, cellular telephone subscribers exceeded one million and the airways were crowded.
Three ways of improving services existed:
One - increase frequencies allocation
Two - split existing cells
Three - improve the technology
The FCC did not want to handout any more bandwidth, and building/splitting cells would have been expensive and would have added bulk to the network. To stimulate the growth of new technology, the FCC declared in 1987 that cellular licensees could employ alternative cellular technologies in the 800 MHz band. The cellular industry began to research new transmission technology as an alternative.
1980
1G
1G (or 1-G) refers to the first-generation of wireless telephone technology, mobile telecommunications. These are the analogy telecommunications standards that were introduced in the 1980s and continued until being replaced by 2G digital telecommunications. The main difference between two succeeding mobile telephone systems, 1G and 2G, is that the radio signals that 1G networks use are analogy, while 2G networks are digital. Although both systems use digital signalling to connect the radio towers (which listen to the handsets) to the rest of the telephone system, the voice itself during a call is encoded to digital signals in 2G whereas 1G is only modulated to higher frequency, typically 150 MHz and up. One such standard is NMT (Nordic Mobile Telephone), used in Nordic countries, Switzerland, Netherlands, Eastern Europe and Russia. Others include AMPS (Advanced Mobile Phone System) used in the North America and Australia,[1] TACS (Total Access Communications System) in the United Kingdom, C-450 in West Germany, Portugal and South Africa, Radiocom 2000[2] in France, and RTMI in Italy. In Japan there were multiple systems. Three standards, TZ-801, TZ-802, and TZ-803 were developed by NTT, while a competing system operated by DDI used the JTACS (Japan Total Access Communications System) standard. 1G speeds vary between that of a 28k modem(28kbit/s) and 56k modem(56kbit/s),[3] meaning actual download speeds of 2.9KBytes/s to 5.6KBytes/s . Antecedent to 1G technology is the mobile radio telephone, or 0G. The first commercially automated cellular network (the 1G generations) was launched in Japan by NTT (Nippon Telegraph and Telephone) in 1979, initially in the metropolitan area of Tokyo. Within five years, the NTT network had been expanded to cover the whole population of Japan and became the first nationwide 1G network. In 1981, this was followed by the simultaneous launch of the Nordic Mobile Telephone (NMT) system in Denmark, Finland, Norway and Sweden. NMT was the first mobile phone network featuring international roaming. The first 1G network launched in the USA was Chicago-based Ameritech in 1983 using the Motorola DynaTAC mobile phone. Several countries then followed in the early-to-mid 1980s including the UK, Mexico and Canada.
1982
Forced divestiture AT&T
The Bell System divestiture, or the breakup of AT&T, was initiated by the filing in 1974 by the U.S. Department of Justice of an antitrust lawsuit against AT&T. The case, United States v. AT&T, led to a settlement finalized on January 8, 1982, under which "Bell System" agreed to divest its local exchange service operating companies, in return for a chance to go into the computer business, AT&T Computer Systems. Effective January 1, 1984, AT&T's local operations were split into seven independent Regional Holding Companies, also known as Regional Bell Operating Companies (RBOCs), or "Baby Bells". Afterwards, AT&T, reduced in value by approximately 70%, continued to operate all of its long-distance services, although in the ensuing years it lost portions of its market share to competitors such as MCI and Sprint
Regional Bell Operating Companies (RBOCs)
Bell Atlantic – (acquired GTE in 2000 and changed its name to Verizon)
BellSouth – (acquired by AT&T Inc. in 2006)
NYNEX – (acquired by Bell Atlantic in 1996, now part of Verizon)
Pacific Telesis – (acquired by SBC in 1997, now part of the AT&T Inc)
Southwestern Bell – (changed its name to SBC in 1995; acquired AT&T Corp. in 2005 and changed its name to AT&T Inc.)
U S WEST – (acquired by Qwest in 2000, which in turn was acquired by CenturyLink in 2011)
Non-BOC Bell System members
The only difference between these two incumbent local exchange carriers (ILECs) and the seven divested Baby Bells (RBOCs) was that AT&T owned only a minority interest in these ILECs as opposed to owning them outright before the breakup. Both were monopolies in their coverage areas much like the RBOCs.
Cincinnati Bell, the only remaining former Bell System member not owned by a Baby Bell, covering the Greater Cincinnati area.
SNET, the other non-RBOC Bell System member, was acquired by SBC Communications in 1998 and rebranded as AT&T in 2005; it covered Connecticut.
Effects
The breakup led to a surge of competition in the long distance telecommunications market by companies such as Sprint and MCI. AT&T's gambit in exchange for its divestiture, AT&T Computer Systems, failed, and after spinning off its manufacturing operations (most notably Western Electric, which became Lucent, now Alcatel-Lucent) and other misguided acquisitions such as NCR and AT&T Broadband, it was left with only its core business with roots as AT&T Long Lines and its successor AT&T Communications. It was at this point that AT&T was purchased by one of its own spin-offs, SBC Communications, which started as Southwestern Bell, and proceeded to buy two other RBOCs, a former AT&T associated operating company, AT&T itself, then another RBOC. See AT&T for details.
One consequence of the breakup is that local residential service rates, which were formerly subsidized by long distance revenues, have been forced to rise faster than the rate of inflation. Long-distance rates, meanwhile, have fallen due both to the end of this subsidy and increased competition. The FCC established a system of access charges where long distance networks paid the more expensive local networks both to originate and terminate a call. In this way, the implicit subsidies of Ma Bell became explicit post-divestiture. These access charges became a source of strong controversy as one company after another sought to arbitrage the network and avoid these fees. In 2002 the FCC declared that Internet service providers would be treated as if they were local and would not have to pay these access charges. This led to VoIP service providers arguing that they did not have to pay access charges, resulting in significant savings for VoIP calls. The FCC has recently been split on this issue; VoIP services that utilize IP but in every other way look like a normal phone call generally have to pay access charges, while VoIP services that look more like applications on the Internet and do not interconnect with the public telephone network do not have to pay access charges. Another consequence of the divestiture was in how both national broadcast television (i.e. ABC, NBC, CBS, PBS) and radio networks (NPR, Mutual, ABC Radio) distributed their programming to their local affiliate/member stations. Prior to the breakup, the broadcast networks relied on AT&T Long Lines' infrastructure of terrestrial microwave relay, coaxial cable, and, for radio, broadcast-quality leased line networks to deliver their programming to local stations. However, by the mid-1970s, the then-new technology of satellite distribution offered by other companies like RCA Astro Electronics and Western Union with their respective Satcom 1 and Westar 1 satellites started to give the Bell System competition in the broadcast distribution field, with the satellites providing higher video & audio quality, as well as much lower transmission costs. However, the networks stayed with AT&T (along with simulcasting their feeds via satellite through the late 70s to the early 80s) due to some stations not being equipped yet with earth station receiving equipment to receive the networks' satellite feeds, and also due to the broadcast networks' contractual obligations with AT&T up until the breakup in 1984, when the networks immediately switched to satellite exclusively. This was due to several reasons: the much cheaper rates for transmission offered by satellite operators that were not influenced by the high tariffs set by AT&T for broadcast customers, the split of the Bell System into separate RBOCs, and the end of contracts that the broadcast companies had with AT&T.
1981
IBM Personal Computer
The IBM Personal Computer, commonly known as the IBM PC, is the original version and progenitor of the IBM PC compatible hardware platform. It is IBM model number 5150, and was introduced on August 12, 1981. It was created by a team of engineers and designers under the direction of Don Estridge of the IBM Entry Systems Division in Boca Raton, Florida.Alongside "microcomputer" and "home computer", the term "personal computer" was already in use before 1981. It was used as early as 1972 to characterize Xerox PARC's Alto. However, because of the success of the IBM Personal Computer, the term PC came to mean more specifically a microcomputer compatible with IBM's PC products.
Type Personal computer
Release date August 12, 1981; 30 years ago
Discontinued April 2, 1987; 25 years ago
Operating system IBM BASIC / PC-DOS 1.0
CP/M-86
UCSD p-System
CPU Intel 8088 @ 4.77 MHz
Memory 16 kB ~ 256 kB
Sound 1-channel PWM
DOS (English pronunciation: /dɒs/), short for "Disk Operating System",
Is an acronym for several closely related operating systems that dominated the IBM PC compatible market between 1981 and 1995, or until about 2000 if one includes the partially DOS-based Microsoft Windows versions 95, 98, and Millennium Edition.
Related systems include MS-DOS, PC-DOS, DR-DOS, FreeDOS, PTS-DOS, ROM-DOS, Novell DOS, OpenDOS and several others. In spite of the common usage, none of these systems were simply named "DOS" (a name given only to an unrelated IBM mainframe operating system in the 1960s). A number of unrelated, non-x86 microcomputer disk operating systems had "DOS" in their name, and are often referred to simply as "DOS" when discussing machines that use them (e.g. AmigaDOS, AMSDOS, ANDOS, Apple DOS, Atari DOS, Commodore DOS, CSI-DOS, ProDOS, and TRS-DOS). While providing many of the same operating system functions for their respective computer systems, programs running under any one of these operating systems would not run under others
1984
Macintosh
The Macintosh ( /ˈmækɨntɒʃ/ MAK-in-tosh), or Mac, is a series of personal computers (PCs) designed, developed, and marketed by Apple Inc. The first Macintosh was introduced by Apple's then-chairman Steve Jobs on January 24, 1984; it was the first commercially successful personal computer to feature a mouse and a graphical user interface rather than a command-line interface. The company continued to have success through the second half of the 1980s, primarily because the sales of the Apple II series remained strong even after the introduction of the Macintosh, only to see it dissipate in the 1990s as the personal computer market shifted toward the "Wintel" platform: IBM PC compatible machines running MS-DOS and Microsoft Windows. In 1998, Apple consolidated its multiple consumer-level desktop models into the iMac all-in-one. This proved to be a sales success and saw the Macintosh brand revitalized. Current Mac systems are mainly targeted at the home, education, and creative professional markets. These include the descendants of the original iMac, the entry-level Mac mini desktop model, the Mac Pro tower graphics workstation, and the MacBook Air and MacBook Pro laptops. The Xserve server was discontinued January 31, 2011. Production of the Mac is based on a vertical integration model in that Apple facilitates all aspects of its hardware and creates its own operating system that is pre-installed on all Mac computers. This is in contrast to most IBM PC compatibles, where multiple sellers create and integrate hardware intended to run another company's operating software. Apple exclusively produces Mac hardware, choosing internal systems, designs, and prices. Apple does use third party components, however, such as graphics subsystems from nVidia and ATi. Current Mac CPUs use Intel's X86-64 architecture; the earliest models (1984–1994) used Motorola's 68k, and models from 1994 to 2006 used the AIM alliance's PowerPC. Apple also develops the operating system for the Mac, Mac OS X, currently on version 10.7 "Lion". The modern Mac, like other personal computers, is capable of running alternative operating systems such as Linux, FreeBSD, and, in the case of Intel-based Macs, Microsoft Windows. However, Apple does not license Mac OS X for use on non-Apple computers.
The first cellular phones: The Motorola DynaTAC 8000X, were made commercially available. Back when phones were the size and weight of a car battery and they were more expensive than the average color TV. The early cellular technology carried with it many health hazards. Cellular phones give off electromagnetic energy which is a non-ionizing form of radiation. This is similar to the radiation produced during thunder storms.
1900
2G
In the 1990s, the 'second generation' (2G) mobile phone systems emerged. Two systems competed for supremacy in the global market: the european developed GSM standard and the U.S. developed CDMA standard. These differed from the previous generation by using digital instead of analog transmission, and also fast out-of-band phone-to-network signalling. The rise in mobile phone usage as a result of 2G was explosive and this era also saw the advent of prepaid mobile phones
In 1991 the first GSM network (Radiolinja) launched in Finland. In general the frequencies used by 2G systems in Europe were higher than those in America, though with some overlap. For example, the 900 MHz frequency range was used for both 1G and 2G systems in Europe, so the 1G systems were rapidly closed down to make space for the 2G systems. In America the IS-54 standard was deployed in the same band as AMPS and displaced some of the existing analogy channels.
Coinciding with the introduction of 2G systems was a trend away from the larger "brick" phones toward tiny 100–200g hand-held devices. This change was possible not only through technological improvements such as more advanced batteries and more energy-efficient electronics, but also because of the higher density of cell sites to accommodate increasing usage. The latter meant that the average distance transmission from phone to the base station shortened, leading to increased battery life whilst on the move. The second generation introduced a new variant of communication called SMS or text messaging. It was initially available only on GSM networks but spread eventually on all digital networks. The first machine-generated SMS message was sent in the UK on 3 December 1992 followed in 1993 by the first person-to-person SMS sent in Finland. The advent of prepaid services in the late 1990s soon made SMS the communication method of choice amongst the young, a trend which spread across all ages. 2G also introduced the ability to access media content on mobile phones. In 1998 the first downloadable content sold to mobile phones was the ring tone, launched by Finland's Radiolinja (now Elisa). Advertising on the mobile phone first appeared in Finland when a free daily SMS news headline service was launched in 2000, sponsored by advertising. Mobile payments were trialed in 1998 in Finland and Sweden where a mobile phone was used to pay for a Coca Cola vending machine and car parking. Commercial launches followed in 1999 in Norway. The first commercial payment system to mimic banks and credit cards was launched in the Philippines in 1999 simultaneously by mobile operators Globe and Smart. The first full internet service on mobile phones was introduced by NTT DoCoMo in Japan in 1999.
1995
Internet
The history of the Internet began with the development of computers in the 1950s. This began with point-to-point communication between mainframe computers and terminals, expanded to point-to-point connections between computers and then early research into packet switching. Packet switched networks such as ARPANET, Mark I at NPL in the UK, CYCLADES, Merit Network, Tymnet, and Telenet, and were developed in the late 1960s and early 1970s using a variety of protocols. The ARPANET in particular led to the development of protocols for internetworking, where multiple separate networks could be joined together into a network of networks. 1974 ABC interview with Arthur C. Clarke in which he describes a future of ubiquitous networked personal computers, In 1982 the Internet Protocol Suite (TCP/IP) was standardized and the concept of a world-wide network of fully interconnected TCP/IP networks called the Internet was introduced. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) developed the Computer Science Network (CSNET) and again in 1986 when NSFNET provided access to supercomputer sites in the United States from research and education organizations. Commercial internet service providers (ISPs) began to emerge in the late 1980s and 1990s. The ARPANET was decommissioned in 1990. The Internet was commercialized in 1995 when NSFNET was decommissioned, removing the last restrictions on the use of the Internet to carry commercial traffic. Since the mid-1990s the Internet has had a drastic impact on culture and commerce, including the rise of near-instant communication by electronic mail, instant messaging, Voice over Internet Protocol (VoIP) "phone calls", two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as NSF's very high speed Backbone Network Service (vBNS), Internet2, and National LambdaRail. Increasing amounts of data are transmitted at higher and higher speeds over fibre optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet continues to grow, driven by ever greater amounts of online information and knowledge, commerce, entertainment and social networking.
2000
3G
3G or 3rd generation mobile telecommunications is a generation of standards for mobile phones and mobile telecommunication services fulfilling the International Mobile Telecommunications-2000 (IMT-2000) specifications by the International Telecommunication Union.[1] Application services include wide-area wireless voice telephone, mobile Internet access, video calls and mobile TV, all in a mobile environment.
Several telecommunications companies market wireless mobile Internet services as 3G, indicating that the advertised service is provided over a 3G wireless network. Services advertised as 3G are required to meet IMT-2000 technical standards, including standards for reliability and speed (data transfer rates). To meet the IMT-2000 standards, a system is required to provide peak data rates of at least 200 kbit/s (about 0.2 Mbit/s). However, many services advertised as 3G provide higher speed than the minimum technical requirements for a 3G service. Recent 3G releases, often denoted 3.5G and 3.75G, also provide mobile broadband access of several Mbit/s to smartphones and mobile modems in laptop computers.
The following standards are typically branded 3G: the UMTS system, first offered in 2001, standardized by 3GPP, used primarily in Europe, Japan, China (however with a different radio interface) and other regions predominated by GSM 2G system infrastructure. The cell phones are typically UMTS and GSM hybrids. Several radio interfaces are offered, sharing the same infrastructure:
The original and most widespread radio interface is called W-CDMA.
The TD-SCDMA radio interface was commercialised in 2009 and is only offered in China.
The latest UMTS release, HSPA+, can provide peak data rates up to 56 Mbit/s in the downlink in theory (28 Mbit/s in existing services) and 22 Mbit/s in the uplink. the CDMA2000 system, first offered in 2002, standardized by 3GPP2, used especially in North America and South Korea, sharing infrastructure with the IS-95 2G standard. The cell phones are typically CDMA2000 and IS-95 hybrids. The latest release EVDO Rev B offers peak rates of 14.7 Mbit/s downstream.
The above systems and radio interfaces are based on spread spectrum radio transmission technology. While the GSM EDGE standard ("2.9G"), DECT cordless phones and Mobile WiMAX standards formally also fulfill the IMT-2000 requirements and are approved as 3G standards by ITU, these are typically not branded 3G, and are based on completely different technologies.
A new generation of cellular standards has appeared approximately every tenth year since 1G systems were introduced in 1981/1982. Each generation is characterized by new frequency bands, higher data rates and non backwards compatible transmission technology. The first release of the 3GPP Long Term Evolution (LTE) standard does not completely fulfill the ITU 4G requirements called IMT-Advanced. First release LTE is not backwards compatible with 3G, but is a pre-4G or 3.9G technology, however sometimes branded "4G" by the service providers. Its evolution LTE Advanced is a 4G technology. WiMAX is another technology verging on or marketed as 4G.
2009
4G
In telecommunications, 4G is the fourth generation of cell phone mobile communications standards. It is a successor of the third generation (3G) standards. A 4G system provides mobile ultra-broadband Internet access, for example to laptops with USB wireless modems, to smart phones, and to other mobile devices. Conceivable applications include amended mobile web access, IP telephony, gaming services, high-definition mobile TV, video conferencing and 3D television.
Two 4G candidate systems are commercially deployed: The Mobile WiMAX standard (at first in South Korea in 2006), and the first-release Long term evolution (LTE) standard (in Scandinavia since 2009). It has however been debated if these first-release versions should be considered as 4G or not. See technical definition. In the U.S. Sprint Nextel has deployed Mobile WiMAX networks since 2008, and MetroPCS was the first operator to offer LTE service in 2010. USB wireless modems have been available since the start, while WiMAX smart phones have been available since 2010, and LTE smart phones since 2011. Equipment made for different continents are not always compatible, because of different frequency bands. Mobile WiMAX and LTE smart phones are currently (April 2012) not available for the European market.