text stringlengths 9 2.4k |
|---|
With the breakup of the Bell System, Bell Labs became a subsidiary of AT&T Technologies in 1984, which resulted in a drastic decline in its funding. In 1996, AT&T spun off AT&T Technologies, which was renamed to Lucent Technologies, using the Murray Hill site for headquarters. Bell Laboratories was split with AT&T retaining parts as AT&T Laboratories. In 2006, Lucent merged with French telecommunication company Alcatel to form Alcatel-Lucent, which was acquired by Nokia in 2016.
Origin and historical locations.
Bell's personal research after the telephone.
In 1880, when the French government awarded Alexander Graham Bell the Volta Prize of 50,000francs for the invention of the telephone (equivalent to about US$10,000 at the time, or about $ now), he used the award to fund the Volta Laboratory (also known as the "Alexander Graham Bell Laboratory") in Washington, D.C. in collaboration with Sumner Tainter and Bell's cousin Chichester Bell. The laboratory was variously known as the "Volta Bureau", the "Bell Carriage House", the "Bell Laboratory" and the "Volta Laboratory".
|
It focused on the analysis, recording, and transmission of sound. Bell used his considerable profits from the laboratory for further research and education advancing the diffusion of knowledge relating to the deaf. This resulted in the founding of the Volta Bureau () at the Washington, D.C. home of his father, linguist Alexander Melville Bell. The carriage house there, at 1527 35th Street N.W., became their headquarters in 1889.
In 1893, Bell constructed a new building close by at 1537 35th Street N.W., specifically to house the lab. This building was declared a National Historic Landmark in 1972.
After the invention of the telephone, Bell maintained a relatively distant role with the Bell System as a whole, but continued to pursue his own personal research interests.
Early antecedent.
The Bell Patent Association was formed by Alexander Graham Bell, Thomas Sanders, and Gardiner Hubbard when filing the first patents for the telephone in 1876.
Bell Telephone Company, the first telephone company, was formed a year later. It later became a part of the American Bell Telephone Company.
|
In 1884, the American Bell Telephone Company created the Mechanical Department from the Electrical and Patent Department formed a year earlier.
The American Telephone and Telegraph Company and its own subsidiary company took control of American Bell and the Bell System by 1899.
American Bell held a controlling interest in Western Electric (which was the manufacturing arm of the business) whereas AT&T was doing research into the service providers.
Formal organization and location changes.
In 1896, Western Electric bought property at 463 West Street to centralize the manufacturers and engineers which had been supplying AT&T with such technology as telephones, telephone exchange switches and transmission equipment.
During the early 20th century, several historically significant laboratories were established. In 1915, the first radio transmissions were made from a shack in Montauk, Long Island. That same year, tests were performed on the first transoceanic radio telephone at a house in Arlington County, Virginia. A radio reception laboratory was established in 1919 in the Cliffwood section of Aberdeen Township, New Jersey. Additionally for 1919, a transmission studies site was established in Phoenixville, Pennsylvania that built, in 1929, the coaxial conductor line for first tests of long-distance transmission in various frequencies.
|
On January 1, 1925, Bell Telephone Laboratories, Inc. was organized to consolidate the development and research activities in the communication field and allied sciences for the Bell System. Ownership was evenly shared between Western Electric and AT&T. The new company had 3600 engineers, scientists, and support staff. Its space was expanded with a new building occupying about one quarter of a city block.
The first chairman of the board of directors was John J. Carty, AT&T's vice president, and the first president was Frank B. Jewett, also a board member, who stayed there until 1940. The operations were directed by E. B. Craft, executive vice-president, and formerly chief engineer at Western Electric.
In the early 1920s, a few outdoor facilities and radio communications development facilities were developed. In 1925, the test plot studies were established at Gulfport, Mississippi, where there were numerous telephone pole samples established for wood preservation. At the Deal, New Jersey location, work was done on ship-to-shore radio telephony. In 1926, in the Whippany section of Hanover Township, New Jersey, land was acquired and established for the development of a 50-kilowatt broadcast transmitter. In 1931, Whippany increased with added from a nearby property. In 1928, a site in Chester Township, New Jersey, was leased for outdoor tests, though the facility became inadequate for such purposes. In 1930, the Chester location required the purchase of an additional of land to be used for a new outdoor plant development laboratory. Prior to Chester being established, a test plot was installed in Limon, Colorado in 1929, similar to the one in Gulfport. The three test plots at Gulfport, Limon, and Chester were outdoor facilities for preservatives and prolonging the use of telephone poles. Additionally, in 1929, a land expansion was done at the Deal Labs to . This added land increased the facility for radio transmission studies.
|
The beginning of 1930s, established three facilities with radio communications experiments and chemical aspects testing. By 1939, the Summit, New Jersey, chemical laboratory was nearly 10 years established in a three-story building conducted experiments in corrosion, using various fungicides tests on cables, metallic components, or wood. For 1929, land was purchased in Holmdel Township, New Jersey, for a radio reception laboratory to replace the Cliffwood location that had been in operation since 1919. In 1930, the Cliffwood location was ending its operations as Holmdel was established. Whereas, in 1930, a location in Mendham Township, New Jersey, was established to continue radio receiver developments farther from the Whippany location and eliminate transmitter interference at that facility with developments. The Mendham location worked on communication equipment and broadcast receivers. These devices were used for marine, aircraft, and police services as well as the location performed precision frequency-measuring apparatus, field strength measurements, and conducted radio interference.
|
By the early 1940s, Bell Labs engineers and scientists had begun to move to other locations away from the congestion and environmental distractions of New York City, and in 1967 Bell Laboratories headquarters was officially relocated to Murray Hill, New Jersey.
Among the later Bell Laboratories locations in New Jersey were Holmdel Township, Crawford Hill, the Deal Test Site, Freehold, Lincroft, Long Branch, Middletown, Neptune Township, Princeton, Piscataway, Red Bank, Chester Township, and Whippany. Of these, Murray Hill and Crawford Hill remain in existence (the Piscataway and Red Bank locations were transferred to and are now operated by Telcordia Technologies and the Whippany site was purchased by Bayer).
The largest grouping of people in the company was in Illinois, at Naperville-Lisle, in the Chicago area, which had the largest concentration of employees (about 11,000) prior to 2001. There also were groups of employees in Indianapolis, Indiana; Columbus, Ohio; North Andover, Massachusetts; Allentown, Pennsylvania; Reading, Pennsylvania; and Breinigsville, Pennsylvania; Burlington, North Carolina (1950s–1970s, moved to Greensboro 1980s) and Westminster, Colorado. Since 2001, many of the former locations have been scaled down or closed.
|
Bell's Holmdel research and development lab, a structure set on , was closed in 2007. The mirrored-glass building was designed by Eero Saarinen. In August 2013, Somerset Development bought the building, intending to redevelop it into a mixed commercial and residential project. A 2012 article expressed doubt on the success of the newly named Bell Works site, but several large tenants had announced plans to move in through 2016 and 2017.
List of Bell Labs (1974).
Bell Lab's 1974 corporate directory listed 22 labs in the United States, located in:
List of Bell Labs (2024).
Nokia Bell Lab's 2024 website pictured 10 labs, located in:
Also listed as research locations without additional information was Sunnyvale, California, US and Tampere, Finland.
The Naperville, Illinois Bell Labs location near Chicago was considered the Chicago Innovation Center and hosted Nokia's second annual Algorithm World event in 2022.
Discoveries and developments.
Bell Laboratories was, and is, regarded by many as the premier research facility of its type, developing a wide range of revolutionary technologies, including radio astronomy, the transistor, the laser, information theory, the operating system Unix, the programming languages C and C++, solar cells, the charge-coupled device (CCD), and many other optical, wireless, and wired communications technologies and systems.
1920s.
|
In 1924, Bell Labs physicist Walter A. Shewhart proposed the control chart as a method to determine when a process was in a state of statistical control. Shewhart's methods were the basis for statistical process control (SPC): the use of statistically based tools and techniques to manage and improve processes. This was the origin of the modern quality control movement, including Six Sigma.
In 1926, the laboratories invented an early synchronous-sound motion picture system, in competition with Fox Movietone and DeForest Phonofilm.
In 1927, a Bell team headed by Herbert E. Ives successfully transmitted long-distance 128-line television images of Secretary of Commerce Herbert Hoover from Washington to New York. In 1928 the thermal noise in a resistor was first measured by John B. Johnson, for which Harry Nyquist provided the theoretical analysis; this is now termed "Johnson-Nyquist noise". During the 1920s, the one-time pad cipher was invented by Gilbert Vernam and Joseph Mauborgne at the laboratories. Bell Labs' Claude Shannon later proved that it is unbreakable.
|
In 1928, Harold Black invented the negative feedback system commonly used in amplifiers. Later, Harry Nyquist analyzed Black's design rule for negative feedback. This work was published in 1932 and became known as the Nyquist criterion.
1930s.
In 1931, a foundation for radio astronomy was laid by Karl Jansky during his work investigating the origins of static on long-distance shortwave communications. He discovered that radio waves were being emitted from the center of the galaxy.
In 1931 and 1932, the labs made experimental high fidelity, long playing, and even stereophonic recordings of the Philadelphia Orchestra, conducted by Leopold Stokowski.
In 1933, stereo signals were transmitted live from Philadelphia to Washington, D.C.
In 1937, the vocoder, an electronic speech compression device, or codec, and the Voder, the first electronic speech synthesizer, were developed and demonstrated by Homer Dudley, the Voder being demonstrated at the 1939 New York World's Fair. Bell researcher Clinton Davisson shared the Nobel Prize in Physics with George Paget Thomson for the discovery of electron diffraction, which helped lay the foundation for solid-state electronics.
1940s.
|
In the early 1940s, the photovoltaic cell was developed by Russell Ohl. In 1943, Bell developed SIGSALY, the first digital scrambled speech transmission system, used by the Allies in World War II. The British wartime codebreaker Alan Turing visited the labs at this time, working on speech encryption and meeting Claude Shannon.
Bell Labs Quality Assurance Department gave the world and the United States such statisticians as Walter A. Shewhart, W. Edwards Deming, Harold F. Dodge, George D. Edwards, Harry Romig, R. L. Jones, Paul Olmstead, E.G.D. Paterson, and Mary N. Torrey. During World War II, Emergency Technical Committee – Quality Control, drawn mainly from Bell Labs' statisticians, was instrumental in advancing Army and Navy ammunition acceptance and material sampling procedures.
In 1947, the transistor, arguably the most important invention developed by Bell Laboratories, was invented by John Bardeen, Walter Houser Brattain, and William Bradford Shockley (who subsequently shared the Nobel Prize in Physics in 1956). In 1947, Richard Hamming invented Hamming codes for error detection and correction. For patent reasons, the result was not published until 1950.
|
In 1948, "A Mathematical Theory of Communication", one of the founding works in information theory, was published by Claude Shannon in the "Bell System Technical Journal". It built in part on earlier work in the field by Bell researchers Harry Nyquist and Ralph Hartley, but went much further. Bell Labs also introduced a series of increasingly complex calculators through the decade. Shannon was also the founder of modern cryptography with his 1949 paper "Communication Theory of Secrecy Systems".
1950s.
The 1950s also saw developments based upon information theory. The central development was binary code systems. Efforts concentrated on the prime mission of supporting the Bell System with engineering advances, including the N-carrier system, TD microwave radio relay, direct distance dialing, E-repeater, wire spring relay, and the Number Five Crossbar Switching System.
In 1952, William Gardner Pfann revealed the method of zone melting, which enabled semiconductor purification and level doping.
In 1953, Maurice Karnaugh developed the Karnaugh map, used for managing of Boolean algebraic expressions.
|
In January 1954, Bell Labs built one of the first completely transistorized computer machines, TRADIC or Flyable TRADIC, for the United States Air Force with 10,358 germanium point-contact diodes and 684 Bell Labs Type 1734 Type A cartridge transistors. The design team was led by electrical engineer Jean Howard Felker with James R. Harris and Louis C. Brown ("Charlie Brown") as the lead engineers on the project, which started in 1951. The device took only 3 cubic-feet and consumed 100 watt power for its small and low powered design in comparison to the vacuum tube designs of the times. The device could be installed in a B-52 Stratofortress Bomber and had a performance up to one million logical operations a second. The flyable program used a Mylar sheet with punched holes, instead of the removable plugboard.
In 1954, the first modern solar cell was invented at Bell Laboratories.
In 1955, Carl Frosch and Lincoln Derick discovered semiconductor surface passivation by silicon dioxide.
In 1956 TAT-1, the first transatlantic communications cable to carry telephone conversations, was laid between Scotland and Newfoundland in a joint effort by AT&T, Bell Laboratories, and British and Canadian telephone companies.
|
In 1957, Max Mathews created MUSIC, one of the first computer programs to play electronic music. Robert C. Prim and Joseph Kruskal developed new greedy algorithms that revolutionized computer network design.
In 1957 Frosch and Derick, using masking and predeposition, were able to manufacture silicon dioxide field effect transistors; the first planar transistors, in which drain and source were adjacent at the same surface. They showed that silicon dioxide insulated, protected silicon wafers and prevented dopants from diffusing into the wafer.
In 1958, a technical paper by Arthur Schawlow and Charles Hard Townes first described the laser.
Following Frosch and Derick research, Mohamed Atalla and Dawon Kahng proposed a silicon MOS transistor in 1959 and successfully demonstrated a working MOS device with their Bell Labs team in 1960. Their team included E. E. LaBate and E. I. Povilonis who fabricated the device; M. O. Thurston, L. A. D’Asaro, and J. R. Ligenza who developed the diffusion processes, and H. K. Gummel and R. Lindner who characterized the device.
|
K. E. Daburlos and H. J. Patterson of Bell Laboratories continued on the work of C. Frosch and L. Derick, and developed a process similar to Hoerni’s planar process about the same time.
J.R. Ligenza and W.G. Spitzer studied the mechanism of thermally grown oxides, fabricated a high quality Si/SiO2 stack and published their results in 1960.
1960s.
On October 1, 1960, the Kwajalein Field Station was announced as a location for the Nike Zeus test program. Mr. R. W. Benfer was the first director to arrive shortly on October 5 for the program. Bell Labs designed many of the major system elements and conducted fundamental investigations of phase-controlled scanning antenna arrays.
In December 1960, Ali Javan, PhD physicist from the University of Tehran, Iran with help by Rolf Seebach and his associates William Bennett and Donald Heriot, successfully operated the first gas laser, the first continuous-light laser, operating at an unprecedented accuracy and color purity.
In 1962, the electret microphone was invented by Gerhard M. Sessler and James E. West. Also in 1962, John R. Pierce's vision of communications satellites was realized by the launch of Telstar.
|
On July 10, 1962, the Telstar spacecraft was launched into orbit by NASA and it was designed and built by Bell Laboratories. The first worldwide television broadcast was July 23, 1962 with a press conference by President Kennedy.
In Spring 1964, the building of an electronic switching systems center was planned at Bell Laboratories near Naperville, Illinois. The building in 1966 would be called Indian Hill, and development work from former electronic switching organization at Holmdel and Systems Equipment Engineering organization would occupy the laboratory with engineers from Western Electric Hawthorne Works. Scheduled for work were about 1,200 people when completed in 1966, and peaked at 11,000 before October 2001 Lucent Technologies downsizing occurred.
In 1964, the carbon dioxide laser was invented by Kumar Patel and the discovery/operation of the was demonstrated by Joseph E. Geusic "et al." Experiments by Myriam Sarachik provided the first data that confirmed the Kondo effect. The research of Philip W. Anderson into electronic structure of magnetic and disordered systems led to improved understanding of metals and insulators for which he was awarded the Nobel Prize for Physics in 1977.
|
In 1965, Penzias and Wilson discovered the cosmic microwave background, for which they were awarded the Nobel Prize in Physics in 1978.
Frank W. Sinden, Edward E. Zajac, Ken Knowlton, and A. Michael Noll made computer-animated movies during the early to mid-1960s. Ken Knowlton invented the computer animation language BEFLIX. The first digital computer art was created in 1962 by Noll.
In 1966, orthogonal frequency-division multiplexing (OFDM), a key technology in wireless services, was developed and patented by R. W. Chang.
In December 1966, the New York City site was sold and became the Westbeth Artists Community complex.
In 1968, molecular beam epitaxy was developed by J.R. Arthur and A.Y. Cho; molecular beam epitaxy allows semiconductor chips and laser matrices to be manufactured one atomic layer at a time.
In 1969, Dennis Ritchie and Ken Thompson created the computer operating system UNIX for the support of telecommunication switching systems as well as general-purpose computing. Also, in 1969, the charge-coupled device (CCD) was invented by Willard Boyle and George E. Smith, for which they were awarded the Nobel Prize in Physics in 2009.
|
From 1969 to 1971, Aaron Marcus, the first graphic designer involved with computer graphics, researched, designed, and programmed a prototype interactive page-layout system for the Picturephone.
1970s.
The 1970s and 1980s saw more and more computer-related inventions at the Bell Laboratories as part of the personal computing revolution.
In the 1970s, major central office technology evolved from crossbar electromechanical relay-based technology and discrete transistor logic to Bell Labs-developed thick film hybrid and transistor–transistor logic (TTL), stored program-controlled switching systems; 1A/#4 TOLL Electronic Switching Systems (ESS) and 2A Local Central Offices produced at the Bell Labs Naperville and Western Electric Lisle, Illinois facilities. This technology evolution dramatically reduced floor space needs. The new ESS also came with its own diagnostic software that needed only a switchman and several frame technicians to maintain.
About 1970, the coax-22 cable was developed by Bell Labs. This coax cable with 22 strands allowed a total capacity of 132,000 telephone calls. Previously, a 12-strand coax cable was used for L-carrier systems. Both of these types of cables were manufactured at Western Electrics' Baltimore Works facility on machines designed by a Western Electric Senior development engineer.
|
In 1970, A. Michael Noll invented a tactile, force-feedback system, coupled with interactive stereoscopic computer display.
In 1971, an improved task priority system for computerized telephone exchange switching systems for telephone traffic was invented by Erna Schneider Hoover, who received one of the first software patents for it.
In 1972, Dennis Ritchie developed the compiled programming language C as a replacement for the interpreted language B, which was then used in a worse is better rewrite of UNIX. Also, the language AWK was designed and implemented by Alfred Aho, Peter Weinberger, and Brian Kernighan of Bell Laboratories. Also in 1972, Marc Rochkind invented the Source Code Control System.
In 1976, optical fiber systems were first tested in Georgia.
Production of their first internally designed microprocessor, the BELLMAC-8, began in 1977. In 1980 they demonstrated the first single-chip 32-bit microprocessor, the Bellmac 32A, which went into production in 1982.
In 1978, the proprietary operating system Oryx/Pecos was developed from scratch by Bell Labs in order to run AT&T's large-scale PBX switching equipment. It was first used with AT&T's flagship System 75, and until very recently was used in all variations up through and including Definity G3 (Generic 3) switches, now manufactured by Avaya.
1980s.
|
During the 1980s, the operating system Plan 9 from Bell Labs was developed extending the UNIX model. Also, the Radiodrum, an electronic music instrument played in three space dimensions, was invented.
In 1980, the TDMA digital cellular telephone technology was patented.
In late 1981, the Bell Labs Research organization internal use of a terminal called Jerq led to the Blit terminal being renamed by designers Rob Pike and Bart Locanthi, Jr for the UNIX operating system. It was a programmable bitmap graphics terminal using multi-layers of opened windows operated by a keyboard and a distinguished red-colored three-button digitized mouse. It was later known as the AT&T 5620 DMD terminal for commercial sales. The Blit used the Motorola 68000 microprocessor, whereas the Teletype/AT&T 5620 Dot Mapped Display terminal used the Western Electric WE32000 microprocessor.
The launching of the Bell Labs Fellows Award started in 1982 to recognize and honor scientists and engineers who have made outstanding and sustained R&D contributions at AT&T with a level of distinction. As of the 2021 inductees, 336 people have received the honor.
|
Ken Thompson and Dennis Ritchie were also Bell Labs Fellows for 1982. Ritchie started in 1967 at Bell Labs in the Bell Labs Computer Systems Research department. Thompson started in 1966. Both co-inventors of the UNIX operating system and C language were also awarded decades later the 2011 Japan Prize for Information and Communications.
In 1982, fractional quantum Hall effect was discovered by Horst Störmer and former Bell Laboratories researchers Robert B. Laughlin and Daniel C. Tsui; they consequently won a Nobel Prize in 1998 for the discovery.
In 1984, the first photoconductive antennas for picosecond electromagnetic radiation were demonstrated by Auston and others. This type of antenna became an important component in terahertz time-domain spectroscopy. In 1984, Karmarkar's algorithm for linear programming was developed by mathematician Narendra Karmarkar. Also in 1984, a divestiture agreement signed in 1982 with the American Federal government forced the breakup of AT&T, and Bellcore (now iconectiv) was split off from Bell Laboratories to provide the same R&D functions for the newly created local exchange carriers. AT&T also was limited to using the Bell trademark only in association with Bell Laboratories. "Bell Telephone Laboratories, Inc." became a wholly owned company of the new AT&T Technologies unit, the former Western Electric. The 5ESS Switch was developed during this transition.
|
The National Medal of Technology was awarded to Bell Labs, the first corporation to achieve this honor in February 1985.
In 1985, laser cooling was used to slow and manipulate atoms by Steven Chu and team. In 1985, the modeling language "A Mathematical Programming Language", AMPL, was developed by Robert Fourer, David M. Gay and Brian Kernighan at Bell Laboratories. Also in 1985, Bell Laboratories was awarded the National Medal of Technology "For contribution over decades to modern communication systems".
In 1985, the programming language C++ had its first commercial release. Bjarne Stroustrup started developing C++ at Bell Laboratories in 1979 as an extension to the original C language.
Arthur Ashkin invented optical tweezers that grab particles, atoms, viruses and other living cells with their laser beam fingers. A major breakthrough came in 1987, when Ashkin used the tweezers to capture living bacteria without harming them. He immediately began studying biological systems using the optical tweezers, which are now widely used to investigate the machinery of life. He was awarded the Nobel Prize in Physics (2018) for his work involving optical tweezers and their application to biological systems.
|
In the mid-1980s, the Transmission System departments of Bell Labs developed highly reliable long-haul fiber-optic communications systems based on SONET, and network operations techniques, that enabled very high volume, near-instantaneous communications across the North American continent. Fail-safe and disaster-related traffic management operations systems enhanced the usefulness of the fiber optics. There was a synergy in the land-based and seas-based fiber optic systems, although they were developed by different divisions within the company. These systems are still in use throughout the U.S. today.
Charles A. Burrus became a Bell Labs Fellow in 1988 for his work done as a Technical Staff member. Prior to this accomplishment, was awarded in 1982 the AT&T Bell Laboratories Distinguished Technical Staff Award. Charles started in 1955 at the Holmdel Bell Labs location and retired in 1996 with consultations to Lucent Technologies up to 2002.
In 1988, TAT-8 became the first transatlantic fiber-optic cable. Bell Labs in Freehold, NJ developed the 1.3-micron fiber, cable, splicing, laser detector, and 280 Mbit/s repeater for 40,000 telephone-call capacity.
|
In the late 1980s, realizing that voiceband modems were approaching the Shannon limit on bit rate, Richard D. Gitlin, Jean-Jacques Werner, and their colleagues pioneered a major breakthrough by inventing DSL (digital subscriber line) and creating the technology that enabled megabit transmission on installed copper telephone lines, thus facilitating the broadband era.
1990s.
Bell Labs' John Mayo received the National Medal of Technology in 1990.
In May 1990, Ronald Snare was named AT&T Bell Laboratories Fellow, for "Singular contributions to the development of the common-channel signaling network and the signal transfer points globally." This system began service in the United States in 1978.
In the early 1990s, approaches to increase modem speeds to 56K were explored at Bell Labs, and early patents were filed in 1992 by Ender Ayanoglu, Nuri R. Dagdeviren and their colleagues.
The scientist, W. Lincoln Hawkins in 1992 received the National Medal of Technology for work done at Bell Labs.
In 1992, Jack Salz, Jack Winters and Richard D. Gitlin provided the foundational technology to demonstrate that adaptive antenna arrays at the transmitter and receiver can substantially increase both the reliability (via diversity) and capacity (via spatial multiplexing) of wireless systems without expanding the bandwidth. Subsequently, the BLAST system proposed by Gerard Foschini and colleagues dramatically expanded the capacity of wireless systems. This technology, known today as MIMO (Multiple Input Multiple Output), was a significant factor in the standardization, commercialization, performance improvement, and growth of cellular and wireless LAN systems.
|
Amos Joel in 1993 received the National Medal of Technology.
Two AT&T Bell Labs scientists, Joel Engel and Richard Frenkiel, were honored with the National Medal of Technology, in 1994.
In 1994, the quantum cascade laser was invented by Federico Capasso, Alfred Cho, Jerome Faist and their collaborators. Also in 1994, Peter Shor devised his quantum factorization algorithm.
In 1996, SCALPEL electron lithography, which prints features atoms wide on microchips, was invented by Lloyd Harriott and his team. The operating system Inferno, an update of Plan 9, was created by Dennis Ritchie with others, using the then-new concurrent programming language Limbo. A high performance database engine (Dali) was developed which became DataBlitz in its product form.
In 1996, AT&T spun off Bell Laboratories, along with most of its equipment manufacturing business, into a new company named Lucent Technologies. AT&T retained a small number of researchers who made up the staff of the newly created AT&T Labs.
Lucy Sanders was the third woman to receive the Bell Labs Fellow award in 1996, for her work in creating a RISC chip that allowed more phone calls using software and hardware on a single server. She started in 1977 and was one of the few woman engineers at Bell Labs.
|
In November 1997, Lucent planned a Bell Laboratories location at Yokosuka Research Park in Yokosuka, Japan for developing a third generation Wideband Code Division Multiple Access cellular system (W-CDMA.)
In 1997, the smallest then-practical transistor (60 nanometers, 182 atoms wide) was built. In 1998, the first optical router was invented.
Rudolph Kazarinov and Federico Capasso received the optoelectronics Rank Prize on December 8, 1998.
In December 1998, Ritchie and Thompson also were honorees of the National Medal of Technology for their work done for pre-Lucent Technologies Bell Labs. The award was presented by U.S. President William Clinton in 1999 in a White House ceremony.
21st century.
2000 was an active year for the Laboratories, in which DNA machine prototypes were developed; progressive geometry compression algorithm made widespread 3-D communication practical; the first electrically powered organic laser was invented; a large-scale map of cosmic dark matter was compiled; and the F-15 (material), an organic material that makes plastic transistors possible, was invented.
|
In 2002, physicist Jan Hendrik Schön was fired after his work was found to contain fraudulent data. It was the first known case of fraud at Bell Labs.
In 2003, the New Jersey Institute of Technology Biomedical Engineering Laboratory was created at Murray Hill, New Jersey.
In 2004, Lucent Technologies awarded two women the prestigious Bell Labs Fellow Award. Magaly Spector, a director in INS/Network Systems Group, was awarded for "sustained and exceptional scientific and technological contributions in solid-state physics, III-V material for semiconductor lasers, Gallium Arsenide integrated circuits, and the quality and reliability of products used in high speed optical transport systems for next generation high bandwidth communication." Eve Varma, a technical manager in MNS/Network Systems Group, was awarded for her citation in "sustained contributions to digital and optical networking, including architecture, synchronization, restoration, standards, operations and control."
In 2005, Jeong H. Kim, former President of Lucent's Optical Network Group, returned from academia to become the President of Bell Laboratories.
|
In April 2006, Bell Laboratories' parent company, Lucent Technologies, signed a merger agreement with Alcatel. On December 1, 2006, the merged company, Alcatel-Lucent, began operations. This deal raised concerns in the United States, where Bell Laboratories works on defense contracts. A separate company, LGS Innovations, with an American board was set up to manage Bell Laboratories' and Lucent's sensitive U.S. government contracts. In March 2019, LGS Innovations was purchased by CACI.
In December 2007, it was announced that the former Lucent Bell Laboratories and the former Alcatel Research and Innovation would be merged into one organization under the name of Bell Laboratories. This is the first period of growth following many years during which Bell Laboratories progressively lost manpower due to layoffs and spin-offs making the company shut down briefly.
In February 2008, Alcatel-Lucent continued the Bell Laboratories tradition of awarding the prestigious award for outstanding technical contributors. Martin J. Glapa, a former chief Technical Officer of Lucent's Cable Communications Business Unit and Director of Advanced Technologies, was presented by Alcatel-Lucent Bell Labs President Jeong H. Kim with the 2006 Bell Labs Fellow Award in Network Architecture, Network Planning, and Professional Services with particular focus in Cable TV Systems and Broadband Services having "significant resulting Alcatel-Lucent commercial successes." Glapa is a patent holder and has co-written the 2004 technical paper called "Optimal Availability & Security For Voice Over Cable Networks" and co-authored the 2008 "Impact of bandwidth demand growth on HFC networks" published by IEEE.
|
As of July 2008, however, only four scientists remained in physics research, according to a report by the scientific journal "Nature".
On August 28, 2008, Alcatel-Lucent announced it was pulling out of basic science, material physics, and semiconductor research, and it will instead focus on more immediately marketable areas, including networking, high-speed electronics, wireless networks, nanotechnology and software.
In 2009, Willard Boyle and George Smith were awarded the Nobel Prize in Physics for the invention and development of the charge-coupled device (CCD).
Rob Soni was an Alcatel-Lucent Bell Labs Fellow in 2009 as cited for work in winning North American customers wireless business and for helping to define 4G wireless networks with transformative system architectures.
2010s.
Gee Rittenhouse, former Head of Research, returned from his position as chief operating officer of Alcatel-Lucent's Software, Services, and Solutions business in February 2013, to become the 12th President of Bell Labs.
On November 4, 2013, Alcatel-Lucent announced the appointment of Marcus Weldon as President of Bell Labs. His stated charter was to return Bell Labs to the forefront of innovation in Information and communications technology by focusing on solving the key industry challenges, as was the case in the great Bell Labs innovation eras in the past.
|
On May 20, 2014, Michel Combes, CEO of Alcatel-Lucent, announced the opening of a Bell Labs location in Tel Aviv, Israel by summer time. The Bell Labs research team would be directed by an Israeli computer scientist and alum of Bell Labs, Danny Raz. The Bell Labs research would be in 'cloud networking' technologies for communications. The location would have approximately twenty academic scientific background employees.
In July 2014, Bell Labs announced it had broken "the broadband Internet speed record" with a new technology dubbed XG-FAST that promises 10 gigabits per second transmission speeds.
In 2014, Eric Betzig shared the Nobel Prize in Chemistry for his work in super-resolved fluorescence microscopy which he began pursuing while at Bell Labs in the Semiconductor Physics Research Department.
On April 15, 2015, Nokia agreed to acquire Alcatel-Lucent, Bell Labs' parent company, in a share exchange worth $16.6 billion. Their first day of combined operations was January 14, 2016.
In September 2016, Nokia Bell Labs, along with Technische Universität Berlin, Deutsche Telekom T-Labs and the Technical University of Munich achieved a data rate of one terabit per second by improving transmission capacity and spectral efficiency in an optical communications field trial with a new modulation technique.
|
Antero Taivalsaari became a Bell Labs Fellow in 2016 for his specific work.
In 2017, Dragan Samardzija was awarded the Bell Labs Fellow.
In 2018, Arthur Ashkin shared the Nobel Prize in Physics for his work on "the optical tweezers and their application to biological systems" which was developed at Bell Labs in the 1980s.
2020s.
In 2020, Alfred Aho and Jeffrey Ullman shared the Turing Award for their work on compilers, starting with their tenure at Bell Labs during 1967–69.
On, November 16, 2021, Nokia presented the 2021 Bell Labs Fellows Award Ceremony, six new members (Igor Curcio, Matthew Andrews, Bjorn Jelonnek, Ed Harstead, Gino Dion, Esa Tiirola) held at Nokia Batvik Mansion, Finland.
In December 2021, Nokia's Chief Strategy and Technology Officer decided to reorganize Bell Labs in two separate functional organizations: Bell Labs Core Research and Bell Labs Solutions research. Bell Labs Core Research is in charge of creating disruptive technologies with 10-year horizon. Bell Labs Solutions Research, looks for shorter term solutions that can provide growth opportunities for Nokia.
|
The Nokia 2022 Bell Labs Fellows were recognized on November 29, 2022, in a New Jersey ceremony. Five researchers were inducted to the total of 341 recipients since its inception by AT&T Bell Labs in 1982. One member was from New Jersey, two were from Cambridge, UK, and two were from Finland representing Espoo and Tampere locations.
On December 11, 2023, Nokia announced a state of the art research facility in New Brunswick, New Jersey. The planned relocation of the 80 year old, Murray Hill New Jersey Bell Labs facility would take place before 2028. The new building would be LEED Gold certified. The Murray Hill location has had iconic research of various historical innovations for AT&T Corp., Lucent Technologies, Alcatel-Lucent, and Nokia.
Nobel Prize, Turing Award, IEEE Medal of Honor.
Eleven Nobel Prizes have been awarded for work completed at Bell Laboratories.
The Turing Award has been won five times by Bell Labs researchers.
First awarded in 1917, the IEEE Medal of Honor is the highest form of recognition by the Institute of Electrical and Electronics Engineers. The IEEE Medal of Honor has been won 22 times by Bell Labs researchers.
|
Emmy Awards, Grammy Award, and Academy Award.
The Emmy Award has been won five times by Bell Labs: one under Lucent Technologies, one under Alcatel-Lucent, and three under Nokia.
The inventions of fiber-optics and research done in digital television and media File Format were under former AT&T Bell Labs ownership.
The Grammy Award has been won once by Bell Labs under Alcatel-Lucent.
The Academy Award has been won once by E. C. Wente and Bell Labs.
Publications.
The American Telephone and Telegraph Company, Western Electric, and other Bell System companies issued numerous publications, such as local house organs, for corporate distribution, for the scientific and industry communities, and for the general public, including telephone subscribers.
The Bell Laboratories Record was a principal house organ, featuring general interest content such as corporate news, support staff profiles and events, reports of facilities upgrades, but also articles of research and development results written for technical or non-technical audiences. The publication commenced in 1925 with the founding of the laboratories.
|
A prominent journal for the focussed dissemination of original or reprinted scientific research by Bell Labs engineers and scientists was the "Bell System Technical Journal", started in 1922 by the AT&T Information Department. Bell researchers also published widely in industry journals.
Some of these articles were reprinted by the Bell System as Monographs, consecutively issued starting in 1920. These reprints, numbering over 5000, comprise a catalog of Bell research over the decades. Research in the Monographs is aided by access to associated indexes, for monographs 1–1199, 1200–2850 (1958), 2851–4050 (1962), and 4051–4650 (1964).
Essentially all of the landmark work done by Bell Labs is memorialized in one or more corresponding monographs. Examples include: |
Bjarne Stroustrup
Bjarne Stroustrup (; ; born 30 December 1950) is a Danish computer scientist, known for the development of the C++ programming language. He led the Large-scale Programming Research department at Bell Labs, served as a professor of computer science at Texas A&M University, and spent over a decade at Morgan Stanley while also being a visiting professor at Columbia University. Since 2022 he has been a full professor at Columbia.
Early life and education.
Stroustrup was born in Aarhus, Denmark. His family was working class, and he attended local schools.
He attended Aarhus University from 1969 to 1975 and graduated with a Candidatus Scientiarum in mathematics with computer science. His interests focused on microprogramming and machine architecture. He learned the fundamentals of object-oriented programming from its inventor, Kristen Nygaard, who frequently visited Aarhus.
In 1979, he received his PhD in computer science from the University of Cambridge, where his research on distributed computing was supervised by David Wheeler.
|
Career and research.
In 1979, Stroustrup began his career as a member of technical staff in the Computer Science Research Center of Bell Labs in Murray Hill, New Jersey. There, he began his work on C++ and programming techniques. Stroustrup was the head of AT&T Bell Labs' Large-scale Programming Research department, from its creation until late 2002. In 1993, he was made a Bell Labs fellow and in 1996, an AT&T Fellow.
From 2002 to 2014, Stroustrup was the College of Engineering Chair Professor in Computer Science at Texas A&M University. From 2011, he was made a University Distinguished Professor.
From January 2014 to April 2022, Stroustrup was a technical fellow and managing director in the technology division of Morgan Stanley in New York City and a visiting professor in computer science at Columbia University.
As of July 2022, Stroustrup is a full professor of computer science at Columbia University.
C++.
Stroustrup is best known for his work on C++. In 1979, he began developing C++ (initially called "C with Classes"). In his own words, he "invented C++, wrote its early definitions, and produced its first implementation [...] chose and formulated the design criteria for C++, designed all its major facilities, and was responsible for the processing of extension proposals in the C++ standards committee." C++ was made generally available in 1985. For non-commercial use, the source code of the compiler and the foundation libraries was the cost of shipping (US$75); this was before Internet access was common. Stroustrup also published a textbook for the language in 1985, "The C++ Programming Language".
|
The key language-technical areas of contribution of C++ are:
Stroustrup documented his principles guiding the design of C++ and the evolution of the language in his 1994 book, "The Design and Evolution of C++", and three papers for ACM's History of Programming Languages conferences.
Stroustrup was a founding member of the C++ standards committee (from 1989, it was an ANSI committee and from 1991 an ISO committee) and has remained an active member ever since. For 24 years he chaired the subgroup chartered to handle proposals for language extensions (Evolution Working Group).
Awards and honors.
Selected honors
Fellowships
Honorary doctorates and professorships
Publications.
Stroustrup has written or co-written a number of publications, including the books:
In all, these books have been translated into 21 languages.
More than 100 academic articles, including:
More than a hundred technical reports for the C++ standards committee (WG21) |
Brain
The brain is an organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals. It consists of nervous tissue and is typically located in the head (cephalization), usually near organs for special senses such as vision, hearing, and olfaction. Being the most specialized organ, it is responsible for receiving information from the sensory nervous system, processing those information (thought, cognition, and intelligence) and the coordination of motor control (muscle activity and endocrine system).
While invertebrate brains arise from paired segmental ganglia (each of which is only responsible for the respective body segment) of the ventral nerve cord, vertebrate brains develop axially from the midline dorsal nerve cord as a vesicular enlargement at the rostral end of the neural tube, with centralized control over all body segments. All vertebrate brains can be embryonically divided into three parts: the forebrain (prosencephalon, subdivided into telencephalon and diencephalon), midbrain (mesencephalon) and hindbrain (rhombencephalon, subdivided into metencephalon and myelencephalon). The spinal cord, which directly interacts with somatic functions below the head, can be considered a caudal extension of the myelencephalon enclosed inside the vertebral column. Together, the brain and spinal cord constitute the central nervous system in all vertebrates.
|
In humans, the cerebral cortex contains approximately 14–16 billion neurons, and the estimated number of neurons in the cerebellum is 55–70 billion. Each neuron is connected by synapses to several thousand other neurons, typically communicating with one another via cytoplasmic processes known as dendrites and axons. Axons are usually myelinated and carry trains of rapid micro-electric signal pulses called action potentials to target specific recipient cells in other areas of the brain or distant parts of the body. The prefrontal cortex, which controls executive functions, is particularly well developed in humans.
Physiologically, brains exert centralized control over a body's other organs. They act on the rest of the body both by generating patterns of muscle activity and by driving the secretion of chemicals called hormones. This centralized control allows rapid and coordinated responses to changes in the environment. Some basic types of responsiveness such as reflexes can be mediated by the spinal cord or peripheral ganglia, but sophisticated purposeful control of behavior based on complex sensory input requires the information integrating capabilities of a centralized brain.
|
The operations of individual brain cells are now understood in considerable detail but the way they cooperate in ensembles of millions is yet to be solved. Recent models in modern neuroscience treat the brain as a biological computer, very different in mechanism from a digital computer, but similar in the sense that it acquires information from the surrounding world, stores it, and processes it in a variety of ways.
This article compares the properties of brains across the entire range of animal species, with the greatest attention to vertebrates. It deals with the human brain insofar as it shares the properties of other brains. The ways in which the human brain differs from other brains are covered in the human brain article. Several topics that might be covered here are instead covered there because much more can be said about them in a human context. The most important that are covered in the human brain article are brain disease and the effects of brain damage.
Structure.
The shape and size of the brain varies greatly between species, and identifying common features is often difficult. Nevertheless, there are a number of principles of brain architecture that apply across a wide range of species. Some aspects of brain structure are common to almost the entire range of animal species; others distinguish "advanced" brains from more primitive ones, or distinguish vertebrates from invertebrates.
|
The simplest way to gain information about brain anatomy is by visual inspection, but many more sophisticated techniques have been developed. Brain tissue in its natural state is too soft to work with, but it can be hardened by immersion in alcohol or other fixatives, and then sliced apart for examination of the interior. Visually, the interior of the brain consists of areas of so-called grey matter, with a dark color, separated by areas of white matter, with a lighter color. Further information can be gained by staining slices of brain tissue with a variety of chemicals that bring out areas where specific types of molecules are present in high concentrations. It is also possible to examine the microstructure of brain tissue using a microscope, and to trace the pattern of connections from one brain area to another.
Cellular structure.
|
Axons transmit signals to other neurons by means of specialized junctions called synapses. A single axon may make as many as several thousand synaptic connections with other cells. When an action potential, traveling along an axon, arrives at a synapse, it causes a chemical called a neurotransmitter to be released. The neurotransmitter binds to receptor molecules in the membrane of the target cell.
Synapses are the key functional elements of the brain. The essential function of the brain is cell-to-cell communication, and synapses are the points at which communication occurs. The human brain has been estimated to contain approximately 100 trillion synapses; even the brain of a fruit fly contains several million. The functions of these synapses are very diverse: some are excitatory (exciting the target cell); others are inhibitory; others work by activating second messenger systems that change the internal chemistry of their target cells in complex ways. A large number of synapses are dynamically modifiable; that is, they are capable of changing strength in a way that is controlled by the patterns of signals that pass through them. It is widely believed that activity-dependent modification of synapses is the brain's primary mechanism for learning and memory.
|
Most of the space in the brain is taken up by axons, which are often bundled together in what are called "nerve fiber tracts". A myelinated axon is wrapped in a fatty insulating sheath of myelin, which serves to greatly increase the speed of signal propagation. (There are also unmyelinated axons). Myelin is white, making parts of the brain filled exclusively with nerve fibers appear as light-colored white matter, in contrast to the darker-colored grey matter that marks areas with high densities of neuron cell bodies.
Evolution.
Generic bilaterian nervous system.
Except for a few primitive organisms such as sponges (which have no nervous system) and cnidarians (which have a diffuse nervous system consisting of a nerve net), all living multicellular animals are bilaterians, meaning animals with a bilaterally symmetric body plan (that is, left and right sides that are approximate mirror images of each other). All bilaterians are thought to have descended from a common ancestor that appeared late in the Cryogenian period, 700–650 million years ago, and it has been hypothesized that this common ancestor had the shape of a simple tubeworm with a segmented body. At a schematic level, that basic worm-shape continues to be reflected in the body and nervous system architecture of all modern bilaterians, including vertebrates. The fundamental bilateral body form is a tube with a hollow gut cavity running from the mouth to the anus, and a nerve cord with an enlargement (a ganglion) for each body segment, with an especially large ganglion at the front, called the brain. The brain is small and simple in some species, such as nematode worms; in other species, such as vertebrates, it is a large and very complex organ. Some types of worms, such as leeches, also have an enlarged ganglion at the back end of the nerve cord, known as a "tail brain".
|
There are a few types of existing bilaterians that lack a recognizable brain, including echinoderms and tunicates. It has not been definitively established whether the existence of these brainless species indicates that the earliest bilaterians lacked a brain, or whether their ancestors evolved in a way that led to the disappearance of a previously existing brain structure.
Invertebrates.
This category includes tardigrades, arthropods, molluscs, and numerous types of worms. The diversity of invertebrate body plans is matched by an equal diversity in brain structures.
Two groups of invertebrates have notably complex brains: arthropods (insects, crustaceans, arachnids, and others), and cephalopods (octopuses, squids, and similar molluscs). The brains of arthropods and cephalopods arise from twin parallel nerve cords that extend through the body of the animal. Arthropods have a central brain, the supraesophageal ganglion, with three divisions and large optical lobes behind each eye for visual processing. Cephalopods such as the octopus and squid have the largest brains of any invertebrates.
|
There are several invertebrate species whose brains have been studied intensively because they have properties that make them convenient for experimental work:
Vertebrates.
The first vertebrates appeared over 500 million years ago (Mya) during the Cambrian period, and may have resembled the modern jawless fish (hagfish and lamprey) in form. Jawed vertebrates appeared by 445 Mya, tetrapods by 350 Mya, amniotes by 310 Mya and mammaliaforms by 200 Mya (approximately). Each vertebrate clade has an equally long evolutionary history, but the brains of modern fish, amphibians, reptiles, birds and mammals show a gradient of size and complexity that roughly follows the evolutionary sequence. All of these brains contain the same set of basic anatomical structures, but many are rudimentary in the hagfish, whereas in mammals the foremost part (forebrain, especially the telencephalon) is greatly developed and expanded.
Brains are most commonly compared in terms of their mass. The relationship between brain size, body size and other variables has been studied across a wide range of vertebrate species. As a rule of thumb, brain size increases with body size, but not in a simple linear proportion. In general, smaller animals tend to have proportionally larger brains, measured as a fraction of body size. For mammals, the relationship between brain volume and body mass essentially follows a power law with an exponent of about 0.75. This formula describes the central tendency, but every family of mammals departs from it to some degree, in a way that reflects in part the complexity of their behavior. For example, primates have brains 5 to 10 times larger than the formula predicts. Predators, who have to implement various hunting strategies against the ever changing anti-predator adaptations, tend to have larger brains relative to body size than their prey.
|
All vertebrate brains share a common underlying form, which appears most clearly during early stages of embryonic development. In its earliest form, the brain appears as three vesicular swellings at the front end of the neural tube; these swellings eventually become the forebrain (prosencephalon), midbrain (mesencephalon) and hindbrain (rhombencephalon), respectively. At the earliest stages of brain development, the three areas are roughly equal in size. In many aquatic/semiaquatic vertebrates such as fish and amphibians, the three parts remain similar in size in adults, but in terrestrial tetrapods such as mammals, the forebrain becomes much larger than the other parts, the hindbrain develops a bulky dorsal extension known as the cerebellum, and the midbrain becomes very small as a result.
The brains of vertebrates are made of very soft tissue. Living brain tissue is pinkish on the outside and mostly white on the inside, with subtle variations in color. Vertebrate brains are surrounded by a system of connective tissue membranes called meninges, which separate the skull from the brain. Cerebral arteries pierce the outer two layers of the meninges, the dura and arachnoid mater, into the subarachnoid space and perfuse the brain parenchyma via arterioles perforating into the innermost layer of the meninges, the pia mater. The endothelial cells in the cerebral blood vessel walls are joined tightly to one another, forming the blood–brain barrier, which blocks the passage of many toxins and pathogens (though at the same time blocking antibodies and some drugs, thereby presenting special challenges in treatment of diseases of the brain). As a result of the osmotic restriction by the blood-brain barrier, the metabolites within the brain are cleared mostly by bulk flow of the cerebrospinal fluid within the glymphatic system instead of via venules like other parts of the body.
|
Neuroanatomists usually divide the vertebrate brain into six main subregions: the telencephalon (the cerebral hemispheres), diencephalon (thalamus and hypothalamus), mesencephalon (midbrain), cerebellum, pons and medulla oblongata, with the midbrain, pons and medulla often collectively called the brainstem. Each of these areas has a complex internal structure. Some parts, such as the cerebral cortex and the cerebellar cortex, are folded into convoluted gyri and sulci in order to maximize surface area within the available intracranial space. Other parts, such as the thalamus and hypothalamus, consist of many small clusters of nuclei known as "ganglia". Thousands of distinguishable areas can be identified within the vertebrate brain based on fine distinctions of neural structure, chemistry, and connectivity.
Although the same basic components are present in all vertebrate brains, some branches of vertebrate evolution have led to substantial distortions of brain geometry, especially in the forebrain area. The brain of a shark shows the basic components in a straightforward way, but in teleost fishes (the great majority of existing fish species), the forebrain has become "everted", like a sock turned inside out. In birds, there are also major changes in forebrain structure. These distortions can make it difficult to match brain components from one species with those of another species.
|
Here is a list of some of the most important vertebrate brain components, along with a brief description of their functions as currently understood:
Reptiles.
Modern reptiles and mammals diverged from a common ancestor around 320 million years ago. The number of extant reptiles far exceeds the number of mammalian species, with 11,733 recognized species of reptiles compared to 5,884 extant mammals. Along with the species diversity, reptiles have diverged in terms of external morphology, from limbless to tetrapod gliders to armored chelonians, reflecting adaptive radiation to a diverse array of environments.
Morphological differences are reflected in the nervous system phenotype, such as: absence of lateral motor column neurons in snakes, which innervate limb muscles controlling limb movements; absence of motor neurons that innervate trunk muscles in tortoises; presence of innervation from the trigeminal nerve to pit organs responsible to infrared detection in snakes. Variation in size, weight, and shape of the brain can be found within reptiles. For instance, crocodilians have the largest brain volume to body weight proportion, followed by turtles, lizards, and snakes. Reptiles vary in the investment in different brain sections. Crocodilians have the largest telencephalon, while snakes have the smallest. Turtles have the largest diencephalon per body weight whereas crocodilians have the smallest. On the other hand, lizards have the largest mesencephalon.
|
Yet their brains share several characteristics revealed by recent anatomical, molecular, and ontogenetic studies. Vertebrates share the highest levels of similarities during embryological development, controlled by conserved transcription factors and signaling centers, including gene expression, morphological and cell type differentiation. In fact, high levels of transcriptional factors can be found in all areas of the brain in reptiles and mammals, with shared neuronal clusters enlightening brain evolution. Conserved transcription factors elucidate that evolution acted in different areas of the brain by either retaining similar morphology and function, or diversifying it.
Anatomically, the reptilian brain has less subdivisions than the mammalian brain, however it has numerous conserved aspects including the organization of the spinal cord and cranial nerve, as well as elaborated brain pattern of organization. Elaborated brains are characterized by migrated neuronal cell bodies away from the periventricular matrix, region of neuronal development, forming organized nuclear groups. Aside from reptiles and mammals, other vertebrates with elaborated brains include hagfish, galeomorph sharks, skates, rays, teleosts, and birds. Overall elaborated brains are subdivided in forebrain, midbrain, and hindbrain.
|
The hindbrain coordinates and integrates sensory and motor inputs and outputs responsible for, but not limited to, walking, swimming, or flying. It contains input and output axons interconnecting the spinal cord, midbrain and forebrain transmitting information from the external and internal environments. The midbrain links sensory, motor, and integrative components received from the hindbrain, connecting it to the forebrain. The tectum, which includes the optic tectum and torus semicircularis, receives auditory, visual, and somatosensory inputs, forming integrated maps of the sensory and visual space around the animal. The tegmentum receives incoming sensory information and forwards motor responses to and from the forebrain. The isthmus connects the hindbrain with midbrain. The forebrain region is particularly well developed, is further divided into diencephalon and telencephalon. Diencephalon is related to regulation of eye and body movement in response to visual stimuli, sensory information, circadian rhythms, olfactory input, and autonomic nervous system.Telencephalon is related to control of movements, neurotransmitters and neuromodulators responsible for integrating inputs and transmitting outputs are present, sensory systems, and cognitive functions.
|
Mammals.
The most obvious difference between the brains of mammals and other vertebrates is their size. On average, a mammal has a brain roughly twice as large as that of a bird of the same body size, and ten times as large as that of a reptile of the same body size.
Size, however, is not the only difference: there are also substantial differences in shape. The hindbrain and midbrain of mammals are generally similar to those of other vertebrates, but dramatic differences appear in the forebrain, which is greatly enlarged and also altered in structure. The cerebral cortex is the part of the brain that most strongly distinguishes mammals. In non-mammalian vertebrates, the surface of the cerebrum is lined with a comparatively simple three-layered structure called the pallium. In mammals, the pallium evolves into a complex six-layered structure called neocortex or "isocortex". Several areas at the edge of the neocortex, including the hippocampus and amygdala, are also much more extensively developed in mammals than in other vertebrates.
|
The elaboration of the cerebral cortex carries with it changes to other brain areas. The superior colliculus, which plays a major role in visual control of behavior in most vertebrates, shrinks to a small size in mammals, and many of its functions are taken over by visual areas of the cerebral cortex. The cerebellum of mammals contains a large portion (the neocerebellum) dedicated to supporting the cerebral cortex, which has no counterpart in other vertebrates.
In placentals, there is a wide nerve tract connecting the cerebral hemispheres called the corpus callosum.
Primates.
The brains of humans and other primates contain the same structures as the brains of other mammals, but are generally larger in proportion to body size. The encephalization quotient (EQ) is used to compare brain sizes across species. It takes into account the nonlinearity of the brain-to-body relationship. Humans have an average EQ in the 7-to-8 range, while most other primates have an EQ in the 2-to-3 range. Dolphins have values higher than those of primates other than humans, but nearly all other mammals have EQ values that are substantially lower.
|
Most of the enlargement of the primate brain comes from a massive expansion of the cerebral cortex, especially the prefrontal cortex and the parts of the cortex involved in vision. The visual processing network of primates includes at least 30 distinguishable brain areas, with a complex web of interconnections. It has been estimated that visual processing areas occupy more than half of the total surface of the primate neocortex. The prefrontal cortex carries out functions that include planning, working memory, motivation, attention, and executive control. It takes up a much larger proportion of the brain for primates than for other species, and an especially large fraction of the human brain.
Development.
The brain develops in an intricately orchestrated sequence of stages. It changes in shape from a simple swelling at the front of the nerve cord in the earliest embryonic stages, to a complex array of areas and connections. Neurons are created in special zones that contain stem cells, and then migrate through the tissue to reach their ultimate locations. Once neurons have positioned themselves, their axons sprout and navigate through the brain, branching and extending as they go, until the tips reach their targets and form synaptic connections. In a number of parts of the nervous system, neurons and synapses are produced in excessive numbers during the early stages, and then the unneeded ones are pruned away.
|
For vertebrates, the early stages of neural development are similar across all species. As the embryo transforms from a round blob of cells into a wormlike structure, a narrow strip of ectoderm running along the midline of the back is induced to become the neural plate, the precursor of the nervous system. The neural plate folds inward to form the neural groove, and then the lips that line the groove merge to enclose the neural tube, a hollow cord of cells with a fluid-filled ventricle at the center. At the front end, the ventricles and cord swell to form three vesicles that are the precursors of the prosencephalon (forebrain), mesencephalon (midbrain), and rhombencephalon (hindbrain). At the next stage, the forebrain splits into two vesicles called the telencephalon (which will contain the cerebral cortex, basal ganglia, and related structures) and the diencephalon (which will contain the thalamus and hypothalamus). At about the same time, the hindbrain splits into the metencephalon (which will contain the cerebellum and pons) and the myelencephalon (which will contain the medulla oblongata). Each of these areas contains proliferative zones where neurons and glial cells are generated; the resulting cells then migrate, sometimes for long distances, to their final positions.
|
Once a neuron is in place, it extends dendrites and an axon into the area around it. Axons, because they commonly extend a great distance from the cell body and need to reach specific targets, grow in a particularly complex way. The tip of a growing axon consists of a blob of protoplasm called a growth cone, studded with chemical receptors. These receptors sense the local environment, causing the growth cone to be attracted or repelled by various cellular elements, and thus to be pulled in a particular direction at each point along its path. The result of this pathfinding process is that the growth cone navigates through the brain until it reaches its destination area, where other chemical cues cause it to begin generating synapses. Considering the entire brain, thousands of genes create products that influence axonal pathfinding.
|
Similar things happen in other brain areas: an initial synaptic matrix is generated as a result of genetically determined chemical guidance, but then gradually refined by activity-dependent mechanisms, partly driven by internal dynamics, partly by external sensory inputs. In some cases, as with the retina-midbrain system, activity patterns depend on mechanisms that operate only in the developing brain, and apparently exist solely to guide development.
In humans and many other mammals, new neurons are created mainly before birth, and the infant brain contains substantially more neurons than the adult brain. There are, however, a few areas where new neurons continue to be generated throughout life. The two areas for which adult neurogenesis is well established are the olfactory bulb, which is involved in the sense of smell, and the dentate gyrus of the hippocampus, where there is evidence that the new neurons play a role in storing newly acquired memories. With these exceptions, however, the set of neurons that is present in early childhood is the set that is present for life. Glial cells are different: as with most types of cells in the body, they are generated throughout the lifespan.
|
There has long been debate about whether the qualities of mind, personality, and intelligence can be attributed to heredity or to upbringing. Although many details remain to be settled, neuroscience shows that both factors are important. Genes determine both the general form of the brain and how it reacts to experience, but experience is required to refine the matrix of synaptic connections, resulting in greatly increased complexity. The presence or absence of experience is critical at key periods of development. Additionally, the quantity and quality of experience are important. For example, animals raised in enriched environments demonstrate thick cerebral cortices, indicating a high density of synaptic connections, compared to animals with restricted levels of stimulation.
Physiology.
The functions of the brain depend on the ability of neurons to transmit electrochemical signals to other cells, and their ability to respond appropriately to electrochemical signals received from other cells. The electrical properties of neurons are controlled by a wide variety of biochemical and metabolic processes, most notably the interactions between neurotransmitters and receptors that take place at synapses.
|
Neurotransmitters and receptors.
Neurotransmitters are chemicals that are released at synapses when the local membrane is depolarised and Ca2+ enters into the cell, typically when an action potential arrives at the synapse – neurotransmitters attach themselves to receptor molecules on the membrane of the synapse's target cell (or cells), and thereby alter the electrical or chemical properties of the receptor molecules. With few exceptions, each neuron in the brain releases the same chemical neurotransmitter, or combination of neurotransmitters, at all the synaptic connections it makes with other neurons; this rule is known as Dale's principle. Thus, a neuron can be characterized by the neurotransmitters that it releases. The great majority of psychoactive drugs exert their effects by altering specific neurotransmitter systems. This applies to drugs such as cannabinoids, nicotine, heroin, cocaine, alcohol, fluoxetine, chlorpromazine, and many others.
The two neurotransmitters that are most widely found in the vertebrate brain are glutamate, which almost always exerts excitatory effects on target neurons, and gamma-aminobutyric acid (GABA), which is almost always inhibitory. Neurons using these transmitters can be found in nearly every part of the brain. Because of their ubiquity, drugs that act on glutamate or GABA tend to have broad and powerful effects. Some general anesthetics act by reducing the effects of glutamate; most tranquilizers exert their sedative effects by enhancing the effects of GABA.
|
There are dozens of other chemical neurotransmitters that are used in more limited areas of the brain, often areas dedicated to a particular function. Serotonin, for example—the primary target of many antidepressant drugs and many dietary aids—comes exclusively from a small brainstem area called the raphe nuclei. Norepinephrine, which is involved in arousal, comes exclusively from a nearby small area called the locus coeruleus. Other neurotransmitters such as acetylcholine and dopamine have multiple sources in the brain but are not as ubiquitously distributed as glutamate and GABA.
Electrical activity.
As a side effect of the electrochemical processes used by neurons for signaling, brain tissue generates electric fields when it is active. When large numbers of neurons show synchronized activity, the electric fields that they generate can be large enough to detect outside the skull, using electroencephalography (EEG) or magnetoencephalography (MEG). EEG recordings, along with recordings made from electrodes implanted inside the brains of animals such as rats, show that the brain of a living animal is constantly active, even during sleep. Each part of the brain shows a mixture of rhythmic and nonrhythmic activity, which may vary according to behavioral state. In mammals, the cerebral cortex tends to show large slow delta waves during sleep, faster alpha waves when the animal is awake but inattentive, and chaotic-looking irregular activity when the animal is actively engaged in a task, called beta and gamma waves. During an epileptic seizure, the brain's inhibitory control mechanisms fail to function and electrical activity rises to pathological levels, producing EEG traces that show large wave and spike patterns not seen in a healthy brain. Relating these population-level patterns to the computational functions of individual neurons is a major focus of current research in neurophysiology.
|
Metabolism.
All vertebrates have a blood–brain barrier that allows metabolism inside the brain to operate differently from metabolism in other parts of the body. The neurovascular unit regulates cerebral blood flow so that activated neurons can be supplied with energy. Glial cells play a major role in brain metabolism by controlling the chemical composition of the fluid that surrounds neurons, including levels of ions and nutrients.
Brain tissue consumes a large amount of energy in proportion to its volume, so large brains place severe metabolic demands on animals. The need to limit body weight in order, for example, to fly, has apparently led to selection for a reduction of brain size in some species, such as bats. Most of the brain's energy consumption goes into sustaining the electric charge (membrane potential) of neurons. Most vertebrate species devote between 2% and 8% of basal metabolism to the brain. In primates, however, the percentage is much higher—in humans it rises to 20–25%. The energy consumption of the brain does not vary greatly over time, but active regions of the cerebral cortex consume somewhat more energy than inactive regions; this forms the basis for the functional brain imaging methods of PET, fMRI, and NIRS. The brain typically gets most of its energy from oxygen-dependent metabolism of glucose (i.e., blood sugar), but ketones provide a major alternative source, together with contributions from medium chain fatty acids (caprylic and heptanoic acids), lactate, acetate, and possibly amino acids.
|
Function.
Information from the sense organs is collected in the brain. There it is used to determine what actions the organism is to take. The brain processes the raw data to extract information about the structure of the environment. Next it combines the processed information with information about the current needs of the animal and with memory of past circumstances. Finally, on the basis of the results, it generates motor response patterns. These signal-processing tasks require intricate interplay between a variety of functional subsystems.
The function of the brain is to provide coherent control over the actions of an animal. A centralized brain allows groups of muscles to be co-activated in complex patterns; it also allows stimuli impinging on one part of the body to evoke responses in other parts, and it can prevent different parts of the body from acting at cross-purposes to each other.
Perception.
The human brain is provided with information about light, sound, the chemical composition of the atmosphere, temperature, the position of the body in space (proprioception), the chemical composition of the bloodstream, and more. In other animals additional senses are present, such as the infrared heat-sense of snakes, the magnetic field sense of some birds, or the electric field sense mainly seen in aquatic animals.
|
Each sensory system begins with specialized receptor cells, such as photoreceptor cells in the retina of the eye, or vibration-sensitive hair cells in the cochlea of the ear. The axons of sensory receptor cells travel into the spinal cord or brain, where they transmit their signals to a first-order sensory nucleus dedicated to one specific sensory modality. This primary sensory nucleus sends information to higher-order sensory areas that are dedicated to the same modality. Eventually, via a way-station in the thalamus, the signals are sent to the cerebral cortex, where they are processed to extract the relevant features, and integrated with signals coming from other sensory systems.
Motor control.
Motor systems are areas of the brain that are involved in initiating body movements, that is, in activating muscles. Except for the muscles that control the eye, which are driven by nuclei in the midbrain, all the voluntary muscles in the body are directly innervated by motor neurons in the spinal cord and hindbrain. Spinal motor neurons are controlled both by neural circuits intrinsic to the spinal cord, and by inputs that descend from the brain. The intrinsic spinal circuits implement many reflex responses, and contain pattern generators for rhythmic movements such as walking or swimming. The descending connections from the brain allow for more sophisticated control.
|
The brain contains several motor areas that project directly to the spinal cord. At the lowest level are motor areas in the medulla and pons, which control stereotyped movements such as walking, breathing, or swallowing. At a higher level are areas in the midbrain, such as the red nucleus, which is responsible for coordinating movements of the arms and legs. At a higher level yet is the primary motor cortex, a strip of tissue located at the posterior edge of the frontal lobe. The primary motor cortex sends projections to the subcortical motor areas, but also sends a massive projection directly to the spinal cord, through the pyramidal tract. This direct corticospinal projection allows for precise voluntary control of the fine details of movements. Other motor-related brain areas exert secondary effects by projecting to the primary motor areas. Among the most important secondary areas are the premotor cortex, supplementary motor area, basal ganglia, and cerebellum. In addition to all of the above, the brain and spinal cord contain extensive circuitry to control the autonomic nervous system which controls the movement of the smooth muscle of the body.
|
Sleep.
Many animals alternate between sleeping and waking in a daily cycle. Arousal and alertness are also modulated on a finer time scale by a network of brain areas. A key component of the sleep system is the suprachiasmatic nucleus (SCN), a tiny part of the hypothalamus located directly above the point at which the optic nerves from the two eyes cross. The SCN contains the body's central biological clock. Neurons there show activity levels that rise and fall with a period of about 24 hours, circadian rhythms: these activity fluctuations are driven by rhythmic changes in expression of a set of "clock genes". The SCN continues to keep time even if it is excised from the brain and placed in a dish of warm nutrient solution, but it ordinarily receives input from the optic nerves, through the retinohypothalamic tract (RHT), that allows daily light-dark cycles to calibrate the clock.
The SCN projects to a set of areas in the hypothalamus, brainstem, and midbrain that are involved in implementing sleep-wake cycles. An important component of the system is the reticular formation, a group of neuron-clusters scattered diffusely through the core of the lower brain. Reticular neurons send signals to the thalamus, which in turn sends activity-level-controlling signals to every part of the cortex. Damage to the reticular formation can produce a permanent state of coma.
|
Sleep involves great changes in brain activity. Until the 1950s it was generally believed that the brain essentially shuts off during sleep, but this is now known to be far from true; activity continues, but patterns become very different. There are two types of sleep: "REM sleep" (with dreaming) and "NREM" (non-REM, usually without dreaming) sleep, which repeat in slightly varying patterns throughout a sleep episode. Three broad types of distinct brain activity patterns can be measured: REM, light NREM and deep NREM. During deep NREM sleep, also called slow wave sleep, activity in the cortex takes the form of large synchronized waves, whereas in the waking state it is noisy and desynchronized. Levels of the neurotransmitters norepinephrine and serotonin drop during slow wave sleep, and fall almost to zero during REM sleep; levels of acetylcholine show the reverse pattern.
Homeostasis.
For any animal, survival requires maintaining a variety of parameters of bodily state within a limited range of variation: these include temperature, water content, salt concentration in the bloodstream, blood glucose levels, blood oxygen level, and others. The ability of an animal to regulate the internal environment of its body—the milieu intérieur, as the pioneering physiologist Claude Bernard called it—is known as homeostasis (Greek for "standing still"). Maintaining homeostasis is a crucial function of the brain. The basic principle that underlies homeostasis is negative feedback: any time a parameter diverges from its set-point, sensors generate an error signal that evokes a response that causes the parameter to shift back toward its optimum value. (This principle is widely used in engineering, for example in the control of temperature using a thermostat.)
|
In vertebrates, the part of the brain that plays the greatest role is the hypothalamus, a small region at the base of the forebrain whose size does not reflect its complexity or the importance of its function. The hypothalamus is a collection of small nuclei, most of which are involved in basic biological functions. Some of these functions relate to arousal or to social interactions such as sexuality, aggression, or maternal behaviors; but many of them relate to homeostasis. Several hypothalamic nuclei receive input from sensors located in the lining of blood vessels, conveying information about temperature, sodium level, glucose level, blood oxygen level, and other parameters. These hypothalamic nuclei send output signals to motor areas that can generate actions to rectify deficiencies. Some of the outputs also go to the pituitary gland, a tiny gland attached to the brain directly underneath the hypothalamus. The pituitary gland secretes hormones into the bloodstream, where they circulate throughout the body and induce changes in cellular activity.
|
Motivation.
The individual animals need to express survival-promoting behaviors, such as seeking food, water, shelter, and a mate. The motivational system in the brain monitors the current state of satisfaction of these goals, and activates behaviors to meet any needs that arise. The motivational system works largely by a reward–punishment mechanism. When a particular behavior is followed by favorable consequences, the reward mechanism in the brain is activated, which induces structural changes inside the brain that cause the same behavior to be repeated later, whenever a similar situation arises. Conversely, when a behavior is followed by unfavorable consequences, the brain's punishment mechanism is activated, inducing structural changes that cause the behavior to be suppressed when similar situations arise in the future.
Most organisms studied to date use a reward–punishment mechanism: for instance, worms and insects can alter their behavior to seek food sources or to avoid dangers. In vertebrates, the reward-punishment system is implemented by a specific set of brain structures, at the heart of which lie the basal ganglia, a set of interconnected areas at the base of the forebrain. The basal ganglia are the central site at which decisions are made: the basal ganglia exert a sustained inhibitory control over most of the motor systems in the brain; when this inhibition is released, a motor system is permitted to execute the action it is programmed to carry out. Rewards and punishments function by altering the relationship between the inputs that the basal ganglia receive and the decision-signals that are emitted. The reward mechanism is better understood than the punishment mechanism, because its role in drug abuse has caused it to be studied very intensively. Research has shown that the neurotransmitter dopamine plays a central role: addictive drugs such as cocaine, amphetamine, and nicotine either cause dopamine levels to rise or cause the effects of dopamine inside the brain to be enhanced.
|
Learning and memory.
Almost all animals are capable of modifying their behavior as a result of experience—even the most primitive types of worms. Because behavior is driven by brain activity, changes in behavior must somehow correspond to changes inside the brain. Already in the late 19th century theorists like Santiago Ramón y Cajal argued that the most plausible explanation is that learning and memory are expressed as changes in the synaptic connections between neurons. Until 1970, however, experimental evidence to support the synaptic plasticity hypothesis was lacking. In 1971 Tim Bliss and Terje Lømo published a paper on a phenomenon now called long-term potentiation: the paper showed clear evidence of activity-induced synaptic changes that lasted for at least several days. Since then technical advances have made these sorts of experiments much easier to carry out, and thousands of studies have been made that have clarified the mechanism of synaptic change, and uncovered other types of activity-driven synaptic change in a variety of brain areas, including the cerebral cortex, hippocampus, basal ganglia, and cerebellum. Brain-derived neurotrophic factor (BDNF) and physical activity appear to play a beneficial role in the process.
|
Neuroscientists currently distinguish several types of learning and memory that are implemented by the brain in distinct ways:
Research.
The field of neuroscience encompasses all approaches that seek to understand the brain and the rest of the nervous system. Psychology seeks to understand mind and behavior, and neurology is the medical discipline that diagnoses and treats diseases of the nervous system. The brain is also the most important organ studied in psychiatry, the branch of medicine that works to study, prevent, and treat mental disorders. Cognitive science seeks to unify neuroscience and psychology with other fields that concern themselves with the brain, such as computer science (artificial intelligence and similar fields) and philosophy.
The oldest method of studying the brain is anatomical, and until the middle of the 20th century, much of the progress in neuroscience came from the development of better cell stains and better microscopes. Neuroanatomists study the large-scale structure of the brain as well as the microscopic structure of neurons and their components, especially synapses. Among other tools, they employ a plethora of stains that reveal neural structure, chemistry, and connectivity. In recent years, the development of immunostaining techniques has allowed investigation of neurons that express specific sets of genes. Also, "functional neuroanatomy" uses medical imaging techniques to correlate variations in human brain structure with differences in cognition or behavior.
|
Neurophysiologists study the chemical, pharmacological, and electrical properties of the brain: their primary tools are drugs and recording devices. Thousands of experimentally developed drugs affect the nervous system, some in highly specific ways. Recordings of brain activity can be made using electrodes, either glued to the scalp as in EEG studies, or implanted inside the brains of animals for extracellular recordings, which can detect action potentials generated by individual neurons. Because the brain does not contain pain receptors, it is possible using these techniques to record brain activity from animals that are awake and behaving without causing distress. The same techniques have occasionally been used to study brain activity in human patients with intractable epilepsy, in cases where there was a medical necessity to implant electrodes to localize the brain area responsible for epileptic seizures. Functional imaging techniques such as fMRI are also used to study brain activity; these techniques have mainly been used with human subjects, because they require a conscious subject to remain motionless for long periods of time, but they have the great advantage of being noninvasive.
|
Another approach to brain function is to examine the consequences of damage to specific brain areas. Even though it is protected by the skull and meninges, surrounded by cerebrospinal fluid, and isolated from the bloodstream by the blood–brain barrier, the delicate nature of the brain makes it vulnerable to numerous diseases and several types of damage. In humans, the effects of strokes and other types of brain damage have been a key source of information about brain function. Because there is no ability to experimentally control the nature of the damage, however, this information is often difficult to interpret. In animal studies, most commonly involving rats, it is possible to use electrodes or locally injected chemicals to produce precise patterns of damage and then examine the consequences for behavior.
Computational neuroscience encompasses two approaches: first, the use of computers to study the brain; second, the study of how brains perform computation. On one hand, it is possible to write a computer program to simulate the operation of a group of neurons by making use of systems of equations that describe their electrochemical activity; such simulations are known as "biologically realistic neural networks". On the other hand, it is possible to study algorithms for neural computation by simulating, or mathematically analyzing, the operations of simplified "units" that have some of the properties of neurons but abstract out much of their biological complexity. The computational functions of the brain are studied both by computer scientists and neuroscientists.
|
Computational neurogenetic modeling is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes.
Recent years have seen increasing applications of genetic and genomic techniques to the study of the brain and a focus on the roles of neurotrophic factors and physical activity in neuroplasticity. The most common subjects are mice, because of the availability of technical tools. It is now possible with relative ease to "knock out" or mutate a wide variety of genes, and then examine the effects on brain function. More sophisticated approaches are also being used: for example, using Cre-Lox recombination it is possible to activate or deactivate genes in specific parts of the brain, at specific times.
|
History.
The oldest brain to have been discovered was in Armenia in the Areni-1 cave complex. The brain, estimated to be over 5,000 years old, was found in the skull of a 12 to 14-year-old girl. Although the brains were shriveled, they were well preserved due to the climate found inside the cave.
Early philosophers were divided as to whether the seat of the soul lies in the brain or heart. Aristotle favored the heart, and thought that the function of the brain was merely to cool the blood. Democritus, the inventor of the atomic theory of matter, argued for a three-part soul, with intellect in the head, emotion in the heart, and lust near the liver. The unknown author of "On the Sacred Disease", a medical treatise in the Hippocratic Corpus, came down unequivocally in favor of the brain, writing:
The Roman physician Galen also argued for the importance of the brain, and theorized in some depth about how it might work. Galen traced out the anatomical relationships among brain, nerves, and muscles, demonstrating that all muscles in the body are connected to the brain through a branching network of nerves. He postulated that nerves activate muscles mechanically by carrying a mysterious substance he called "pneumata psychikon", usually translated as "animal spirits". Galen's ideas were widely known during the Middle Ages, but not much further progress came until the Renaissance, when detailed anatomical study resumed, combined with the theoretical speculations of René Descartes and those who followed him. Descartes, like Galen, thought of the nervous system in hydraulic terms. He believed that the highest cognitive functions are carried out by a non-physical "res cogitans", but that the majority of behaviors of humans, and all behaviors of animals, could be explained mechanistically.
|
The first real progress toward a modern understanding of nervous function, though, came from the investigations of Luigi Galvani (1737–1798), who discovered that a shock of static electricity applied to an exposed nerve of a dead frog could cause its leg to contract. Since that time, each major advance in understanding has followed more or less directly from the development of a new technique of investigation. Until the early years of the 20th century, the most important advances were derived from new methods for staining cells. Particularly critical was the invention of the Golgi stain, which (when correctly used) stains only a small fraction of neurons, but stains them in their entirety, including cell body, dendrites, and axon. Without such a stain, brain tissue under a microscope appears as an impenetrable tangle of protoplasmic fibers, in which it is impossible to determine any structure. In the hands of Camillo Golgi, and especially of the Spanish neuroanatomist Santiago Ramón y Cajal, the new stain revealed hundreds of distinct types of neurons, each with its own unique dendritic structure and pattern of connectivity.
|
In the first half of the 20th century, advances in electronics enabled investigation of the electrical properties of nerve cells, culminating in work by Alan Hodgkin, Andrew Huxley, and others on the biophysics of the action potential, and the work of Bernard Katz and others on the electrochemistry of the synapse. These studies complemented the anatomical picture with a conception of the brain as a dynamic entity. Reflecting the new understanding, in 1942 Charles Sherrington visualized the workings of the brain waking from sleep:
The invention of electronic computers in the 1940s, along with the development of mathematical information theory, led to a realization that brains can potentially be understood as information processing systems. This concept formed the basis of the field of cybernetics, and eventually gave rise to the field now known as computational neuroscience. The earliest attempts at cybernetics were somewhat crude in that they treated the brain as essentially a digital computer in disguise, as for example in John von Neumann's 1958 book, "The Computer and the Brain". Over the years, though, accumulating information about the electrical responses of brain cells recorded from behaving animals has steadily moved theoretical concepts in the direction of increasing realism.
|
One of the most influential early contributions was a 1959 paper titled "What the frog's eye tells the frog's brain": the paper examined the visual responses of neurons in the retina and optic tectum of frogs, and came to the conclusion that some neurons in the tectum of the frog are wired to combine elementary responses in a way that makes them function as "bug perceivers". A few years later David Hubel and Torsten Wiesel discovered cells in the primary visual cortex of monkeys that become active when sharp edges move across specific points in the field of view—a discovery for which they won a Nobel Prize. Follow-up studies in higher-order visual areas found cells that detect binocular disparity, color, movement, and aspects of shape, with areas located at increasing distances from the primary visual cortex showing increasingly complex responses. Other investigations of brain areas unrelated to vision have revealed cells with a wide variety of response correlates, some related to memory, some to abstract types of cognition such as space.
|
Theorists have worked to understand these response patterns by constructing mathematical models of neurons and neural networks, which can be simulated using computers. Some useful models are abstract, focusing on the conceptual structure of neural algorithms rather than the details of how they are implemented in the brain; other models attempt to incorporate data about the biophysical properties of real neurons. No model on any level is yet considered to be a fully valid description of brain function, though. The essential difficulty is that sophisticated computation by neural networks requires distributed processing in which hundreds or thousands of neurons work cooperatively—current methods of brain activity recording are only capable of isolating action potentials from a few dozen neurons at a time.
Furthermore, even single neurons appear to be complex and capable of performing computations. So, brain models that do not reflect this are too abstract to be representative of brain operation; models that do try to capture this are very computationally expensive and arguably intractable with present computational resources. However, the Human Brain Project is trying to build a realistic, detailed computational model of the entire human brain. The wisdom of this approach has been publicly contested, with high-profile scientists on both sides of the argument.
|
In the second half of the 20th century, developments in chemistry, electron microscopy, genetics, computer science, functional brain imaging, and other fields progressively opened new windows into brain structure and function. In the United States, the 1990s were officially designated as the "Decade of the Brain" to commemorate advances made in brain research, and to promote funding for such research.
In the 21st century, these trends have continued, and several new approaches have come into prominence, including multielectrode recording, which allows the activity of many brain cells to be recorded all at the same time; genetic engineering, which allows molecular components of the brain to be altered experimentally; genomics, which allows variations in brain structure to be correlated with variations in DNA properties and neuroimaging.
Society and culture.
As food.
Animal brains are used as food in numerous cuisines.
In rituals.
Some archaeological evidence suggests that the mourning rituals of European Neanderthals also involved the consumption of the brain.
The Fore people of Papua New Guinea are known to eat human brains. In funerary rituals, those close to the dead would eat the brain of the deceased to create a sense of immortality. A prion disease called kuru has been traced to this. |
Byzantium
Byzantium () or Byzantion () was an ancient Greek city in classical antiquity that became known as Constantinople in late antiquity and Istanbul today. The Greek name "Byzantion" and its Latinization "Byzantium" continued to be used as a name of Constantinople sporadically and to varying degrees during the thousand-year existence of the Eastern Roman Empire, which also became known by the former name of the city as the Byzantine Empire. Byzantium was colonized by Greeks from Megara in the 7th century BC and remained primarily Greek-speaking until its conquest by the Ottoman Empire in AD 1453.
Etymology.
The etymology of "Byzantium" is unknown. It has been suggested that the name is of Thracian origin. It may be derived from the Thracian personal name Byzas which means "he-goat". Ancient Greek legend refers to the Greek king Byzas, the leader of the Megarian colonists and founder of the city. The name "Lygos" for the city, which likely corresponds to an earlier Thracian settlement, is mentioned by Pliny the Elder in his "Natural History".
|
"Byzántios," plural "Byzántioi" (, ; adjective the same) referred to Byzantion's inhabitants, also used as an ethnonym for the people of the city and as a family name. In the Middle Ages, "Byzántion" was also a synecdoche for the eastern Roman Empire. (An ellipsis of ). "Byzantinós" (, ) denoted an inhabitant of the empire. The Anglicization of Latin "Byzantinus" yielded "Byzantine", with 15th and 16th century forms including "Byzantin", "Bizantin(e)", "Bezantin(e)", and "Bysantin" as well as "Byzantian" and "Bizantian".
The name "Byzantius" and "Byzantinus" were applied from the 9th century to gold Byzantine coinage, reflected in the French "besant" ("d'or"), Italian "bisante", and English "besant", "byzant", or "bezant". The English usage, derived from Old French "besan" (pl. "besanz"), and relating to the coin, dates from the 12th century.
Later, the name "Byzantium" became common in the West to refer to the Eastern Roman Empire, whose capital was Constantinople. As a term for the east Roman state as a whole, "Byzantium" was introduced by the historian Hieronymus Wolf only in 1555, a century after the last remnants of the empire, whose inhabitants continued to refer to their polity as the Roman Empire (), had ceased to exist.
|
Other places were historically known as "Byzántion" (Βυζάντιον) – a city in Libya mentioned by Stephanus of Byzantium and another on the western coast of India referred to by the Periplus of the Erythraean Sea; in both cases the names were probably adaptations of names in local languages. Faustus of Byzantium was from a city of that name in Cilicia.
History.
The origins of Byzantium are shrouded in legend. Tradition says that Byzas of Megara (a city-state near Athens) founded the city when he sailed northeast across the Aegean Sea. The date is usually given as 667 BC on the authority of Herodotus, who states the city was founded 17 years after Chalcedon. Eusebius, who wrote almost 800 years later, dates the founding of Chalcedon to 685/4 BC, but he also dates the founding of Byzantium to 656 BC (or a few years earlier depending on the edition). Herodotus' dating was later favored by Constantine the Great, who celebrated Byzantium's 1,000th anniversary between the years 333 and 334.
Byzantium was mainly a trading city due to its location at the Black Sea's only entrance. Byzantium later conquered Chalcedon, across the Bosphorus on the Asiatic side.
|
The city was taken by the Persian Empire at the time of the Scythian campaign (513 BC) of Emperor Darius I (r. 522–486 BC), and was added to the administrative province of Skudra. Though Achaemenid control of the city was never as stable as compared to other cities in Thrace, it was considered, alongside Sestos, to be one of the foremost Achaemenid ports on the European coast of the Bosphorus and the Hellespont.
Byzantium was besieged by Greek forces during the Peloponnesian War. As part of Sparta's strategy for cutting off grain supplies to Athens during their siege of Athens, Sparta took control of the city in 411 BC, to bring the Athenians into submission. The Athenian military later retook the city in 408 BC, when the Spartans had withdrawn following their settlement.
After siding with Pescennius Niger against the victorious Septimius Severus, the city was besieged by Roman forces and suffered extensive damage in AD 196. Byzantium was rebuilt by Septimius Severus, now emperor, and quickly regained its previous prosperity. It was bound to Perinthus during the period of Septimius Severus. After the war, Byzantium lost its city status and free city privileges, but Caracalla persuaded Severus to restore these rights. In appreciation, the Byzantines named Caracalla an archon of their city. The strategic and highly defensible (due to being surrounded by water on almost all sides) location of Byzantium attracted Roman Emperor Constantine I who, in AD 330, refounded it as an imperial residence inspired by Rome itself, known as Nova Roma. Later the city was called Constantinople (Greek Κωνσταντινούπολις, "Konstantinoupolis", "city of Constantine").
|
This combination of imperialism and location would affect Constantinople's role as the nexus between the continents of Europe and Asia. It was a commercial, cultural, and diplomatic centre and for centuries formed the capital of the Byzantine Empire, which decorated the city with numerous monuments, some still standing today. With its strategic position, Constantinople controlled the major trade routes between Asia and Europe, as well as the passage from the Mediterranean Sea to the Black Sea. On May 29, 1453, the city was conquered by the Ottoman Turks, and again became the capital of a powerful state, the Ottoman Empire. The Turks called the city "Istanbul" (although it was not officially renamed until 1930); the name derives from the Greek phrase "στην πόλη", which means "to the city". To this day it remains the largest and most populous city in Turkey, although Ankara is now the national capital.
Emblem.
By the late Hellenistic or early Roman period (1st century BC), the star and crescent motif was associated to some degree with Byzantium; even though it became more widely used as the royal emblem of Mithradates VI Eupator (who for a time incorporated the city into his empire).
|
Some Byzantine coins of the 1st century BC and later show the head of Artemis with bow and quiver, and feature a crescent with what appears to be an eight-rayed star on the reverse.
According to accounts which vary in some of the details, in 340 BC the Byzantines and their allies the Athenians were under siege by the troops of Philip of Macedon. On a particularly dark and wet night Philip attempted a surprise attack but was thwarted by the appearance of a bright light in the sky. This light is occasionally described by subsequent interpreters as a meteor, sometimes as the moon, and some accounts also mention the barking of dogs. However, the original accounts mention only a bright light in the sky, without specifying the moon. To commemorate the event the Byzantines erected a statue of Hecate "lampadephoros" (light-bearer or bringer). This story survived in the works of Hesychius of Miletus, who in all probability lived in the time of Justinian I. His works survive only in fragments preserved in Photius and the tenth century lexicographer Suidas. The tale is also related by Stephanus of Byzantium, and Eustathius.
|
Devotion to Hecate was especially favored by the Byzantines for her aid in having protected them from the incursions of Philip of Macedon. Her symbols were the crescent and star, and the walls of her city were her provenance. This contradicts claims that only the symbol of the crescent was meant to symbolize Hecate, whereas the star was only added later in order to symbolize the Virgin Mary, as Constantine I is said to have rededicated the city to her in the year 330.
It is unclear precisely how the symbol Hecate/Artemis, one of many goddesses would have been transferred to the city itself, but it seems likely to have been an effect of being credited with the intervention against Philip and the subsequent honors. This was a common process in ancient Greece, as in "Athens" where the city was named after "Athena" in honor of such an intervention in time of war.
Cities in the Roman Empire often continued to issue their own coinage. "Of the many themes that were used on local coinage, celestial and astral symbols often appeared, mostly stars or crescent moons." The wide variety of these issues, and the varying explanations for the significance of the star and crescent on Roman coinage precludes their discussion here. It is, however, apparent that by the time of the Romans, coins featuring a star or crescent in some combination were not at all rare. |
Biotic
Biotics describe living or once living components of a community; for example organisms, such as animals and plants.
Biotic may refer to: |
Berlin Wall
The Berlin Wall (, ) was a guarded concrete barrier that encircled West Berlin from 1961 to 1989, separating it from East Berlin and the German Democratic Republic (GDR; East Germany). Construction of the Berlin Wall was commenced by the government of the GDR on 13 August 1961. It included guard towers placed along large concrete walls, accompanied by a wide area (later known as the "death strip") that contained anti-vehicle trenches, beds of nails and other defenses. The primary intention for the Wall's construction was to prevent East German citizens from fleeing to the West.
The Soviet Bloc propaganda portrayed the Wall as protecting its population from "fascist elements conspiring to prevent the will of the people" from building a communist state in the GDR. The authorities officially referred to the Berlin Wall as the "Anti-Fascist Protection Rampart" (, ). Conversely, West Berlin's city government sometimes referred to it as the "Wall of Shame", a term coined by mayor Willy Brandt in reference to the Wall's restriction on freedom of movement. Along with the separate and much longer inner German border, which demarcated the border between East and West Germany, it came to symbolize physically the Iron Curtain that separated the Western Bloc and Soviet satellite states of the Eastern Bloc during the Cold War.
|
Before the Wall's erection, 3.5 million East Germans circumvented Eastern Bloc emigration restrictions and defected from the GDR, many by crossing over the border from East Berlin into West Berlin; from there they could then travel to West Germany and to other Western European countries. Between 1961 and 1989, the deadly force associated with the Wall prevented almost all such emigration. During this period, over 100,000 people attempted to escape, and over 5,000 people succeeded in escaping over the Wall, with an estimated death toll of those murdered by East German authorities ranging from 136 to more than 200 in and around Berlin.
In 1989, a series of revolutions in nearby Eastern Bloc countries (Poland and Hungary in particular) and the events of the "Pan-European Picnic" set in motion a peaceful development during which the Iron Curtain largely broke, rulers in the East came under public pressure to cease their repressive policies. After several weeks of civil unrest, the East German government announced on 9 November 1989 that all GDR citizens could visit the FRG and West Berlin. Crowds of East Germans crossed and climbed onto the Wall, joined by West Germans on the other side, and souvenir hunters chipped away parts of the Wall over the next few weeks. The Brandenburg Gate, a few meters from the Berlin Wall, reopened on 22 December 1989, with demolition of the Wall beginning on 13 June 1990 and concluding in 1994. The fall of the Berlin Wall paved the way for German reunification, which formally took place on 3 October 1990.
|
Background.
Post-war Germany.
After the end of World War II in Europe, what remained of pre-war Germany west of the Oder-Neisse line was divided into four occupation zones (as per the Potsdam Agreement), each one controlled by one of the four occupying Allied powers: the United States, the United Kingdom, France and the Soviet Union. The capital, Berlin, as the seat of the Allied Control Council, was similarly subdivided into four sectors despite the city's location, which was fully within the Soviet zone.
Within two years, political divisions increased between the Soviets and the other occupying powers. These included the Soviets' refusal to agree to reconstruction plans making post-war Germany self-sufficient, and to a detailed accounting of industrial plants, goods and infrastructure—some of which had already been removed by the Soviets. France, the United Kingdom, the United States, and the Benelux countries later met to combine the non-Soviet zones of Germany into one zone for reconstruction, and to approve the extension of the Marshall Plan.
|
Eastern Bloc and the Berlin airlift.
Following the defeat of Nazi Germany in World War II, the Soviet Union engineered the installation of communist regimes in most of the countries occupied by Soviet military forces at the end of the war, including Poland, Hungary, Czechoslovakia, Bulgaria, Romania, and the GDR, which together with Albania formed the Comecon in 1949 and later a military alliance, the Warsaw Pact. The beginning of the Cold War saw the Eastern Bloc of the Soviet Union confront the Western Bloc of the United States, with the latter grouping becoming largely united in 1949 under NATO and the former grouping becoming largely united in 1955 under the Warsaw Pact. As the Soviet Union already had an armed presence and political domination all over its eastern satellite states by 1955, the pact has been long considered "superfluous", and because of the rushed way in which it was conceived, NATO officials labeled it a "cardboard castle". There was no direct military confrontation between the two organizations; instead, the conflict was fought on an ideological basis and through proxy wars. Both NATO and the Warsaw Pact led to the expansion of military forces and their integration into the respective blocs. The Warsaw Pact's largest military engagement was the Warsaw Pact invasion of Czechoslovakia, its own member state, in August 1968.
|
Since the end of the war, the USSR installed a Soviet-style regime in the Soviet occupation zone of Germany and later founded the GDR, with the country's political system based on a centrally planned socialist economic model with nationalized means of production, and with repressive secret police institutions, under party dictatorship of the SED (Sozialistische Einheitspartei Deutschlands; Socialist Unity Party of Germany) similar to the party dictatorship of the Soviet Communist Party in the USSR.
At the same time, a parallel country was established under the control of the Western powers in the zones of post-war Germany occupied by them, culminating in the foundation of the Federal Republic of Germany in 1949, which initially claimed to be the sole legitimate power in all of Germany, East and West. The material standard of living in the Western zones of Berlin began to improve quickly, and residents of the Soviet zone soon began leaving for the West in large numbers, fleeing hunger, poverty and repression in the Soviet Zone for a better life in the West. Soon residents of other parts of the Soviet zone began to escape to the West through Berlin, and this migration, called in Germany "Republikflucht", deprived the Soviet zone not only of working forces desperately needed for post-war reconstruction but disproportionately of highly educated people, which came to be known as the "Brain Drain".
|
In 1948, in response to moves by the Western powers to establish a separate, federal system of government in the Western zones, and to extend the US Marshall Plan of economic assistance to Germany, the Soviets instituted the Berlin Blockade, preventing people, food, materials and supplies from arriving in West Berlin by land routes through the Soviet zone. The United States, the United Kingdom, France, Canada, Australia, New Zealand and several other countries began a massive "airlift", supplying West Berlin with food and other supplies. The Soviets mounted a public relations campaign against the Western policy change. Communists attempted to disrupt the elections of 1948, preceding large losses therein, while 300,000 Berliners demonstrated for the international airlift to continue. In May 1949, Stalin lifted the blockade, permitting the resumption of Western shipments to Berlin.
The German Democratic Republic (the "GDR"; East Germany) was declared on 7 October 1949. On that day, the USSR ended the Soviet military government which had governed the Soviet Occupation Zone (Sowetische Besatzungszone) since the end of the war and handed over legal power to the Provisorische Volkskammer under the new Constitution of the GDR which came into force that day. However, until 1955, the Soviets maintained considerable legal control over the GDR state, including the regional governments, through the Sowetische Kontrollkommission and maintained a presence in various East German administrative, military, and secret police structures. Even after legal sovereignty of the GDR was restored in 1955, the Soviet Union continued to maintain considerable influence over administration and lawmaking in the GDR through the Soviet embassy and through the implicit threat of force which could be exercised through the continuing large Soviet military presence in the GDR, which was used to repress protests in East Germany bloodily in June 1953.
|
East Germany differed from West Germany (Federal Republic of Germany), which developed into a Western capitalist country with a social market economy and a democratic parliamentary government. Continual economic growth starting in the 1950s fueled a 20-year "economic miracle" (). As West Germany's economy grew, and its standard of living steadily improved, many East Germans wanted to move to West Germany.
Emigration westward in the early 1950s.
After the Soviet occupation of Eastern Europe at the end of World War II, the majority of those living in the newly acquired areas of the Eastern Bloc aspired to independence and wanted the Soviets to leave. Taking advantage of the zonal border between occupied zones in Germany, the number of GDR citizens moving to West Germany totaled 187,000 in 1950; 165,000 in 1951; 182,000 in 1952; and 331,000 in 1953. One reason for the sharp 1953 increase was fear of potential further Sovietization, given the increasingly paranoid actions of Joseph Stalin in late 1952 and early 1953. In the first six months of 1953, 226,000 had fled.
|
Erection of the inner German border.
By the early 1950s, the Soviet approach to controlling national movement, restricting emigration, was emulated by most of the rest of the Eastern Bloc, including East Germany. The restrictions presented a quandary for some Eastern Bloc states, which had been more economically advanced and open than the Soviet Union, such that crossing borders seemed more natural—especially where no prior border existed between East and West Germany.
Up until 1952, the demarcation lines between East Germany and the western occupied zones could be easily crossed in most places. On 1 April 1952, East German leaders met the Soviet leader Joseph Stalin in Moscow; during the discussions, Stalin's foreign minister Vyacheslav Molotov proposed that the East Germans should "introduce a system of passes for visits of West Berlin residents to the territory of East Berlin [so as to stop] free movement of Western agents" in the GDR. Stalin agreed, calling the situation "intolerable". He advised the East Germans to build up their border defenses, telling them that "The demarcation line between East and West Germany should be considered a border—and not just any border, but a dangerous one ... The Germans will guard the line of defence with their lives."
|
Consequently, the inner German border between the two German states was closed, and a barbed-wire fence erected. The border between the Western and Eastern sectors of Berlin, however, remained open, although traffic between the Soviet and the Western sectors was somewhat restricted. This resulted in Berlin becoming a magnet for East Germans desperate to escape life in the GDR, and also a flashpoint for tension between the United States and the Soviet Union.
In 1955, the Soviets gave East Germany authority over civilian movement in Berlin, passing control to a regime not recognized in the West. Initially, East Germany granted "visits" to allow its residents access to West Germany. However, following the defection of large numbers of East Germans (known as "Republikflucht") under this regime, the new East German state legally restricted virtually all travel to the West in 1956. Soviet East German ambassador Mikhail Pervukhin observed that "the presence in Berlin of an open and essentially uncontrolled border between the socialist and capitalist worlds unwittingly prompts the population to make a comparison between both parts of the city, which unfortunately does not always turn out in favour of Democratic [East] Berlin."
|
Berlin emigration loophole.
With the closing of the inner German border officially in 1952, the border in Berlin remained considerably more accessible because it was administered by all four occupying powers. Accordingly, Berlin became the main route by which East Germans left for the West. On 11 December 1957, East Germany introduced a new passport law that reduced the overall number of refugees leaving Eastern Germany.
It had the unintended result of drastically increasing the percentage of those leaving through West Berlin from 60% to well over 90% by the end of 1958. Those caught trying to leave East Berlin were subjected to heavy penalties, but with no physical barrier and subway train access still available to West Berlin, such measures were ineffective. The Berlin sector border was essentially a "loophole" through which Eastern Bloc citizens could still escape. The 3.5 million East Germans who had left by 1961 totalled approximately 20% of the entire East German population.
An important reason that passage between East Germany and West Berlin was not stopped earlier was that doing so would cut off much of the railway traffic in East Germany. Construction of a new railway bypassing West Berlin, the Berlin outer ring, commenced in 1951. Following the completion of the railway in 1961, closing the border became a more practical proposition.
|
Brain drain.
The emigrants tended to be young and well-educated, leading to the "brain drain" feared by officials in East Germany. Yuri Andropov, then the CPSU Director on Relations with Communist and Workers' Parties of Socialist Countries, wrote an urgent letter on 28 August 1958 to the Central Committee about the significant 50% increase in the number of East German intelligentsia among the refugees. Andropov reported that, while the East German leadership stated that they were leaving for economic reasons, testimony from refugees indicated that the reasons were more political than material. He stated "the flight of the intelligentsia has reached a particularly critical phase."
By 1960, the combination of World War II and the massive emigration westward left East Germany with only 61% of its population of working age, compared to 70.5% before the war. The loss was disproportionately heavy among professionals: engineers, technicians, physicians, teachers, lawyers, and skilled workers. The direct cost of manpower losses to East Germany (and corresponding gain to the West) has been estimated at $7 billion to $9 billion, with East German party leader Walter Ulbricht later claiming that West Germany owed him $17 billion in compensation, including reparations as well as manpower losses. In addition, the drain of East Germany's young population potentially cost it over 22.5 billion marks in lost educational investment. The brain drain of professionals had become so damaging to the political credibility and economic viability of East Germany that the re-securing of the German communist frontier was imperative.
|
The exodus of emigrants from East Germany presented two minor potential benefits: an easy way to smuggle East German secret agents to West Germany, and a reduction in the number of citizens hostile to the communist regime. Neither of these advantages, however, proved particularly useful.
Start of the construction (1961).
On 15 June 1961, First Secretary of the Socialist Unity Party and GDR State Council chairman Walter Ulbricht stated in an international press conference, (No one has the intention of erecting a wall!). It was the first time the colloquial term (wall) had been used in this context.
The transcript of a telephone call between Nikita Khrushchev and Ulbricht, on 1 August in the same year, suggests that the initiative for the construction of the Wall came from Khrushchev. However, other sources suggest that Khrushchev had initially been wary about building a wall, fearing negative Western reaction. Nevertheless, Ulbricht had pushed for a border closure for some time, arguing that East Germany's existence was at stake.
|
Khrushchev had become emboldened upon seeing US president John F. Kennedy's youth and inexperience, which he considered a weakness. In the 1961 Vienna summit, Kennedy made the error of admitting that the US would not actively oppose the building of a barrier. A feeling of miscalculation and failure immediately afterwards was admitted by Kennedy in a candid interview with "New York Times" columnist James "Scotty" Reston. On Saturday, 12 August 1961, the leaders of the GDR attended a garden party at a government guesthouse in , in a wooded area to the north of East Berlin. There, Ulbricht signed the order to close the border and erect a wall.
At midnight, the police and units of the East German army began to close the border and, by Sunday morning, 13 August, the border with West Berlin was closed. East German troops and workers had begun to tear up streets running alongside the border to make them impassable to most vehicles and to install barbed wire entanglements and fences along the around the three western sectors, and the that divided West and East Berlin. The date of 13 August became commonly referred to as Barbed Wire Sunday in Germany.
|
The barrier was built inside East Berlin on East German territory to ensure that it did not encroach on West Berlin at any point. Generally, the Wall was only slightly inside East Berlin, but in a few places it was some distance from the legal border, most notably at Potsdamer Bahnhof and the Lenné Triangle that is now much of the Potsdamer Platz development.
Later, the initial barrier was built up into the Wall proper, the first concrete elements and large blocks being put in place on 17 August. During the construction of the Wall, National People's Army (NVA) and Combat Groups of the Working Class (KdA) soldiers stood in front of it with orders to shoot anyone who attempted to defect. Additionally, chain fences, walls, minefields and other obstacles were installed along the length of East Germany's western border with West Germany proper. A wide no man's land was cleared as well to provide a better overview and a clear line of fire at fleeing refugees.
Immediate effects.
|
United States and UK sources had expected the Soviet sector to be sealed off from West Berlin but were surprised by how long the East Germans took for such a move. They considered the Wall as an end to concerns about a GDR/Soviet retaking or capture of the whole of Berlin; the Wall would presumably have been an unnecessary project if such plans were afloat. Thus, they concluded that the possibility of a Soviet military conflict over Berlin had decreased.
The East German government claimed that the Wall was an "anti-fascist protective rampart" () intended to dissuade aggression from the West. Another official justification was the activities of Western agents in Eastern Europe. The Eastern German government also claimed that West Berliners were buying out state-subsidized goods in East Berlin. East Germans and others greeted such statements with skepticism, as most of the time, the border was only closed for citizens of East Germany traveling to the West, but not for residents of West Berlin travelling to the East. The construction of the Wall had caused considerable hardship to families divided by it. Most people believed that the Wall was mainly a means of preventing the citizens of East Germany from entering or fleeing to West Berlin.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.