identifier
stringlengths
1
43
dataset
stringclasses
3 values
question
stringclasses
4 values
rank
int64
0
99
url
stringlengths
14
1.88k
read_more_link
stringclasses
1 value
language
stringclasses
1 value
title
stringlengths
0
200
top_image
stringlengths
0
125k
meta_img
stringlengths
0
125k
images
listlengths
0
18.2k
movies
listlengths
0
484
keywords
listlengths
0
0
meta_keywords
listlengths
1
48.5k
tags
null
authors
listlengths
0
10
publish_date
stringlengths
19
32
summary
stringclasses
1 value
meta_description
stringlengths
0
258k
meta_lang
stringclasses
68 values
meta_favicon
stringlengths
0
20.2k
meta_site_name
stringlengths
0
641
canonical_link
stringlengths
9
1.88k
text
stringlengths
0
100k
correct_foundationPlace_00083
FactBench
3
62
https://www.itu.int/itunews/issue/1999/06/news.html
en
News
[]
[]
[]
[ "" ]
null
[]
null
null
Beware of the GPS week number roll-over problem on 22 August 1999. The year 2000 problem, with the year roll over from 99 to 00 has received much attention. But is that also the case with the global positioning system (GPS) week number roll-over problem? In telecommunications, the GPS is used at some places to synchronize the timing or the frequency of systems. So just in the same way that we look at the year 2000 problem, each user should check if their GPS-based system will be working correctly on 22 August 1999, the first day that the GPS week number will roll over from 1023 to 0. The reason is the limited number of bits (10) used in the GPS navigation message. If the software in the GPS receiver is not corrected, then the receiver will internally interpret the new week 0 as 6 January 1980. It might then stop tracking any satellite and the systems might perform worse or even loose synchronization. — Working Party 7A, Study Group 7 (Science services), Radiocommunication Sector. tele.ring and Alcatel sign contract for nationwide GSM network in Austria. tele.ring — majority owned by Mannesmann — has awarded to Alcatel a major contract for the turnkey delivery of key elements for its new GSM 1800 network in Austria, valued at FRF 1.9 billion. Deliveries will start immediately in order to meet the planned service launch within less than 12 months. The contract includes the complete EVOLIUM radio solutions (GSM base station subsystems and microwave transmission), the comprehensive GPRS (general packet radio service) system for high-speed data transmission over GSM, the intelligent network platform and project management. — Alcatel. Analysys launches newentrants.com. A fully interactive Web service tracking telecommunication entrants in the Western European fixed telephony market has been launched by telecommunications consultancy Analysys. The new site (www.newentrants.com), which is updated quarterly, features profiles of over 300 new entrants across 16 countries and contains contact details for more than 500 companies. The former monopoly operators are also profiled, enabling users to track the performance of all the key players in the business. In addition, the site includes operator benchmarks providing the ability to compare and contrast key financial and operational statistics for the major new network operators and the incumbent operators, and a share indices service, updated daily, which enables users to compare the individual share performance of the major listed network operators. — Analysys. Alcatel to sell European transmission assets of DSC Communications Corporation to Tellabs, Inc. Alcatel and Tellabs, Inc., an American supplier of telecommunications equipment, have signed an agreement under which Alcatel is selling to Tellabs the European transmission assets of the former DSC Communications Corporation for some USD 110 million. In the framework of the deal, both parties also entered into a patent cross-licensing agreement according to which the parties will license to one another their respective patents. In particular, Tellabs will benefit over the next four years of the patent licences of Alcatel's extensive range of SDH, SONET and WDM technologies. Alcatel acquired DSC Communications Corporation in 1998 and has integrated its American activities within Alcatel USA. — Alcatel. CNN news via mobile phone. CNN Interactive and Nokia have developed an extensive news and information service used via mobile phone. The CNN Mobile service will become available in Europe, the Near East, Africa and Asia and its news content varies by region. CNN news comes in the form of phone text messages. By the end of February, nine mobile phone operators had signed an agreement on this form of news transmission. The service was to be offered first in Finland, and later by other operators by the end of June 1999. CNN Mobile is the first worldwide added value service that is based on the Wireless Application Protocol (WAP). WAP is an open standard that enables the offering of so-called value-added services via digital wireless mobile phone networks. In addition to news services, value-added services include weather reports, stock prices, electronic mail, flight schedules and banking services. — Finnfacts. BT to build nationwide Internet telephony and multimedia network in Spain. BT has announced it is to create a purpose-built nationwide Internet telephony and multimedia network in Spain. The network will use the latest Internet Protocol (IP) technology from Nortel Networks and will form part of a multi-million pound investment by BT in Spain. The company has already invested more than GBP 400 million in its various fixed, mobile and Internet interests, and plans to spend at least an additional GBP 600 million over the next ten years. The IP network will provide national coverage from 12 main nodes this year, with plans to expand the network by 27 main nodes within three years. — BT. Ascom to renew New Zealand's entire payphone network. Between July and December 1999, Telecom New Zealand will be replacing its existing network of public telephones with 6000 chip card payphones supplied by Ascom, including an off-line management system. This network will be the second, after Hungary, to benefit from a comprehensive security system protecting the operator against fraudulent card use. The contract is valued at FRF 45 million. — Ascom. Turner Broadcasting signs long-term lease with AsiaSat. Asia Satellite Telecommunications Company Limited has announced the signing of a long-term lease with Turner Broad casting System Inc. (TBS), for a full 36 MHz C-band transponder on AsiaSat-3S. The capacity will be used to broadcast to Asia a number of digital television channels including CNN International, TNT and Cartoon Network. TBS, a subsidiary of Time Warner Inc., is a producer of news and entertainment around the world and a leading provider of programming for the cable industry. — AsiaSat. NTT to aid telecommunication reconstruction in Honduras. NTT will help Honduras rebuild its telecommunications infrastructure following the devastating hurricane Mitch (November 1998) — the killer storm which left nearly 15 000 people either dead or missing. Some 1.4 million Honduran residents are still suffering. Aid from NTT is focused on the provision of essential telecommunications equipment. Transmission cables donated by the corporation include 10 km of optical fibre cable, 9 km of aerial cable, 265 km of jumper wire, 7 km of lead-in wire and 3 km of inside wire. The total value of NTT's aid is about JPY 9 million. NTT is providing the aid through the Basic Human Needs Telecom Committee, a non-governmental organization based in Japan that offers humanitarian aid focusing on telecommunications. Cables and other equipment from NTT were to arrive in Honduras in May 1999. — NTT. Ascom to install 7000 payphones in El Salvador and Guatemala. Already present in Mexico, Colombia and Argentina, Ascom has confirmed its presence in Latin America in the public telephone sector through new contracts in these two countries. Ascom will be installing 7000 Proxim card payphones in El Salvador and Guatemala together with their supervision system, both orders having been placed by the national telecommunication operator. In Guatemala, 500 payphones had been installed by January 1999 and the remaining units will be supplied at a rate of 1000 per month. In El Salvador, 4000 payphones will be installed during 1999 for CTE Antel, a company in which France Télécom recently took a controlling interest. — Ascom. Americans said to pay more for their calls than many Europeans. Recent studies by United Kingdom-based Eurodata Foundation reveal that, overall, most households in the United States are in reality paying more for their telephone service usage than Europeans — substantially so, in many cases. Eurodata's report Competition and tariffs in the USA challenges the perception that all calls in America are markedly cheaper than those made by Europeans. The report explains in detail, with supporting basket analysis, the structure of tariffs for PSTN services, private lines, ISDN, ATM and frame relay services. It is an essential guide for anyone who wishes to understand and analyse the cost of service in this most complex of markets. Traditionally, United States local loop operators include an element to cover costs of local calls within the "flat rate" element of their monthly bill. The effect is that United States residential users do not see the true cost of the local calls they make. In short, Americans perceive their local calls to be free. This perception influences their views of the charges they pay for non-local calls, creating an assumption that all calls in the United States are exceptionally good value. — Eurodata Foundation. BT and Sega to provide subscription-free television Internet access across Europe. BT has announced that it has been chosen by Sega, the global electronics company, to provide a pan-European subscription-free Internet access service for users of Sega Dreamcast. The new Dreamcast games console, to be launched in September 1999, will provide users subscription-free Internet access through their televisions on a pay-as-you-go basis via BT's network of European partners. This is the first time that subscription-free Internet access has been made available across Europe in this way. The dial IP service will be available from BT in the United Kingdom and its European partners in France, Germany, Italy and Spain. BT and Sega expect to roll-out the service to other European countries later in the year. The new Dreamcast console will include a 33.6 k modem and a browser, and will also enable customers to access E-mail, chat and on-line gaming services. Future plans include the offer of on-line shopping facilities. Dreamcast is expected to retail at around GBP 199. — BT. New services for Greece. Greece's State network operator OTE is now adding intelligent services to its network to make itself more competitive before the full opening of its national telecommunications market in the year 2001. It is against this background that Siemens has obtained an order worth some DEM 23 million to equip OTE's telephone network with the Siemens platform for intelligent networks, INXpress. This means that OTE will be able to launch Siemens services on its network at the same time as developing new services which can also be tested and operated in heterogeneous network architectures with components from various manufacturers of telecommunication devices. Once the intelligent network has been fully set up, OTE customers will be able to make use of services such as universal access number, premium rate, televoting, universal personal telecommunications, virtual private network and prepaid card. — Siemens. GTS announces majority stake in Omnicom. Global TeleSystems Group, Inc. has announced that it has acquired a total of 1 877 608 common shares of Omnicom, representing 98.6 per cent of Omnicom's common share capital. This concludes the offer for those common shares and convertible bonds of Omnicom not previously held by GTS. — GTS. q INTELSAT provides first satellite link to Internet II backbone. The International Telecommunications Satellite Organization has announced that it will be providing capacity to one of the alternate carriers in Israel, Israsat, a subsidiary of Gilat Communications, to connect to the Internet II backbone. The link, which is the first satellite link to the Internet II network, will connect the Inter-Universities Computation Center (IUCC) in Israel with academic institutions in the United States. Service was scheduled to start in June 1999. The Internet II is an initiative by a number of universities worldwide, in conjunction with government and industry partners, to accelerate the next stage of Internet development in academia. This private network will operate at speeds several times faster than today's Internet. The duplex service will be carried over the Intelsat-801 satellite at 328.5° E and will initially operate at 45 Mbit/s using two 36 MHz transponders. This is expected to be upgraded to a duplex 155 Mbit/s service. — INTELSAT. NTT Communications Corporation, a new company. Nippon Telegraph and Telephone Corporation has established NTT Communications Corporation, following approval of its reorganization plan by the Minister of Posts and Telecommunications. An agreement to transfer business from NTT to NTT Communications was also signed by the two companies. The new company was to commence operations on 1 July 1999. — NTT. Alcatel and Compaq announce marketing initiative for ADSL in Europe. Alcatel and Compaq Computer Corporation are embarking on a series of joint marketing ventures to encourage the commercial roll-out of ADSL (asymmetric digital subscriber line) equipment and services among operators and consumers in Europe. Initial target countries are the United Kingdom, Belgium, Italy and Sweden. — Alcatel/Compaq. Columbia signs Justice Technology to "Columbia-515". Columbia Communications Corporation has announced a new lease of 36 MHz West-West capacity on the Columbia-515 satellite to Justice Technology. The lease will support voice and data, using the Internet protocol between the United States and several sites in Latin America. — Columbia. Comverse unveils Internet call waiting service. Comverse Network Systems, a division of Comverse Technology, Inc., has unveiled its Internet call waiting service, which will give residential voice messaging and unified messaging (voice, fax and E-mail in a single mailbox) subscribers a virtual telephone line while they are online. This new service is activated automatically, as soon as the subscriber connects to the Internet. When a call comes in to the designated phone, a message pops up on the computer screen with the name and number of the caller. The subscriber can then decide whether to accept the call, route the caller to voice mail, redirect the call to another number, play a message for the caller, or simply ignore the call. — Comverse. EMS Technologies to supply wireless products to Nortel Networks for multimedia satellite link. EMS Technologies Canada, Ltd. has received a contract from Nortel Networks to supply advanced communications hardware and software for the Astra satellite network of Société européenne des satellites (SES), and its advanced return channel system (ARCS). ARCS will be the world's first broadband system for providing direct two-way Internet access via satellite. The multi-frequency, time-division multiple-access (MF-TDMA) architecture of the ARCS programme is a major advancement in satellite communications technology. Traditionally, satellites have received uplink signals from a limited number of earth stations. Under this new architecture, small terminals in hundreds of thousands of homes and offices will be able to communicate directly with a central hub via the Ka/Ku band satellite. The satellite access units will efficiently share satellite bandwidth through software developed by EMS, especially for the multimedia traffic environment. — EMS. ICO Global Communications sponsor of TELECOM 99 + Interactive 99. ICO Global Communications will sponsor the opening ceremony of TELECOM 99 + Interactive 99 to be organized by the International Telecommunication Union (ITU) in Geneva from 10 to 17 October. This ceremony will be held on 9 October, from 14h00 to 16h30 and will launch the event on the theme "Join the world" and start off the programme of speakers. About 200 000 visitors and 3500 official participants are expected from 150 countries. The TELECOM exhibition and Forum are held every four years. They are regarded as the world's most important event in the field of telecommunications. Ericsson, NEC and Hughes Network Systems are ICO's partners in sponsoring the opening ceremony. — ICO. WorldTel awarded BOT concession in Zimbabwe. WorldTel has been awarded a build-operate-transfer concession in Zimbabwe, following a joint venture agreement between the Posts and Telecommunications Corporation (PTC) and WorldTel with the express authorization of the Government. WorldTel has established a local private company, WorldTel Zimbabwe Limited (WTZ) to build and operate the fixed line network. WorldTel is planning to install 150 000 lines in Mashonaland and Victoria provinces including Harare. Both the project and its business plan were developed with the assistance of the International Telecommunication Union that had earlier drawn up the national master development plan. The estimated investment is over USD 115 million. The project, which will expand existing capacity by 75 per cent, aims to meet the present enormous pent-up demand in the country and seeks to reduce the waiting list for additional lines to zero over the next few years. WTZ positions itself as a friendly complementary service provider to PTC and not as its competitor. The company intends to use existing national and international network system capacity, wherever possible. In return for the in-kind contribution that PTC makes and for the operating opportunity given to WTZ, the former will obtain equity interest and share part of the revenue to be collected from subscribers. WorldTel has undertaken to give priority to Zimbabwean institutional investors to place funds in WTZ in the form of debt and equity. The balance of the required investment will be raised from WorldTel's own shareholders and other international institutional investors and banks. — WorldTel. Analysts foresee radical restructuring in telecommunications as bandwidth economics are redefined. International telecommunications analyst group, Ovum, has warned of a radical redefinition of the global telecommunications landscape as the industry moves from a business model where bandwidth is scarce and expensive, to one where bandwidth is potentially plentiful and, therefore, extremely cheap. Ovum predicts that this change will invert existing pricing models, create new market structures with new types of players and transform the operational and organizational make-up of today's operators. Increased supplies are driving dramatic reductions in price. But as prices fall, what happens to the commercial viability of the networks? Will increased supply stimulate enough extra demand to fill the networks at commercially sustainable prices? Ovum points out in its report The bandwidth explosion that many new networks are built on the premise of "if you build it, they will come" — i.e. increases in supply of capacity will always be filled up by growth in demand. But Ovum believes that the key issue is not simply whether or not demand will continue to grow rapidly — it evidently will. Rather, the success of many new and existing bandwidth providers will be determined by how quickly this will happen and whether or not some operators will be left with a capacity "glut", i.e. large amounts of capacity that they cannot sell. — Ovum. GTE/Bell Atlantic merger. GTE Corporation shareholders have approved the company's merger with Bell Atlantic. The vote was announced at the GTE annual shareholder meeting in Atlanta (Georgia). According to the company, 96.7 per cent of the votes submitted were in favour of the transaction. The United States Department of Justice cleared the merger after an exhaustive review, finding no competitive issues in the company's vast combined wireline businesses. — GTE. LMI to begin communication services. Lockheed Martin Intersputnik (LMI), a joint venture of Lockheed Martin Global Telecommunications and the International Organization of Space Communications (INTERSPUTNIK), confirmed it will launch its initial satellite, LMI-1 and initiate communication services by the end of September 1999. In a statement issued following a board meeting, the joint venture said its LMI-1 spacecraft is completing production at Lockheed Martin Commercial Space Systems in Sunnyvale (California), and has successfully completed thermal vacuum testing. The satellite will be ready for shipment to the Baikonur Cosmodrome launch site before the end of July 1999 and is manifested for launch by International Launch Services aboard a Proton launch vehicle in late August/early September 1999. LMI-1 will provide communication services to Eastern Europe, South and South-East Asia, parts of Africa and the Middle East, and the Commonwealth of Independent States. The spacecraft is the first of four planned advanced- technology satellites that are to provide comprehensive coverage and connectivity to business and residential customers throughout the world. — Lockheed Martin Global Telecommunications/INTERSPUTNIK. Columbia signs COMSAT Mobile for service to cruise ships. Columbia Communications has signed an agreement with COMSAT Mobile Communications to provide space segment in support of communication services (voice and data) for two cruise ships: QE2 and Century. The expansive C-band coverage of Columbia's TDRS-6 satellite, combined with its competitive pricing, makes the Columbia satellite system the ideal choice for C-band maritime services. A single TDRS-6 transponder delivers simultaneous full coverage to the Caribbean and Mediterranean regions, providing the most efficient space segment service for cruise ships and maritime service providers operating in these areas. — Columbia. A benchmark industry study of OSS investment points to pent-up demand for integrated solutions. TeleManagement Forum, a communications industry consortium of service providers and solution suppliers, has announced availability of a benchmark study of the global market for operational support systems. The study shows a 1998 world market of USD 16.5 billion for these business critical systems, documents current purchasing habits among service providers, and identifies a large, fast growing market that is not yet well understood, but that is going to create new winners and losers across the industry. Entitled The communications industry market for OSS, the study was commissioned by TeleManagement Forum and conducted by Ovum, Ltd. While time-critical pressures to improve efficiency drive service providers to look for turnkey solutions, issues of interoperability are preventing the industry from realizing the goal of fully automated and integrated systems. For a sufficient number of products to emerge, the industry must rally around a few key agreements that will enable common reusable off-the-shelf solutions and software components to be built. — Ascom SA. Structural changes in Kenya Kenya Posts and Telecommunications Corporation has ceased to be the signatory of Kenya. The Communications Commission of Kenya is the current designated signatory to the ITU on all telecommunications regulatory matters. in Monaco A Directorate for Licensing and Telecommunications Control was established by Decree No. 13.633 of 25 September 1998. This Directorate is placed under the authority of the Government Counsellor for Public Works and Social Affairs. Personnel changes in Bangladesh Mr Md. Ashraful Alim has been appointed Director (International), Bangladesh Telegraph and Telephone Board. in Brazil Mr João Pimenta da Veiga Filho has been appointed Minister of Communications. in Burkina Faso Mr Justin T. Thiombiano has been appointed State Secretary for Telecommunications, Ministry of Communication and Culture. in Finland Mr Olli-Pekka Heinonen has been appointed Minister of Transport and Communications. in Ghana Mr John D. Mahama has been appointed Minister of Communications. in Kenya Mr William Ole Ntimama has been appointed Minister of Transport and Communications. Messrs Karanja Kabage and Samuel Kiprono Chepkong'a have been appointed Chairman and Director-General, Communications Commission of Kenya, respectively. in Lebanon Mr Issam Naaman has been appointed Minister of Post and Telecommunications. in Monaco Mr Michel Levêque has been appointed State Minister and Director of the External Relations Service. Messrs Raoul Viora and Christian Palmaro have been appointed Director and Chief of the Telecommunication Division, Directorate for Licensing and Telecommunications Control, respectively. in the Netherlands Ms T. Netelenbos has been appointed Minister of Transport, Public Works and Water Management. Ms Monique de Vries has been appointed Secretary of State, Ministry of Transport, Public Works and Water Management. Mr J. W. Weck has been appointed Director-General, Telecommunications and Posts Department. in Poland Mr Maciej Srebro has been appointed Minister of Posts and Telecommunications. Messrs Miroslaw Marcinkiewicz and Wladyslaw Wilkans have been appointed Director-General and Director, Department of European Integration, Negotiations and International Relations, Ministry of Posts and Telecommunications, respectively. in Saudi Arabia Mr Fareed Y. Khashoggi has been appointed Director-General, International Affairs, Ministry of Post, Telegraph and Telephone. — ITU Notification No. 1378.
correct_foundationPlace_00083
FactBench
2
96
https://nocamels.com/2014/05/top-10-israeli-women-who-put-the-quality-in-equality/
en
10 Amazing Female Israeli Tech Leaders Who Put Quality In Equality
https://nocamels.com/wp-…kin-53639404.jpg
https://nocamels.com/wp-…kin-53639404.jpg
[ "https://nocamels.com/wp-content/themes/studio-day/images/logo.png", "https://nocamels.com/wp-content/uploads/2014/05/bigstock-Image-of-businesswoman-breakin-53639404-1024x614.jpg", "http://nocamels.com/wp-content/uploads/2014/05/KiraRadinsky.jpg", "http://nocamels.com/wp-content/uploads/2014/05/sivanby-e1399474051135.jpg", "http://nocamels.com/wp-content/uploads/2014/05/tsur-e1399474117754.jpg", "http://nocamels.com/wp-content/uploads/2014/05/malvina-300x300.jpg", "http://nocamels.com/wp-content/uploads/2014/05/hashay-e1399474168516-266x300.jpg", "http://nocamels.com/wp-content/uploads/2014/05/daphnanissenbaum.jpg", "http://nocamels.com/wp-content/uploads/2014/05/amitknaani.jpg", "http://nocamels.com/wp-content/uploads/2014/05/triinu-300x300.jpg", "http://nocamels.com/wp-content/uploads/2014/05/liataaronson-199x300.jpg", "http://nocamels.com/wp-content/uploads/2014/05/ayeletnoff-229x300.jpg", "https://nocamels.com/wp-content/uploads/2024/07/dp-drone43tfgs-350x164.jpg", "https://nocamels.com/wp-content/uploads/2024/07/alvan-nee-vhvH46ASxH0-unsplash-350x233.jpg", "https://nocamels.com/wp-content/uploads/2024/07/unsplash-lpst23-rwfd-350x191.jpg", "https://nocamels.com/wp-content/uploads/2024/07/negev-fields-ilan-sharif-350x157.jpg", "https://nocamels.com/wp-content/uploads/2023/06/pexels-eye-e1686471982830-350x252.jpg", "https://nocamels.com/wp-content/uploads/2024/07/dp-drone43tfgs-350x164.jpg", "https://nocamels.com/wp-content/uploads/2024/07/alvan-nee-vhvH46ASxH0-unsplash-350x233.jpg", "https://nocamels.com/wp-content/uploads/2024/07/unsplash-lpst23-rwfd-350x191.jpg", "https://nocamels.com/wp-content/uploads/2016/11/ASPER.png", "https://nocamels.com/wp-content/uploads/2021/10/RU_Asper.jpg", "https://nocamels.com/wp-content/uploads/2024/06/RAW-500px.png", "https://nocamels.com/wp-content/uploads/2024/07/rex2-150x150.jpg", "https://nocamels.com/wp-content/uploads/2024/07/bgu-sar-150x150.jpg", "https://nocamels.com/wp-content/uploads/2024/07/bojangles-harrison-keely-150x150.jpg", "https://nocamels.com/wp-content/uploads/2019/12/Depositphotos_213406816_l-2015-e1693833755410-150x150.jpg", "https://nocamels.com/wp-content/themes/studio-day/images/logo-no-slogan.png" ]
[]
[]
[ "" ]
null
[ "nocamels", "Sophie Imas" ]
2014-05-11T22:09:58+00:00
James Brown once sand "this is a man's world," giving the impression that when it comes to running the show, women take a back seat. However, in today's world, that couldn't be further from the truth, with women attaining key positions in all walks of life. NoCamels has sifted through the startup nation to bring you the top ten key female figures in Israel's tech scene.
en
https://nocamels.com/wp-content/themes/studio-day/favicon.ico
NoCamels
https://nocamels.com/2014/05/top-10-israeli-women-who-put-the-quality-in-equality/
James Brown sang “It’s a man’s world,” but as these 10 Israeli women show, that couldn’t be farther from the truth. NoCamels has sifted through the Startup Nation to bring you the top ten key female figures in Israel’s booming tech scene. And if you think we’ve missed other bright and bold female leaders, tell us in the comments below. Dr. Kira Radinsky – Next-generation oracle This 27-year-old has had the world watching her with wide eyes ever since she used her unique software to predict Cuba’s first Cholera outbreak in 130 years several months before it occurred. By using complex algorithms, this young prodigy developed software that is capable of predicting the future. At the young age of eight, while most children were still playing in the sandbox, Radinsky had already started coding, taken extra courses at a special Russian after-school in Israel, and earned a black belt in Karate. Radinsky then began studying at the Technion, Israel’s Institute of Technology, at the age of 15, and had completed her PhD in data mining by the time she was 26, not without taking a break in the middle to serve as a software engineer in the army. Staying true to her passion of predictive analytics, Radinsky founded SalesPredict in 2012, a company which works to advise salespeople on how to identify and handle promising leads. Already, Radinsky’s keen foresight has earned her a spot on MIT’s list of Top 35 Innovators Under 35. With such a lengthy list of accomplishments at the mere age of 27, only time, or perhaps her software, will tell where Radinsky’s genius will lead her next. Sivan Borowich Ya’ari – Bringing Israeli innovation to Africa Sivan Borowich-Ya’ari has been branded the “Israeli sunshine girl” for a reason: in 2008, Borowich-Ya’ari founded Innovation:Africa, a non-profit organization bringing Israeli clean technologies in solar energy and water to regions in need throughout in Africa. Borowich-Ya’ari’s connection to Africa is deep-seated, her father and mother emigrated to Israel from the northern African countries of Algeria and Tunisia. So far, the organization has carried out 68 projects in Ethiopia, Uganda, Malawi and Tanzania, providing more than 450,000 people with electricity through sustainable energy sources, enough to power schools, orphanages, medical clinics and water pumping systems. For its efforts, the organization has obtained consultative status at the UN Economic and Social Council (ECOSOC) and received the UN Innovation Award. Not only has Borowich-Ya’ari taken on the task of helping communities in Africa obtain vital resources, this philanthropist operates under the declared mission of using Israeli innovation to change the world for the better. Dr. Michal Tsur – The “Iron Lady” of Israeli innovation Dr. Michal Tsur, nicknamed the “Iron Lady” of Israeli startups has already accomplished more than most people hope to achieve in a lifetime. With a successful startup exit under her belt, she is now the co-founder and president of a multi-million dollar company. But Tsur ended up in the startup world entirely by accident, after earning a doctoral degree in application of game-theory to law from NYU, and a post-doctoral fellowship from Yale. Her lack of coding knowledge didn’t stop her from co-founding Cyota, an online security firm (which sold for $145 million just a few years after its creation), only to move on to co-found another leading Israeli company, Kaltura, an open source video platform that collaborates with big names such as AT&T, Disney and ABC. But Tsur’s success doesn’t stop there: she is also an advisor and angel investor in several successful startups, and is a pro at juggling her family life with a fast-paced, intercontinental business routine. With all this under her belt, even Kira Radinsky might find it hard to predict what Tsur will bring to the table next. Malvina Goldfeld – Breaking through cultural boundaries Malvina Goldfeld is the definition of cosmopolitan: she was born in Moldova, grew up in Israel, studied in Canada, the US, Japan and Chile and has worked in Ukraine, Italy, the UAE and Vietnam. Being fluent in five languages (English, Hebrew, Russian, Spanish and Portuguese), Goldfeld has broken through cultural and business boundaries, holding a number of prestigious positions at large corporations worldwide, allowing her to foster her passion for learning new languages and reading literature. Since July 2013, (when she had not yet turned 30), Malvina has been the Head of Business Development for Africa at PayPal, working to connect sub-Saharan Africa to the global marketplace. Previously, Goldfeld worked as the Vice President at Battery Ventures, a global venture capital fund focusing on investments in e-commerce and internet applications and as a business analyst for McKinsey & Company in Israel and Eastern Europe. Goldfeld began to build her profuse resume as a camp counselor at Seeds of Peace, a summer program working to promote dialogue and build bridges between Israeli and Palestinian youth. In 2013, she was one of the speakers at TEDxTelAvivWomen, a conference of the leading women in Israeli innovation, and in her spare time (apparently she has some), she mentors young girls in the coastal city of Bat Yam. Orit Hashay – Serial Entrepreneur This serial entrepreneur is no stranger to the startup world, joining the industry over 10 years ago. Perhaps the very epitome of an entrepreneur, Hashay hasn’t rested a moment since she entered the market. Armed with 15 years of experience in software and business development for companies like Emblaze and Comverse, Hashay has a knack for providing tech-savvy solutions to women’s daily problems. So far, she’s founded mit4mit.com (a popular Israeli portal for wedding-planning), Ramkol.co.il (a business review site), Brayola (a personalized, crowdsourced form of online bra shopping) and Vetrinas (a virtual shopping site). And her success hasn’t gone unnoticed; in 2012 alone, she was included both in the Top 50 Most Influential Women in Israel list, compiled by “Globes” and the Top 100 Girls in European Tech by the international Girls in Tech Network. Daphna Nissenbaum – Putting an end to plastic Although she started out as a computer engineer, this Israeli mom couldn’t shake her passion for sustainability and the environment. After holding management positions in companies in Israel and the United States, Nissenbaum founded Tipa, a green tech startup, in 2010. The company’s aim is to fulfill the growing need for genuinely ecologically sensitive food and beverage packaging. This savvy businesswoman developed 100 percent biodegradable and compostable packaging that decomposes within 180 days and acts as a seamless alternative to regular plastic, cardboard or aluminum packaging. So far, Nissenbaum and Tipa’s efforts to contribute to the green revolution has been met with impressive international recognition. In 2012, Tipa won first prize at Israel’s Cleantech out of 50 promising companies, as well as the Anuga Foodtec prize, a leading food industry packaging conference in Germany. And just last month, Calcalist reported that Tipa is raising $10 million, led by Chinese billionaire Li Ka-Shing’s Horizon Ventures and joined by existing investors Aviv Ventures and GreenSoil. Amit Knaani – Giving kids access to tech Amit Knaani began her journey into the world of startup development at what some would call the top. She started off as the Senior Product Manager at WiX.com, the world’s leading platform for website development, and then quit her prestigious job to join BabyFirstTV, a TV network for toddlers, as Product Manager in 2010. Together with Yami Glick, this mother of two then created her own startup in 2012 called Vikido, a video messaging service designed to allow kids and parents to communicate via video messages using an interface with no reading prerequisites. At the same time, she worked as Head of Product for ooVoo, a free video chat and instant messaging application, which has won a number of awards from “PC Magazine” since its founding. Today, she is the Vice President of Product at another up-and-coming website creation site called duda, specializing in the creation of mobile websites for businesses free of charge. Triinu Magi – Connecting with the spirit of Israeli innovation While she originally hails from Estonia, Triinu Magi has successfully caught the Israeli innovation bug. An expert data and security analyst and software architect by trade, Magi recently co-founded Neura, a system which allows users to profile their connected environment and create logical connections between devices, people, locations and the web. Before Neura, Magi worked as the head of Israeli company RSA’s (a leading global software corporation) emerging technologies and fraud detection models. She has also previously worked as the data analyst and project manager at Estonia’s Ministry of Justice, being one of the youngest people to help build the country’s e-services and e-government infrastructure. Undoubtedly, the future looks bright for this new Israeli immigrant and leader of the revolution of The Internet of Things (IoT) and connected devices. Liat Aaronson – Teaching entrepreneurship for a brighter future This lady’s got the passion for entrepreneurship in her bones. From educating the business leaders of the future, to getting budding startups on their feet, Liat Aaronson is working hard to make innovation and entrepreneurship indelible parts of Israeli culture. A lawyer by trade, Aaronson currently serves as the Director of the prestigious Zell Entrepreneurship Program at the Interdisciplinary Center (IDC) in Herzliya, an exclusive program that teaches the promising business leaders of tomorrow everything they need to know to make their venture succeed. As part of her job, she travels all over the world to coach the program’s 252 alumni and current students as they develop and promote their ventures, including by introducing her young protégés to Warren Buffett. In 2009, Aaronson attended TEDIndia and decided to launch the viral conference series in Israel, becoming the co-organizer of two TEDx events, one in Tel Aviv and another focused on Israeli-Palestinian discourse called TEDxHolyland, in 2010. From 2009 to 2013, Aaronson was also the director of Startupseeds, a non-profit organization for technology innovation for high school students. Ayelet Noff – The blonde queen of the social media scene Ayelet Noff is convinced that not only do blondes have more fun, they are also masters of social media networks. At least that was the idea when she founded her new media public relations agency Blonde 2.0 in 2006 at a time when Facebook was still in diapers. But before she set off on her own, Noff was the Marketing Manager at one of the first and most successful Israeli instant messaging startups, ICQ, and was a contributing writer for major tech news portals like TechCrunch and The Next Web. Now, she and her team of 20 at Blonde 2.0 promote over forty Israeli companies to the likes of HP, Mobli and Viber on social media networks. Noff got the message early on that when it comes to social media, when a [smart] blonde says it, people listen.
correct_foundationPlace_00083
FactBench
3
35
https://www.justice.gov/archive/dag/speeches/2006/dag_speech_060810.htm
en
Transcript of Press Conference Regardingthe Indictment of Three Former Comverse Technology Inc. Executives
[ "https://www.justice.gov/gif/sealdoj.gif" ]
[]
[]
[ "" ]
null
[]
null
null
Transcript of Press Conference Regarding the Indictment of Three Former Comverse Technology Inc. Executives Washington, D.C. MR. MCNULTY: Good afternoon. I’m Paul McNulty, the Deputy Attorney General, and I am joined on this stage by Roslynn Mauskopf, the United States Attorney for the Eastern District of New York; Chip Burrus, who is the Assistant Director of the Criminal Division for the FBI; and Linda Thomsen, who is the director of enforcement for the Securities and Exchange Commission. Today, in Brooklyn the federal courthouse for the Eastern District of New York, the Department of Justice unsealed a complaint charging three, former-senior officers of Comverse Technology, Inc. with conspiring to violate our nation’s securities laws and to commit mail and wire fraud. Those charged are Jacob “Kobi” Alexander, David Kreinberg, and William F. Sorin. Mr. Alexander is Comverse’s founder, former CEO, and former chairman of the board. He had an active role in preparing and signing the company’s quarterly and annual securities filings. Mr. Kreinberg is Comverse’s former CFO, and was responsible for the corporation’s annual and quarterly securities filings, and Mr. Sorin is Comverse’s former general counsel and corporate secretary, and in those duties signed Comverse’s annual securities filings and proxy statements. The criminal complaint outlines a broad-ranging scheme to defraud the investing public through the improper granting of stock options for Comverse’s stock. Through that scheme, the three defendants repeatedly ensured that they and others at the corporation received the option to buy stock at prices well below the stock’s market price, in violation of the company’s shareholder-approved, stock-option plans. As just one example, in the fall of 2001, the defendants caused options to be granted to themselves and others to buy stock at the share price in effect on October 22, 2001. That date happened to be the second-lowest share price in all of 2001, yet the board’s compensation committee did not approve the grants until November 28, 2001, at the earliest, when the share price had risen nearly $5 per share. As a result, Alexander received options worth approximately $3 million, Kreinberg received options worth $625,000, and Sorin received options worth $135,000. At the same time, the defendants repeatedly, falsely indicated with their statements to shareholders and the investing public that it was company practice to grant options only at the prevailing market price. They conducted this scheme by routinely creating false paperwork that showed the options having been granted on dates earlier than the actual dates, a practice known as backdating. Unsurprisingly, given the perfect vision of hindsight, these early dates consistently coincided with dates when the stock was at a low point, ensuring that the options had substantial value when actually granted. In addition, defendants Alexander and Kreinberg participated in the creation of an options slush fund that allowed Alexander to award options to favored employees without the knowledge or consent of the compensation committee and without appropriate disclosure to the investing public. These options, too, were often backdated, in violation of company policy and statements to the investing public. Today’s action seeks to bring those responsible for these practices to direct account. In addition to criminal charges, the Department has also initiated action to freeze bank accounts worth $45 million, used to launder the proceeds of the scheme, and it has initiated forfeiture proceedings. The criminal charges brought today are only an allegation, of course. All defendants are presumed innocent until proven guilty beyond a reasonable doubt in a court of law. Stock options can be a valuable tool for compensating employees. When used appropriately and honestly disclosed, they can ensure that employees have a direct incentive to improve the business and its share price, benefiting the company’s shareholders. But when options are backdated to a time when the share price was lower, and without honest disclosure, those options are simply theft from shareholders. The employee gets something of value without taking any steps to improve the company, and without the shareholder ever knowing that the compensation has been paid. Those who engage in such practices must be held responsible. Now this is not a new effort. A little over four years ago, the President created the Corporate Fraud Task Force as one piece of a multipart initiative to return transparency, accountability, and integrity to our markets. And since that time, the Justice Department has criminally prosecuted more than 1,000 defendants for corporate fraud, including nearly 170 CEOs and corporate presidents, more than 30 CFOs, and nearly 20 corporate counsel. Other actions by the SEC and by the Congress have also done much to improve corporate accountability. Yet where large sums are at stake, there will always be those who will be tempted to ignore the rules and abuse their positions for their own benefit. Thus we must remain diligent to maintain our hard-won gains in improving our markets. Investor confidence is critical to our markets, and honesty in corporate leadership is critical to that confidence. Thus the Department of Justice remains steadfast in its determination to hold to account those who violate the investing public’s trust. Today’s action is simply the latest demonstration of that ongoing but essential effort. I want to thank, in particular, Roslynn Mauskopf’s team in Eastern Virginia (sic), including Ilene Jaroslaw, Linda Lacewell, Sean Casey, Kathleen Nanden. These Assistant United States Attorneys have worked with her in bringing this case, along with many others at the FBI and at the SEC. And I’d like now to recognize Chip Burrus from the FBI to make a statement. Chip. MR. BURRUS: Thank you, Paul. Thank you, Paul; thank you, Roslynn, and thank you, Linda, for your leadership and fortitude in attacking the deceitful abused of corporate power as we see here today. Today’s complaint represents yet another in a sad, continuing parade of ruined lives and shattered corporations. Since the formation of the President’s Corporate Fraud Task Force in 2002, the FBI and its partners have convicted more than 400 people in a variety of fraud schemes. Right now there are more than 260 FBI agents working on 420 corporate-fraud and securities-fraud cases to protect the investing public. This includes 45 instances of backdated stock options, the newest trend in investor-fleecing. Backdated stock options is such an innocent-sounding name, but it’s the corporate equivalent of placing a bet after the race has been run. In what can only be called an abuse of corporate power, these executives, through fraud and deceit, rewarded themselves and their friends, at the expense of the investing public. Mr. Alexander is alleged to have used the scheme to steal $6 million from the corporation. Kreinberg and Sorin also allegedly each netted a million dollars each. Mr. Kreinberg and Mr. Sorin both turned themselves in to the FBI early this morning. By fraudulently backdating these options and not reporting the awards, the defendants deceived the investing public. They created a slush fund that was used to secretly reward corporate insiders, in one instance netting an executive $4 million in instant profit. Comverse, a billion dollar publicly traded company and its shareholders were plundered by these modern-day pirates and they must be held accountable. And they will be held accountable in the court of law. I want to congratulate the men and women of the New York’s FBI office who have been working on this case: the prosecutors, the Department of Justice, and the dedicated employees of the Securities and Exchange Commission. Andy Arena and David Chavez in particular worked very hard on this case, and they’re here to talk to you, perhaps, afterwards, about the investigation, should you so desire. I’d like to now introduce Linda Thomsen, but before I go, I want to tell you this is a very strong partnership of the FBI, the Department of Justice dedicated prosecutors, and the Securities and Exchange Commission. We’re not going to quit. Linda. MS. THOMSEN: Thank you, Chip. Good afternoon, everyone. First, I’d like to commend the Deputy Attorney General and everyone at the Department of Justice, especially the team for the United States Attorney’s Office for the Eastern District of New York and the Federal Bureau of Investigation who are responsible for the criminal case the Deputy Attorney General just announced. I’d also like to echo the remarks about the President’s Corporate Fraud Task Force. The Securities and Exchange Commission is a proud member of that taskforce and it has been our privilege to work with the other members, especially the Department of Justice, U.S. Attorney’s Offices throughout the country, and the Federal Bureau of Investigation, to make sure that corporate fraud is pursued and effectively addressed. Today’s actions, as have been noted, are part of that effort. Today, the Securities and Exchange Commission filed civil, fraud charges against Jacob Alexander, former chairman and chief executive officer of Comverse Technology, Incorporated; David Kreinberg, Comverse’s former chief financial officer; and William Sorin, Comverse’s former general counsel. Our complaint, the result of an intensive, five-month investigation, alleges that each of them took part in a decade-long fraud. Their scheme, we allege, was designed to give themselves and others undisclosed, in-the-money stock options by backdating stock-option grants to coincide with historically low closing prices of Comverse common stock. The complaint alleges, among other things, that between 1999 and 2000, Misters Alexander and Kreinberg created a slush fund of backdated options. They directed that options be granted to fictitious employees and later used these options, some of which were immediately exercisable, to recruit and retain personnel. Each of the former executives realized substantial personal gains from the exercise of illegally backdated option grants and the subsequent sale of Comverse common stock. Collectively they realized millions. Mr. Alexander realized at least $138 million in gain from the exercise and sale of backdated options. At least $6.4 million of that personal gain represents the undisclosed, in-the-money portion of those grants at the time of the grants. For Mr. Kreinberg, the personal gain is over $13 million, of which, at least $1 million represents the in-the-money portion of the grants. And for Mr. Sorin, the gain is of $14 million, of which approximately $1 million represents the in-the-money portion. According to the SEC’s complaint, in the decade between 1991 and 2001, Mr. Alexander repeatedly used hindsight to select a date when the closing price of Comverse’s common stock was at or near a quarterly or annual low. Mr. Alexander later communicated this date and closing price to Mr. Sorin, the general counsel, in order for it to be used as the date of an exercised price for a grant of converse options. Mr. Sorin, with Mr. Alexander’s knowledge, created company records that falsely indicated a committee of Comverse’s board of directors had actually approved the option grant on the date Mr. Alexander had cherry-picked, when, in fact, the committee had not acted and did not act until a later date. Mr. Kreinberg joined the scheme no later than 1998 and assisted Mr. Alexander in selecting grant dates, which coincided with historically low share prices for Comverse stock. The complaint alleges Mr. Kreinberg knew that the company records reflected false grant dates. The complaint also alleges that Mr. Kreinberg, with Mr. Sorin’s knowledge, initiated a similar backdating scheme at AltCom, Inc., a publicly traded, majority-owned subsidiary of Comverse. Both Comverse and AltCom have publicly announced that they expect to restate their historical financial results for multiple years. As we have said before, we will pursue fraudulent conduct related to stock options wherever and whenever it occurs. Today’s action, the result of a five-month investigation, which is continuing, is another example of our commitment. It also reflects the dedication and commitment of our investigative team, led by Antonia Sheehan and Christopher Cante. Thank you very much, and now I give the podium back to Mr. McNulty. MR. MCNULTY: Thank you, Linda. And I’ll take any questions you might have. Yes. QUESTION: I’m just wondering -- this is your second case now in backdating -- if you guys have a handle on what’s the scope of the problem here, and is this yet another scandal that is facing corporate America? MR. MCNULTY: Well, we do have a significant number. I’m not going to try to pin it down precisely, but we do have a significant number of investigations going on. Both the SEC and the Department of Justice have significant number of investigations. It remains to be seen how many of those investigations will result in individuals being charged, and I don’t want to speculate on that either. We’ll have to see. But the fact that there are a significant number of investigations indicates that this is a problem that is extensive that has -- is receiving a great deal of attention from the enforcement community. And I expect that there will be more accountability down the road in this area. Yes. QUESTION: In the press materials it mentions that there’s $57 million that was transferred to accounts in Israel. Is there any way of freezing that money or can you tell me what might be happening to that money? MR. MCNULTY: I actually don’t know if there’s any ability to get at the money there. If either one of you do, if you want to say something, also. MS. MAUSKOPF: As we indicate, we have been using every tool at our disposal to try and trace money related to the fraud alleged in this complaint. $45 million has been frozen here in the United States. We are aware of other monies and are continuing our investigation to ensure that we can freeze and, where lawfully appropriate, recoup the gains from this fraud. QUESTION: Do you all know where Mr. Alexander is and how close are you to apprehending him? MR. MCNULTY: I’m not going to comment today on his whereabouts. The other two defendants have surrendered themselves, and we are in the process of getting a hold of Mr. Alexander, but I won’t say anything more than that. QUESTION: Could you tell us whether he’s in the United States? MR. MCNULTY: I’m not going to common on that. QUESTION: Back to the scope of the problem, the number that’s been reported is about 80, the stock backdating investigations, is that a good universe? And, secondly, a long history of those big investigations with ENRON and WorldCom -- was it a surprise to you that you found this massive fraud, another way for corporate executives to enrich themselves after all the titans of industry claim that they’ve cleaned up their accounting practices? MR. MCNULTY: Well, a number of these offenses, as you can see in this case, actually occurred some time ago. And there is something to be said about that pattern where the actions occur at one time but perhaps change as a result of new reporting requirements to the SEC, and then there’s been a lot of restatements, which have brought this to the attention -- it’s come to the attention of law enforcement in different ways, actually, but that certainly the restatements have been one way that this has led to a number of investigations. On your original question about numbers. I think that number sounds like it might be actually a combination of both numbers that you have used. MS. THOMSON: That is a number a number that the SEC has announced of civil investigations, and that number is still good. QUESTION: Has a grand jury indicted these men? MR. MCNULTY: This is a complaint that we have today and an indictment would follow, subsequent to an actual complaint. QUESTION: So have they been officially charged? MR. MCNULTY: Yes, they have been officially charged by way of this complaint, through a criminal complaint. Yes. QUESTION: A question for Linda. How much money are you seeking in disgorgement from these three men? Is it $138 million for Mr. Alexander or just the $6.4 million? MS. THOMSEN: We will seek all available disgorgement, and that will be developed during the course of the case. QUESTION: So it depends? We’re not quite sure yet? MS. THOMSEN: We will seek as much disgorgement as we can legally get for the conduct. QUESTION: Up to a maximum of $138 million? MS. THOMSEN: I think we have to see where -- how the facts play out in terms of the options that were granted and exercised, any other options that may be other benefits, perhaps compensation in addition to options. So the number will subject to the proof and evidence as we go forward with the case, but we are seeking as much disgorgement as we lawfully can. QUESTION: Some surveys suggest that 17 percent of public companies have some problem with backdating. Do you have a sense of how many of those are criminal type of problems or are there just sort of legitimate mistakes that can be made? MR. MCNULTY: I wouldn’t venture to identify a percentage, but I will acknowledge that within all of the cases or incidents involving backdating there’s going to be a variety of factual circumstances and that’s going to affect whether or not the government has a case that can go forward. Some may include mistakes that were made, sloppiness in paperwork, where there isn’t evidence of an intent to actually deceive the shareholders, the board, the investing public. Others are going to show, again, that kind of intent. There is going to be clear evidence of what was being done in order to take advantage of low points in the stock-price value and so forth. So it’s going to be determined by the facts of the case, but it’s clear that some of what may be called a backdating issue may not in fact result in any criminal prosecution. QUESTION: Have you looked at any cases where you’ve looked into the backdating and there hasn’t been the intent of fraud and you’ve closed cases? MR. MCNULTY: The answer, I guess, is yes because we’re looking at a lot of cases, and some are being pursued and others are not being pursued as aggressively or don’t seem to be heading in a path where there might be charges brought. So I don’t have a sense of how many we could call closed cases, but based on the information I’m gathering from the field and what folks are doing in their investigations, they’re certainly identifying some cases that have a stronger body of evidence to suggest that there was a clear criminal intent. QUESTION: Have IRS agents been included in the investigation with regard to the -- tax accounting for the options? MR. MCNULTY: Yes, we are working with IRS in this matter because you’re right; there are certainly significant tax implications to these cases. And, as you know, the IRS is on the Corporate Fraud Task Force. We actually met, the Corporate Fraud Task Force did, two weeks ago, discussed this at some length, and talked about the way in which it’s important to coordinate closely with the IRS. And each case will differ a bit as to how involved the IRS will be. In this case, have you had the IRS involved yet in the investigation? MS. MAUSKOPF: We’ve had some involvement. MR. MCNULTY: Yeah, we’ve had -- we’re going to have the IRS in this case, as well, but we’ll see where that goes. QUESTION: Mr. McNulty, on another topic: There’s reports that one of the eleven missing Egyptian students has been arrested. Do you have any more information on that? MR. MCNULTY: I don’t have any more information on that particular individual. What I can tell is just, the FBI has said and Immigration and Customs Enforcement has said that they’re attempting to locate all of the Egyptian students that arrived and determine not only obviously their whereabouts but their reason for being here. But there is no indication that there is any threat associated with their presence. But there is an active effort, an energetic effort, to identify where they are. QUESTION: So you don’t have any more information on that? MR. MCNULTY: I don’t have any more information on this individual, no. QUESTION: How concerned are you about these missing students and what is being done to find them? MR. MCNULTY: As I said, there’s an active effort. We’re on the lookout for those individuals, active effort to identify them, but no information to suggest that there’s any threat associated with their whereabouts or why they’re here. QUESTION: Why then was a BOLO issued if there was no threat? Or maybe this is for Mr. Burrus. Naturally there’s been a lot of interest because of what’s going on in the Middle East. You can understand that there’s public concern whether warranted or not when this report came out, and so there was a BOLO which indicated obvious official concern. And yet you say that there’s no threat, so help explain why this was put out there for public consumption, and yet there’s no threat. MR. MCNULTY: I think all I’ll say is that it was determined to be a prudent thing to do to make the necessary efforts to identify the individuals, the prudent and responsible thing to do. But at the same time, while we make those efforts then that course requires the cooperation of law enforcement and so BOLOs are a means by which that cooperation is obtained. That is, in a sort of operational sense. So that method corresponds with the need to be prudent and responsible here, but no reason for that to be associated with a specific threat. It’s just appropriate to make that clear to the public. QUESTION: I had just a quick question following on my colleague’s question about the money figures. You’ve identified about $8.5 million of profits to the three executives that was attributed to the backdating. That doesn’t strike me as huge given recent corporate scandals. Do you have a figure of the grants to all employees that are attributable to backdating? MR. MCNULTY: All grants -- not yet. No. That process will be ongoing as we gather more information. The numbers you have here relate to the amount of money that may be directly associated with the backdating or certainly was used to facilitate, if it wasn’t directly resulting from -- facilitating the ability to commit the fraud and bring the fraudulently obtained money in. So that sorting out is going to occur forward. So we freeze the money that constitutes the broadest scope of what we believe would be forfeitable and then we will gather more information and begin to sort out the number. QUESTION: So those bank-account figures, the $57 million, the $40 million, you’re not sure at this point how much of that constitutes fraudulently obtained profits? MR. MCNULTY: Correct. Yes. QUESTION: Was there anything special about the case of Rocade and Comverse that made these two companies the first to get charged? You said you have several ongoing investigations, I mean was it just the first two that you got to or were they particularly brazen or could you just -- MR. MCNULTY: A couple of things stand out. One is that the strength of evidence in both cases, from the government’s perspective -- that’s something will be have to be determined in a court of law, but government felt that we had especially clear, strong evidence to support the charges brought. In this particular case, we also had the added benefit of cooperation. If you look at some of the dates in here, you’ll note that this conduct became known only in March of this year, and here we are today in early August. So you see that a great deal has been accomplished in a relatively short period of time, and that’s attributable to the fact that we got good cooperation from Comverse and the assistance that companies often provide in bringing information to light that they have found through their efforts, and we see this operating here. And so that helped get sufficient information faster, and that’s the kind of cooperative approach that we really seek from corporations. We think that the corporations have a fiduciary duty to their stockholders, to the investing public to provide that cooperation to the government. And when they respond appropriately, it can bring these matters into a clear view, we can hold the right people accountable, and, as a result, the marketplace is stronger. And I hope that we see more of that in the cases to come. Thank you all very much. ###
correct_foundationPlace_00083
FactBench
2
59
https://www.justice.gov/archive/usao/nye/pr/pressrel_2006.html
en
Eastern District of New York
[ "https://www.justice.gov/archive/usao/images/headDOJ_seal.jpg", "https://www.justice.gov/archive/usao/images/headers/headLogo-nye.gif", "https://www.justice.gov/archive/usao/nye/images/subBanner-News.gif", "https://www.justice.gov/archive/usao/images/docicon.gif", "https://www.justice.gov/archive/usao/images/subicon.gif", "https://www.justice.gov/archive/usao/images/subsubicon.gif", "https://www.justice.gov/archive/usao/images/docicon.gif", "https://www.justice.gov/archive/usao/images/subicon.gif", "https://www.justice.gov/archive/usao/images/subsubicon.gif", "https://www.justice.gov/archive/usao/images/footer_right.jpg" ]
[]
[]
[ "" ]
null
[]
null
null
correct_foundationPlace_00083
FactBench
2
18
https://nypost.com/2006/08/25/fugitive-comverse-ceo-in-sri-lankan-hideaway/
en
FUGITIVE COMVERSE CEO IN SRI LANKAN HIDEAWAY
https://pagesix.com/wp-c…strip=all&w=1024
[ "https://sb.scorecardresearch.com/p?c1=2&c2=6390601&c3=NYPOST&cj=1&cv=3.6", "https://nypost.com/wp-content/uploads/sites/2/2023/02/button_3_bg-2.png?w=206", "https://nypost.com/wp-content/uploads/sites/2/2024/07/newspress-collage-9k29lgtpi-1721612035053.jpg?quality=75&strip=all&1721597683&w=260&h=174&crop=1", "https://nypost.com/wp-content/uploads/sites/2/2024/03/amazonbestsellers-1.png?w=93&h=62&crop=1", "https://nypost.com/wp-content/uploads/sites/2/2024/07/Serta-Cooling-Mattress-Cover.jpg?quality=75&strip=all&w=93&h=62&crop=1", "https://nypost.com/wp-content/uploads/sites/2/2024/07/biles-therma.jpg?quality=75&strip=all&w=93&h=62&crop=1", "https://nypost.com/wp-content/uploads/sites/2/2024/06/walmartfeat.jpg?quality=75&strip=all&w=93&h=62&crop=1", "https://nypost.com/wp-content/uploads/sites/2/2024/05/New-Cooling-Sheets.jpg?quality=75&strip=all&w=93&h=62&crop=1", "https://pagesix.com/wp-content/uploads/sites/3/2024/07/prince-george-29.jpg?quality=75&strip=all&w=1024", "https://decider.com/wp-content/uploads/2024/07/The-View-1.jpg?quality=75&strip=all&w=322&h=215&crop=1", "https://nypost.com/wp-content/uploads/sites/2/2024/07/85816859.jpg?quality=75&strip=all&crop=1&w=341&h=227 1x, https://nypost.com/wp-content/uploads/sites/2/2024/07/85816859.jpg?quality=75&strip=all&crop=1&w=682&h=454 2x", "https://nypost.com/wp-content/uploads/sites/2/2024/07/newspress-collage-9k29lgtpi-1721612035053.jpg?quality=75&strip=all&1721597683&crop=1&w=341&h=227 1x, https://nypost.com/wp-content/uploads/sites/2/2024/07/newspress-collage-9k29lgtpi-1721612035053.jpg?quality=75&strip=all&1721597683&crop=1&w=682&h=454 2x", "https://nypost.com/wp-content/mu-plugins/vip-helpers/images/vip-powered-dark-small.png" ]
[]
[]
[ "" ]
null
[ "Zach Haberman" ]
2006-08-25T00:00:00
A private investigator eyeballed Jacob “Kobi” Alexander – the fugitive CEO of Comverse who went on the lam just as he was set to be indicted...
en
https://nypost.com/wp-co…t/apple-icon.png
New York Post
https://nypost.com/2006/08/25/fugitive-comverse-ceo-in-sri-lankan-hideaway/
A private investigator eyeballed Jacob “Kobi” Alexander – the fugitive CEO of Comverse who went on the lam just as he was set to be indicted for making millions in a stock option backdating scam – in Sri Lanka, he said yesterday. But the U.S. Attorney’s Office in Brooklyn, which is prosecuting Alexander for conspiring with top executives at Comverse to pull off the scam in which they pocketed $138 million, said the investigator might be on the wrong track. “There is no confirmation of Mr. Alexander’s whereabouts,” said Robert Nardoza, a spokesman for the Brooklyn U.S. Attorney. Private detective Moshe Buller, who declined to name the New York venture capital fund that hired him, said he tracked down Alexander in Sri Lanka through money transfers and a Skype Technologies SA phone call made between the country’s capital city, Colombo, and his home in Israel. “I saw him with my own eyes; it was definitely Kobi,” Buller said. “The place is so small, it wasn’t hard to find him.” Israeli newspaper Ma’ariv reported yesterday that Alexander was spotted in Negombo, a Sri Lankan fishing village about 20 miles north of Colombo. “The FBI is aware of the report and checking whether it’s true,” said U.S. Embassy spokesman Stewart Tuttle. Alexander, an Israeli national who founded the voice-mail software company in 1984, illegally wired almost $60 million to Israel just weeks before prosecutors were set to unseal an indictment against him and company execs William Sorin and David Kreinberg on Aug. 9, according to court documents. Within in a week, the FBI declared Alexander a fugitive, and law enforcement sources said they don’t believe he is currently in Sri Lanka or Israel. Buller said he also believes Alexander has since left Sri Lanka since he was spotted on Wednesday. Alexander faces up to 10 years behind bars if convicted in the stock scam, a sentence which could be doubled as a result of his skipping the country. His lawyer, Robert Movillo, did not return a call seeking comment. With Bloomberg
correct_foundationPlace_00083
FactBench
2
63
https://www.cfo.com/news/cfo-crimes-litigation-control-failures-ethics/712987/
en
CFO crimes, litigation, financial control failures, ethics violations, and more
https://www.cfo.com/imgp…mV5LmpwZw==.webp
https://www.cfo.com/imgp…mV5LmpwZw==.webp
[ "https://d12v9rtnomnebu.cloudfront.net/logo/publications/cfodotcom_black.svg", "https://d12v9rtnomnebu.cloudfront.net/logo/publications/cfodotcom_white.svg", "https://www.cfo.com/static/img/menu_icons/menu_white.svg?013916280623", "https://www.cfo.com/static/img/menu_icons/search_white.svg?013916280623", "https://www.cfo.com/static/img/email_icons/mail.svg?273117231121", "https://www.cfo.com/static/img/menu_icons/search.svg?320116291121", "https://www.cfo.com/static/img/menu_icons/close.svg?273117231121", "https://d12v9rtnomnebu.cloudfront.net/logo/printer_friendly/cfo.jpg", "https://www.cfo.com/static/img/cfodotcom/signup-footer.png?013916280623", "https://www.cfo.com/imgproxy/d1mML9NdDyIjKfe_ZIU16pK34HQN0FXZuGzbrlieJG8/g:ce/rs:fill:1200:675:1/bG9jYWw6Ly8vZGl2ZWltYWdlL0dldHR5SW1hZ2VzLTE0MDg0NjM3MTguanBn.webp", "https://www.cfo.com/static/images/info-icon.png?234917240220", "https://www.cfo.com/static/images/info-icon.png?234917240220", "https://www.cfo.com/imgproxy/j64mifX3rNjO9Hm9iGwfoBQ6r6U4yt6-cYvCzFmcBMk/g:ce/rs:fill:600:338:1/bG9jYWw6Ly8vZGl2ZWltYWdlL0dldHR5SW1hZ2VzLTE0NDQwNTg3MTguanBn.webp", "https://www.cfo.com/imgproxy/AYxZfApAl_MRLl_hk_F7pvQVUYPcn-t_bh5XGhntqkM/g:ce/rs:fill:600:338:1/bG9jYWw6Ly8vZGl2ZWltYWdlL0dldHR5SW1hZ2VzLTEzMzUwMjc1MzkuanBn.webp", "https://www.cfo.com/imgproxy/Ru3Vjugy54-eT3dQz9VoSk0aeTQSPTzQBNQ9BNbg-0M/g:ce/rs:fill:600:338:1/bG9jYWw6Ly8vZGl2ZWltYWdlL2dldHR5aW1hZ2VzLTE0NjYyNDMxNDYtMTcwNjY3YS5qcGc=.webp", "https://www.cfo.com/imgproxy/ZoB0hblzYayAVqZN9Kq7F3jX5n4CxoZOc3qGv4f_82Y/g:ce/rs:fill:600:338:1/bG9jYWw6Ly8vZGl2ZWltYWdlL2Q0cC1zdWl0LnBuZw==.webp", "https://www.cfo.com/static/img/lib_icons/survey-black.svg?141716111021", "https://www.cfo.com/static/img/lib_icons/whitepaper-black.svg?141716111021", "https://www.cfo.com/static/img/lib_icons/survey-black.svg?141716111021", "https://res.cloudinary.com/dmgi9movl/image/upload/dpr_2.0,h_48,w_48/v1720544240/press_release/assets/company_logos/DeshCap_logo_2_iwezq8.jpg", "https://www.cfo.com/static/img/cfodotcom/signup-footer.png?013916280623", "https://res.cloudinary.com/dmgi9movl/image/upload/dpr_2.0,h_50,w_50/v1720544240/press_release/assets/company_logos/DeshCap_logo_2_iwezq8.jpg", "https://www.cfo.com/imgproxy/d1mML9NdDyIjKfe_ZIU16pK34HQN0FXZuGzbrlieJG8/g:ce/rs:fill:1200:675:1/bG9jYWw6Ly8vZGl2ZWltYWdlL0dldHR5SW1hZ2VzLTE0MDg0NjM3MTguanBn.webp", "https://www.cfo.com/static/images/info-icon.png?234917240220", "https://www.cfo.com/static/images/info-icon.png?234917240220", "https://www.cfo.com/imgproxy/ZoB0hblzYayAVqZN9Kq7F3jX5n4CxoZOc3qGv4f_82Y/g:ce/rs:fill:600:338:1/bG9jYWw6Ly8vZGl2ZWltYWdlL2Q0cC1zdWl0LnBuZw==.webp", "https://www.cfo.com/imgproxy/3ctRBzR-ONULceqTEZnEh9nFj2tpHXxNWQAT3tSG1G8/g:ce/rs:fill:600:338:1/bG9jYWw6Ly8vZGl2ZWltYWdlL0Nvc21pbl9oZWFkZXIuanBn.webp", "https://www.cfo.com/imgproxy/NKAA2rGGLnXzLtOVcSubRg4bY0Z_MXWQMa7xzCL06BM/g:ce/rs:fill:600:338:1/bG9jYWw6Ly8vZGl2ZWltYWdlL0dldHR5SW1hZ2VzLTExNjA5NDc0NzUuanBn.webp", "https://d12v9rtnomnebu.cloudfront.net/logo/corporate/ID_black.svg", "https://www.cfo.com/static/img/menu_icons/close.svg?273117231121", "https://www.cfo.com/static/img/menu_icons/search.svg?320116291121", "https://www.cfo.com/static/img/menu_icons/chevron_down.svg?102717100918" ]
[]
[]
[ "" ]
null
[ "CFO Editorial Staff" ]
2024-04-12T14:00:00
CFO.com provides essential analysis and expert advice for Chief Financial Officers to tackle organizational challenges, manage major risks, drive organizational value, and maximize their personal career potential.
en
/favicon.ico?v=2
CFO
https://www.cfo.com/news/cfo-crimes-litigation-control-failures-ethics/712987/
correct_foundationPlace_00083
FactBench
1
61
https://www.rallyventures.com/team/
en
Rally Ventures
https://www.rallyventure…/Rally-Photo.jpg
https://www.rallyventure…/Rally-Photo.jpg
[ "https://www.rallyventures.com/wp-content/uploads/2020/06/logo.svg", "https://www.rallyventures.com/wp-content/uploads/2022/08/Charles-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/08/Charles.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/08/Charles-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/08/Charles.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/07/Jeff-2-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/07/Jeff-2.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/07/Jeff-2-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/07/Jeff-2.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/07/Justin-2-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/07/Justin-2.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/07/Justin-2-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/07/Justin-2.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Stephanie-McCoy-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Stephanie-McCoy.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Stephanie-McCoy-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Stephanie-McCoy.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Jessi-Howard-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Jessi-Howard.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Jessi-Howard-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Jessi-Howard.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Kathy-Altenhoff-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Kathy-Altenhoff.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Kathy-Altenhoff-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Kathy-Altenhoff.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Angie-Dardis-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Angie-Dardis.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Angie-Dardis-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Angie-Dardis.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Rachel-Stone-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Rachel-Stone.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Rachel-Stone-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Rachel-Stone.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2024/05/Holly-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2024/05/Holly.png 800w", "https://www.rallyventures.com/wp-content/uploads/2024/05/Holly-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2024/05/Holly.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Lea-Avilla-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Lea-Avilla.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Lea-Avilla-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Lea-Avilla.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Michael-Jennings-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Michael-Jennings.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Michael-Jennings-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Michael-Jennings.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Zenas-Hutcheson-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Zenas-Hutcheson.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Zenas-Hutcheson-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Zenas-Hutcheson.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Tom-Peterson-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Tom-Peterson.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/Tom-Peterson-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/Tom-Peterson.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/11/Untitled-design-35-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/11/Untitled-design-35.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/11/Untitled-design-35-480x480.png 480w,https://www.rallyventures.com/wp-content/uploads/2022/11/Untitled-design-35.png 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/James-Gibson-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/James-Gibson.jpg 800w", "https://www.rallyventures.com/wp-content/uploads/2022/06/James-Gibson-480x480.jpg 480w,https://www.rallyventures.com/wp-content/uploads/2022/06/James-Gibson.jpg 800w", "https://www.rallyventures.com/wp-content/themes/rally-ventures-2/assets/images/arrow.svg", "https://www.rallyventures.com/wp-content/themes/rally-ventures-2/assets/images/arrow.svg", "https://www.rallyventures.com/wp-content/themes/rally-ventures-2/assets/images/arrow.svg", "https://www.rallyventures.com/wp-content/themes/rally-ventures-2/assets/images/arrow.svg", "https://www.rallyventures.com/wp-content/uploads/2024/06/twitter-x-fill.svg", "https://www.rallyventures.com/wp-content/uploads/2022/06/icon-linkedin.svg", "https://www.rallyventures.com/wp-content/uploads/2022/06/icon-medium.svg" ]
[]
[]
[ "" ]
null
[]
2020-06-23T19:52:45+00:00
en
https://www.rallyventure…icon-180x180.png
Rally Ventures
https://www.rallyventures.com/team/
Charles has been a venture capitalist since 1997 and has focused on early-stage business technology companies throughout his career. Charles co-founded Rally Ventures in 2012 with Jeff Hinck. Prior to co-founding Rally Ventures, Charles was a General Partner at El Dorado Ventures since 2000. He started his venture career as an associate at Scripps Ventures after working as an analyst at Piper Jaffray and earlier as a systems consultant at the Mayo Clinic, where he helped design a radiology information system. Some of Charles’ notable investments include: Coupa Software, which had a highly successful IPO in 2016 (Nasdaq: COUP); Compellent Technologies, which held an IPO and was subsequently sold to Dell, returning more than 9x invested capital; NuSpeed, which was acquired by Cisco Systems returning 55x invested capital; Verodin; and Twistlock. Charles founded and helps produce the Node Summit conference series, which focuses on the evolution and future of web application development. He serves on the boards of Abstract Security, AnyRoad, Blackwell Security, Bugcrowd, Influ2, Picnic and is Chairman of the Board of the Camp Chippewa Foundation. Charles received his MBA from Wharton and his BA from Colby College. Jeff has been an entrepreneur, executive and venture capitalist for over 25 years. Jeff co-founded Rally Ventures in 2012 with Charles Beeler. He has founded or co-founded numerous companies and has had the good fortune to have been surrounded by brilliant co-founders and executives. Jeff had a successful run as a software CEO early in his career and realized his competitive advantage was in a boardroom and as an investor rather than as an executive. Some of Jeff’s notable investments include: SoFi (IPO), Figure Technologies, Bold.co, Algety (acquired by Corvis), Broadsoft (IPO, then acquired by Cisco), Compellent (IPO, then acquired by Dell), enStratius (acquired by Dell), IPCell (acquired by Cisco), iTKO (acquired by CA), Lockpath (acquired by Navex), NetCentrex (acquired by Comverse), NuSpeed (acquired by Cisco), Sistina (acquired by Redhat) and SportsEngine (acquired by NBC/Comcast). Jeff serves on the boards of Crowdstreet, JustiFi, Krista Software, Parallax, Total Expert, Vertical Insure and Yardstik, and has twice served as President of the MVCA. Jeff received his JD from Harvard Law School and his BA from Northwestern University. He began his professional career at McKinsey & Company. Justin has more than 20 years of strategic operating experience in high-technology, including deep expertise in Saas, payments and network effect businesses. Justin was a Rally Technology Partner beginning in 2015 and joined Rally Ventures as General Partner in October 2019. Prior to joining Rally, Justin was CEO and cofounder of SportsEngine, one of Rally’s early portfolio companies. SportsEngine defined the SRM (Sport Relationship Management) market and under Justin’s lead, grew to provide its CRM and Payments platform to more than 45,000 paying customers, more than 35 million monthly unique platform users and more than $100 million in revenue, while growing to more than 500 full-time employees. The company was acquired by Comcast / NBCUniversal in 2016 and continues to operate as the amateur sports division of NBC Sports. Prior to SportsEngine, Justin was the cofounder of Third North Creative which developed CMS and E-Commerce technologies for corporations. Justin currently serves on the boards of AnyRoad, Bond Sports, JustiFi, Kaleidoscope, Kyros, Order, PayGround, Total Expert, Vertical Insure, Yardstik and Zingtree. He is a board observer at Allen Control Systems, Epoch, ManageXR and Parallax. Justin received his BA in Economics from the University of Wisconsin-EauClaire. Stephanie has more than 20 years of experience in the finance industry and has spent the last 15 years in venture capital. Stephanie joined Rally Ventures in January 2014. As Rally’s Chief Financial Officer, Stephanie is responsible for the firm’s operations and financial management including legal and regulatory filings as well as investor relations, accounting, tax compliance, and financial reporting for the funds. Prior to joining Rally, Stephanie held senior financial positions with Crescendo Ventures and Decathlon Capital Partners. She began her professional career at Pricewaterhouse Coopers, LLP (PwC) in 1995 and worked in a variety of industries but decided to focus on financial services and quickly determined the venture capital industry was where she wanted to be. Stephanie received her BA from the University of St.Thomas and is a CPA (inactive). Jessi Howard has 20 years of experience in Human Resources, and she has spent the majority of her career in the technology industry. As Rally Ventures’ Chief People Officer, Jessi leads the firm’s HR practice and operates as the Fractional Chief People Officer for Rally portfolio companies, providing strategic guidance and tactical support around expanding and supporting their teams. Prior to joining Rally, Jessi helped grow and shape multiple tech start-ups, including SportsEngine, RedBrick Health and Secure Computing. Jessi received her BS from the University of Wisconsin, Stout and her MA from St. Mary’s University. Kathy has more than 20 years of experience in investor relations and capital formation within the alternative investment and private equity industry. Kathy joined Rally Ventures in February 2019. As Vice President of Investor Relations, she is focused on managing the firm’s investor relations, marketing and corporate communications efforts. Prior to working with Rally, Kathy was a Managing Director at Castlelake and lead the firm’s investor relations team. During her tenure at Castlelake, she was responsible for developing and directing investor relations’ policies and procedures, managed and maintained existing investor relationships and executed on all aspects of capital raising initiatives. Kathy began her career in investor relations at Crescendo Ventures. She received a BS in Marketing and Business Administration from St. Cloud State University. Angie has over 19 years of experience in the finance industry. She joined Rally Ventures in October 2015. As the Director of Finance, she is focused on the firm’s operations and financial management including investor relations, accounting, tax compliance and financial reporting for the funds. Prior to joining Rally, Angie worked at various investment firms such as CarValInvestors, Galliard Capital Management and Pine River Capital Management. She started her career in the finance industry at Crescendo Ventures, where she worked with Jeff Hinck and Stephanie McCoy. Angie received her BS from the University of Minnesota, Duluth and is a CPA (inactive). Rachel is a marketing and PR professional with fifteen years of experience in communications, digital media and relationship management. She joined Rally Ventures in July 2020. As the Director of Platform, Rachel is responsible for the firm’s marketing, PR and editorial content. She also guides community and resource development for the Rally portfolio. Prior to joining Rally, Rachel led marketing and communications for Questscope, a humanitarian organization based out of the Middle East. Rachel’s experience also includes positions with The Collaborative, a member organization for Minnesota entrepreneurs and investors, and Second Street Media, an audience engagement software platform. Rachel received her BS in Economics and International Development from Saint Louis University. Holly has over 20 years of experience in the finance industry. She joined Rally Ventures in October 2023. As the Finance Manager, she is focused on the firm’s operations and financial management, including accounting, tax compliance and financial reporting for the funds. Prior to joining Rally, Holly held fund accounting positions at CarVal Investors and Pine River Capital Management. She began her professional career at KPMG LLP, primarily auditing financial services companies. Holly received her BBA from the University of Wisconsin – Eau Claire and is a CPA (inactive). Michael has more than 30 years of experience in information technology and business development. He joined Rally Ventures in November 2018. Prior to joining Rally, Michael was the CEO of Secure-24, a leading provider of comprehensive managed IT services. During his time at Secure-24, Michael led the company through nearly 200% growth over 6 years and the successful sale to NTT Communications in 2018. He was also the co-founder, Chief Technology Officer and Senior Vice President of Business Development at Appshop, a leading provider of hosted Oracle application and database technology. Michael’s experience includes positions with El Dorado Ventures, United Technologies, Pratt & Whitney, InfoSolutions and SimulTrans. Michael has designed and built industry leading hosting and software delivery platforms, built and managed global information system support organizations, supported over 500 technology sales cycle wins, managed the implementation of more than 30 ERP and CRM deployments, founded and built two companies. Michael served as Chairman of the board for 10thMagnitude, a leading Microsoft Azure partner that sold to Cognizant in 2020 and currently is a board member of Bugcrowd, Gappify, Spec, Ostra and Protera. He studied Network Engineering and Management at the University of California, Santa Cruz. CEO for more than 30 years. He joined Rally Ventures at its inception in 2012. Prior to joining Rally, Zenas co-founded and has co-managed Vesbridge Partners since 2004. He also served as the co-Managing Partner of SPVC (formerly St. Paul Venture Capital), which he joined in 1997. Some of Zenas’ notable investments include Acopia (acquired by F5), Granite Systems (acquired by Telcordia), Castle Networks (acquired by Unisphere), Corvil and VisiTrend (acquired byCarbon Black). He was CEO of: Vivo Networks (acquired by Real Networks), CAECO (acquired and h by Mentor Graphics) and Control Automation (acquired by Universal Instrument). Zenas served as a board member of PictureTel, Natural Microsystems and Concord Data Systems and currently serves on the board of Ottometric. He started his career at Bain & Co. and BNY. Zenas received his MBA from the University of Chicago and his BA from Dartmouth. Tom has been a venture capitalist for more than 30 years and is a veteran early-stage technology investor, focusing on enterprise software and infrastructure companies. Tom joined Rally Ventures at its inception in 2012 as Venture Partner. Tom was one of the founding members of El Dorado Ventures in 1986 and drove the bulk of the returns for the firm during his career. He began his venture career in 1984 with Union Venture Corp., the venture capital arm of Union Bank. Prior to joining Union Venture, Tom held engineering and management positions with Hewlett-Packard and General Electric. Some of Tom’s notable investments include: Access Health (IPO, then acquired by McKesson), Braze (Nasdaq: BRZE), Clinical Research IO (purchased by The Riverside Company), Cyras (acquired by Ciena), Earthlink (Nasdaq: ELNK), Efficient Networks (IPO, then acquired by Siemens), Jigsaw (acquired by Salesforce), MediaQ (acquired by NVIDIA), Rogue Wave (Nasdaq: RWAV), Scout Labs (acquired by Lithium) and Swarm (acquired by Groupon). Tom serves on the boards of 401GO, Appdome, Epoch and Rev, and is a board observer at DecisionNext, PieEye and SenseIP. Tom received his MBA from UCLA and his BS from Iowa State University.
correct_foundationPlace_00083
FactBench
2
34
https://www.economist.com/business/2006/10/05/the-fugitive
en
The fugitive
https://www.economist.co…llback-image.png
https://www.economist.co…llback-image.png
[ "https://www.economist.com/cdn-cgi/image/width=16,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 16w, https://www.economist.com/cdn-cgi/image/width=32,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 32w, https://www.economist.com/cdn-cgi/image/width=48,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 48w, https://www.economist.com/cdn-cgi/image/width=64,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 64w, https://www.economist.com/cdn-cgi/image/width=96,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 96w, https://www.economist.com/cdn-cgi/image/width=128,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 128w, https://www.economist.com/cdn-cgi/image/width=256,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 256w, https://www.economist.com/cdn-cgi/image/width=360,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 360w, https://www.economist.com/cdn-cgi/image/width=384,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 384w, https://www.economist.com/cdn-cgi/image/width=480,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 480w, https://www.economist.com/cdn-cgi/image/width=600,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 600w, https://www.economist.com/cdn-cgi/image/width=834,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 834w, https://www.economist.com/cdn-cgi/image/width=960,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 960w, https://www.economist.com/cdn-cgi/image/width=1096,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 1096w, https://www.economist.com/cdn-cgi/image/width=1280,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 1280w, https://www.economist.com/cdn-cgi/image/width=1424,quality=80,format=auto/sites/default/files/20061007issuecovEU400.jpg 1424w", "https://www.economist.com/cdn-cgi/image/width=256,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 256w, https://www.economist.com/cdn-cgi/image/width=360,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 360w, https://www.economist.com/cdn-cgi/image/width=384,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 384w, https://www.economist.com/cdn-cgi/image/width=480,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 480w, https://www.economist.com/cdn-cgi/image/width=600,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 600w, https://www.economist.com/cdn-cgi/image/width=834,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 834w, https://www.economist.com/cdn-cgi/image/width=960,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 960w, https://www.economist.com/cdn-cgi/image/width=1096,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 1096w, https://www.economist.com/cdn-cgi/image/width=1280,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 1280w, https://www.economist.com/cdn-cgi/image/width=1424,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 1424w", "https://www.economist.com/cdn-cgi/image/width=256,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 256w, https://www.economist.com/cdn-cgi/image/width=360,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 360w, https://www.economist.com/cdn-cgi/image/width=384,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 384w, https://www.economist.com/cdn-cgi/image/width=480,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 480w, https://www.economist.com/cdn-cgi/image/width=600,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 600w, https://www.economist.com/cdn-cgi/image/width=834,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 834w, https://www.economist.com/cdn-cgi/image/width=960,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 960w, https://www.economist.com/cdn-cgi/image/width=1096,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 1096w, https://www.economist.com/cdn-cgi/image/width=1280,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 1280w, https://www.economist.com/cdn-cgi/image/width=1424,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 1424w", "https://www.economist.com/cdn-cgi/image/width=256,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 256w, https://www.economist.com/cdn-cgi/image/width=360,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 360w, https://www.economist.com/cdn-cgi/image/width=384,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 384w, https://www.economist.com/cdn-cgi/image/width=480,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 480w, https://www.economist.com/cdn-cgi/image/width=600,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 600w, https://www.economist.com/cdn-cgi/image/width=834,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 834w, https://www.economist.com/cdn-cgi/image/width=960,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 960w, https://www.economist.com/cdn-cgi/image/width=1096,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 1096w, https://www.economist.com/cdn-cgi/image/width=1280,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 1280w, https://www.economist.com/cdn-cgi/image/width=1424,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 1424w", "https://www.economist.com/cdn-cgi/image/width=256,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 256w, https://www.economist.com/cdn-cgi/image/width=360,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 360w, https://www.economist.com/cdn-cgi/image/width=384,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 384w, https://www.economist.com/cdn-cgi/image/width=480,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 480w, https://www.economist.com/cdn-cgi/image/width=600,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 600w, https://www.economist.com/cdn-cgi/image/width=834,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 834w, https://www.economist.com/cdn-cgi/image/width=960,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 960w, https://www.economist.com/cdn-cgi/image/width=1096,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 1096w, https://www.economist.com/cdn-cgi/image/width=1280,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 1280w, https://www.economist.com/cdn-cgi/image/width=1424,quality=80,format=auto/media-assets/image/20240720_WBD002.jpg 1424w", "https://www.economist.com/cdn-cgi/image/width=256,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 256w, https://www.economist.com/cdn-cgi/image/width=360,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 360w, https://www.economist.com/cdn-cgi/image/width=384,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 384w, https://www.economist.com/cdn-cgi/image/width=480,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 480w, https://www.economist.com/cdn-cgi/image/width=600,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 600w, https://www.economist.com/cdn-cgi/image/width=834,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 834w, https://www.economist.com/cdn-cgi/image/width=960,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 960w, https://www.economist.com/cdn-cgi/image/width=1096,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 1096w, https://www.economist.com/cdn-cgi/image/width=1280,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 1280w, https://www.economist.com/cdn-cgi/image/width=1424,quality=80,format=auto/media-assets/image/20240720_WBP503.jpg 1424w", "https://www.economist.com/cdn-cgi/image/width=256,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 256w, https://www.economist.com/cdn-cgi/image/width=360,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 360w, https://www.economist.com/cdn-cgi/image/width=384,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 384w, https://www.economist.com/cdn-cgi/image/width=480,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 480w, https://www.economist.com/cdn-cgi/image/width=600,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 600w, https://www.economist.com/cdn-cgi/image/width=834,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 834w, https://www.economist.com/cdn-cgi/image/width=960,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 960w, https://www.economist.com/cdn-cgi/image/width=1096,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 1096w, https://www.economist.com/cdn-cgi/image/width=1280,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 1280w, https://www.economist.com/cdn-cgi/image/width=1424,quality=80,format=auto/media-assets/image/20240720_WBP504.jpg 1424w" ]
[]
[]
[ "" ]
null
[ "The Economist" ]
2006-10-05T00:00:00
More twists in the fuss over options backdating | Business
en
/favicon.ico
The Economist
https://www.economist.com/business/2006/10/05/the-fugitive
More twists in the fuss over options backdating Oct 5th 2006|new york AFTER earlier speculation that he was hiding out in Sri Lanka, Kobi Alexander, the founder and former boss of Comverse Technology, a software firm, seems to have chosen Namibia as his refuge. Mr Alexander is the highest-profile executive to be charged so far in the scandal surrounding the backdating of executive share options at over 100 American firms. When fraud charges were filed in August against Mr Alexander and two other former executives—all three had resigned in May—it emerged that he was on the run, and had transferred $57m to his native Israel in what prosecutors called a “money-laundering scheme”. This article appeared in the Business section of the print edition under the headline “The fugitive” From the October 7th 2006 edition Discover stories from this section and more in the list of contents Explore the edition More from Business China is the West’s corporate R&D lab. Can it remain so? Foreign firms want Chinese boffins. America and China may have other plans Can Burberry put its chequered past behind it? The British label’s new boss has his work cut out Google wants a piece of Microsoft’s cyber-security business A $23bn acquisition of Wiz, an Israeli startup, is the search giant’s biggest ever More from Business China is the West’s corporate R&D lab. Can it remain so? Foreign firms want Chinese boffins. America and China may have other plans Can Burberry put its chequered past behind it? The British label’s new boss has his work cut out Google wants a piece of Microsoft’s cyber-security business A $23bn acquisition of Wiz, an Israeli startup, is the search giant’s biggest ever Can anyone save Macy’s? America’s biggest department store has rejected a takeover. Now what? How a CEO knows when to quit Bosses have a shelf life and plenty of incentives to misjudge what it is Tech bros love J.D. Vance. Many CEOs are scared stiff Donald Trump’s running-mate has a deep-rooted resentment of big business
correct_foundationPlace_00083
FactBench
1
77
http://pipelinepub.com/1209/advertisers_page.html
en
Pipeline Publishing, Volume 6, Issue 7
[ "http://pipelinepub.com/1209/images/blank_spacer.gif", "http://pipelinepub.com/1209/images/pipeline_toplogo.gif", "http://pipelinepub.com/1209/images/tl_blank.gif", "http://pipelinepub.com/1209/images/tl_mid.gif", "http://pipelinepub.com/1209/images/tl_btmight.gif", "http://pipelinepub.com/1209/images/tl_btnleft1.gif", "http://pipelinepub.com/1209/images/tl_index.gif", "http://pipelinepub.com/1209/images/tl_btnright1.gif", "http://pipelinepub.com/1209/images/top_section_number13.gif", "http://pipelinepub.com/1209/images/cover_small_topright.gif", "http://pipelinepub.com/1209/images/blank_spacer.gif", "http://pipelinepub.com/1209/images/blank_spacer.gif", "http://pipelinepub.com/1209/images/nakina_logo_sm.gif", "http://pipelinepub.com/1209/images/getMore_Nakina.gif", "http://pipelinepub.com/1209/images/logo_conceptwave.jpg", "http://pipelinepub.com/1209/images/getMore_ConceptWave.gif", "http://pipelinepub.com/1209/images/logo_tekelec2.gif", "http://pipelinepub.com/1209/images/getMore_Tekelec.gif", "http://pipelinepub.com/1209/images/Comverse_logo.gif", "http://pipelinepub.com/1209/images/ontology_low_res_logo.gif", "http://pipelinepub.com/1209/images/getMore_ontology.gif", "http://pipelinepub.com/1209/images/openet_logo.gif", "http://pipelinepub.com/1209/images/ltc_banner.gif", "http://pipelinepub.com/1209/images/getMore_LTC.gif", "http://pipelinepub.com/1209/images/MWC-1109.gif", "http://pipelinepub.com/1209/images/nprg_logo.gif", "http://pipelinepub.com/1209/images/arrow_left_btm.gif", "http://pipelinepub.com/1209/images/btm_center.gif", "http://pipelinepub.com/1209/images/arrow_right_btm.gif", "http://pipelinepub.com/1209/images/rightshadow.gif" ]
[]
[]
[ "" ]
null
[ "pipeline.com" ]
null
null
Nakina Systems provides Domain Control and Intelligence solutions to communications equipment and service providers worldwide. Our solutions enable service providers to scale new network service infrastructure more rapidly and cost-effectively. What makes Nakina different? We combine: Proven scalability to 10,000s of nodes; Deep function-by-function control for elements of any complexity; Built-in configurability and tier one operations processes. That’s why Nakina solutions are deployed in daily use to manage critical telecom network infrastructure in over twenty four countries around the world. Domain Control and Intelligence bridges the gap between "proven in the lab" and "ready for national and global rollout." Our ultimate objective is to enable service providers to roll out complex new infrastructure – Ethernet, IMS, optical, and wireless – without operational obstacles and expensive manual processes that slow deployment. IMS, VoIP, Ethernet and all their friends. Bring it on! Nothing stops Nakina. Tekelec, a global leader in core multimedia session control, mobile messaging and network intelligence, ensures scalable, secure and highly available communications. The company's market-leading signaling solutions enable the interworking of different network applications, technologies and protocols, providing a smooth transition to next-generation networks. Tekelec has more than 20 offices around the world serving customers in more than 100 countries, with corporate headquarters located near Research Triangle Park in Morrisville, N.C., U.S.A. For more information, please visit www.tekelec.com. Tekelec's heritage in signaling combined with its pioneering expertise in SIP has yielded an innovative approach to next-generation network evolution. Tekelec is the first and only company to help operators add SIP signaling to next-generation networks. SIP can deliver the promise of IP networks to the next-generation networks operators are deploying today. Our customers associate us with innovation, and for good reason. We were first with a purpose-built STP, first with signaling over IP, first with integrated number portability, and first with probeless monitoring. We implemented TALI (signaling over IP) in 1999, which later evolved to SIGTRAN for Orange, UK. For more than 30 years, Tekelec has consistently anticipated the evolution of the telecom market and developed the technology to deal with changing network complexities, enabling our customers to deliver their own innovative products and services. Comverse is the world's leading provider of software and systems enabling value-added services for voice, messaging, mobile Internet and mobile advertising; converged billing and active customer management; and IP communications. Comverse's extensive customer base spans more than 130 countries and covers over 500 communication service providers serving more than two billion subscribers. The company's innovative product portfolio enables communication service providers to unleash the value of the network for their customers by making their networks smarter. Comverse's solutions support flexible deployment models, including in-network, hosted and managed services, and can run on circuit-switched, IP, IMS or converged network environments. Comverse is a subsidiary of Comverse Technology, Inc. (CMVT.PK). For more information, visit www.comverse.com. Ontology Systems is unifying OSS/BSS data with unprecedented speed and accuracy, at a fraction of the industry's cost expectation. Ontology’s OSS/CAD application is an agile, low cost solution to manage customers, services and network infrastructure by aligning OSS/BSS systems “dirty” data in a non disruptive way, and without the need to run a huge transformation program. OSS/CAD achieves this by offering a radically new approach which uses semantics to align existing OSS/BSS systems by the bottom up unification of their data. Based in London, Ontology was founded in 2006 by Benedict Enweani and Leo Zancani. They were responsible for leading the Orchestream technology division to being acquired by Oracle, via its Metasolv acquisition. Between them, they have grown the Ontology team to include industry-leading experts from the communications and IT sector, as well as respected professionals from the field of formal semantics. Openet is a leading worldwide provider of event-processing and transaction-management solutions. We remain focused on delivering best-in-class network-edge solutions and specialized engagement processes that create business value from network activity. Through its open architecture and modular design, the award-winning FusionWorks™ FrameWork serves as the scalable foundation for a range of Openet products; Convergent Mediation, Convergent Charging, Network Edge Rating, Balance Manager, and Policy Manger. A global company, Openet implementations include long-running engagements with the world's leading service providers such as BT, Orange, AT&T, Verizon Wireless, and Telstra. LTC International provides leading companies in the telecommunications and IT sectors with a unique level of service based on true subject matter expertise. Our Business Operations Architects® each have at least ten years of hands-on experience in service provider and IT intensive companies. Our consulting team has experience in all areas of business profit optimization, wireless and wireline communications, Internet services, as well as software and hardware planning, implementation and operations. LTC has incorporated more than 1,000 years of first hand operating company and software application experience into our Business Management Toolkit. This comprehensive set of tools, guidelines, checklists, templates and training programs is designed to remove uncertainty and accelerate success for our clients. NEW PARADIGM RESOURCES GROUP, Inc. (NPRG) is the nation's leading strategic consulting and research firm for innovators within the communications industry. NPRG provides business strategy and technology advice to our clients. To accomplish this, we identify, analyze and forecast emerging technologies and trends, support mission critical decision-making processes for service providers, technology developers and financial institutions, and deliver proven business strategies, product plans and market forecasts, enabling clients to succeed within evolving market conditions. NPRG has long published an extensive array of industry analysis reports to keep up with the ongoing competitive, technological and product evolution. To present our data and analyses on a continuous basis, NPRG has moved to a completely online, dynamic set of Continuous Information & Advisory ServicesSM presenting all the sectors we cover, including the CLECs, in the dynamic delivery format.
correct_foundationPlace_00083
FactBench
1
20
https://www.dell.com/en-us/dt/corporate/newsroom/announcements/2006/02/02142006-4012.htm
en
Comverse Integrates Insight with EMC CLARiiON for Advanced Multimedia Telecom Storage and Management Solutions
https://www.delltechnolo…rd-2000x1000.jpg
https://www.delltechnolo…rd-2000x1000.jpg
[ "https://www.dell.com/content/dam/delltechnologies/assets/corporate/images/newsroom/pressreleases/facebook.svg", "https://www.dell.com/content/dam/delltechnologies/assets/corporate/images/newsroom/pressreleases/twitter.svg", "https://www.dell.com/content/dam/delltechnologies/assets/corporate/images/newsroom/pressreleases/linkedin.svg", "https://www.dell.com/content/dam/delltechnologies/assets/corporate/images/newsroom/pressreleases/google-plus.svg", "https://www.dell.com/content/dam/delltechnologies/assets/corporate/images/newsroom/pressreleases/email.svg", "https://www.dell.com/content/dam/delltechnologies/assets/corporate/images/newsroom/pressreleases/print.svg", "https://dellicm-dell-na174.secure.force.com/liveAgentSetupFlow/resource/Dell_Chat_Logo_SVG" ]
[]
[]
[ "" ]
null
[]
null
Dell Technologies is the leader in digital transformation, providing digital technology solutions, products, and services to drive business success.
en
/etc/designs/uwaem/assets/img/favicons/favicon-192x192.png
https://www.dell.com/en-us/dt/corporate/newsroom/announcements/2006/02/02142006-4012.htm
Barcelona, Spain (3GSM World Congress 2006, Hall 2, Booth G141) - February 14, 2006 - EMC Corporation, (NYSE: EMC) the world leader in information management and storage, announced it has signed an OEM agreement with Comverse, a unit of Comverse Technology, Inc. (NASDAQ: CMVT), and the world's leading supplier of software and systems enabling network-based multimedia enhanced communication services, to integrate the EMC® CLARiiON® CX series of networked storage systems into Comverse InSight™ Open Services Environment. Modular and standards-based, Comverse InSight core components (such as voice ports, message and subscriber profile stores, and management services) are shared across services, reducing costs and speeding deployments. Through this technology integration, Comverse will provide operators with an efficient, highly functional and cost-effective solution for managing, storing and protecting data from multimedia messaging and other value added services. The integrated Comverse solution enables telecom service providers to manage, manipulate and deliver personal and public multimedia content to their subscriber community with the highest levels of performance and reliability. "Telecom service providers need an integrated, high-performance multimedia storage solution capable of supporting enhanced services in a converged environment," said BJ Jenkins, EMC's Vice President of Global Marketing. "The EMC CLARiiON CX series is an ideal fit to meet this growing need with carrier-grade NEBS and ETSI certified systems that facilitate cutting-edge enhanced services, including new next-generation multimedia, voice and data services from the industry leading Comverse InSight Open Services Environment." "In order to meet the subscriber's appetite for multimedia services, carriers are increasingly investing in data and multimodal interfaces," said Menashe Rothschild, CTO at Comverse. "The industry leading EMC CLARiiON family of networked storage systems, combined with our multimedia management capabilities, enable us to support many new enhanced services possibilities, such as video applications. By complying with NEBS and ETSI requirements, the Comverse multimedia platform results in superior storage capabilities that enhances functionality and expands capacity while potentially reducing telecom service provider's operating expenses."
correct_foundationPlace_00083
FactBench
2
75
https://www.galatea-associates.com/
en
Galatea Associates
http://static1.squarespace.com/static/54270ba9e4b017861367df0b/t/59a0267dd7bdce0286c56b94/1503667839513/Old+Logo.png?format=1500w
http://static1.squarespace.com/static/54270ba9e4b017861367df0b/t/59a0267dd7bdce0286c56b94/1503667839513/Old+Logo.png?format=1500w
[ "https://images.squarespace-cdn.com/content/v1/54270ba9e4b017861367df0b/1503667837850-KUYA81T1IERHACI0BQ91/Old+Logo.png?format=1500w", "https://images.squarespace-cdn.com/content/v1/54270ba9e4b017861367df0b/1494339846308-X2WVBZZLTD1093UI3VF0/DSC_0024.png", "https://images.squarespace-cdn.com/content/v1/54270ba9e4b017861367df0b/1417467639589-M12SC2U835RWRRJ3W824/image-asset.jpeg", "https://images.squarespace-cdn.com/content/v1/54270ba9e4b017861367df0b/1417467417731-7WYWW6XUUXK5B9M02LOY/image-asset.jpeg", "https://images.squarespace-cdn.com/content/v1/54270ba9e4b017861367df0b/ac91e54f-1ee9-43d7-ae8f-849826bde7dd/NMSDC_CERTIFIED_2024-2.jpg" ]
[]
[]
[ "" ]
null
[]
null
Wall Street Without the Suit   Boston - London - Durham - Tampa   Join Us
en
https://images.squarespace-cdn.com/content/v1/54270ba9e4b017861367df0b/1498681316775-97VIO66M8VYDSQ0PVWA5/favicon.ico?format=100w
Galatea Associates
https://www.galatea-associates.com
solving Wall Street's toughest problems through custom software solutions Galatea Associates has established a rock-solid reputation among Wall Street investment banks for delivering mission-critical systems in a timely and efficient manner. We've built this reputation with our excellent teams and our unique organizational approach. Some of the largest financial firms in the world choose to partner with Galatea for several key reasons: understanding of the Business Our Associates, with the help of Galatea's training program, learn the fundamentals of the investment banking industry and, for this reason, have an unparalleled understanding of the business problems faced by our end-users. END TO END COLLABORATION Since our Associates know both the business and IT infrastructure of our clients' operations, we are uniquely positioned to collaborate with clients throughout project lifecycles. Our ‘lead from the front’ mindset means that our Associates get as deeply involved in the design phase as they do in the development, testing, and implementation phases. TECHNOLOGY AGNOSTIC Our Associates bring unique technical skills to the table, but we consider ourselves technologically agnostic in that we do not advocate for any single language or environment. We work with our clients to select the best combination of technologies to meet their needs. Galateans are able to work with any technology, develop in any language, and operate in any environment.
4768
dbpedia
2
7
https://stevanussugianto.wordpress.com/2013/03/12/concepts-of-programming-languages-chapter-2-evolution-of-the-major-programming-languages/
en
Concepts of Programming Languages – Chapter 2 – Evolution of The Major Programming Languages
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2013-03-12T00:00:00
Review Questions: 1. In what year was Plankalkul designed? In what years was that design published? Plankalkul was designed in 1943 by Konrad Zuse as his proposal for his Ph. D. dissertation. In a lengthy manuscript dated 1945. The design was published in 1972. 2. Mention an interesting feature of Zuse's programs. An interesting feature…
en
https://s1.wp.com/i/favicon.ico
Stevanus Sugianto
https://stevanussugianto.wordpress.com/2013/03/12/concepts-of-programming-languages-chapter-2-evolution-of-the-major-programming-languages/
Review Questions: 1. In what year was Plankalkul designed? In what years was that design published? Plankalkul was designed in 1943 by Konrad Zuse as his proposal for his Ph. D. dissertation. In a lengthy manuscript dated 1945. The design was published in 1972. 2. Mention an interesting feature of Zuse’s programs. An interesting feature of Zuse’s programs was the inclusion of mathematical expressions showing the current relationships between program variables. These expression stated that would be true during execution at the points in the code where they appeared. These are very similar to the assertions of Java and in those in axiomatic semantics. 3. What does Plankalkul mean? Plankalkul means program calculus. It was defined and the algorithms were written in language to solve a wide variety of problems by Zuse. 4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they? They were the novel facility of automatically incrementing address registers which didn’t appear in hardware until the UNIVAC 1107 computers of 1962. The speedcoding interpreter effectively converted the problem of 701 ti a virtual three-address floating-point calculator. 5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped? The number of bits in a single word of the UNIVAC I’s memory is 72 bits and the bits are grouped as 12 six-bit bytes. 6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming language? Explain why. Indexing and floating-point instructions in hardware, because one of the primary reasons why the slowness of interpretive system was tolerated from the late 1940s to the mid-1950s was the lack of floating-point hardware in the available computers. All floating-point operations had to be simulated in software, a very time-consuming process. Because so much processors time was spent in software floating-point processing, the overhead of interpretation and the simulation of indexing were relatively insignificant. As long as floating-point had to be done by software, interpretation was an acceptable expense. 7. Who developed the Speedcoding system for the IBM 701? The speedcoding system was developed by John Backus for the IBM 701 which is an example of such a system (Backus, 1954). 8. Who developed Short Code? Why is Short Code called automatic programming? Short Code was developed by John Mauchly. It is called automatic programming because it is not translated to machine code, rathter it was implemented with a pure interpreter. It clearly simplifies the programming process, but at the expense of execution time. 9. Under what environmental consideration as Fortran developed? Which is the first version of Fortran? Environmental consideration as Fortran developed was under: – Computers had small memories and were slow and relatively unreliable – The primary use of computer was for scientific computations – There were no existing efficient and effective ways to proram computers – Because of the high cost of computers compare to the cost of programmers, speed of the generated object code was the primary goal of the first Fortran compilers. The first version of Fortran is Fortran 0. 10. What was the most significant feature added to Fortran I to get Fortran II? The most significant feature added to Fortran I to get Fortran II was the independent compilation of subroutines. Without it, any change in a program required that the entire program be recompiled. 11. What control flow statements were added to Fortran IV to get Fortran 77? Control flow statements added to Fortran IV to get Fortran 77 were logical loop control statements and an If with an optional Else clause. 12. Which versions of Fortran was the first to have any sort of dynamic variables? Fortran 90(ANSI, 1992) was the first version of Fortran to have any sort of dynamic variables. 13. Which version of Fortran was the first to have character string handling? Version of Fortran that was the first to have character string handling was Fortran 77. Moreover it became standard in 1978 and it retained most of features of Fortran IV. 14. Why are linguist interested in artificial intelligence in the late 1950s? Linguist were interested in artificial intelligence in the late 1950s because the were concerned with natural language processing. 15. What are the different data types and structures in Common LISP? The different data types and structures in common LISP are the Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of a packages for modularizing collections of functions and data providing access control. 23. In what year did the COBOL design process begin? COBOL design process was begun in 1959 when it was held the meeting of the subject of a common language for business applications, which was sponsored by the Department of Defense at the Pentagon. 27. Why was BASIC was an important language in the early 1980s? In early 1980s BASIC was an important language because it was easy for beginners to learn, especially those were not science oriented, and its smaller dialects can be implemented on computers with very small memories. Problem Set: 1. What features of Fortran IV do you think would have had the greatest influence of Java if the Java designers had been familiar with Fortran? It think they would be the capability of passing subprograms as parameter to others subprogram, explicit type declarations for variables, and a logical if construct. 3. Write a short history of the Fortran 0, Fortran I, Fortran II, Fortran IV systems. Fotran 0 was the first version of Fortran. It stated that it would provide the efficiency of hand-coded programs and the ease of programming of the interpretive pseudocode systems. Environmental consideration as Fortran developed was under: computers had small memories and were slow and relatively unreliable, the primary use of computer was for scientific computations, there were no existing efficient and effective ways to proram computers, because of the high cost of computers compare to the cost of programmers, speed of the generated object code was the primary goal of the first Fortran compilers. Fortran I included input/output formatting, variable names of up to six characters, user-defined subroutines, the if selection statement and the Do loop statement. There were no data typing, variable whose names began with I, J, K, L, M, N were implicity integer type, and all others were implicitly floating-point. Fortran II fixed many of the bugs in the Fortran I compilation system and added some significant features to the language, most important being the independent compilation of subroutines. Without independent compilation, any change in a program required that the entire program be recompiled. The capability of including precompiled machine language versions of subprograms shortened the compilation process considerably and made it practical to develop much larger programs. Fortran IV became one of the most widely used programming languages of its time. It was an improvement over Fortran II in many ways. It added explicit type declaration for variable, a logical If construct, and the capability of passing subprograms as parameters to other subprograms. 9. Why in your opinion, did Fortran allow names that began with I, J, K, L, M, and N as implicitly integer type? In my opinion Fortran allowed it due to the programmers habits of using those character as name of variable. And it eased the programmers to declare the variable with int data type whenever they needed it. 10. Outline the major developments in ALGOL 60.Major development of ALGOL 60: – The concept of block structure was introduced. This allowed the programmer to localize parts of programs by introducing new data environments, or scopes. – Two different means of passing parameters to subprograms were allowed: pass by value and pass by name. – Procedures were allowed to be recursive. The ALGOL 58 description was unclear on this issue. Note that although this recursion was new for the imperative languages, LISP had already provided recursive functions in 1959. – Stack-dynamic arrays were allowed. A stack-dynamic array is one for which the subscript range or ranges are specified by variables, so that the size of the arrays is set at the time storage is allocated to they array, which happens when the declaration is reached during execution. 15. Are there any non procedural languages other than Prolog? Yes there are, SQL is non procedural languages. 16. What is your opinion of the argument that languages that are too complex are too dangerous to use, and we should therefore keep all languages small and simple? My opinion about it is that those language are literally not dangerous. It is dangerous if an amateur tries to create a complex program with less understanding of concept of that languages which are too complex. If it is reachable to create a language which are small and simple, why not? As long as it can provide all that we need to create a program. Regarding simplicity is one of the characteristic that a programming language should have. 24. Why, in your opinion, do new scripting languages appear more frequently than new compiled languages? In opinion I think it is because a scripted language is easier for the beginner to learn and implement than a compiled language. And also due to scripting languages efficiency of its ease built in interfaces and give the fastest turnaround from script to execution. 25. Give a brief general description of the Java servlet. Java servlet is Java platform technology of choice for extending and enhancing Web servers. Servlets provide a component-based, platform-independent method for building Web-based applications, without the performance limitations of CGI programs. And unlike proprietary server extension mechanisms (such as the Netscape Server API or Apache modules), servlets are server- and platform-independent.
4768
dbpedia
1
58
https://craftofcoding.wordpress.com/category/algol/page/2/
en
Algol – Page 2 – The Craft of Coding
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages1-1.png?w=599", "https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages2.png?w=620", "https://craftofcoding.wordpress.com/wp-content/uploads/2020/11/sequencegoto.jpg?w=472", "https://craftofcoding.wordpress.com/wp-content/uploads/2020/11/sequencealgol.jpg?w=644", "https://craftofcoding.wordpress.com/wp-content/uploads/2017/06/salgolbubblesort.jpg?w=620", "https://craftofcoding.wordpress.com/wp-content/uploads/2017/06/algorithm33.jpg", "https://craftofcoding.wordpress.com/wp-content/uploads/2017/04/firstif.jpg", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2021-02-18T20:22:36+00:00
Posts about Algol written by spqr
en
https://s1.wp.com/i/favicon.ico
The Craft of Coding
https://craftofcoding.wordpress.com/category/algol/
Language design in the 1960’s was dominated by attempts to improve upon Algol60. Hoare noted in 1973 [1] that ALGOL-60 was “… a language so far ahead of its time, that it was not only an improvement on its predecessors, but also on nearly all its successors”. Niklaus Wirth was a graduate student at University of California, Berkeley when he started playing around with languages. He joined a research group which was engaged with the implementation and improvement of a dialect of ALGOL-58, NELIAC. He described the compiler as “an intricate mess“, and the process one of “1% science, and 99% sorcery” [1]. The first language Wirth designed leading to his dissertation was Euler, as Wirth himself put it “a trip with the bush knife through the jungle of language features and facilities” [1]. Euler had academic elegance, but no real practical value, however it did catch the attention of the IFIP Working Group, engaged in designing the successor to ALGOL-60. There seemed to be two camps here, one which wanted to push the boundaries of language design and another which wanted to extend ALGOL-60. Wirth belonged to the latter group. In 1965, three reports by Wirth, Seegmüller, and Wijngaarden described three different quasi-complete languages. The complexity of the design process, involving far too many people, eventually led Wirth to become disheartened with the design process, and he went off to develop his own version of ALGOL-60s successor. Designed with contributions from Tony Hoare, this language would become ALGOL-W. Wirth’s first significant language was PL360, a byproduct of the ALGOL-W effort. The IBM 360 upon which ALGOL-W was implemented offered the choice of assembly language or Fortran compilers, neither of which was very attractive. PL360 [2] was a tool with which to implement ALGOL-W. ALGOL-W had a number of applications, but was deficient as a systems programming language. PL360 was to become more successful than ALGOL-W, largely because ALGOL-W was a complex language, and the target computers inadequate. In the fall of 1967, Wirth returned to Switzerland to begin work on the language that would become most closely associated with his language design efforts – Pascal. Wirth would go on to create a trinity of languages, neither of which were that closely related, but all that had one thing in common, they were “ALGOL-like” languages – Pascal, Modula-2 and Oberon. Pascal – Based largely on Algol W, including the use of it’s while and case statements, and record structures. There were syntactic differences, however Algol 60 was almost a subset of Pascal. Modula-2 – Wirth noted that Modula-2 “includes all aspects of Pascal, and extends them with the module concept”. Oberon – Evolved from Modula-2 by very few additions, and several subtractions. Note that “ALGOL-like” really implied – a procedural paradigm, a rigorously defined syntax, traditional mathematical notation (with the nonsense of symbols like ++), block structure providing identifier scope, the availability of recursion, and a strict, static data typing system. [3] [1] Wirth, N., “From programming language design to computer construction”, CACM, 28(2), pp. 160-164 (1985) [2] Wirth, N., “PL360, a programming language for the 360 computers”, Journal of the ACM, 15(1), pp.34-74 (1968) [3] Wirth, N., “Modula-2 and Oberon”, in ACM Conf. on History of Programming Languages, pp.3-1-3-10 (2007) Algol was the first real “algorithmic” language, more so than Fortran because the latter contained a lot of structures that we couldn’t consider pleasant, and most of them had to do with what is considered breaking the sequential flow. In early Fortran there were a lot of “jump” statements, either explicitly goto, or thinly veiled as goto (see arithmetic if). In early languages goto (or go to) was often used in place of repetitive statements, either because the programmer was use to jumps (from assembler), mistrusted loops, or just didn’t consider them. Algol used the go to statement and associated labels to break the sequential flow. Consider the arithmetic sequence, S+1/n/n where S=0 initially, and n takes on a sequence of values 1,…,∞. This can be expressed as: Of course ∞ makes for a lot of work, so it is easier to stop the process at some point, let’s say 1.6449 (π²/6). Here is an Algol program to perform the task: This piece of code has two jumps. The first one, which uses the label L1 mimics a loop, repetitively processing the next value of n, until such time as S is greater-than-or-equal-to 1.6449, then the second jump is invoked to label L2, effectively exiting the loop. In “The Humble Programmer”, written by Dijkstra in 1972 he made the following statement: “Another lesson we should have learned from the recent past is that development of ‘richer’ or ‘more powerful’ programming languages was a mistake in the sense that these baroque monstrosities, these conglomerations of idiosyncrasies, are really unmanageable both mechanically and mentally.” And to be honest, he was probably right… but that wasn’t even the worst of it. If he thought Algol 68 was a monster, C++ and Ada would balloon to gargantuan proportions. Have we lost the ability to create simple programming languages? I don’t necessarily mean for the purpose of creating commercial applications – the nature of interaction with hardware, networking etc, makes languages with large scale structure necessary. But along the way we have used these commercial languages to teach novices to program – and it hasn’t really worked. One of the reasons why students become disengaged with introductory programming courses may be the vast amounts of language knowledge required, on top of learning problem solving skills, and algorithm design. It may be too much for people not use to thinking in a “programming” mode. Learning about language-specific syntax, and memory management, design, testing, blah, blah, blah – it may be overwhelming. Which is surprising, but then those of us who learned programming in the 1980s learned to program in Pascal, or maybe Fortran. Before languages added things like OO, became “functional”, and masters of everything that a language could be, they were simple – no really they were. There were languages created specifically for teaching programming. Pascal was one. S-Algol was another (it appeared after Pascal). Here is a program in S-Algol which performs a Bubble sort: let n = readi let x = vector 1::n of 0.0 for i = 1 to n do x(i) := readr for i = 1 to n-1 do for j = 1 to n-i do if x(j) > x(j+1) do begin let temp = x(j) x(j) := x(j+1) x(j+1) := temp end write "The sorted numbers are 'n'" for i = 1 to n do write x(i), "'n"? The structure of this language was very simple. It was from a pre-OO time when the programming world was simpler, happier. When Algol 60 first appeared, it heralded a new era for programming languages, one which encompassed the idea of “blocks” of code, amongst other things. So why did such a promising language fail? There are likely a number of reasons. Firstly, Algol 60 had no I/O in the specification. This meant that when Algol was implemented on a particular machine I/O was added, and so the specifications for I/O differed among implementations. Secondly, the 1960s were awash with new programming languages – everyone wanted theirs to be the next “big thing”. It is harder to compete in this sort of an environment. There was also very little time between the release of Algol 60 and beginning work on its successor Algol 68. Algol may have been more successful if effort had been spent on improving Algol 60, rather than creating a whole new language. Thirdly, and likely the biggest issue was IBM. Algol 60 made some headway in Europe, but never really established itself in North America. Why? IBM’s US domestic installed market share for computer systems was around 71.4% in the 1960s. Its main competitors, Honeywell and Univac held 8.1% and 9.6% of the market respectively [1]. Algol60 was suppose to take over some of Fortran’s territory, but due to IBM’s dominance, this never happened. In fact Fortran so dominated the industry, that even IBM’s other language PL/I (1964) failed to make any headway. On the business side, IBM adopted Cobol in 1962 as its primary development language. By the time the tide started to turn against Fortran (in the late 1960s), effected by the surge of interest in structured programming, Algol60 had been usurped by Algol68. Algol68 was too complex a language, and other players in the guise of Pascal (and later C) would take over. Pascal was block-oriented, and structured, and would lead the way for programming in the 1970s. By the time S-algol appeared in the late 1970s, it was over for Algol. Which is a pity, because S-algol showed real promise. Here’s a bubblesort in S-algol. [1] Arnst, C., “Charts show IBM outran others in ’60s”, Computerworld, Nov.7, p.57 (1977) By 1977, Fortran had likely its greatest metamorphosis from an unstructured, to a quasi-structured language. At the eleventh hour the revision for the F77 standard was modified to reduce the impact of goto statements to match other languages, where its influence was minimal, or even non-existent. The changes made Fortran 77 vastly different from its predecessor, Fortran 66. Of major importance, was the inclusion of a “block IF“, which took the following form: IF (E) THEN ... END IF The use of THEN as a new keyword allowed a block of statements to be incorporated until the terminating keyword ENDIF was reached. This also solved the dangling else problem. This was augmented by the addition of the keyword ELSE, which allowed for a group of statements to be actioned if the preceding IF is not satisfied. IF (E) THEN ... ELSE ... ENDIF By the mid-70s, Fortran was likely coerced into making these changes due to the competition from C and Pascal, both of which offered these conditionals. These new F77 constructs allowed for improved program readability, especially through eliminating the need for statement labels, and goto statements. Here is an example: IF (K.GT.0) THEN POSNUM = POSNUM + 1 ELSE IF (K.LT.0) THEN NEGNUM = NEGNUM + 1 ELSE ZEROS = ZEROS + 1 ENDIF The emergence of Ada did nothing to evolve the if statement. Like Pascal and F77, it used a then keyword, borrowed the else-if idea from Algol68, renaming it elsif, and used the same structure terminator endif, as F77. By this stage, if statements had likely evolved as far as they would, and new languages were just selecting appropriate concepts from existing languages. if C1 then S1 elsif C2 then S2 elsif Cn then Sn else S(n+1) endif; Fortran 90 would go on to finally make the arithmetic if obsolescent. Python would alter very little, adopting the elif of Algol68, and the lack of parentheses. if x == 0: zeroes = zeroes + 1 elif x < 0: negnum = negnum + 1 else: posnum = posnum + 1 Julia as well uses an amalgam of structural pieces. if x < 0 negnum = negnum + 1 elseif x > 0 posnum = posnum + 1 else zeroes = zeroes + 1 end We are now in the age of mix-and-match, and it is unlikely the if statement will evolve to any great extent. The design of the if statement in Algol 60 was likely the pinnacle of its evolution. From here on in every language tweaked its syntax, but there were no major changes. Languages like Algol 68, C, and Pascal all had conditional statements. Algol 68, although having the same name as “Algol” moniker, was a different language altogether. Whereas Algol 60 required the use of explicit compound statements within an if statement if more than one statement was being controlled, Algol 68 incorporated the use of control structure terminators. For the if statement this meant the use of the reversed keyword fi. Algol 68 still lacked the parentheses of Fortran, but also had no requirements for compound statements, as each section was self-delineated. It had the following general form: if C then ... else ... fi This had the added effect of eliminating the dangling-else problem of Algol 60. Algol 68 also added the keyword elif, a short-hand to allow for a series of else-if statements: if C1 then ... elif C2 then ... elif C3 then ... else ... fi Here is an example: if x>0 then posNum := posNum + 1; elif x<0 then negNum := negNum + 1; else zeros := zeros + 1; fi The if statement of C simplified that of Algol 60, deleting the then clause, and adding parentheses to enclose the conditional statement. It had the following general form: if (C) statement1; else statement2; However, similar to Algol 60, groups of statements require the use of compound statements delineated by { }, and C also suffers from the dangling-else problem of Algol 60. Here is an example: if (x>0) posNum := posNum + 1; else if (x<0) negNum := negNum + 1; else zeros := zeros + 1; Pascal, which arrived at a similar time to C, has a syntax similar to that of C – except its logical expression was bracket-less, and it used the then keyword, like Algol 60. Like Algol 60, it also suffered from the dangling-else problem, and required the use of begin-end delineators for a compound statement. if C then S else S2; Fortran did not make any inroads into modifying the if statement until later. Likely spurned on by Algol 60, Fortran IV introduced the logical if statement in 1965. It had the following form: IF (E) STATEMENT Where E was a logical expression, using operators of the form .EQ. for =, and .LE. for ≤. The statement was any statement except a DO statement or another logical IF. However unlike Algol 60, there were no compound statements, and no keyword corresponding to else. Both these had to be achieved by means of goto statements. In this sense it almost mimicked an if-else statement. Consider the example below: IF (A .LE. 0) GOTO 15 W = X ** A GOTO 20 15 W = 0 20 ... In this case, if the value of A is less than of equal to zero, the program jumps to statement 15, setting W to 0. Otherwise it calculates W=X**A, and jumps to statement 20. Notice that the Fortran conditional “operators” are stropped by the use of periods, e.g. .EQ.. This was done to avoid potential ambiguity. The expression A LE 0 could also have been interpreted as the variable ALE0. Fortran 66, the first industry standard made no changes to the if statement. There were a number of differences between Fortran (IV) and Algol (60): Fortran used mnemonics to represent conditional operator, e.g. .LE., versus Algol’s ≤ (in some implementations <= was used due to the non-availability of ≤) Fortran uses parentheses, ( ), to separate the logical expression from the statement, whereas Algol uses the additional keyword then. Fortran (66) required that each arithmetic statement on either side of a conditional be of the same datatype. This is because A.GT.B was often translated to A-B.GT.0.(This disappeared in F77). By all accounts, Fortran IV, and 66 were extremely deficient with respect to conditional statements. The next major changes were not to appear until Fortran 77. Consider code that looked like this in Algol 60: if k>0 then posNum := posNum + 1 else if k<0 then negNum := negNum + 1 else zeros := zeros + 1 The equivalent in Fortran 66 would be: IF (K.GT.0) GOTO 30 IF (K.LT.0) GOTO 31 ZEROS = ZEROS + 1 GOTO 47 30 POSNUM = POSNUM + 1 GOTO 47 31 NEGNUM = NEGNUM + 1 75 ... How did if evolve in other languages? Algol 68, C, Pascal? Arguably one of the most important control structures to evolve is “if“. Without it, programs couldn’t make any sort of decisions. Few algorithmic languages, apart from Plankalkül (1948), contained conditional statements. Plankalkül formed conditional statements with the help of a symbol which was an arrow with a period above it, which was used in the following manner: The left side of the statement, B, signifies the condition (Bedingung) and is an expression with a boolean value, and the right side, a, is an arbitrary statement. If B evaluates to 0 (nein), then the statement ends here, otherwise if B is 1 (ja), then the statement continues with a. There is no “else” statement. Heinz Rutishauser’s Superplan (1949-1951), did not have a decision statement. Decision statements in programming languages are intrinsically linked to branch instructions in assembler. The first language to use something akin to the modern form of the if statement was likely Fortran I which used an if statement as a form of three-way goto statement. IF (E) L1, L2, L3 The expression, E is evaluated and one of the alternative paths of L1, L2, and L3 is chosen based on whether E is negative, zero or positive. This became known as the arithmetic if. This could be used to derive a three-way decision statement of the form: IF (X-Y) 10, 10, 30 10 MAXNUM = Y GO TO 20 30 MAXNUM = X 20 ... This says that if X-Y is less than or equal to zero, then the maximum is Y, otherwise the maximum is X. This made sense in the context of unstructured jumps using go to. This allowed for a very limited decision structure, where the expression always had to be expressed in terms of some numeric output. In 1957-58 John McCarthy, developer of Lisp, was writing a series of routines for legal chess moves in Fortran which prompted him to invent conditional expressions. He found the arithmetic if construct from Fortran I and II “awkward to use” [McCarthy81], and found it more natural to invent a Fortran function XIF(M,N1,N2) whose value was N1 or N2 based on whether M was zero or not (it was written in machine language). The function was likely not that efficient, as it required all three arguments to be evaluated before XIF() was entered. In Lisp, the conditional took the form of the cond function: (cond (condition1 result1) (condition2 result2) ... (T resultN)) Later a more “traditional” like conditional operator was included into the specifications for Lisp, and appeared as follows: X = IF (N .EQ. 0, ICAR(Y), ICDR(Y)) McCarthy suggested the use of this concept in Algol 58 when he was a member of the Algol committee. In the Algol 58 preliminary report the if statement took the form: if (a>0); c:=a↑2↓×b↑2↓ if (a<0); c:=a↑2↓+b↑2↓ if (a=0); go to bed Algol 58 did not really progress much, and was superseded by Algol 60. Algol 60 added the keyword then, to separate the logical expression from the statement to be executed. many considered this if-then combination to make the statement more readable. The Algol statement was also extended to include an “else” part. Here is an example of an if-then-else in Algol 60. if x > 0 then pos := pos + 1 else negzero := negzero + 1 This lead to the ambiguity we know today as the “dangling-else”. Whereas a statement such as: if x=0 then if y=0 then m:=m+1 is not ambiguous, the following statement could be: if x=0 then if y=0 then m:=m+1 else n:=n-1 Is 1 to be subtracted from n when x is non-zero, whatever the value of y, OR when x is zero but y is not? A conundrum. To further add to the structural space, these if statements were constrained to the control of a single statement, which limited their usefulness. Algol 60 dealt with this through the use of the compound statement it had introduced using the keywords begin and end. For example, a piece of code to swap two numbers if x < y: if x<y then begin dummy:=x; x:=y; y:=dummy end Or, written in a more readable manner (many early languages crammed as much as they could on one line – blame punch-cards): if x<y then begin dummy:=x; x:=y; y:=dummy end This structure could also be used to reduce the dangling-else problem: if x=0 then begin if y=0 then m:=m+1 else n:=n-1 end REF(S): [McCarthy81] McCarthy, J., “LISP Session”, History of Programming Languages, pp.173-197, ACM (1981)
4768
dbpedia
3
41
https://stackoverflow.com/questions/1463321/was-algol-ever-used-for-mainstream-programming
en
Was ALGOL ever used for "mainstream" programming?
https://cdn.sstatic.net/…g?v=73d79a89bded
https://cdn.sstatic.net/…g?v=73d79a89bded
[ "https://www.gravatar.com/avatar/ffe13f37cb15da7e2793e1fe46818082?s=64&d=identicon&r=PG&f=y&so-version=2", "https://www.gravatar.com/avatar/47b88555c03aad35b076726767fcfa35?s=64&d=identicon&r=PG&f=y&so-version=2", "https://www.gravatar.com/avatar/f6da212a53e66e2e6a7b55acecd61520?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/8d0532a13348650ef87211151bc180f3?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/1798a5489bfe56297c762fe0e761ecac?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/6c86720e9a178abc0b6f169ae67151fd?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/c3ae6a1b8d708b79b6b85ecc365266a4?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/880d901d6473c6210dd0ba16bca571a2?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/dc3ab16e835bc43631591ae68dfad659?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/7674a18c302a81cc1ee1b912f5bb50c0?s=64&d=identicon&r=PG&f=y&so-version=2", "https://www.gravatar.com/avatar/0b474fc20a084538f086bfac37bb7b2d?s=64&d=identicon&r=PG&f=y&so-version=2", "https://www.gravatar.com/avatar/74d76a0c950dfcecb5ea4449513ddbd8?s=64&d=identicon&r=PG", "https://i.sstatic.net/AYvgg.jpg?s=64", "https://www.gravatar.com/avatar/b766e92f0b43a5f969a68f354b914de8?s=64&d=identicon&r=PG", "https://stackoverflow.com/posts/1463321/ivc/acfa?prg=547f8d0c-1f16-47e2-911b-e77d651c2d19" ]
[]
[]
[ "" ]
null
[]
2009-09-23T00:23:18
I know that ALGOL language is super-uber-extremely important as a theoretical language, and it also had a variety of implementations as per Wikipedia. However, what's unclear is, was ALGOL (pure ...
en
https://cdn.sstatic.net/Sites/stackoverflow/Img/favicon.ico?v=ec617d715196
Stack Overflow
https://stackoverflow.com/questions/1463321/was-algol-ever-used-for-mainstream-programming
Nothing like responding to 2 year old threads. I program in ALGOL almost daily. I am a programmer on a Unisys ClearPath mainframe and the majority of the system code is written in ALGOL or variants. The Burroughs B5500 was really designed around the language so it is a pretty efficient language/compilation process. Granted, this version is ALGOL with some extensions like limited classes (structure blocks), etc. i := 80; while i > 0 do begin scan ptrRay:ptrRay for i:i until in ALPHA; scan ptrEnd:ptrRay for i:i while in ALPHA; if i > 0 then begin replace nextToken by ptrRay for (offset(ptrEnd) - offset(ptrRay)); end; end; That code scans for ALPHA only tokens. It uses the OFFSET function which is a little more costly than using the residual count math yourself (i, starti, etc); Algol58 seems to have been the most successful in terms of important applications. From Wikipedia: JOVIAL is an acronym for "Jules Own Version of the International Algorithmic Language." The "International Algorithmic Language" was a name originally proposed for ALGOL 58. It was developed to compose software for the electronics of military aircraft by Jules Schwartz in 1959. Then: Notable systems using JOVIAL include the Milstar Communications Satellite, Advanced Cruise Missile, B-52, B-1B, B-2 bombers, C-130, C-141, and C-17 transport aircraft, F-111, F-15, F-16 (prior to Block 50), and F-117 fighter aircraft, LANTIRN, U-2 aircraft, E-3 Sentry AWACS aircraft, Navy Aegis cruisers, Army Multiple Launch Rocket System (MLRS), Army UH-60 Black Hawk helicopters, F100, F117, and F119 jet engines, the NORAD air defense & control system (Hughes HME-5118ME system) and RL-10 rocket engines. Airborne radar systems with embedded JOVIAL software include the APG-70, APG-71 and APG-73 ALGOL 68 was used in part of DRA for the same purpose. cf. Wikipedia: The '''Defence Research Agency''' (normally known as '''DRA'''), was an executive agency of the UK Ministry of Defence (MOD) from April 1991 until April 1995. At the time the DRA was Britain's largest science and technology organisation. DRA's Algol68 compiler was finally open-sourced in April 1999 and is now available for linux for download from sourceforge. (However an interpreter for "Algol68g" is easier to use). ICL's Algol68 was/is S3 - It was developed by the UK company International Computers Limited (ICL) for its 2900 Series mainframes. It is a system programming language based on ALGOL 68 but with data types and operators aligned to those offered by the 2900 Series. It was the implementation language of the operating system VME. There are (at least) two other British operating systems - Flex and Cambridge_CAP_computer - written in Algol68 variants. And also 1 Soviet OS: Эльбрус-1 (Elbrus-1), but I have yet to find any of their source code. (If anyone can find and distribute to this source code please let me know) BTW: I believe that VME is still running - in production - as a Linux/Unixware guest VM. Mostly at Commonwealth of Nations Custom/Immigration services. Also over the same period the USSR was using Algol68, c.f. history link. Algol68 is used in Russian telephone exchanges. And Algol58 was used in the Russian "Buran/Буран" Space Shuttle landing system. ALGOL68 was internationalized in 1968. I suspect there are other Algol projects in other countries. esp in German, in Dutch Japanese and Chinese but I have no details. If you want to actually tryout Algol68 and/or contribute your code, check out Rosettacode's ALGOL 68 repository, then as a class project try one of the "Tasks not implemented". However, what's unclear is, was Algol (pure Algol, not any of its derivatives like Simula) ever actually used for any "real" programming in any way? Please, avoid the term "real" programming. "Real" - as opposed to what ? Imaginative ? By "real", I mean used for several good-sized projects other than programming language/CS research, or by a significant number of developers (say, > 1000). Yes. It was used for a certain number of projects on which worked a certain number of developers. Only, what is usually misinterpreted often today is this; in those days computers weren't exactly a household commodity. Hell, they weren't that 30 years ago, less alone 60. Programming was done in computer centres which were either in goverment ownership (military, academic, institutes of various kinds) or in private enterprises (large companies). And programming wasn't a profession - it was something which engineers, mathematicians, scientiscs and the like used to do when their work was done on paper ... or they had specialized operators which did it for them. Often women, who may or may have not had a scientific background in that particular field - they were "language translators", in lack of a better term (and my bad english). Programming theories and research was at its beginnings ... vendors being few (and naturally uncooperative to each other) ... each of them used their own extensions, and often programs written for one didn't work well with the other vendor's systems. There wasn't a "right way" to do something ... you had that and that, and you used whatever catch you could figure to work around your problem. But, I've wandered off. Let me get back to the number of people. This also goes for several other languages; fortran and cobol, for example. People say, "very few use it". That's simply not true. What is true is that a small percentage of people uses it today, but a larger percent of people used to use it. As I said, in those days only the sci. and eng. community used to do it. And their number was relatively small, compared to the total population. Nowadays, everybody uses computers, but the absolute number of engineers, mathematicians and the like, is pretty much the same. So it seems that nobody uses those languages anymore ... while in reality, for certain specialized languages (well, nowadays this goes for fortran and cobol, more than algol) the number of users is pretty much constant. Personally, the only Algol programming I have ever done was on paper, thus the curiosity. I know I didn't answer your question, but just wanted to clear this. Algol was a little "beofre my time".
4768
dbpedia
3
16
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
en
The Burroughs 205 and 220 Blog: Algol
https://datatron.blogspot.com/favicon.ico
https://datatron.blogspot.com/favicon.ico
[ "https://resources.blogblog.com/img/icon18_edit_allbkg.gif", "https://3.bp.blogspot.com/-DAY4Y69Bfls/VL4_Kc85dcI/AAAAAAAAC7Y/5l1lU1Nu0TE/s220/B205Prime.jpg" ]
[]
[]
[ "" ]
null
[ "Paul Kimpel" ]
null
By the early 1950s, the number and variety of electronic computers were increasing, the scope of their application to science, business, a...
en
https://datatron.blogspot.com/favicon.ico
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
4768
dbpedia
1
19
https://stevanussugianto.wordpress.com/2013/03/12/concepts-of-programming-languages-chapter-2-evolution-of-the-major-programming-languages/
en
Concepts of Programming Languages – Chapter 2 – Evolution of The Major Programming Languages
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2013-03-12T00:00:00
Review Questions: 1. In what year was Plankalkul designed? In what years was that design published? Plankalkul was designed in 1943 by Konrad Zuse as his proposal for his Ph. D. dissertation. In a lengthy manuscript dated 1945. The design was published in 1972. 2. Mention an interesting feature of Zuse's programs. An interesting feature…
en
https://s1.wp.com/i/favicon.ico
Stevanus Sugianto
https://stevanussugianto.wordpress.com/2013/03/12/concepts-of-programming-languages-chapter-2-evolution-of-the-major-programming-languages/
Review Questions: 1. In what year was Plankalkul designed? In what years was that design published? Plankalkul was designed in 1943 by Konrad Zuse as his proposal for his Ph. D. dissertation. In a lengthy manuscript dated 1945. The design was published in 1972. 2. Mention an interesting feature of Zuse’s programs. An interesting feature of Zuse’s programs was the inclusion of mathematical expressions showing the current relationships between program variables. These expression stated that would be true during execution at the points in the code where they appeared. These are very similar to the assertions of Java and in those in axiomatic semantics. 3. What does Plankalkul mean? Plankalkul means program calculus. It was defined and the algorithms were written in language to solve a wide variety of problems by Zuse. 4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they? They were the novel facility of automatically incrementing address registers which didn’t appear in hardware until the UNIVAC 1107 computers of 1962. The speedcoding interpreter effectively converted the problem of 701 ti a virtual three-address floating-point calculator. 5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped? The number of bits in a single word of the UNIVAC I’s memory is 72 bits and the bits are grouped as 12 six-bit bytes. 6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming language? Explain why. Indexing and floating-point instructions in hardware, because one of the primary reasons why the slowness of interpretive system was tolerated from the late 1940s to the mid-1950s was the lack of floating-point hardware in the available computers. All floating-point operations had to be simulated in software, a very time-consuming process. Because so much processors time was spent in software floating-point processing, the overhead of interpretation and the simulation of indexing were relatively insignificant. As long as floating-point had to be done by software, interpretation was an acceptable expense. 7. Who developed the Speedcoding system for the IBM 701? The speedcoding system was developed by John Backus for the IBM 701 which is an example of such a system (Backus, 1954). 8. Who developed Short Code? Why is Short Code called automatic programming? Short Code was developed by John Mauchly. It is called automatic programming because it is not translated to machine code, rathter it was implemented with a pure interpreter. It clearly simplifies the programming process, but at the expense of execution time. 9. Under what environmental consideration as Fortran developed? Which is the first version of Fortran? Environmental consideration as Fortran developed was under: – Computers had small memories and were slow and relatively unreliable – The primary use of computer was for scientific computations – There were no existing efficient and effective ways to proram computers – Because of the high cost of computers compare to the cost of programmers, speed of the generated object code was the primary goal of the first Fortran compilers. The first version of Fortran is Fortran 0. 10. What was the most significant feature added to Fortran I to get Fortran II? The most significant feature added to Fortran I to get Fortran II was the independent compilation of subroutines. Without it, any change in a program required that the entire program be recompiled. 11. What control flow statements were added to Fortran IV to get Fortran 77? Control flow statements added to Fortran IV to get Fortran 77 were logical loop control statements and an If with an optional Else clause. 12. Which versions of Fortran was the first to have any sort of dynamic variables? Fortran 90(ANSI, 1992) was the first version of Fortran to have any sort of dynamic variables. 13. Which version of Fortran was the first to have character string handling? Version of Fortran that was the first to have character string handling was Fortran 77. Moreover it became standard in 1978 and it retained most of features of Fortran IV. 14. Why are linguist interested in artificial intelligence in the late 1950s? Linguist were interested in artificial intelligence in the late 1950s because the were concerned with natural language processing. 15. What are the different data types and structures in Common LISP? The different data types and structures in common LISP are the Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of a packages for modularizing collections of functions and data providing access control. 23. In what year did the COBOL design process begin? COBOL design process was begun in 1959 when it was held the meeting of the subject of a common language for business applications, which was sponsored by the Department of Defense at the Pentagon. 27. Why was BASIC was an important language in the early 1980s? In early 1980s BASIC was an important language because it was easy for beginners to learn, especially those were not science oriented, and its smaller dialects can be implemented on computers with very small memories. Problem Set: 1. What features of Fortran IV do you think would have had the greatest influence of Java if the Java designers had been familiar with Fortran? It think they would be the capability of passing subprograms as parameter to others subprogram, explicit type declarations for variables, and a logical if construct. 3. Write a short history of the Fortran 0, Fortran I, Fortran II, Fortran IV systems. Fotran 0 was the first version of Fortran. It stated that it would provide the efficiency of hand-coded programs and the ease of programming of the interpretive pseudocode systems. Environmental consideration as Fortran developed was under: computers had small memories and were slow and relatively unreliable, the primary use of computer was for scientific computations, there were no existing efficient and effective ways to proram computers, because of the high cost of computers compare to the cost of programmers, speed of the generated object code was the primary goal of the first Fortran compilers. Fortran I included input/output formatting, variable names of up to six characters, user-defined subroutines, the if selection statement and the Do loop statement. There were no data typing, variable whose names began with I, J, K, L, M, N were implicity integer type, and all others were implicitly floating-point. Fortran II fixed many of the bugs in the Fortran I compilation system and added some significant features to the language, most important being the independent compilation of subroutines. Without independent compilation, any change in a program required that the entire program be recompiled. The capability of including precompiled machine language versions of subprograms shortened the compilation process considerably and made it practical to develop much larger programs. Fortran IV became one of the most widely used programming languages of its time. It was an improvement over Fortran II in many ways. It added explicit type declaration for variable, a logical If construct, and the capability of passing subprograms as parameters to other subprograms. 9. Why in your opinion, did Fortran allow names that began with I, J, K, L, M, and N as implicitly integer type? In my opinion Fortran allowed it due to the programmers habits of using those character as name of variable. And it eased the programmers to declare the variable with int data type whenever they needed it. 10. Outline the major developments in ALGOL 60.Major development of ALGOL 60: – The concept of block structure was introduced. This allowed the programmer to localize parts of programs by introducing new data environments, or scopes. – Two different means of passing parameters to subprograms were allowed: pass by value and pass by name. – Procedures were allowed to be recursive. The ALGOL 58 description was unclear on this issue. Note that although this recursion was new for the imperative languages, LISP had already provided recursive functions in 1959. – Stack-dynamic arrays were allowed. A stack-dynamic array is one for which the subscript range or ranges are specified by variables, so that the size of the arrays is set at the time storage is allocated to they array, which happens when the declaration is reached during execution. 15. Are there any non procedural languages other than Prolog? Yes there are, SQL is non procedural languages. 16. What is your opinion of the argument that languages that are too complex are too dangerous to use, and we should therefore keep all languages small and simple? My opinion about it is that those language are literally not dangerous. It is dangerous if an amateur tries to create a complex program with less understanding of concept of that languages which are too complex. If it is reachable to create a language which are small and simple, why not? As long as it can provide all that we need to create a program. Regarding simplicity is one of the characteristic that a programming language should have. 24. Why, in your opinion, do new scripting languages appear more frequently than new compiled languages? In opinion I think it is because a scripted language is easier for the beginner to learn and implement than a compiled language. And also due to scripting languages efficiency of its ease built in interfaces and give the fastest turnaround from script to execution. 25. Give a brief general description of the Java servlet. Java servlet is Java platform technology of choice for extending and enhancing Web servers. Servlets provide a component-based, platform-independent method for building Web-based applications, without the performance limitations of CGI programs. And unlike proprietary server extension mechanisms (such as the Netscape Server API or Apache modules), servlets are server- and platform-independent.
4768
dbpedia
0
55
https://www.oxfordreference.com/display/10.1093/oi/authority.20110803095402287
en
Algol
https://www.oxfordrefere…780199234004.jpg
https://www.oxfordrefere…780199234004.jpg
[ "https://ouptag.scholarlyiq.com/ntpagetag.gif?js=0" ]
[]
[]
[ "" ]
null
[]
null
"Algol" published on by null.
en
/assets/ctx/6ab902f7/favicon.ico
Oxford Reference
https://www.oxfordreference.com/display/10.1093/oi/authority.20110803095402287
Acronym for algorithmic language. The generic name for a family of high-level languages of great significance in the development of computing. In 1958 the Association for Computing Machinery (ACM) in the US and the Gesellschaft für Angewandte Mathematik und Mechanik (GAMM) in Europe set up a joint committee to define an international algorithmic language (IAL). The language that was designed became known as Algol, and was later called Algol 58 to distinguish it from later versions. Algol 58 was not intended to be a viable language, and in 1960 an augmented committee was convened to devise the second iteration, which was published as the language Algol 60. See also Jovial. Algol 60 was much more popular in Europe than in the US, probably due to the dominance of IBM and Fortran in the North American market. It introduced many new concepts, notably block structure (see block-structured languages), nested scopes, modes of parameter passing to procedures, and the definition of the language introduced the now classic BNF notation for describing syntax. The influence of Algol 60 can be seen in all succeeding languages, and it stands as a milestone in the development of programming languages. In the years following the publication of the Algol 60 Report, a working group of the International Federation for Information Processing was set up to consider the definition of a successor to Algol 60. There were many dissensions within the group, and eventually a minority report was issued proposing the language Algol 68. Although Algol 68 introduced many novel concepts of great theoretical interest and significance, its specification was very difficult to understand and its practical application was almost nil. One of the most significant effects of the split in the Algol 68 working group is that it led indirectly to the development of Pascal.
4768
dbpedia
3
57
https://www.cs.csub.edu/~melissa/cs350-f15/notes/notes02.html
en
[ "https://www.cs.csub.edu/~melissa/cs350-f15/Images/history.gif" ]
[]
[]
[ "" ]
null
[]
null
null
Chapter 2: Evolution of Major Programming Languages "Much of my work has come from being lazy." -- John Backus, co-designer of IBM 704 architecture and developer of Fortran (full chart) 1 9 4 5 - Zuse's Plankalkul (1945) Never implemented; Advanced data structures; floating point, arrays, records; Invariants Short Code (1949) A "pseudocode" developed by Mauchly for BINAC computers; 1) The poor readability and writability of machine code; 2) Machine deficiencies; - no index registers (required absolute addressing in code) - no floating point hardware 1 9 5 0 - Speedcoding (1954) A "pseudocode" developed by John Backus for IBM 701; Index registers became standard (1955-1964) Pseudo operations for arithmetic & math functions; Pseudo operations for conditional & unconditional branching; Auto-increment registers for array access; Slower than machine code FORTRAN 0 (1954) FORmula TRANslation language specification only - no subroutines; (brief history of Fortran) & Fortran history & docs "Fortran should virtually eliminate coding and debugging errors." p. 2. "Unfortunately we were hopelessly optimistic in 1954 about the problems of debugging Fortran programs." p. 168. "As far as we were aware, we simply made up the language as we went along. We did not regard language design as a difficult problem, merely a simple prelude to the real problem: designing a compiler which could produce efficient programs." p. 168. "...while it was perhaps natural and inevitable that languages like Fortran and its successors should have developed out of the concept of the von Neumann computer as they did, the fact that such languages have dominated our thinking for over twenty years is unfortunate. It is unfortunate because their long-standing familiarity will make it hard for us to understand and adopt new programming styles which one day will offer far greater intellectual and compututational power." p. 178. John Backus, "The history of Fortran I, II, and III." ACM SIGPLAN Notices, 13(8):165-180, August 1978. FORTRAN I (1956 - 1957) > 1st implemented version of Fortran compiler (released 1957) - 18 man years; Designed for IBM 704 with index registers and floating point hardware; Environment of development: Primitive kernel or no kernel; No programming methodology or tools; Computers were small and unreliable; Scientific applications and efficiency primary concern; Impact of environment on design: No need for dynamic storage (no heap) Need good array handling and counting loops; No string handling, decimal arithmetic, or powerful I/O (commercial stuff) Overview: Names could have up to six characters; Post-test counting loop (DO) Formatted I/O; User-defined subprograms; Three-way selection statement ( arithmetic IF - <, >, = ) No data typing statements; Programs larger than 400 lines rarely compiled correctly, mainly due to poor reliability of 704; Code was very fast - quickly became widely used FORTRAN II (1958) Independent compilation of subroutines (reduced development time) LISP (1958) - LISt Processing language; Designed at MIT by McCarthy (1958) First functional language AI applications process data in lists rather than arrays Symbolic computation rather than numeric Code + Data has same form Originally only two data types: atoms and lists Syntax based on lambda calculus No need for variables or assignment (as in imperative languages) Control via recursion and conditional expressions Still the dominant language for AI ML, Miranda, and Haskell are related languages ALGOL 58 (1958) The 1st step to platform independent language with formalized grammar in BNF (Backus-Naur form) notation; Language Close to mathematical notation - good for describing ALGOrithms; Language development up to now was for specific machines (Fortran had barely arrived for IBM 70x); No portable or universal language -- all were machine-dependent; No easy way to communicate algorithms; ACM and GAMM met for four days on design (May 27 to June 1, 1958) Must be translatable to machine code; Formalized Concept of "type"; Names could be any length; Arrays could have any number of subscripts; Parameters were separated by mode (in & out) Subscripts were placed in brackets; Compound statements (begin ... end) Semicolon as a statement separator; Assignment operator was :=; if had an else-if clause; No I/O - would make I/O machine dependent; IBM initially enthusiastic -- all support dropped by mid 1959 so eventually ALGOL as a language died 1 9 6 0 - COBOL (COmmon Business-Oriented Language) Design Process (late 1959 - ) Names up to 12 characters, with embedded hyphens; English names for arithmetic operators (no arithmetic expressions) Verbs the first word in every statement; Design committee members from computer manufacturers and DoD ; Design goals: to work for business applications (based on FLOW-MATIC) to look like simple English -- easy to use at expense of being powerful; to broaden the base of computer users; FORTRAN IV (1960-62) Explicit type declarations; Logical selection statement; Subprogram names could be parameters; ALGOL 60 (1960) an effort to design a universal language - portability; New features: Block structure (local scope) Two parameter passing methods; Subprogram recursion; Stack-dynamic arrays; No I/O or string handling; APL: A Programming Language (1960) Early dynamic language; Designed as a hardware description language at IBM by Ken Iverson; Highly expressive (many operators for scalars and arrays of n-dimensions) Programs are very difficult to read; Still in use; minimal changes; SNOBOL (1962) Early dynamic language; Designed as a string manipulation language at Bell Labs; Powerful operators for string pattern matching; Slower than alternative languages (no longer used for writing editors) Still used for certain text processing tasks; BASIC (1964) The beginning of timesharing - Free and private access; Designed at Dartmouth for non-science students - 1st widely used timesharing language ; User time more important than computer time - Fast turnaround for homework; Easy to learn and use for non-science students: pleasant and friendly; Current popular dialect: Visual BASIC PL/I (1964) Initially called NPL (New Programming Language) changed to PL/I in 1965; Designed by IBM and SHARE: Everything for Everybody ; Computing situation in 1964 (IBM's point of view) 1) Scientific computing IBM 1620 and 7090 computers Fortran; 2) SHARE user group Business computing IBM 1401, 7080 computers COBOL; ; By 1963 scientific users began to need more elaborate I/O like COBOL and business users began to need floating point and arrays ; Two kinds of computers, languages, and support staff--too costly; The solution: Build a new computer to do both (IBM 360) and design a new language for it ; PL/I designed in 5 months by 3 members from IBM, 3 members from SHARE; Initial concept - an extension of Fortran IV; FORTRAN 66 - First ANSI standard for a high level language ; SIMULA 67 (1967) Historical Importance: designed for system simulation ; Introduced concept of data abstraction; Designed primarily for system simulation in Norway by Nygaard and Dahl; Based on ALGOL 60 and SIMULA I; Primary Contributions: Co-routines - a kind of subprogram; Implemented in a structure called a class; Classes are the basis for data abstraction; Classes are structures that include both local data and functionality; ALGOL 68 (1968) Orthogonal Design; From ALGOL 60 but not a superset of that language; Source of several new ideas but language never achieved widespread use; Design is based on the concept of orthogonality; A few principle concepts, few combining mechanisms; Contributions: User-defined data structures; Reference types; Dynamic arrays (called flex arrays) Comments 1 9 7 0 - PROLOG (1970) Programming Based on Formal Logic ; Developed at University of Aix-Marseille, France ; Non-procedural; declarative rather than imperative; Can be summarized as being an intelligent database system that uses an ; inferencing process to infer the truth of given queries; Highly inefficient, small application areas -- formal language research; AI;
4768
dbpedia
2
60
https://analyticsindiamag.com/ai-origins-evolution/10-dead-programming-languages/
en
Obsolete Code: 10 Programming Languages That Vanished Over Time
https://analyticsindiama…ng-Languages.jpg
https://analyticsindiama…ng-Languages.jpg
[ "https://analyticsindiamag.com/wp-content/uploads/2024/03/UHG.gif", "https://ad.doubleclick.net/ddm/ad/N4789.2790902ANALYTICSINDIA/B27062729.399237976;sz=1x1;ord=[timestamp];dc_lat=;dc_rdid=;tag_for_child_directed_treatment=;tfua=;dc_sdk_apis=[APIFRAMEWORKS];dc_omid_p=[OMIDPARTNER];gdpr=${GDPR};gdpr_consent=${GDPR_CONSENT_755};dc_tdv=1", "https://analyticsindiamag.com/wp-content/uploads/2024/03/MITB-India-Analytics-Magazine_Feb24_B_970x90.gif", "https://analyticsindiamag.com/wp-content/uploads/2021/12/AIM-logo-black-1024x445.png", "https://analyticsindiamag.com/wp-content/uploads/2023/07/Dead-Programming-Languages-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2021/12/Join-us-in-Whatsapp.png", "https://analyticsindiamag.com/wp-content/uploads/2021/12/Follow-AIM.png", "https://analyticsindiamag.com/wp-content/uploads/2021/12/Cypher001-700X100-1.gif", "https://lh5.googleusercontent.com/ASIyvoX1MXOFfVjHfRQ_azDyXGfrlcIxmY6So1kdFujJ788Yl-c4vUv7IqO4efQyMSJYHxdpVh2N_EJShx9SsiNtlfKbR63SYdLEZEwhevmrju-qkVvAa75vMtMrsiV0MXlKfb2ST_dltokQfaCuSHQ", "https://analyticsindiamag.com/wp-content/uploads/2023/05/avatar_user_96646_1684734733-300x300.png", "https://analyticsindiamag.com/wp-content/uploads/2024/08/C-Is-The-Ultimate-Language--768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/07/Mohits-Banners-4-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/04/COBOLs-Costly-Legacy-Continues-to-Drain-Company-Resources_-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2023/11/Rust-Provides-the-Ultimate-Security-Against-Hackers-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2023/11/How-WasmGC-Will-Change-the-Developer-Experience-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2023/05/Move-Over-Julia-Mojo-Is-Here-1-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2022/12/New-Upgraded-Version-of-NumPy-is-Here--768x432.png", "https://analyticsindiamag.com/wp-content/uploads/2022/12/Why-new-programming-languages-are-not-coming-up-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2021/12/SMU-sidebanner.jpg", "https://analyticsindiamag.com/wp-content/uploads/2021/12/2024_Spring_AIM_300x250.jpeg", "https://analyticsindiamag.com/wp-content/uploads/2024/07/cdo-vision-mumbai-dallas-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/07/Generative-AI-Crash-Course-for-Non-Techies-768x576.jpg", "https://analyticsindiamag.com/wp-content/uploads/2021/12/AIM-App-for-IOS.webp", "https://analyticsindiamag.com/wp-content/uploads/2021/12/AIM-App-for-Android.webp", "https://analyticsindiamag.com/wp-content/uploads/2024/08/Target-Launches-First-Retail-Chatbot-for-Store-Employees-This-Month-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/ai-agents-768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/Cognizant-is-Offering-a-Very-Generous-Salary-of-INR-2.5-LPA-for-Engineers--768x432.jpg", "https://analyticsindiamag.com/wp-content/uploads/2021/12/discord-icon-1.png", "https://analyticsindiamag.com/wp-content/uploads/2024/08/Gen-AI-Has-Made-Frontend-Development-More-Complex-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/cashback.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/Why-Elon-Musk-is-Betting-Big-on-Computer-Vision-and-not-LiDAR-for-Teslas-AV-1-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2023/06/bmtc-1-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/How-is-India-Using-GenAI-in-Defence_-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/sarvam-ai-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/Indias-Fabless-Supply-Chain-is-Far-From-Fabulous-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/08/Top-Seven-GenAI-Courses-for-Non-Techies-1-1300x731.jpg", "https://analyticsindiamag.com/wp-content/uploads/2024/03/Cypher-24-USA-India.pdf-768x419.jpg", "https://analyticsindiamag.com/wp-content/uploads/2021/12/AIM-logo-black.png", "https://analyticsindiamag.com/wp-content/uploads/2021/12/App-store.png", "https://analyticsindiamag.com/wp-content/uploads/2021/12/Playstore.png", "https://analyticsindiamag.com/wp-content/uploads/2022/03/SMU-Career-Banner.webp" ]
[]
[]
[ "" ]
null
[ "K L Krithika" ]
2023-07-13T06:30:00+00:00
Programming languages have come a long way since Autocode in complexity of tasks that they can accomplish
en
https://analyticsindiama…1-22-3-32x32.jpg
AIM
https://analyticsindiamag.com/ai-origins-evolution/10-dead-programming-languages/
Programming languages are constantly evolving with a life cycle that entails: popularity, growth and decline. The reasons behind their decline vary from outdated principles to new more efficient languages gaining popularity. Here are 10 languages that once enjoyed popularity in their prime but were lost into oblivion in the 21st century. COBOL In 1960, the CODASYL organisation played a significant role in the development of COBOL, a programming language influenced by the division between business and scientific computing. During that time, high-level languages in the industry were either used for engineering calculations or data management. COBOL, considered one of the four foundational programming languages along with ALGOL, FORTRAN, and LISP, was once the most widely used language worldwide. It continues to operate many of our legacy business systems. Cause of Death: Two factors contributed to COBOL’s decline. Firstly, it had minimal connections with other programming language efforts. Very few developers built upon COBOL, leading to the scarcity of its influence in second or third generation languages, which benefited from lessons learned from their predecessors. COBOL is exceptionally intricate, even by today’s standards. Consequently, COBOL compilers fell behind those of contemporaneous microcomputers and minicomputers, providing opportunities for other languages to thrive and eventually surpass itself. ALGOL In 1960, the ALGOL committee aimed to create a language for algorithm research, with ALGOL-58 preceding and quickly being replaced by ALGOL-60. Despite being relatively lesser known today compared to LISP, COBOL, and FORTRAN, ALGOL holds significant importance, second only to LISP, among the four original programming languages. It contributed to lexical scoping, structured programming, nested functions, formal language specifications, call-by-name semantics, BNF grammars, and block comments. Cause of Death: ALGOL was primarily a research language, not intended for commercial use. Its specification lacked input/output capabilities, making practical application difficult. As a result, numerous ALGOL-like languages emerged in the 1960s and 1970s. Subsequent languages were based on these extensions rather than ALGOL itself. During the 1960s and 1970s, numerous ALGOL-like languages emerged as people extended ALGOL with input/output capabilities and additional data structures. Examples of such languages include JOVIAL, SIMULA, CLU, and CPL. The descendants of ALGOL ultimately overshadowed and outpaced it in popularity and usage. APL APL was created by Ken Iverson in 1962. Originally developed as a hand-written notation for array mathematics, IBM adopted it as a programming language. APL focused on array processing, enabling concise manipulation of large blocks of numbers.It gained popularity on mainframe computers due to its ability to run with minimal memory requirements. APL revolutionised array processing by introducing the concept of operating on entire arrays at once. Its influence extends to modern data science and related fields, with its innovations inspiring the development of languages like R, NumPy, pandas, and Matlab. APL also has direct descendants such as J, Dyalog, K, and Q, which, although less successful, still find extensive use in the finance sector. Cause of Death: APL faced challenges due to keyboard limitations. The language’s non-ASCII symbols made it difficult for widespread adoption. Ken Iverson addressed this issue in 1990 with J, which utilised digraphs instead of distinct symbols. However, this change came relatively late and did not gain significant traction in popularising a radically different programming style. Another challenge was APL’s limitation to homogeneous data, as it did not support storing both strings and numbers in the same data structure. Working with strings was also cumbersome in APL. These limitations, including the absence of dataframes, hindered APL’s suitability for modern data science applications. BASIC Created by John Kemeny in 1964, BASIC originated as a simplified FORTRAN-like language intended to make computer programming accessible to non-engineering individuals. BASIC could be compactly compiled into as little as 2 kilobytes of memory and became the lingua franca for early-stage programmers. It was commonly used by individuals programming at home in the 1970s. Its major technical impact lay in its runtime interpretation. It was the first language to feature a real-time interpreter, beating APL by a year. Cause of Death: BASIC faced the perception of being a “lesser” language compared to other programming languages used by professional programmers. While it continued to be used by children and small business owners, it was not considered the language of choice for experienced programmers. As microcomputers with larger memory capacities became available, BASIC was gradually replaced by languages like Pascal and C. BASIC persisted for some time as a legacy teaching language for kids but eventually faded away from that niche as well. PL/I Developed by IBM in 1966, PL/I aimed to create a language suitable for both engineering and business purposes. IBM’s business was previously divided between FORTRAN for scientists and COMTRAN for business users. PL/I merged the features of these two languages, resulting in a language that supported a wide range of applications. PL/I implemented structured data as a type, which was a novel concept at the time. It was the first high-level language to incorporate pointers for direct memory manipulation, constants, and function overloading. Many of these ideas influenced subsequent programming languages, including C, which borrowed from both BCPL and PL/I. Notably, PL/I’s comment syntax is also used in C. Cause of Death: PL/I faced challenges as it tried to straddle the line between FORTRAN and COBOL. Many FORTRAN programmers considered it too similar to COBOL, while COBOL programmers saw it as too similar to FORTRAN. IBM’s attempt to compete with two established languages using a more complex language deterred wider adoption. Moreover, IBM held the sole compiler for PL/I, leading to mistrust from potential users concerned about vendor lock-in. By the time IBM addressed these issues, the computing world had already transitioned to the microcomputer era, where BASIC outpaced PL/I. SIMULA 67 Ole Dahl and Kristen Nygaard developed SIMULA 67 in 1967 as an extension of ALGOL for simulations. SIMULA 67, although not the first object-oriented programming (OOP) language, introduced proper objects and laid the groundwork for future developments. It popularised concepts such as class/object separation, subclassing, virtual methods, and protected attributes. Cause of Death: SIMULA faced performance challenges, being too slow for large-scale use. Its speed was particularly limited to mainframe computers, posing difficulties for broader adoption. It’s worth noting that Smalltalk-80, which extended SIMULA’s ideas further, had the advantage of Moore’s Law advancements over the extra 13 years. Even Smalltalk was often criticised for its speed. As a result, the ideas from SIMULA were integrated into faster and simpler languages by other developers, and those languages gained wider popularity. Pascal Niklaus Wirth created Pascal in 1970 to capture the essence of ALGOL-60 after ALGOL-68 became too complex. Pascal gained prominence as an introductory language in computer science and became the second most popular language on Usenet job boards in the early 1980s. Pascal popularised ALGOL syntax outside academia, leading to ALGOL’s assignment syntax, “:=”, being called “Pascal style”. Cause of Death: The decline of Pascal is complex and does not have a clear-cut explanation like some other languages. While some attribute its decline to Edsger Dijkstra’s essay ‘Why Pascal is not my favourite language’, this explanation oversimplifies the situation. Pascal did face competition from languages like C, but it managed to hold its own for a significant period. It’s worth noting that Delphi, a variant of Pascal, still ranks well in TIOBE and PYPA measurements, indicating that it continues to exist in certain niches. CLU CLU was developed by Barbara Liskov in 1975, with the primary intention of exploring abstract data types. Despite being relatively unknown, CLU is one of the most influential languages in terms of ideas and concepts. CLU introduced several concepts that are widely used today, including iterators, abstract data types, generics, and checked exceptions. Although these ideas might not be directly attributed to CLU due to differences in terminology, their origin can be traced back to CLU’s influence. Many subsequent language specifications referenced CLU in their development. Cause of Death: CLU served as a demonstration language with Liskov’s primary goal being the adoption of her ideas rather than the language itself. This objective was largely achieved, as nearly all modern programming languages incorporate elements inspired by CLU. SML Robin Milner developed ML in 1976 while working on the LCF Prover, one of the first proof assistants. Initially designed as a metalanguage for writing proofs in a sound mathematical format, ML eventually evolved into a standalone programming language. It is considered one of the oldest “algebraic programming languages”. ML’s most notable innovation was type inference, allowing the compiler to deduce types automatically, freeing programmers from explicitly specifying them. This advancement paved the way for the adoption of typed functional programming in real-world applications. Cause of Death: ML initially served as a specialised language for theorem provers, limiting its broader usage. While SML emerged in the same year as Haskell, which exemplified a more “pure” typed functional programming language, the wider programming community paid more attention to Haskell. ML’s impact and adoption remained substantial within academic and research settings but did not achieve the same level of popularity as some other languages. Smalltalk Smalltalk, developed by Alan Kay, had multiple versions released over time. Each version built upon the previous one, with Smalltalk-80 being the most widely adopted and influential. It is often regarded as the language that popularised the concept of object-oriented programming (OOP). While not the first language with objects, Smalltalk was the first language where everything, including booleans, was treated as an object. Its influence can be seen in the design of subsequent OOP languages, such as Java and Python.
4768
dbpedia
2
21
https://vanemden.wordpress.com/2017/03/18/children-of-the-miracle-from-algol-to-prolog/
en
Children of the Miracle: from Algol to Prolog
https://vanemden.wordpre…/alain.jpg?w=237
https://vanemden.wordpre…/alain.jpg?w=237
[ "https://vanemden.wordpress.com/wp-content/uploads/2017/03/alain.jpg?w=237&h=300", "https://0.gravatar.com/avatar/60a14fed58b3653888c52e3d582fb228386cf0486d75a8450bc54c347f0f61e3?s=32&d=identicon&r=G", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2017-03-18T00:00:00
The appearance of Fortran inaugurated a fruitful period in programming languages that was to last until the early 1970s. When, in 1999, E.W. Dijkstra gave the keynote address at the ACM Symposium on Applied Computing in San Antonio, Texas, he gave an overview of what he saw as the large-scale trends in the preceding half…
en
https://s1.wp.com/i/favicon.ico
A Programmers Place
https://vanemden.wordpress.com/2017/03/18/children-of-the-miracle-from-algol-to-prolog/
The appearance of Fortran inaugurated a fruitful period in programming languages that was to last until the early 1970s. When, in 1999, E.W. Dijkstra gave the keynote address at the ACM Symposium on Applied Computing in San Antonio, Texas, he gave an overview of what he saw as the large-scale trends in the preceding half century. I quote: And then the 60s started with an absolute miracle, viz. ALGOL 60. This was a miracle because on the one hand this programming language had been designed by a committee, while on the other hand its qualities were so outstanding that in retrospect it has been characterized as “a major improvement on most of its successors” (C.A.R. Hoare). … Several friends of mine, when asked to suggest a date of birth for Computing Science, came up with January 1960, precisely because it was ALGOL 60 that showed the first ways in which automatic computing could and should and did become a topic of academic concern. [1]). Algol was a miracle as a language. It was short-lived, but it left a momentous legacy that acted in two ways: in the way the Revised Report on Algol 60 describes the language and in the way subsequent language designers were influenced by being shown what a programming language could be. In celebration of Algol 60 I refer to these designers as “Children of the Miracle”. The first Children of the Miracle were the members of the Simula team. Although that language quickly followed Algol 60 into oblivion, its distinguishing feature, classes, survived as object-oriented programming in the hands of Bjarne Stroustrup [2] as his C++ language. Although that language is no longer among the most widely used object-oriented languages, it is very much alive. The Simula team was exposed to Algol 60 pretty much in one place and at the same time. It is remarkable that three of the main contributors to Prolog implementation also had Algol as their formative experience, but independently, scattered in space and time. In the remainder of this article I give an account of how they were influenced by Algol. In the case of Alain Colmerauer I rely on [3, 4, 5, 6]. In 1963 Colmerauer, as a new graduate student, joined a group at the University of Grenoble whose task was to build an Algol-60 compiler for the IBM 7044. Among the various available parsing techniques, the group was attracted to the method of Edgar Irons. Compared to the other recursive approach, that of Booker and Morris, the one of Irons was more generally applicable, but required non-deterministic choice among production rules. This was Colmerauer’s first brush with non-determinism. After Colmerauer completed his PhD work, in parsing, he remained in Grenoble to work on two other topics. One of these was to implement an extension to Algol 60 according to a proposal by Robert Floyd for a general mechanism that is not restricted to such non-determinism as may arise in parsing and that can be added to any imperative language. The other topic was the two-level grammar of Adriaan van Wijngaarden. Colmerauer wrote both a parser and a generator such grammars. Van Wijngaarden thought that going beyond context-free grammars is important for better definition of programming languages. Accordingly, these grammars were used in the definition of Algol 68. If context-free grammars seem inadequate for the definition of programming language, then they seem more obviously inadequate for natural-language processing. At the University of Montreal, where he had moved to in 1967, Colmerauer implemented “Q systems”, a generalization of context-free grammars that has similarities to two-level grammars as well as to the type-0 grammars of the Chomsky hierarchy. Q systems can be used both in parsing and in generating mode. This property makes them attractive for natural-language translation: parse the source-language text to capture a semantic structure, generate target-language text on the basis of that structure. The dual-mode property of Q systems makes them also an attractive choice for question-answering systems in natural language: parse the assertions to capture their information content, update the semantic structure, parse the question to retrieve information from the semantic structure, and generate answers on the basis of that information. Such a use suggests a specific kind of semantic structure, namely a sufficiently expressive logic. That is, view a question-answering system as a front-end to automatic theorem-proving. When Colmerauer left Montreal in 1970 to take up a faculty position in the University of Aix-Marseille, he assembled a small group to develop a question-answering system in French. For the theorem-prover they selected J.A. Robinson’s resolution logic, possibly inspired by Cordell Green’s question-answering system QA3 [7] where Lisp is the language for the assertions and queries. The most promising work in resolution theorem-proving was seen to be happening in the University of Edinburgh. Colmerauer invited Robert Kowalski for a brief visit in 1971, followed by a longer visit in 1972. The expectations of the Marseille group were met by what they learned about SL resolution, a new technique recently developed in Edinburgh. Beyond these expectations there was a surprise: Boyer, Kowalski, and Moore had noticed that positive Horn clauses play a pivotal role in resolution logic: in their presence SL resolution refrained from misbehaviour and these clauses can be read as context-free production rules. This suggested that the rules be expressed in positive Horn clauses, with SL resolution acting in parsing or in generating mode, as required by the question-answering system. When the Marseille group learned these results, the relevance to their project was apparent: the Q systems, which were modeled on type-0 grammars, could be replaced by an SL-resolution theorem-prover based on productions in the form of positive Horn clauses. In spite of a resemblance to type-2 grammars, these logic grammars are more expressive because of the availability of parameters, reminiscent of the rules of the two-level grammars of Algol 68. Thus the Kowalski visits resulted in a drastic re-design of the Marseille question-answering system. Instead of Q systems for parsing assertions and queries and for generating answers, with logic restricted to the semantic structures, it became logic for everything: an SL theorem-prover specialized for positive Horn clauses could parse, generate, and make inferences. This theorem-prover was only a step away from a general-purpose programming language. Colmerauer got a grant for a year with the goal to produce a “man-machine communication system in French”. The strategy: first create the programming language, then write the required system. The group adopted “Prolog” as the name of the programming language. The most incisive account of the logic kernel of Prolog is Kowalski’s [8]. In 1975 the action moved to Budapest, Hungary, where Peter Szeredi completed his first Prolog implementation. As a student in mathematics, Szeredi had been programming since 1966. He started in Autocode on the Elliott 803 and used it to write assemblers for this machine. He next became involved in Algol 68: translated parts of the report into Hungarian. Szeredi is credited with discovering an error in the type system, which was later corrected by introducing “incestuous unions”. As a result of his involvement with Algol 68, Szeredi became acquainted with the Compiler Definition Language (CDL), developed by Cornelis Koster, one of the authors of the Algol 68 report. CDL is closely related, via affix grammars, to the W-grammars in which Algol 68 is defined. As others did, Szeredi found CDL a congenial medium for software development and he used it for systems programs for a new Hungarian computer. In 1975 the Fortran source code of the second Marseille Prolog implementation reached Hungary together with a few transparencies by David H.D. Warren explaining the main ideas of this interpreter. By the time another group in Budapest had overcome its problems in porting the Fortran code to the locally available machine, Szeredi had completed a new Prolog implementation written in CDL. He credits the similarity of CDL with Prolog for this quick success. See [9] for more information about Szeredi’s work in connection with Prolog. As Algol plays such an important part in the causal chain connecting Szeredi’s early programming experience with his Prolog implementations, I count him among the Children of the Miracle. Starting in 1976 Keith Clark took the lead in implementing a sequence of Prolog-like languages. These languages exploited the non-sequential semantics of Horn clauses and used control structures such as coroutines and guards or used parallelism. Let us look at Clark’s pre-Prolog computing experience. After completing his undergraduate work in logic and philosophy, Clark continued his academic study by taking the Computer Science Conversion diploma course at Imperial College, London. There were no examinations; one graduated on the basis of a dissertation. At the start of the work he was handed a listing of an implementation of Euler on the IBM 7094, written in Algol 60. “Do something with this.” Clark extended the language with list processing capabilities and liberalised the syntax to allow var declarations to appear anywhere in procedure bodies. In addition to introducing new primitives, he had to extend the BNF grammar of Euler, which associated abstract-machine code generation with most of the grammar rules, add extra abstract-machine instructions, and change the definition of its abstract interpreter for code sequences in reverse Polish. He ended up producing a usable implementation of the extended Euler. In this way Clark was exposed to a double dose of Algol 60: by understanding the language so as to be able to modify the implementation and by immersing himself in Euler, a language inspired by Algol 60. The miracle that was Algol 60 exerted its influence through the language itself as well as via derivatives such as Simula, Euler, and Algol 68. For the Prolog pioneers these languages were the formative experience. Acknowledgements Thanks to Keith Clark, Paul McJones, and Péter Szeredi for help with this article. References [ 1] Edsger W. Dijkstra: “Computing Science: Achievements and Challenges” (EWD1284) http://tinyurl.com/znbzyd7 [ 2] Stroustrup, Bjarne. The design and evolution of C++. Addison Wesley, 1994. [ 3] Cohen, Jacques. “A view of the origins and development of Prolog.” Communications of the ACM 31.1 (1988): 26-36. [ 4] Cohen, Jacques. “A tribute to Alain Colmerauer.” Theory and Practice of Logic Programming 1.06 (2001): 637-646. [ 5] Colmerauer, Alain, and Philippe Roussel. “The birth of Prolog.” History of programming languages—II. ACM, 1996. [ 6] Kowalski, Robert A. “The early years of logic programming.” Communications of the ACM 31.1 (1988): 38-43. [ 7] Green, Cordell. “Theorem proving by resolution as a basis for question-answering systems.” Machine intelligence 4 (1969): 183-205. [ 8] Kowalski, R. “Predicate logic as a programming language”. Proc. of IFIP Congress ’74, pp. 569-574, North Holland. [ 9] P. Szeredi “The Early Days of Prolog in Hungary”. ALP Newsletter, Vol 17 n. 4, November 2004. PS Alain Colmerauer, January 24, 1941 – May 12, 2017. We mourn a great scientist and a dear friend.
4768
dbpedia
0
18
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
en
The Burroughs 205 and 220 Blog: Algol
https://datatron.blogspot.com/favicon.ico
https://datatron.blogspot.com/favicon.ico
[ "https://resources.blogblog.com/img/icon18_edit_allbkg.gif", "https://3.bp.blogspot.com/-DAY4Y69Bfls/VL4_Kc85dcI/AAAAAAAAC7Y/5l1lU1Nu0TE/s220/B205Prime.jpg" ]
[]
[]
[ "" ]
null
[ "Paul Kimpel" ]
null
By the early 1950s, the number and variety of electronic computers were increasing, the scope of their application to science, business, a...
en
https://datatron.blogspot.com/favicon.ico
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
4768
dbpedia
1
54
https://praxent.com/blog/history-of-programming-languages
en
History of Programming Languages
https://praxent.com/wp-c…es-1024x1024.png
https://praxent.com/wp-c…es-1024x1024.png
[ "https://praxent.com/wp-content/themes/praxent/assets/images/logo.svg", "https://praxent.com/wp-content/themes/praxent/assets/images/logo.svg", "https://praxent.com/wp-content/themes/praxent/assets/images/blog-icons/article.svg", "https://praxent.com/wp-content/themes/praxent/assets/images/blog-icons/podcast.svg", "https://praxent.com/wp-content/themes/praxent/assets/images/blog-icons/press.svg", "https://praxent.com/wp-content/themes/praxent/assets/images/blog-icons/webinar.svg", "https://praxent.com/wp-content/themes/praxent/assets/images/blog-icons/whitepaper.svg", "https://praxent.com/wp-content/uploads/2023/02/MHM-By-Praxent-1-1024x256.png", "https://praxent.com/wp-content/uploads/2024/06/PBM-Thumbnails-blog-300x300.png", "https://praxent.com/wp-content/uploads/2023/12/Locality-bank-Thumbnail-300x300.png", "https://praxent.com/wp-content/uploads/2023/12/Plinqit-Thumbnail-300x300.png", "https://praxent.com/wp-content/themes/praxent/assets/images/logo.svg", "https://praxent.com/wp-content/themes/praxent/assets/images/aicpa.png", "https://praxent.com/wp-content/themes/praxent/assets/images/c-reviews.png", "https://praxent.com/wp-content/themes/praxent/webroot/img/Inc5000_stacked_gold.svg" ]
[]
[]
[ "" ]
null
[ "facebook.com" ]
2022-10-04T17:52:12-06:00
History of Programming Languages As our society becomes increasingly technology-obsessed, programming languages are used by various people and businesses to accomplish particular tasks or functions. When computer users interact with services like social media or emails, they use a programming language’s code (specifically through commands) to complete these tasks. The First Programming Language In 1843, …
https://praxent.com/wp-c…avicon-32x32.png
Praxent
https://praxent.com/blog/history-of-programming-languages
As our society becomes increasingly technology-obsessed, programming languages are used by various people and businesses to accomplish particular tasks or functions. When computer users interact with services like social media or emails, they use a programming language’s code (specifically through commands) to complete these tasks. The First Programming Language In 1843, before the invention of computers, the first programming language was created by Ada Lovelace, one of the earliest female mathematicians. Since then, programming languages have made significant advancements, but this undoubtedly was a pivotal milestone in the development of programming languages. Historical Milestones in the Evolution of Programming Languages Programming languages have existed since before the invention of human-readable codes, which we now use to complete routine technological tasks. The following are some significant events in the development of programming languages that might interest you. Ada Lovelace’s invention of machine algorithms in 1843 The first computer algorithm was invented in 1843 by Ada Lovelace. Quite interestingly, Charles Babbage designed a machine for which this algorithm was intended. Konrad Zuse established the company Plankalkul in the 1940s Konrad Zuse created the programming language Plankalkul in the 1940s. It was the first high-level computer programming language ever developed and contained many coding sequences that engineers commonly use to execute basic activities. 1949 – Assembly language, the precursor to today’s coding, was created The first computer to use assembly language was the Electronic Delay Storage Automatic Calculator in 1949. Assembly language is a low-level programming language designed to simplify the machine code for writing instructions for the processors. This indicates that modern computers are easier to program than those in 1949. John McCauley mentioned shortcode in 1949, but that wasn’t implemented until 1949 and 1950 by W. Schmitt to improve the operations of the BINAC and UNIVAC computers. In 1952, Autocode was the first compiled programming language used In 1952, Alick Glennie coined the word Autocode to refer to a set of programming languages. Glennie was an expert in computer science who used Autocode mainly for the Mark 1 computer. It was the first compiled programming language used to translate machine code via a special application known as the compiler. In 1957, John Backus created FORmula TRANslation (FORTRAN) One of the first programming languages ever created, FORTRAN, was developed in 1957 and is still used today. Developed to solve scientific and mathematical problems, the FORTRAN programming language has stood the test of time. In 1958, ALGOL and LISP were both invented. In 1958, two coding languages were created: Algorithmic language (ALGOL) and List Processor (LISP). American and European computer scientists came together to create ALGOL, which is believed to be the origin of popular programming languages such as C, C++, Java, and Pascal. McCarthy created LISP to be used in his artificial intelligence applications, and it is one of the oldest computer programming languages still used today. Many individuals and companies continue to favor LISP over other programming languages, such as Ruby or Python. In 1959, Dr. Grace Murray Hopper developed COBOL, a high-level business-oriented programming language Dr. Grace Murray Hopper created COBOL in 1959. It became the first third-generation programming language and was originally designed for business applications, but it is now used for many different tasks. In 1989, COBOL was estimated to be in use by over 100 million people and is still widely employed in today’s banking systems. In 1964, BASIC was created as a computer programming language for beginners In 1964, Dartmouth College students created the Beginner’s All-Purpose Symbolic Instruction Code (BASIC). It was developed for those with limited computer proficiency and is still used today. Paul Allen and Bill Gates furthered its development, making it Microsoft’s first commercially successful product. 1970 saw the implementation of PASCAL by Niklaus Wirth Niklaus Wirth developed PASCAL, which he named after famed mathematician Blaise Pascal. Initially designed to help users learn programming languages, Apple was one of the leading firms to endorse it due to its streamlined and straightforward process. Smalltalk, SQL, and C were all born in 1972 The Smalltalk language was designed and created in order to help programmers modify programming languages, resulting in a variety of applications, including Java, Python, and Ruby. Dennis Ritchie developed C and named it so because the language descended from B. Some well-known technology companies that continue to employ C include Apple, Google, and Facebook. Donald Chamberlain and Raymond Boyce created SQL, a structured query language that helps to modify and view important computer data. Today, many firms, such as Microsoft and Accenture, continue to use SQL. Jean Ichbiah initiated the development of Ada in the early 1980s Ada was created by Jean Ichbiah and was named after the first person to use computer languages. Ada is a specific programming language used to regulate air traffic in many different nations, including Belgium, Australia, and Germany. C++ and Objective-C were created in 1983 when C++ was born A C programmer, Bjarne Stroustrup, modified the programming language and released it as C++ in 1983. The changes he made to it introduced new features such as templates, classes, and virtual jobs. In 2003, C++ also won an award as one of the top programming languages. Microsoft Office, gaming platforms like PlayStation and Nintendo 64, Adobe Photoshop, and plenty of other programs use C++. Objective-C was created by Tom Love and Brad Cox in 1983 and has been the primary programming language for Apple platforms ever since. This includes iOS and macOS. In 1987, Larry Wall developed the programming language Perl Larry Wall first created Perl in 1987. Originally it was a text editor, but now Perl is used for a variety of functions across many different industries and sectors, including database applications, visual programs, network programs, and more. Haskell first established itself in 1990 Haskell was created in 1990 and is named after eminent mathematician Haskell Brooks Curry. This programming language is most often used for mathematical operations but has also been used to create video games. Visual Basic and Python both came into existence in 1991 Microsoft created Visual Basic in 1991 to make it easier for users to drag and drop scripts simultaneously. Individuals and businesses can choose from a large number of drop-down menus and change different codes with Visual Basic. Python is one of the most widely used programming languages in the world today. Guido Van Rossum founded it in 1991 as an alternative for different programming styles. Yahoo, Spotify, and Google are just some of the successful companies that continue to use this software extensively. In 1993, Yukihiro Matsumoto created Ruby Ruby was first created in 1993 by Yukihiro Matsumoto. Intended to be used for programming web apps, Ruby is based on numerous other languages, such as Perl, Smalltalk, and Lisp. Brands like Hulu, Groupon, and Twitter use Ruby frequently. Java, JavaScript, and PHP were first introduced in 1995 One of the best programming languages out there is Java. It was created in 1991 by James Gosling for a software project for interactive TV, and these days it’s still immensely popular on mobile devices and PCs. JavaScript is another great programming language that Brendan Eich created in 1995. It’s used primarily as a scripting language in websites, including Gmail and Adobe Photoshop, but it can also be used to create web pages or PDF files (among other things). PHP was originally introduced in 1995. Its purpose is to help people and companies create and maintain their websites, and this includes our own personal homepage, Facebook, Wikipedia, and many others. The C# compiler was created in the year 2000 Microsoft created C# in 2000 with the goal of merging some of the benefits of C++ with the functionality of Visual Basic. Notably, it can be used more easily than Java while still retaining a lot of its power. The only Microsoft tools and products that are not written in C# are outdated. In 2003, Scala was established, and Groovy followed 2003 In 2003, Martin Odersky created Scala, a programming language that is compatible with Java. This is beneficial for programmers who work with Android’s system, and developers from LinkedIn, Netflix, Foursquare, and Twitter are among those who still use it today. Groovy is a computer language that is derived from Java and designed to promote productivity and efficiency. Starbucks and Craft Business base are two well-known companies that use Groovy to execute daily tasks. Google rolled out Go in 2009 Google introduced its Go programming language in 2009 to solve and manage any problems that may arise when dealing with large-scale software systems. Its modern and user-friendly structure has become popular among various organizations worldwide, and these include Uber, Google, Twitch, and many others. Apple developed Swift in 2014 Developing new programming languages is not an easy task. In 2014, Apple replaced Objective-C, C++, and C with Swift to create a more streamlined language than their predecessors. More specifically, Swift allows users to create programs that work on both mobile devices and computers for improved functionality across the board. Are There Any Old Programming Languages That Are Still in Use? Multiple computer programming languages have been retired, yet many still exist today. One of these is FORTRAN, which was introduced in 1953 and has seen a recent update as recently as 2018. FORTRAN was developed by IBM and upgraded to be compatible with modern computers. COBOL, another popular computer programming language, was created in 1959 and remained the most relied-upon language for banks. What Programming Language Is Most Popular in the Contemporary Technological World? A lot of new programming languages have been developed over the years. The need to use technology has risen faster than anticipated, and this has created a demand for more programming languages. The following are some of the more popular computer programming languages that businesses use today: SQL, JavaScript, Java, .NET/C#, and HTML/CSS. Conclusion As you can see, the history of programming languages is quite interesting and continues to progress. It’s incredible to consider how a written method from 1840 led to the programming languages that are still used today. They’ve gone from academics employing machine codes to large corporations using high-level, readable code on supercomputers. And now, computer programming is expected to continue evolving and transforming into new languages-some of which will be regarded as some of the best in language history. Additional Resources:
4768
dbpedia
1
0
https://en.wikipedia.org/wiki/ALGOL_58
en
Wikipedia
https://en.wikipedia.org/static/favicon/wikipedia.ico
https://en.wikipedia.org/static/favicon/wikipedia.ico
[ "https://en.wikipedia.org/static/images/icons/wikipedia.png", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1", "https://en.wikipedia.org/static/images/footer/wikimedia-button.svg", "https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg" ]
[]
[]
[ "" ]
null
[ "Contributors to Wikimedia projects" ]
2004-08-30T19:27:07+00:00
en
/static/apple-touch/wikipedia.png
https://en.wikipedia.org/wiki/ALGOL_58
Programming language ALGOL 58Paradigmprocedural, imperative, structuredFamilyALGOLDesigned byFriedrich L. Bauer, Hermann Bottenbruch, Heinz Rutishauser, Klaus Samelson, John Backus, Charles Katz, Alan Perlis, Joseph Henry WegsteinFirst appeared1958; 66 years ago ( )Typing disciplineStatic, strongScopeLexicalInfluenced byFORTRAN, IT, Plankalkül,[1] Superplan, Sequentielle FormelübersetzungInfluencedMost subsequent imperative languages (Algol-like) ALGOL 58, originally named IAL, is one of the family of ALGOL computer programming languages. It was an early compromise design soon superseded by ALGOL 60. According to John Backus: The Zurich ACM-GAMM Conference had two principal motives in proposing the IAL: (a) To provide a means of communicating numerical methods and other procedures between people, and (b) To provide a means of realizing a stated process on a variety of machines...[2] ALGOL 58 introduced the fundamental notion of the compound statement, but it was restricted to control flow only, and it was not tied to identifier scope in the way that Algol 60's blocks were. Name [edit] Bauer attributes the name to Hermann Bottenbruch, who coined the term algorithmic language (algorithmische Sprache) in 1957, "at least in Germany".[3][4] History [edit] There were proposals for a universal language by the Association for Computing Machinery (ACM) and also by the German Gesellschaft für Angewandte Mathematik und Mechanik ("Society of Applied Mathematics and Mechanics") (GAMM). It was decided to organize a joint meeting to combine them. The meeting took place from May 27 to June 2, 1958, at ETH Zurich and was attended by the following people: Friedrich L. Bauer, Hermann Bottenbruch, Heinz Rutishauser, and Klaus Samelson (from the GAMM) John Backus, Charles Katz, Alan Perlis, and Joseph Henry Wegstein (from the ACM). The language was originally proposed to be called IAL (International Algebraic Language) but according to Perlis,[5] this was rejected as an "'unspeakable' and pompous acronym". ALGOL was suggested instead, though not officially adopted until a year later. The publication following the meeting still used the name IAL.[6] By the end of 1958 the ZMMD-group had built a working ALGOL 58 compiler for the Z22 computer. ZMMD was an abbreviation for Zürich (where Rutishauser worked), München (workplace of Bauer and Samelson), Mainz (location of the Z22 computer), Darmstadt (workplace of Bottenbruch). ALGOL 58 saw some implementation effort at IBM, but the effort was in competition with FORTRAN, and soon abandoned. It was also implemented at Dartmouth College on an LGP-30, but that implementation soon evolved into ALGOL 60. An implementation for the Burroughs 220 called BALGOL evolved along its own lines as well, but retained much of ALGOL 58's original character.[7] ALGOL 58's primary contribution was to later languages; it was used as a basis for JOVIAL,[4] MAD, NELIAC[4] and ALGO. It was also used during 1959 to publish algorithms in CACM, beginning a trend of using ALGOL notation in publication that continued for many years. Time line of implementations of ALGOL 58 variants [edit] Name Year Author State Description Target CPU ZMMD-implementation 1958 Friedrich L. Bauer, Heinz Rutishauser, Klaus Samelson, Hermann Bottenbruch Germany Z22 NELIAC 1958 Naval Electronics Laboratory USA AN/USQ-17 JOVIAL 1960 Jules Schwartz USA Was the DOD HOL prior to Ada (programming language) Various (see article) BALGOL 1960 Joel Merner et al. USA Burroughs Corporation B220 MAD 1960 University of Michigan USA IBM 7090/7094 mainframe, then mid-1960s ported to Univac 1108 Dartmouth ALGOL 30 1962 Thomas Eugene Kurtz et al. - evolved into ALGOL 60 USA LGP-30 SUBALGOL 1962 Bob Braden, Lawrence M. Breed and Roger Moore, Stanford University USA BALGOL extension IBM 7090 ALGO ~ Bendix Corporation USA Bendix G-15 ALGOL 58's influence on ALGOL 60 [edit] IAL introduced the three-level concept of reference, publication and hardware language, and the concept of "word delimiters" having a separate representation from freely chosen identifiers (hence, no reserved words). ALGOL 60 kept this three-level concept.[8] The distinction between assignment (:= representing a left-facing arrow) and the equality relation = was introduced in IAL and kept in ALGOL 60. Both IAL and ALGOL 60 allow arrays with arbitrary lower and upper subscript bounds, and allow subscript bounds to be defined by integer expressions. Both IAL and ALGOL 60 allow nesting of procedure declarations and the corresponding identifier scopes. The IAL report described parameter substitution in much the same terms as the ALGOL 60 report, leaving open the possibility of call by name. It is unclear if this was realized at the time. IAL allows numeric statement labels, that ALGOL 60 kept. The possibility of including non-ALGOL code within a program was already hinted at, in the context of parameters to procedures. Both IAL and ALGOL 60 have a switch designator, unrelated, however, to the switch statement in C and other languages. In-line functions of the form f(x) := x / 2; were proposed in IAL but dropped in ALGOL 60. IAL procedure declarations provide separate declaration lists for input and output parameters, a procedure can return multiple values; this mechanism was replaced in ALGOL 60 with the value declaration. Variable declarations in IAL can be placed anywhere in the program and not necessarily at the beginning of a procedure. In contrast, the declarations within an ALGOL 60 block should occur before all execution statements. The for-statement has the form for i:=base(increment)limit, directly resembling the loop of Rutishauser's programming language Superplan, replacing =with :=, and replacing its German keyword Für with the direct English translation for; ALGOL 60 replaced the parentheses with the word delimiters step and until, such that the previous statement instead would be i:=base step increment until limit. The IAL if-statement does not have a then-clause or else-clause; it rather guards the succeeding statement. IAL provides an if either-statement that cleanly allows testing of multiple conditions. Both were replaced by ALGOL's if-then construct, with the introduction of the "dangling-else" ambiguity. IAL provides macro-substitution with the do-statement; this was dropped in ALGOL 60. IAL allows one or more array subscripts to be omitted when passing arrays to procedures, and to provide any or all arguments to a procedure passed to another procedure. IAL's infix boolean operators are all of the same precedence level. Exponents are indicated with paired up and down arrows, which removed any confusion about the correct interpretation of nested exponents; ALGOL 60 replaced the paired arrows with a single up-arrow whose function is equivalent to FORTRAN's **. The IAL report does not explicitly specify which standard functions were to be provided, making a vague reference to the "standard functions of analysis." The ALGOL 60 report has a more explicit list of standard functions. References [edit]
4768
dbpedia
0
59
https://www.cs.csub.edu/~melissa/cs350-f15/notes/notes02.html
en
[ "https://www.cs.csub.edu/~melissa/cs350-f15/Images/history.gif" ]
[]
[]
[ "" ]
null
[]
null
null
Chapter 2: Evolution of Major Programming Languages "Much of my work has come from being lazy." -- John Backus, co-designer of IBM 704 architecture and developer of Fortran (full chart) 1 9 4 5 - Zuse's Plankalkul (1945) Never implemented; Advanced data structures; floating point, arrays, records; Invariants Short Code (1949) A "pseudocode" developed by Mauchly for BINAC computers; 1) The poor readability and writability of machine code; 2) Machine deficiencies; - no index registers (required absolute addressing in code) - no floating point hardware 1 9 5 0 - Speedcoding (1954) A "pseudocode" developed by John Backus for IBM 701; Index registers became standard (1955-1964) Pseudo operations for arithmetic & math functions; Pseudo operations for conditional & unconditional branching; Auto-increment registers for array access; Slower than machine code FORTRAN 0 (1954) FORmula TRANslation language specification only - no subroutines; (brief history of Fortran) & Fortran history & docs "Fortran should virtually eliminate coding and debugging errors." p. 2. "Unfortunately we were hopelessly optimistic in 1954 about the problems of debugging Fortran programs." p. 168. "As far as we were aware, we simply made up the language as we went along. We did not regard language design as a difficult problem, merely a simple prelude to the real problem: designing a compiler which could produce efficient programs." p. 168. "...while it was perhaps natural and inevitable that languages like Fortran and its successors should have developed out of the concept of the von Neumann computer as they did, the fact that such languages have dominated our thinking for over twenty years is unfortunate. It is unfortunate because their long-standing familiarity will make it hard for us to understand and adopt new programming styles which one day will offer far greater intellectual and compututational power." p. 178. John Backus, "The history of Fortran I, II, and III." ACM SIGPLAN Notices, 13(8):165-180, August 1978. FORTRAN I (1956 - 1957) > 1st implemented version of Fortran compiler (released 1957) - 18 man years; Designed for IBM 704 with index registers and floating point hardware; Environment of development: Primitive kernel or no kernel; No programming methodology or tools; Computers were small and unreliable; Scientific applications and efficiency primary concern; Impact of environment on design: No need for dynamic storage (no heap) Need good array handling and counting loops; No string handling, decimal arithmetic, or powerful I/O (commercial stuff) Overview: Names could have up to six characters; Post-test counting loop (DO) Formatted I/O; User-defined subprograms; Three-way selection statement ( arithmetic IF - <, >, = ) No data typing statements; Programs larger than 400 lines rarely compiled correctly, mainly due to poor reliability of 704; Code was very fast - quickly became widely used FORTRAN II (1958) Independent compilation of subroutines (reduced development time) LISP (1958) - LISt Processing language; Designed at MIT by McCarthy (1958) First functional language AI applications process data in lists rather than arrays Symbolic computation rather than numeric Code + Data has same form Originally only two data types: atoms and lists Syntax based on lambda calculus No need for variables or assignment (as in imperative languages) Control via recursion and conditional expressions Still the dominant language for AI ML, Miranda, and Haskell are related languages ALGOL 58 (1958) The 1st step to platform independent language with formalized grammar in BNF (Backus-Naur form) notation; Language Close to mathematical notation - good for describing ALGOrithms; Language development up to now was for specific machines (Fortran had barely arrived for IBM 70x); No portable or universal language -- all were machine-dependent; No easy way to communicate algorithms; ACM and GAMM met for four days on design (May 27 to June 1, 1958) Must be translatable to machine code; Formalized Concept of "type"; Names could be any length; Arrays could have any number of subscripts; Parameters were separated by mode (in & out) Subscripts were placed in brackets; Compound statements (begin ... end) Semicolon as a statement separator; Assignment operator was :=; if had an else-if clause; No I/O - would make I/O machine dependent; IBM initially enthusiastic -- all support dropped by mid 1959 so eventually ALGOL as a language died 1 9 6 0 - COBOL (COmmon Business-Oriented Language) Design Process (late 1959 - ) Names up to 12 characters, with embedded hyphens; English names for arithmetic operators (no arithmetic expressions) Verbs the first word in every statement; Design committee members from computer manufacturers and DoD ; Design goals: to work for business applications (based on FLOW-MATIC) to look like simple English -- easy to use at expense of being powerful; to broaden the base of computer users; FORTRAN IV (1960-62) Explicit type declarations; Logical selection statement; Subprogram names could be parameters; ALGOL 60 (1960) an effort to design a universal language - portability; New features: Block structure (local scope) Two parameter passing methods; Subprogram recursion; Stack-dynamic arrays; No I/O or string handling; APL: A Programming Language (1960) Early dynamic language; Designed as a hardware description language at IBM by Ken Iverson; Highly expressive (many operators for scalars and arrays of n-dimensions) Programs are very difficult to read; Still in use; minimal changes; SNOBOL (1962) Early dynamic language; Designed as a string manipulation language at Bell Labs; Powerful operators for string pattern matching; Slower than alternative languages (no longer used for writing editors) Still used for certain text processing tasks; BASIC (1964) The beginning of timesharing - Free and private access; Designed at Dartmouth for non-science students - 1st widely used timesharing language ; User time more important than computer time - Fast turnaround for homework; Easy to learn and use for non-science students: pleasant and friendly; Current popular dialect: Visual BASIC PL/I (1964) Initially called NPL (New Programming Language) changed to PL/I in 1965; Designed by IBM and SHARE: Everything for Everybody ; Computing situation in 1964 (IBM's point of view) 1) Scientific computing IBM 1620 and 7090 computers Fortran; 2) SHARE user group Business computing IBM 1401, 7080 computers COBOL; ; By 1963 scientific users began to need more elaborate I/O like COBOL and business users began to need floating point and arrays ; Two kinds of computers, languages, and support staff--too costly; The solution: Build a new computer to do both (IBM 360) and design a new language for it ; PL/I designed in 5 months by 3 members from IBM, 3 members from SHARE; Initial concept - an extension of Fortran IV; FORTRAN 66 - First ANSI standard for a high level language ; SIMULA 67 (1967) Historical Importance: designed for system simulation ; Introduced concept of data abstraction; Designed primarily for system simulation in Norway by Nygaard and Dahl; Based on ALGOL 60 and SIMULA I; Primary Contributions: Co-routines - a kind of subprogram; Implemented in a structure called a class; Classes are the basis for data abstraction; Classes are structures that include both local data and functionality; ALGOL 68 (1968) Orthogonal Design; From ALGOL 60 but not a superset of that language; Source of several new ideas but language never achieved widespread use; Design is based on the concept of orthogonality; A few principle concepts, few combining mechanisms; Contributions: User-defined data structures; Reference types; Dynamic arrays (called flex arrays) Comments 1 9 7 0 - PROLOG (1970) Programming Based on Formal Logic ; Developed at University of Aix-Marseille, France ; Non-procedural; declarative rather than imperative; Can be summarized as being an intelligent database system that uses an ; inferencing process to infer the truth of given queries; Highly inefficient, small application areas -- formal language research; AI;
4768
dbpedia
1
15
https://historyofalgol.wordpress.com/
en
The History of ALGOL
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
null
When was the language developed (originally, and any major revisions)? To begin with, ALGOL was created in 1959 by a joint meeting in Zűrich,Switzerland by GAMM (Society of Applied Mathematics and Mechanics [German group]) and ACM (Association of Computing Machinery). The conference lasted 6 days and ended with the creation of ALGOL 58. Originally ALGOL…
en
https://s1.wp.com/i/favicon.ico
The History of ALGOL
https://historyofalgol.wordpress.com/
When was the language developed (originally, and any major revisions)? To begin with, ALGOL was created in 1959 by a joint meeting in Zűrich,Switzerland by GAMM (Society of Applied Mathematics and Mechanics [German group]) and ACM (Association of Computing Machinery). The conference lasted 6 days and ended with the creation of ALGOL 58. Originally ALGOL 58 was meant to be used as a universal algorithmic programing language, but the disputes over the implementation and other areas caused another conference to be called in 1960; this time in Paris, France that resulted in ALGOL 60.Since there were still disputes about the language, another branch of ALGOL was created ALGOL 68. But due to the difficulty of understanding its two-level grammar (hyper rules and meta rules i.e. ‘ALPHA:: beta.’), which was meant to be used like templates to enable multiple different types of production, it never caught on. Original: ALGOL 58 Revisions ALGOL 60, and 68 What individuals or organizations were responsible for its development? Organizations Responsible: GAMM (Society of Applied Mathematics and Mechanics [German group])(ALGOL 58) ACM (Association of Computing Machinery)(ALGOL 58) Notable Figure: John W. Bakus, the leader of the group which developed the language FORTRAN(Specifically involved in ALGOL 60) Which languages did it influence and/or was influenced by? The influence of ALGOL is far extending, such as recursive procedures which are still used by Java and C, which according to www.tiobie.com have been the top 2 programing languages in the last 10 years. Also, can be noted as a precursor to PASCAL language. As far as influences go, ALGOL was originally developed as a reaction to the FORTRAN language which is why FORTRAN had the most impact on ALGOL’s development. Influence: Java, C, PASCAL Influenced By: FORTRAN For what paradigm or domain was it developed and/or used? To put it bluntly and straight forward ALGOL was used for mathematical programming. Hence the name ALGOL(ALGOrithmic Language). ALGOL 60 was an implementation language, one of the few around during its creation. At the time, people were having issues with compilers, so having it as implementation language made it a better language choice in computing society (Wegner, 12). Where was it used? Great question thank you for asking! AlGOL started to generate some american excitement around the two companies of SHARE and IBM, but never became the accepted language. According to History of Programming Languages, Edited by Richard L. Wexelblat, “Perhaps it is just as well that IBM and its users accepted ALGOL 58 as a model for language development but did not settle upon it as a computing standard . . . too much would have to have been fixed too soon.” What are its most important features? Some of the most noticeable abstractions of ALGOL 60 that were new, was the block statement, parameter passing to procedures (Dainth, para 2) and nested if-then-else statements(Wegner, 7). These abstractions have continued to be used ever since. In its differences to FORTRAN during the time, ALGOL 60 was able to handle dynamic arrays and recursive procedures (Wegner, 7), which were also introduced for the first time. Important Features: The introduction of nested if-then-else, recursive procedures, parameters passing to procedures and block statements Is it still in use today? ALGOL can be written and compiled today (on most computers) using a program called Marst, which compiles it into C code which then can be compiled and executed as C code. ALGOL is not very common in any working condition, in fact, many consider the language to be obsolete so it is unused except by those who have an interest in it. Answer: No, ALGOL is no longer in use To what can you attribute its success or failure?
4768
dbpedia
2
17
https://www.cs.csub.edu/~melissa/cs350-f15/notes/notes02.html
en
[ "https://www.cs.csub.edu/~melissa/cs350-f15/Images/history.gif" ]
[]
[]
[ "" ]
null
[]
null
null
Chapter 2: Evolution of Major Programming Languages "Much of my work has come from being lazy." -- John Backus, co-designer of IBM 704 architecture and developer of Fortran (full chart) 1 9 4 5 - Zuse's Plankalkul (1945) Never implemented; Advanced data structures; floating point, arrays, records; Invariants Short Code (1949) A "pseudocode" developed by Mauchly for BINAC computers; 1) The poor readability and writability of machine code; 2) Machine deficiencies; - no index registers (required absolute addressing in code) - no floating point hardware 1 9 5 0 - Speedcoding (1954) A "pseudocode" developed by John Backus for IBM 701; Index registers became standard (1955-1964) Pseudo operations for arithmetic & math functions; Pseudo operations for conditional & unconditional branching; Auto-increment registers for array access; Slower than machine code FORTRAN 0 (1954) FORmula TRANslation language specification only - no subroutines; (brief history of Fortran) & Fortran history & docs "Fortran should virtually eliminate coding and debugging errors." p. 2. "Unfortunately we were hopelessly optimistic in 1954 about the problems of debugging Fortran programs." p. 168. "As far as we were aware, we simply made up the language as we went along. We did not regard language design as a difficult problem, merely a simple prelude to the real problem: designing a compiler which could produce efficient programs." p. 168. "...while it was perhaps natural and inevitable that languages like Fortran and its successors should have developed out of the concept of the von Neumann computer as they did, the fact that such languages have dominated our thinking for over twenty years is unfortunate. It is unfortunate because their long-standing familiarity will make it hard for us to understand and adopt new programming styles which one day will offer far greater intellectual and compututational power." p. 178. John Backus, "The history of Fortran I, II, and III." ACM SIGPLAN Notices, 13(8):165-180, August 1978. FORTRAN I (1956 - 1957) > 1st implemented version of Fortran compiler (released 1957) - 18 man years; Designed for IBM 704 with index registers and floating point hardware; Environment of development: Primitive kernel or no kernel; No programming methodology or tools; Computers were small and unreliable; Scientific applications and efficiency primary concern; Impact of environment on design: No need for dynamic storage (no heap) Need good array handling and counting loops; No string handling, decimal arithmetic, or powerful I/O (commercial stuff) Overview: Names could have up to six characters; Post-test counting loop (DO) Formatted I/O; User-defined subprograms; Three-way selection statement ( arithmetic IF - <, >, = ) No data typing statements; Programs larger than 400 lines rarely compiled correctly, mainly due to poor reliability of 704; Code was very fast - quickly became widely used FORTRAN II (1958) Independent compilation of subroutines (reduced development time) LISP (1958) - LISt Processing language; Designed at MIT by McCarthy (1958) First functional language AI applications process data in lists rather than arrays Symbolic computation rather than numeric Code + Data has same form Originally only two data types: atoms and lists Syntax based on lambda calculus No need for variables or assignment (as in imperative languages) Control via recursion and conditional expressions Still the dominant language for AI ML, Miranda, and Haskell are related languages ALGOL 58 (1958) The 1st step to platform independent language with formalized grammar in BNF (Backus-Naur form) notation; Language Close to mathematical notation - good for describing ALGOrithms; Language development up to now was for specific machines (Fortran had barely arrived for IBM 70x); No portable or universal language -- all were machine-dependent; No easy way to communicate algorithms; ACM and GAMM met for four days on design (May 27 to June 1, 1958) Must be translatable to machine code; Formalized Concept of "type"; Names could be any length; Arrays could have any number of subscripts; Parameters were separated by mode (in & out) Subscripts were placed in brackets; Compound statements (begin ... end) Semicolon as a statement separator; Assignment operator was :=; if had an else-if clause; No I/O - would make I/O machine dependent; IBM initially enthusiastic -- all support dropped by mid 1959 so eventually ALGOL as a language died 1 9 6 0 - COBOL (COmmon Business-Oriented Language) Design Process (late 1959 - ) Names up to 12 characters, with embedded hyphens; English names for arithmetic operators (no arithmetic expressions) Verbs the first word in every statement; Design committee members from computer manufacturers and DoD ; Design goals: to work for business applications (based on FLOW-MATIC) to look like simple English -- easy to use at expense of being powerful; to broaden the base of computer users; FORTRAN IV (1960-62) Explicit type declarations; Logical selection statement; Subprogram names could be parameters; ALGOL 60 (1960) an effort to design a universal language - portability; New features: Block structure (local scope) Two parameter passing methods; Subprogram recursion; Stack-dynamic arrays; No I/O or string handling; APL: A Programming Language (1960) Early dynamic language; Designed as a hardware description language at IBM by Ken Iverson; Highly expressive (many operators for scalars and arrays of n-dimensions) Programs are very difficult to read; Still in use; minimal changes; SNOBOL (1962) Early dynamic language; Designed as a string manipulation language at Bell Labs; Powerful operators for string pattern matching; Slower than alternative languages (no longer used for writing editors) Still used for certain text processing tasks; BASIC (1964) The beginning of timesharing - Free and private access; Designed at Dartmouth for non-science students - 1st widely used timesharing language ; User time more important than computer time - Fast turnaround for homework; Easy to learn and use for non-science students: pleasant and friendly; Current popular dialect: Visual BASIC PL/I (1964) Initially called NPL (New Programming Language) changed to PL/I in 1965; Designed by IBM and SHARE: Everything for Everybody ; Computing situation in 1964 (IBM's point of view) 1) Scientific computing IBM 1620 and 7090 computers Fortran; 2) SHARE user group Business computing IBM 1401, 7080 computers COBOL; ; By 1963 scientific users began to need more elaborate I/O like COBOL and business users began to need floating point and arrays ; Two kinds of computers, languages, and support staff--too costly; The solution: Build a new computer to do both (IBM 360) and design a new language for it ; PL/I designed in 5 months by 3 members from IBM, 3 members from SHARE; Initial concept - an extension of Fortran IV; FORTRAN 66 - First ANSI standard for a high level language ; SIMULA 67 (1967) Historical Importance: designed for system simulation ; Introduced concept of data abstraction; Designed primarily for system simulation in Norway by Nygaard and Dahl; Based on ALGOL 60 and SIMULA I; Primary Contributions: Co-routines - a kind of subprogram; Implemented in a structure called a class; Classes are the basis for data abstraction; Classes are structures that include both local data and functionality; ALGOL 68 (1968) Orthogonal Design; From ALGOL 60 but not a superset of that language; Source of several new ideas but language never achieved widespread use; Design is based on the concept of orthogonality; A few principle concepts, few combining mechanisms; Contributions: User-defined data structures; Reference types; Dynamic arrays (called flex arrays) Comments 1 9 7 0 - PROLOG (1970) Programming Based on Formal Logic ; Developed at University of Aix-Marseille, France ; Non-procedural; declarative rather than imperative; Can be summarized as being an intelligent database system that uses an ; inferencing process to infer the truth of given queries; Highly inefficient, small application areas -- formal language research; AI;
4768
dbpedia
1
42
https://heerdebeer.org/ALGOL/notation.html
en
Heer de Beer.org—Notation
[ "https://heerdebeer.org/assets/img/heerdebeer.svg", "https://heerdebeer.org/ALGOL/images/FORTRAN1954a3NUMBER.png", "https://heerdebeer.org/ALGOL/images/ALGOL58aNUMBER.png", "https://heerdebeer.org/ALGOL/images/ALGOL58a12EXPRESSIONPART.png", "https://heerdebeer.org/ALGOL/images/FORTRAN1954a6EXPRESSIONPART.png", "https://heerdebeer.org/ALGOL/images/ALGOL58a14IF.png", "https://heerdebeer.org/ALGOL/images/FORTRAN1954a14IF.png", "https://heerdebeer.org/ALGOL/images/ALGOL58a12COMPOUNDSTATEMENT.png", "https://heerdebeer.org/ALGOL/images/ALGOL58a14PROCEDUREDECLARATION.png", "https://heerdebeer.org/ALGOL/images/ALGOL60-128-129-procedure.png", "https://heerdebeer.org/ALGOL/images/ALGOL60-124-procedurestatement.png", "https://heerdebeer.org/ALGOL/images/ALGOL60-114-expression.png", "https://heerdebeer.org/ALGOL/images/ALGOL60-130-131-example1.png" ]
[]
[]
[ "FORTRAN", "IAL", "ALGOL 60" ]
null
[ "Huub de Beer" ]
2006-08-14T00:00:00
en
null
Notation From IAL to ALGOL 60 Huub de Beer Why notation matters? Communication between the members of the various working groups on aspects of the new algorithmic language was important because the ALGOL effort was an international effort consisting of different people and groups. A clear and unambiguous understanding of the language in all its facets by all members participating in the effort was necessary to be able to develop and discuss the ALGOL language. Using English or any other natural language to describe a programming language was, and is, insufficient. Natural languages are too ambiguous to define formally or even describe a programming language completely. A description of a programming language consists of a description of two related aspects of the language: the syntax and the semantics. The syntax of a language is about how to form a string in the language. The semantics is about the meaning of syntactically correct strings in the language. The most obvious way to define formally the semantics is an implementation of the language on some machine. This implementation then fixes the meaning of the language. Such an implementation is not a workable description of a language. A typical implementation is too large to be understood easily. Often it is written in some low level programming language or even assembly language. In addition, a compiler is written for a particular machine for which the details should be known to understand the details of the implementation. Fortunately, describing the syntax formally appeared to be an easier task than describing the semantics of a language. During the development of ALGOL a formal notation was invented to describe the syntax of ALGOL. This notation, or metalanguage, was not complete or perfect but it fulfilled its task to prevent the occurrence of many ambiguities in the discussions about ALGOL. Although the new notation was primarily intended for describing the syntax, the semantics were not completely ignored. Actually, the description of the semantics was mixed in with the description of the syntax. Usually a special notation to describe the syntax also influences the language concepts to be described. Some language concepts are more easily described in one notation than in an other one. A formal notation will result in more coherent and simple concepts: the notation forces the concepts to be described along the rules of the notation. In this chapter, both the development of the notation used to describe the syntax and semantics of ALGOL and the development of the language concepts of ALGOL 60 are discussed. To that end, the notation used for describing FORTAN and IAL is explained first. After that the focus is on the first stage of the development of the notation: Backus’s notation. The third topic is the notation of the ALGOL 60 report. Finally, the development of some problematic programming language concepts in ALGOL, especially that of the procedure concept, are treated in more detail. Notations used to describe early programming languages The notations used to describe the early programming languages were, like the languages they described, primitive. During the ALGOL effort this notation was developing into an important aspect of the field of programming languages. Actually, one of the results of this effort was a way to define programming languages: do it as it was done in the ALGOL 60 report. To describe this development of notation, the notation used before the ALGOL effort is explained and compared to the notation used to describe IAL. This comparison answers the question if this development of notation started during or before the ALGOL effort. In other words: Was the nature of the ALGOL effort responsible for this development of notation? The notation of IAL and FORTRAN compared In 1954, the first document describing FORTRAN was published: Preliminary Report—Specifications for the IBM Mathematical FORmula TRANslating System FORTRAN. Two years later, at 15 October 1956, a more finalised version of the language was published. The notation used in this later document was the same as in the preliminary report; the notation to describe a programming language had not changed fundamentally during these two years. As was the case with FORTRAN, the first publication of IAL was also a preliminary report: Preliminary Report: International Algebraic Language. Actually, both preliminary reports do resemble each other in the sense that both documents were set up in a similar way. To compare the notations used in both reports, the descriptions for some language elements are given and compared. Real numbers in IAL are described using a pattern reflecting the form of a real number. Compared to the description of real numbers in FORTRAN , where only natural language is used, the form of a number is more obvious. A more complicated language construct was the expression. Again, in IAL patterns are used to clarify the form of an expression whereas in FORTRAN natural language is used. The difference between the two notations is much smaller than before: a pattern is used in both descriptions. In IAL’s description this pattern is more explicit, however. The notations used for statements were also alike. For example, the if statement is described in both preliminary reports with a pattern (compare figures). Although this version of FORTAN did not have that much declarations, the description of declarations was more or less similar with those in the IAL report. In a later version of FORTRAN more declarations were added to the language. In short, the notation used to describe FORTRAN was similar to the notation used to describe IAL. In the description of FORTRAN natural language was used more often than in the description of IAL. The general form of a language element was described with a pattern in both descriptions. These patterns were the basis of the description of IAL. This was probably the influence of the European part of the designing committee. As we have seen before, the Europeans tended more to logic and mathematics than the Americans did. The quality of IAL’s notation The notations used to describe both FORTRAN and IAL were similar. This does not say anything about the quality of the notation, however. The question is: Was this notation sufficient to describe a programming language formally and completely? To answer this question, a closer look at IAL and its notation is taken. Even for simple language elements, like algebraic expressions, IAL’s notation was not good enough. For example, an algebraic expression has simply one of the forms occurring on the right hand side of the “~” symbols. Although it is assumed that the operators in this definition do have the ‘conventional meaning,’ this definition is ambiguous and it does not say anything about operator precedence, nor about associativity. When defining an element with an unknown number of subelements or when there are constraints on the occurrence of some subelements this notation was problematic. Take, for example, the compound statement (above Figure). The pattern of this compound statement is a number of statements between the begin and end keywords. At first glance, this seems to define this compound statement completely. Unfortunately it is not clear if it was possible to have a compound statement containing no statements at all, only one statement, or even two statements. After all, there are three “Σ” symbols in the pattern describing the compound statement. The use of “…” in itself may be clear and completely understandable for most of the readers of the report; those readers were almost all used to read mathematical texts in which these shortscripts occurred often. It does, however, not describe formally and fully what should occur on those “…”. This compound statement consists of a number of unconstrained elements only and the use of “…” would be acceptable if the only meaning was zero or more times the repeated element. Unacceptable for a formal notation are constraints on the elements in a pattern added outside the pattern using natural language. As an example, part of the definition of the procedure declaration is given in the above Figure. Here the “…” are used not only to denote zero or more occurrences of one element denoted by a single symbol, it has three different meanings in the same pattern: One or more occurrences of a procedure signature. The procedure signature itself consisted of more than one symbol and a part of unknown length, namely the parameter part of the procedure signature. Actually the part ≡:(P0) is optional. Zero or more occurrences of a declaration about the input and output parameters defined in the procedure signatures. Zero or more occurrences of two different elements: statements Σ and ∆ declarations . These different elements can be mixed in any order. For every procedure signature, however, there should be a label identical to the procedure identifier preceding minimally one statement. In addition, each procedure must have a return statement and all output parameters should have an assignment statement assigning the value to output the result. This early notation to describe programming languages was not suitable to define a language like IAL fully and formally. To be able to do so was necessary because IAL was intended to be machine independent: IAL was to be implemented on various machines by different people. According to Backus (1959): ‘there must exist a precise description of those sequences of symbols which constitute legal IAL programs.’ But, ‘heretofore there has existed no formal description of a machine-independent language’. For this reason, Backus started to work on such a formal description. Backus’s notation At the UNESCO International Conference on Information Processing, held at Paris from 15 till 20 June 1959, J.W. Backus presented The syntax and semantics of the proposed international algebraic language of the Zurich ACM-GAMM Conference about a formal description of the syntax of IAL. To be able to describe the syntax formally he invented a new metalanguage based on Emile Post’s production system. This notation became known as the Backus Normal Form and later as the Backus Naur Form, it is, however, best known by its abbreviation BNF. Using this notation, the syntax of a language could be described by “production rules”. Each rule was of the shape <metalinguistic variable> :≡ pattern. A pattern was built up from metalinguistic variables and symbols of the language. All possible patterns for a metalinguistic variable were connected with the or symbol, denoting a choice between the different patterns for the metalinguistic variable. A simple example of the application of Backus’s notation is the description of integers: <digit> :≡ 0 or 1 or 2 or 3 or 4 or 5 or 6 or 7 or 8 or 9 <integer> :≡ <digit> or <integer><digit> A digit is clearly a number, or better, a symbol representing a number, between 0 and 9 . An integer is now built up from these simple digits: it is either one simple digit, or it is a integer followed by a simple digit, like 9237. Another simple and clear example is the description of arithmetic expressions: <factor> :≡ <number> or <function> or <variable> or <subscr var> or ( <ar exp> ) or <factor>↑<ar exp>↓ <term> :≡ <factor> or <term> × <factor> or <term> / <factor> <ar exp> :≡ <term> or + <term> or - <term> or <ar exp> + <term> or <ar exp> - term <Ar exp A> :≡ <ar exp> <relation> :≡ < or > or ≤ or ≥ or ≠< rel exp> :≡ ( <ar exp> <relation> <ar exp A>) Comparing this with the previous incarnation, it is immediately clear that the former is a less ambiguous description than the former description. Using his notation Backus was able to denote the operator precedence by splitting up the description of expressions into different parts: factors, terms and expressions. Some other problems with the older notation were also solved by this new notation: the use of “…” to denote the occurrence of an element a (unknown) number of times was not needed any more. In Backus’s notation, it was possible to use recursion and, hence, to specify that an element could occur a number of times. The use of this recursivity is made clear by describing a parameter list: <param list> :≡ <param> or <param list>, <param> A parameter list is either just one parameter, or it is a parameter list followed by a comma and one parameter. Choice could be denoted by using the connective: write two patterns, one with the element of choice and one without it. Both recursion and choice were used to describe procedure statements: <oe> :≡ <left element> <out list> :≡ <oe> or <outlist>, <oe> <suc> :≡ <label> or <id>[<exp>] <succr list> :≡ <suc> or <succr list>, <suc> <A> :≡ =:(<out list>) or <blank> <B> :≡ :(<succr list>) or <blank> <proc stmt> :≡ <function> <A> <B> or <id> =:(<outlist>) <B> or <id>:(<succr list>) ppol> :≡ <blank> or <ppol> <oe> <pol> :≡ <ppol> or <pol>, or <pol>, <oe> <A'> :≡ =:(<pol>) <ppsl> :≡ <blank> or <ppsl><suc> <psl> :≡ <ppsl> or <psl>, or <psl>, <suc> <B'> :≡ :(<psl>) <F*> :≡ <function> or <pure funtion> or <id> A*> :≡ <A> or <A'> <B*> :≡ <B> or <B'> <pure function> :≡ <pure function> <A*> <B*> or <F*> <A'> <B'> or <F*> <A*> <B'> [a pure procedure may have any of the forms of a procedure statement but at least one position of one existing list must be empty; at least one input parameter position or one output position or one successor position]. Unfortunately, Backus was not able to write down a formal, clear and understandable description of the procedure statement. Even using his new notation, Backus had to write down some remarks about the statement using natural language to complete the formal description; the procedure statement was too difficult to describe formally. In addition, the description of the declaration of procedures was not included at all. The new notation was an huge improvement over the one used earlier. Nonetheless, it was improved further by Peter Naur. He replaced by and by . With this, and with the use of complete words for metalinguistic variables instead of using abbreviations of the same words as did Backus, Naur improved the readability of the description. The most important contribution of Peter Naur to Backus’s notation, however, was the fact that he used it in the ALGOL 60 report. Only after the publication of that report the BNF became more widely known. Before the publication it appeared to Peter Naur that Backus’s description of IAL ‘was received with a silence that made it seem that precise syntax description was an idea whose time had not yet come’. Naur ‘thus proved the usefulness of the idea in a widely read paper and it was accepted’. Developing ALGOL 60 During the eighteen months between the meeting in Zürich and the next joint meeting on the international algebraic language, held in Paris, January 1960, the name of the language had changed from the ‘“unspeakable” and pompous acronym, IAL’ to ALGOL. Furthermore, the language was discussed among interested people from America and various countries of Europe. In these months the ALGOL effort became a truly international effort, but it was still a separated effort. The main development took place at different meetings and in correspondence between members of the various subcommittees. The official communications channels of the development, however, were the Communications of the ACM in the USA and the ALGOL Bulletin in Europe. Most of the American proposals were related to practical aspects of the language. They wanted to improve the language by extending it, by adding more types, and by adding input and output facilities. Another suggestion to improve the language was to tidy up the syntax a bit. This practical attitude to the ALGOL effort was a result of the state of programming in the USA: programming was becoming a professional field and the experience gained with existing programming languages provided a good feedback to the ALGOL effort. The European proposals were often focused on the procedure concept and the scopes of variables. The Europeans aimed to improve the language fundamentally and the main target was the difficult procedure concept in IAL. Both in the preliminary report on IAL and in Backus’s description of IAL this procedure concept could not be described fully and formally in the notation used. Aside from this notational problem, other problems with IAL’s procedure concept were also noted and discussed. The discussions on the procedure concept focused mainly on parameters. E.T. Irons and F.S. Acton (1959) sum up some problems with IAL’s parameters in A proposed interpretation in ALGOL. Parameters could occur in the procedure body at the left hand side of an assignment statement. When the procedure was called these parameters in the body were replaced with the argument supplied for that parameter. If an argument was not an assignable element (i.e. not a variable) it would result in undefined behaviour. Another problem mentioned was the use of one argument as both an input and an output parameter. Actually, one parameter could also be used as both an input and an output parameter. These problems with the procedure concept were resolved by various subcommittees at the final meeting in Paris. First, the distinction between input and output parameters was removed. This solved a number of problems but not all of them. Eventually, under great time pressure, the distinction between call-by-name (enabling the so-called Jensen’s device) and call-by-value was invented. Every occurrence of a call-by-name parameter in the body of a procedure was being substituted by the name of the argument supplied to the procedure for that parameter. If a parameter was called by value, however, then the value of the argument was assigned to all occurrences of the corresponding parameter in the procedure body. This call-by-name parameter concept was one of the most controversial features of ALGOL 60. Another issue at the Paris meeting was recursion. Recursive procedures were new in 1960 and the usefulness of it was not widely recognised. The proposal to add the keyword to the language to denote a recursive procedure was rejected. When ALGOL 60 was published, however, it appeared that there was no restriction on the occurrence of calls to a procedure in its own procedure body. Without anyone knowing recursion was added to the language. According to Bauer (1978), this was the result of ‘the Amsterdam plot in introducing recursivity.’ At the end of the meeting the procedure concept had become a clear concept. The formal description of the declaration of a procedure is given in the above Figure. As said earlier, Backus (1959) did not define the procedure declarations formally, but he did describe procedure statements. The description of procedure statements in the ALGOL 60 report consists of just six subparts. Backus, on the other hand, needed seventeen subparts for the same task. Using the BNF to define ALGOL 60 had a beneficial effect on the procedure concept and ALGOL 60 as a whole. Besides the types Boolean, integer, real, and array of those types, the string type was added to ALGOL 60. It could only be used as an actual parameter in procedures, however. Proposals to add extra types like complex numbers and lists, were rejected. Furthermore, IAL’s assignment statement was extended into a multiple assignment statement: assignments like were now also allowed. The compound statement introduced in IAL was extended and it became a special case of the block concept. In a block, variables, functions, and even procedures can be declared local to the scope of the block. These declarations are only known in the block they are declared in and in all subblocks. They are not known, however, in the encapsulating block. A compound statement was now a block without any declarations. Blocks could be nested to any particular level. The case-like statement from IAL was removed and replaced by the else clause in the if statement and conditional arithmetical and Boolean expressions. In the above Figure the description of the arithmetic expression is given, including this conditional expression. An example of the use of a conditional expression is: a := if (b < 23) then b + 23 else b - 21. To the variable the value b + 23 is assigned if b < 23, else the value b - 21 is assigned. In Figure below one of the example programs in the ALGOL 60 report is given exemplifying many of the programming language concepts discussed above. The meeting in Paris was attended by Bauer, Naur, Rutishauser, Samelson, Vauquois, van Wijngaarden, and Woodger from Europe and by Backus, Green, Katz, McCarty, Perlis, and Wegstein from the USA. The seventh American member, Turanski, had died even before the meeting and was not present. According to Perlis (1978), ‘The meetings were exhausting, interminable, and exhilarating. (…) diligence persisted during the entire period, The chemistry of the 13 was excellent. (…) Progress was steady and the output, Algol60, was more racehorse than camel.’ The difference between IAL and ALGOL 60 was huge. Instead of ‘just adding a few corrections to ALGOL 58, it was necessary to redesign the language from the bottom up.’ In addition, the editor of the ALGOL 60 report, Peter Naur, used a changed version of Backus’s notation. The use of the BNF was beneficial for the clean structure of the report. Peter Naur became the editor because he had prepared a draft of the language for the Paris meeting. According to Bauer (1978), the participants of the conference were surprised by Naur’s work. ‘It therefor sounds poetic if he has written that his draft Report was ‘chosen’ as the basis of the discussion; the Committee was simply forced to do so’. The result of the meeting, the ALGOL 60 report, ‘was a fitting display for the language. Nicely organised, tantalisingly incomplete, slightly ambiguous, difficult to read, consistent in format, and brief, it was a perfect canvas for a language that possessed those same properties. Like the Bible it was meant, not merely to be read, but to be interpreted.’ And although the ALGOL 60 report and the formal notation used in it were frightening many people at first, the ALGOL 60 report would become the standard for defining programming languages. Conclusion A formal notation to define a programming language is important to allow everyone to read and interpret the definition in the same way. During the development of ALGOL 60, the notation used to describe the ALGOL languages changed fundamentally. The notation to describe early programming languages like FORTRAN and IAL was natural language combined with some patterns denoting the form of the various language elements. The disadvantage of this notation was that it resulted in ambiguous descriptions. Even for simple language elements, like numbers, expressions, and simple control structures, this was problematic. For definition of complex structures, like the procedure statement and declarations, the notation was totally insufficient. The description of IAL’s procedure concepts was long and incomplete because of the use of the “…” symbol. To give a more formal and complete description of the syntax of IAL, Backus invented a new notation: the Backus Normal Form. Using this simple notation, complex structures in the language could be described formally. Unfortunately, the procedure concept of IAL was too complex to describe in this notation. So we can conclude that either the notation was insufficient or the procedure concept in itself was wrong. During the period between the definition of IAL and ALGOL 60, many proposals to improve ALGOL were made. One of the main topics, especially in Europe, was the complex procedure concept. In the ALGOL 60 report the procedure concept was simplified: input and output parameters were removed, call-by-name and call-by-value parameters introduced. Another important aspect of the new language was the notion of a block with its own scope. This block was an extension of the compound statement from IAL. Recursion was a new feature added without anybody knowing it; the proposal to add recursion explicitly was rejected. The ALGOL 60 report was edited by Peter Naur. He wrote the draft version and used a slightly modified version of Backus’ notation to describe the language. This draft was used as the basis for the ALGOL meeting. The final report would become the standard method of defining programming languages and Backus’ notation became the standard method to describe the syntax of programming languages. In this short period, the ALGOL effort became a major contributor to the field of programming languages. Examples of contributions include the BNF, a method to define a programming language, the block structure, recursive procedures, call by name, call by value, the block, the scope, etc. References
4768
dbpedia
2
83
https://lobste.rs/s/0jerh5/origin_virtues_semicolons_programming
en
The origin and virtues of semicolons in programming languages
https://lobste.rs/touch-icon-144.png
https://lobste.rs/touch-icon-144.png
[ "https://lobste.rs/avatars/carlana-16.png 1x, /avatars/carlana-32.png 2x", "https://lobste.rs/avatars/eeue56-16.png 1x, /avatars/eeue56-32.png 2x", "https://lobste.rs/avatars/felixyz-16.png 1x, /avatars/felixyz-32.png 2x", "https://lobste.rs/avatars/toastal-16.png 1x, /avatars/toastal-32.png 2x", "https://lobste.rs/avatars/Zurga-16.png 1x, /avatars/Zurga-32.png 2x", "https://lobste.rs/avatars/ztoz-16.png 1x, /avatars/ztoz-32.png 2x", "https://lobste.rs/avatars/fanf-16.png 1x, /avatars/fanf-32.png 2x", "https://lobste.rs/avatars/bdesham-16.png 1x, /avatars/bdesham-32.png 2x", "https://lobste.rs/avatars/fanf-16.png 1x, /avatars/fanf-32.png 2x", "https://lobste.rs/avatars/spc476-16.png 1x, /avatars/spc476-32.png 2x", "https://lobste.rs/avatars/veqq-16.png 1x, /avatars/veqq-32.png 2x", "https://lobste.rs/avatars/bomp-16.png 1x, /avatars/bomp-32.png 2x", "https://lobste.rs/avatars/spc476-16.png 1x, /avatars/spc476-32.png 2x", "https://lobste.rs/avatars/xigoi-16.png 1x, /avatars/xigoi-32.png 2x", "https://lobste.rs/avatars/unwind-16.png 1x, /avatars/unwind-32.png 2x", "https://lobste.rs/avatars/carlana-16.png 1x, /avatars/carlana-32.png 2x", "https://lobste.rs/avatars/JaDogg-16.png 1x, /avatars/JaDogg-32.png 2x", "https://lobste.rs/avatars/xigoi-16.png 1x, /avatars/xigoi-32.png 2x", "https://lobste.rs/avatars/JaDogg-16.png 1x, /avatars/JaDogg-32.png 2x", "https://lobste.rs/avatars/xigoi-16.png 1x, /avatars/xigoi-32.png 2x" ]
[]
[]
[ "" ]
null
[]
null
en
/touch-icon-144.png
null
4768
dbpedia
0
22
https://ebs-integrator.com/en/blog/programming-timeline-part2
en
EBS Integrator
https://storage.fileserv…37a1e3402db0.png
https://storage.fileserv…37a1e3402db0.png
[ "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=3840&q=75 3840w" ]
[]
[]
[ "" ]
null
[]
null
en
/favicon.ico
https://ebs-integrator.com/en/blog/programming-timeline-part2
Let’s take a deep dive in time and look at how computer science & programming languages began. Follow us as we go through the ages and look at the timeline, major personalities, and events that through their ingenuity paved the road to our current Technological and Digital Era. The history of programming Want to see the father of computers and the first ever programmer as they struggled in the 1800’s, or check out how a 17th-century philosopher helped us develop binary code that helps YOU read these words on your screen. Well, then you absolutely should check out our earlier entry in the series. And if you’ve just finished that one, let's do a recap of where we left off. Fortran code is said to be 20 times shorter than its analogue in handwritten assembly code. The community was doubtful of it at the time due to performance concerns, but the fact that programmers could write more code quicker — it was an easy choice from the economical viewpoint. This how the code interface looked in Fortran back then: It wasn't like entering everything into a computer and having it appear on a screen, there were no screens... or even a keyboard. Computer Program from 1954 Like Mr. Jacquard more than a hundred years ago, they had to encode it on punched cards. The only difference is that we're talking about an IBM mainframe, and you couldn't touch the IBM mainframe. You had people who did that. You write your program on that paper with a pen: An improvement on the Fortran language Since we're talking about Mr. Backus, let's go back to 1958, when Algol was created by a committee including European and American computer scientists. Algol (also known as Algorithmic Language) is a high-level imperative computer programming language that introduced a wealth of features that would later influence and be used in the creation of Pascal, C#, C++ and Java, among others. Matter of fact most languages now days have syntaxes inspired by Algol and it’s considered amongst the most influential programming languages ever. It’s said that Algol was an attempt to improve on some intrinsic problems met with Fortran and its wildly considered as an “evolution” of sorts. While we’re on the subject, John Backus along side Peter Naur developed the Backus-Naur Form of describing programming languages based on ALGOL 58-60. So, you could consider Backus as one of the original fathers of many languages we use today. Here’s how “Hello, World!” looks in Algol-60: BEGIN DISPLAY (“Hello, world!”) END​ This is starting to look quite reasonable. Let’s get some clojure for computer sciences We’re still in the 1958 as we’re about to witness another giant of “programming languages”, and a father-language to many revolutionary ideas at this time – be born. LISP (aka LISt Processor) – a list depended high-level programming language (3GL) developed by John McCarthy at “Massachusetts Institute of Technology”. The language has a long history of changing dialects (like Racket, Clojure, Common lisp and Scheme). Originally intended as a formal notation for computations influenced by lambda calculus and artificial intelligence; it surprised everyone after being introduced into the IBM 704 and turning out to be a fully functioning practical tool for solving problems. Lisp is notable for pioneering many ideas in computer sciences like: tree data structures, dynamic typing, conditions, automatic storage management, recursion, high-order of functions and self-hosting compiler. It is also notable for giving students a migraine, due to its overreliance on parenthesis, so much so they gave it nicknames like “Lost in stupid parenthesis” or “Lots of irritating superfluous parenthesis”. And here’s how “Hello, World!” would look in Lisp: (PRINT “Hello, World!”) Though in 1958 Lisp is still in its infancy and doesn’t see much broad use, this brings us to… An old acquaintance At this current time most of if not all computers are working in either low-programming languages like opcode/ASM or the only high-level programming language available Fortran. And if by some reason you want to upgrade your computer, you’ll have to write all that code from scratch. So, let’s make a small leap to end of 1959. We’re going to meet someone, we’ve already discussed. And that someone is Grace Hopper; the rear admiral of the US Navy and a computer scientist she has already made an impact on history of programming before by creating the A-0 compiler system. Now however she’s attending the CODASYL or the “Conference on Data System Languages” as a technical aide and main expert. The committee is formed in an attempt to guide the development of standardization of programming languages or create a single language that could be used across different machines regardless of hardware. The result was the “Common Business Orientated Language”, or COBOL for short – an “English-like” Programming language heavily influenced by FLOW-MATIC created by Ms. Hopper; Primary goal of COBOL was to lower the barrier of entry into programming; before the creation of High-level languages, programming was a realm exclusive to computer scientists and experts. Now however, other enthusiasts from different professions like businesspeople, doctors, engineers, teachers and many other could incorporate computation into their work. To deal with underlying hardware each computing machine had to have its own COBOL compiler. But critically these compilers could accept the same COBOL source code. The notion of “Write once, apply everywhere” was created; for at this time, most computers use machine code or assembly which is very CPU specific. Let’s see how “Hello, World!” looks in COBOL: IDENTIFICATION DIVISION. PROGRAM-ID. hello-world. PROCEDURE DIVISION. DISPLAY "Hello, world!" That folks, is what we call a “language designed by committee”. The arrival of the minicomputer Hey mama welcome to the 60’s!! Wo-oh-oh-oh… Oh sorry, old habits. But as you might have guessed, 1960’s has rolled around, and someone had a bright idea​​​​​​​ How about we stop writing code in pencil and spending weeks on encoding it into punch-cards and use the conveniently invented teletype machines that news agencies and governments use to send messages to each other; and use those to “talk” with the computer? With that sorted, humanity paved the way to a more general-purpose high-level programming languages designed to be used by everyone, everywhere. And with the emergence of minicomputers on the market, there was a sharp need of a programming language/compiler compact enough to meet the demand. Thus, BASIC was born – “Beginners’ All-purpose Symbolic Instruction Code” Originally designed by John G. Kemeny and Thomas E. Kurtz in 1964; Basic’s core principal and objective was “ease of use” and “compactness”. “Minicomputer” – PDP-11 So compact in fact that it is generally considered as the first commercial “interpreter” being able to make use of all those 4 kB of memory available on those “mini”-computers. “Basic” was amazing for teaching programming principles, it was easy to understand and follow; especially for people who didn’t have a strong understanding of mathematics or computers in general. Now days we know BASIC under its many dialects that appeared after its inception in 1964 – Microsoft BASIC; Visual BASIC; Atari BASIC; Commodore BASIC; Applesoft BASIC and dozen other variations. However, they all come from the “Dartmouth BASIC”. Aside from being the originator Dartmouth Basic ran with an interactive programming environment available to all students and teachers called “Dartmouth Time Sharing System” (DTSS); the first time-sharing operating system! 10 PRINT “Hello, World!” 20 GO TO 10 120 END Family tree of programming languages. Most of the following have their roots deep down in the languages we’ve mentioned earlier. By 1970s and onwards to 1980s we had: PASCAL (1970); Smalltalk (1972); C (1972); SQL (1972); Scheme (1975); C++ (1980); Ada (1983); Objective-C (1983); Perl (1987) and later the 90s rolled around and we have our favourite: Haskel and Python (1990); Visual Basic (1991); Java, PhP and JavaScript in 1995 and so many others. History of programming: End of todays session We believe went too fast over there and overloaded the capacitors; we seem to be going back into 2021! Well, I suppose that’s it for now, something went horribly wrong and I’m afraid we’re stuck in our time again. Perhaps later we’ll be coming back to the 70s and exploring more how our programming world evolved through the ages. ​​​​​​​This was our take on the history of programming from the 1950s to 1960s. If you think it’s a great idea, maybe leave a like and a comment saying so? What do you think is the most fascinating part of programming history? While you ponder that, I’ll go back researching the incredible ways you can improve your business and tech company with digital transformation. ​​​​​​​Stay classy tech and business nerds!
4768
dbpedia
3
2
https://en.wikipedia.org/wiki/ALGOL
en
Wikipedia
https://upload.wikimedia…t_al_-_cover.jpg
https://upload.wikimedia…t_al_-_cover.jpg
[ "https://en.wikipedia.org/static/images/icons/wikipedia.png", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg", "https://upload.wikimedia.org/wikipedia/commons/thumb/c/c8/1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg/121px-1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/8/84/Algol%26Fortran_family-by-Borkowski.svg/220px-Algol%26Fortran_family-by-Borkowski.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/f/f2/Edit-clear.svg/40px-Edit-clear.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/9/9c/Flag_of_Denmark.svg/20px-Flag_of_Denmark.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/0/03/Flag_of_Italy.svg/23px-Flag_of_Italy.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/c/c3/Flag_of_France.svg/23px-Flag_of_France.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/a/a9/Flag_of_the_Soviet_Union.svg/23px-Flag_of_the_Soviet_Union.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/3/3d/Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg/23px-Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/1/12/Flag_of_Poland.svg/23px-Flag_of_Poland.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Flag_of_the_People%27s_Republic_of_China.svg/23px-Flag_of_the_People%27s_Republic_of_China.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/1/1c/Wiki_letter_w_cropped.svg/44px-Wiki_letter_w_cropped.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/8/8a/OOjs_UI_icon_edit-ltr-progressive.svg/10px-OOjs_UI_icon_edit-ltr-progressive.svg.png", "https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1", "https://en.wikipedia.org/static/images/footer/wikimedia-button.svg", "https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg" ]
[]
[]
[ "" ]
null
[ "Contributors to Wikimedia projects" ]
2001-08-26T23:02:19+00:00
en
/static/apple-touch/wikipedia.png
https://en.wikipedia.org/wiki/ALGOL
Family of programming languages This article is about the programming language family. For other uses, see Algol (disambiguation). ALGOLParadigmProcedural, imperative, structuredFamilyALGOLDesigned byBauer, Bottenbruch, Rutishauser, Samelson, Backus, Katz, Perlis, Wegstein, Naur, Vauquois, van Wijngaarden, Woodger, Green, McCarthyFirst appeared1958; 66 years ago ( )Typing disciplineStatic, strongScopeLexicalInfluencedMost subsequent imperative languages (including so-called ALGOL-like languages) e.g. PL/I, Simula, Pascal, C and Scheme ALGOL ( ; short for "Algorithmic Language")[1] is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years.[2] In the sense that the syntax of most modern languages is "Algol-like",[3] it was arguably more influential than three other high-level programming languages among which it was roughly contemporary: FORTRAN, Lisp, and COBOL.[4] It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/I, Simula, BCPL, B, Pascal, Ada, and C. ALGOL introduced code blocks and the begin...end pairs for delimiting them. It was also the first language implementing nested function definitions with lexical scope. Moreover, it was the first programming language which gave detailed attention to formal language definition and through the Algol 60 Report introduced Backus–Naur form, a principal formal grammar notation for language design. There were three major specifications, named after the years they were first published: ALGOL 58 – originally proposed to be called IAL, for International Algebraic Language. ALGOL 60 – first implemented as X1 ALGOL 60 in 1961. Revised 1963.[5][6][7] ALGOL 68 – introduced new elements including flexible arrays, slices, parallelism, operator identification. Revised 1973.[8] ALGOL 68 is substantially different from ALGOL 60 and was not well received,[according to whom?] so reference to "Algol" is generally understood to mean ALGOL 60 and its dialects.[citation needed] History [edit] ALGOL was developed jointly by a committee of European and American computer scientists in a meeting in 1958 at the Swiss Federal Institute of Technology in Zurich (cf. ALGOL 58).[9] It specified three different syntaxes: a reference syntax, a publication syntax, and an implementation syntax, syntaxes that permitted it to use different keyword names and conventions for decimal points (commas vs periods) for different languages.[citation needed] ALGOL was used mostly by research computer scientists in the United States and in Europe; commercial applications were hindered by the absence of standard input/output facilities in its description, and the lack of interest in the language by large computer vendors (other than Burroughs Corporation).[citation needed] ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development.[according to whom?] John Backus developed the Backus normal form method of describing programming languages specifically for ALGOL 58. It was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur form.[10] Peter Naur: "As editor of the ALGOL Bulletin I was drawn into the international discussions of the language and was selected to be member of the European language design group in November 1959. In this capacity I was the editor of the ALGOL 60 report, produced as the result of the ALGOL 60 meeting in Paris in January 1960."[11] The following people attended the meeting in Paris (from 1 to 16 January):[citation needed] Friedrich Ludwig Bauer, Peter Naur, Heinz Rutishauser, Klaus Samelson, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger (from Europe) John Warner Backus, Julien Green, Charles Katz, John McCarthy, Alan Jay Perlis, and Joseph Henry Wegstein (from the US). Alan Perlis gave a vivid description of the meeting: "The meetings were exhausting, interminable, and exhilarating. One became aggravated when one's good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent."[This quote needs a citation] ALGOL 60 inspired many languages that followed it. Tony Hoare remarked: "Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors."[12] The Scheme programming language, a variant of Lisp that adopted the block structure and lexical scope of ALGOL, also adopted the wording "Revised Report on the Algorithmic Language Scheme" for its standards documents in homage to ALGOL.[13] ALGOL and programming language research [edit] As Peter Landin noted,[citation needed] ALGOL was the first language to combine seamlessly imperative effects with the (call-by-name) lambda calculus.[citation needed] Perhaps the most elegant formulation of the language is due to John C. Reynolds, and it best exhibits its syntactic and semantic purity.[according to whom?] Reynolds's idealized ALGOL also made a convincing methodologic argument regarding the suitability of local effects in the context of call-by-name languages, in contrast with the global effects used by call-by-value languages such as ML.[citation needed] The conceptual integrity of the language made it one of the main objects of semantic research, along with Programming Computable Functions (PCF) and ML.[citation needed] IAL implementations timeline [edit] To date there have been at least 70 augmentations, extensions, derivations and sublanguages of Algol 60.[14] Name Year Author Country Description Target CPU ZMMD-implementation 1958 Friedrich L. Bauer, Heinz Rutishauser, Klaus Samelson, Hermann Bottenbruch Germany implementation of ALGOL 58 Z22 (later Zuse's Z23 was delivered with an Algol 60 compiler)[15] X1 ALGOL 60 1960 August[16] Edsger W. Dijkstra and Jaap A. Zonneveld Netherlands First implementation of ALGOL 60[17] Electrologica X1 Elliott ALGOL 1960s C. A. R. Hoare UK Subject of the 1980 Turing Award Lecture[18] Elliott 803, Elliott 503, Elliott 4100 series JOVIAL 1960 Jules Schwartz US A DOD HOL prior to Ada Various (see article) Burroughs Algol (Several variants) 1961 Burroughs Corporation (with participation by Hoare, Dijkstra, and others) US Basis of the Burroughs (and now Unisys MCP based) computers Burroughs Large Systems and their midrange also. Case ALGOL 1961 Case Institute of Technology[19] US Simula was originally contracted as a simulation extension of the Case ALGOL UNIVAC 1107 GOGOL 1961 William M. McKeeman US For ODIN time-sharing system[20] PDP-1 RegneCentralen ALGOL 1961 Peter Naur, Jørn Jensen Denmark Implementation of full Algol 60 DASK at Regnecentralen Dartmouth ALGOL 30 1962 Thomas Eugene Kurtz et al. US LGP-30 USS 90 Algol 1962 L. Petrone Italy ALGOL 60 1962 Bernard Vauquois, Louis Bolliet[21] France Institut d'Informatique et Mathématiques Appliquées de Grenoble (IMAG) and Compagnie des Machines Bull Bull Gamma 60 Algol Translator 1962 G. van der Mey and W.L. van der Poel Netherlands Staatsbedrijf der Posterijen, Telegrafie en Telefonie ZEBRA Kidsgrove Algol 1963 F. G. Duncan UK English Electric Company KDF9 VALGOL 1963 Val Schorre US A test of the META II compiler compiler Whetstone 1964 Brian Randell and L. J. Russell UK Atomic Power Division of English Electric Company. Precursor to Ferranti Pegasus, National Physical Laboratories ACE and English Electric DEUCE implementations. English Electric Company KDF9 NU ALGOL 1965 Norway UNIVAC ALGEK 1965 Soviet Union АЛГЭК, based on ALGOL-60 and COBOL support, for economical tasks Minsk-22 ALGOL W 1966 Niklaus Wirth US Proposed successor to ALGOL 60 IBM System/360 MALGOL 1966 publ. A. Viil, M Kotli & M. Rakhendi, Estonian SSR Minsk-22 ALGAMS 1967 GAMS group (ГАМС, группа автоматизации программирования для машин среднего класса), cooperation of Comecon Academies of Science Comecon Minsk-22, later ES EVM, BESM ALGOL/ZAM 1967 Poland Polish ZAM computer Simula 67 1967 Ole-Johan Dahl and Kristen Nygaard Norway Algol 60 with classes UNIVAC 1107 Triplex-ALGOL Karlsruhe 1967/1968 Karlsruhe, Germany ALGOL 60 (1963) with triplex numbers for interval arithmetic [22] Chinese Algol 1972 China Chinese characters, expressed via the Symbol system DG/L 1972 US DG Eclipse family of Computers S-algol 1979 Ron Morrison UK Addition of orthogonal datatypes with intended use as a teaching language PDP-11 with a subsequent implementation on the Java VM The Burroughs dialects included special Bootstrapping dialects such as ESPOL and NEWP. The latter is still used for Unisys MCP system software. Properties [edit] ALGOL 60 as officially defined had no I/O facilities; implementations defined their own in ways that were rarely compatible with each other. In contrast, ALGOL 68 offered an extensive library of transput (input/output) facilities. ALGOL 60 allowed for two evaluation strategies for parameter passing: the common call-by-value, and call-by-name. Call-by-name has certain effects in contrast to call-by-reference. For example, without specifying the parameters as value or reference, it is impossible to develop a procedure that will swap the values of two parameters if the actual parameters that are passed in are an integer variable and an array that is indexed by that same integer variable.[23] Think of passing a pointer to swap(i, A[i]) in to a function. Now that every time swap is referenced, it is reevaluated. Say i := 1 and A[i] := 2, so every time swap is referenced it will return the other combination of the values ([1,2], [2,1], [1,2] and so on). A similar situation occurs with a random function passed as actual argument. Call-by-name is known by many compiler designers for the interesting "thunks" that are used to implement it. Donald Knuth devised the "man or boy test" to separate compilers that correctly implemented "recursion and non-local references." This test contains an example of call-by-name. ALGOL 68 was defined using a two-level grammar formalism invented by Adriaan van Wijngaarden and which bears his name. Van Wijngaarden grammars use a context-free grammar to generate an infinite set of productions that will recognize a particular ALGOL 68 program; notably, they are able to express the kind of requirements that in many other programming language standards are labelled "semantics" and have to be expressed in ambiguity-prone natural language prose, and then implemented in compilers as ad hoc code attached to the formal language parser. Examples and portability [edit] Code sample comparisons [edit] ALGOL 60 [edit] (The way the bold text has to be written depends on the implementation, e.g. 'INTEGER'—quotation marks included—for integer. This is known as stropping.) procedure Absmax(a) Size:(n, m) Result:(y) Subscripts:(i, k); value n, m; array a; integer n, m, i, k; real y; comment The absolute greatest element of the matrix a, of size n by m, is copied to y, and the subscripts of this element to i and k; begin integer p, q; y := 0; i := k := 1; for p := 1 step 1 until n do for q := 1 step 1 until m do if abs(a[p, q]) > y then begin y := abs(a[p, q]); i := p; k := q end end Absmax Here is an example of how to produce a table using Elliott 803 ALGOL.[24] FLOATING POINT ALGOL TEST' BEGIN REAL A,B,C,D' READ D' FOR A:= 0.0 STEP D UNTIL 6.3 DO BEGIN PRINT PUNCH(3),££L??' B := SIN(A)' C := COS(A)' PRINT PUNCH(3),SAMELINE,ALIGNED(1,6),A,B,C' END END' ALGOL 68 [edit] The following code samples are ALGOL 68 versions of the above ALGOL 60 code samples. ALGOL 68 implementations used ALGOL 60's approaches to stropping. In ALGOL 68's case tokens with the bold typeface are reserved words, types (modes) or operators. proc abs max = ([,]real a, ref real y, ref int i, k)real: comment The absolute greatest element of the matrix a, of size ⌈a by 2⌈a is transferred to y, and the subscripts of this element to i and k; comment begin real y := 0; i := ⌊a; k := 2⌊a; for p from ⌊a to ⌈a do for q from 2⌊a to 2⌈a do if abs a[p, q] > y then y := abs a[p, q]; i := p; k := q fi od od; y end # abs max # Note: lower (⌊) and upper (⌈) bounds of an array, and array slicing, are directly available to the programmer. floating point algol68 test: ( real a,b,c,d; # printf – sends output to the file stand out. # # printf($p$); – selects a new page # printf(($pg$,"Enter d:")); read(d); for step from 0 while a:=step*d; a <= 2*pi do printf($l$); # $l$ - selects a new line. # b := sin(a); c := cos(a); printf(($z-d.6d$,a,b,c)) # formats output with 1 digit before and 6 after the decimal point. # od ) Timeline: Hello world [edit] The variations and lack of portability of the programs from one implementation to another is easily demonstrated by the classic hello world program.[citation needed] ALGOL 58 (IAL) [edit] Main article: ALGOL 58 ALGOL 58 had no I/O facilities. ALGOL 60 family [edit] Main article: ALGOL 60 Since ALGOL 60 had no I/O facilities, there is no portable hello world program in ALGOL. The next three examples are in Burroughs Extended Algol. The first two direct output at the interactive terminal they are run on. The first uses a character array, similar to C. The language allows the array identifier to be used as a pointer to the array, and hence in a REPLACE statement. BEGIN FILE F(KIND=REMOTE); EBCDIC ARRAY E[0:11]; REPLACE E BY "HELLO WORLD!"; WRITE(F, *, E); END. A simpler program using an inline format: BEGIN FILE F(KIND=REMOTE); WRITE(F, <"HELLO WORLD!">); END. An even simpler program using the Display statement. Note that its output would end up at the system console ('SPO'): BEGIN DISPLAY("HELLO WORLD!") END. An alternative example, using Elliott Algol I/O is as follows. Elliott Algol used different characters for "open-string-quote" and "close-string-quote", represented here by ‘ and ’ . programHiFolks; begin print‘Helloworld’ end; Below is a version from Elliott 803 Algol (A104). The standard Elliott 803 used five-hole paper tape and thus only had upper case. The code lacked any quote characters so £ (UK Pound Sign) was used for open quote and ? (Question Mark) for close quote. Special sequences were placed in double quotes (e.g. ££L?? produced a new line on the teleprinter). HIFOLKS' BEGIN PRINT £HELLO WORLD£L??' END' The ICT 1900 series Algol I/O version allowed input from paper tape or punched card. Paper tape 'full' mode allowed lower case. Output was to a line printer. The open and close quote characters were represented using '(' and ')' and spaces by %.[25] 'BEGIN' WRITE TEXT('('HELLO%WORLD')'); 'END' ALGOL 68 [edit] Main article: ALGOL 68 ALGOL 68 code was published with reserved words typically in lowercase, but bolded or underlined. begin printf(($gl$,"Hello, world!")) end In the language of the "Algol 68 Report" the input/output facilities were collectively called the "Transput". Timeline of ALGOL special characters [edit] The ALGOLs were conceived at a time when character sets were diverse and evolving rapidly; also, the ALGOLs were defined so that only uppercase letters were required. 1960: IFIP – The Algol 60 language and report included several mathematical symbols which are available on modern computers and operating systems, but, unfortunately, were unsupported on most computing systems at the time. For instance: ×, ÷, ≤, ≥, ≠, ¬, ∨, ∧, ⊂, ≡, ␣ and ⏨. 1961 September: ASCII – The ASCII character set, then in an early stage of development, had the \ (Back slash) character added to it in order to support ALGOL's boolean operators /\ and \/.[26] 1962: ALCOR – This character set included the unusual "᛭" runic cross[27] character for multiplication and the "⏨" Decimal Exponent Symbol[28] for floating point notation.[29][30][31] 1964: GOST – The 1964 Soviet standard GOST 10859 allowed the encoding of 4-bit, 5-bit, 6-bit and 7-bit characters in ALGOL.[32] 1968: The "Algol 68 Report" – used extant ALGOL characters, and further adopted →, ↓, ↑, □, ⌊, ⌈, ⎩, ⎧, ○, ⊥, and ¢ characters which can be found on the IBM 2741 keyboard with typeball (or golf ball) print heads inserted (such as the APL golf ball). These became available in the mid-1960s while ALGOL 68 was being drafted. The report was translated into Russian, German, French, and Bulgarian, and allowed programming in languages with larger character sets, e.g., Cyrillic alphabet of the Soviet BESM-4. All ALGOL's characters are also part of the Unicode standard and most of them are available in several popular fonts. 2009 October: Unicode – The ⏨ (Decimal Exponent Symbol) for floating point notation was added to Unicode 5.2 for backward compatibility with historic Buran programme ALGOL software.[33] Legacy [edit] A significant contribution of the ALGOL 58 Report was to provide standard terms for programming concepts: statement, declaration, type, label, primary, block, and others.[34] See also [edit] References [edit] Further reading [edit] O'Hearn, Peter; Tennent, Robert D., eds. (1997). Algol-like Languages. Vol. 1. Cambridge, MA: Birkhauser. doi:10.1007/978-1-4612-4118-8. ISBN 9780817638801. S2CID 6273486. The first volume of a two volume set, that includes an introduction from Peter W. O’Hearn and Robert D. Tennent, a report on Algol 60 from Peter Naur and colleagues, four chapters from John C. Reynolds, and other chapters from Christopher Strachey, Matthias Felleisen, Stephen Weeks, Albert R. Meyer, Kurt Sieber, Vipin Swarup, Uday S. Reddy, and Evan Ireland. Baumann, Richard [in German]; Feliciano, Manuel; Bauer, Friedrich Ludwig; Samelson, Klaus (1964). Introduction to ALGOL–A Primer for the Non-Specialist, Emphasizing the Practical Uses of the Algorithmic Language. Automatic Computation. Englewood Cliffs, NJ: Prentice-Hall, Inc. ISBN 0134778286. LCCN 64-10740. ark:/13960/t6qz35p37 . Randell, Brian & L. J. Russell (1964). ALGOL 60 Implementation: The Translation and Use of ALGOL 60 Programs on a Computer. Academic Press. CiteSeerX 10.1.1.737.475. . On the design of the Whetstone Compiler, and one of the early published descriptions of implementing a compiler. For a related paper, see Brian Randell's Whetstone Algol Revisited Archived 27 February 2008 at the Wayback Machine, and The Whetstone KDF9 Algol Translator. Dijkstra, E. W (1961), ALGOL 60 Translation: An ALGOL 60 Translator for the X1 and Making a Translator for ALGOL 60 (PDF), report MR 35/61, Amsterdam: Mathematisch Centrum, archived (PDF) from the original on 9 October 2022 Kruseman Aretz, Frans E.J., The Dijkstra–Zonneveld ALGOL 60 Compiler for the Electrologica X1 (PDF), Historical note SEN, 2, Amsterdam: Centrum voor Wiskunde en Informatica, archived (PDF) from the original on 9 October 2022 Goos, Gerhard [in German] (7 August 2017). Geschichte der Deutschsprachigen Informatik - Programmiersprachen und Übersetzerbau [History of Informatics in German-Speaking Countries - Programming Languages and Compiler Design] (PDF) (in German). Karlsruhe, Germany: Fakultät für Informatik, Karlsruhe Institute of Technology. Archived (PDF) from the original on 19 May 2022 .
4768
dbpedia
3
61
https://search.proquest.com/openview/ba7cc75505a08f3b4c31a667900611b3/1%3Fpq-origsite%3Dgscholar%26cbl%3D1817620
en
J. F. RATCLIFFE: "Algol in Brief: a short practical guide to Computer Programming in Algol" (Book Review)
https://search.proquest.…roQuest_logo.gif
https://search.proquest.…roQuest_logo.gif
[ "https://search.proquest.com/assets/ctx/51be0a5b/images/icons/blank.gif", "https://search.proquest.com/assets/ctx/51be0a5b/images/icons/blank.gif" ]
[]
[]
[ "" ]
null
[]
null
Explore millions of resources from scholarly journals, books, newspapers, videos and more, on the ProQuest Platform.
en
/apple-icon-57x57.png
null
This is a short preview of the document. Your library or institution may give you access to the complete full text for this document in ProQuest.
4768
dbpedia
3
36
https://fortran-lang.discourse.group/t/john-backus-1924-2007-fortrans-father/873
en
John Backus (1924-2007), Fortran's father
https://canada1.discours…c805f45c6ee6.png
https://canada1.discours…c805f45c6ee6.png
[ "https://canada1.discourse-cdn.com/free1/uploads/fortran_lang/original/1X/e3dcf719088761ee6f1db014c6678378753d70ce.jpeg", "https://canada1.discourse-cdn.com/free1/uploads/fortran_lang/original/1X/116ae7b05230aba99aefecc5ae236aa9814dc0de.jpeg", "https://sea1.discourse-cdn.com/free1/user_avatar/fortran-lang.discourse.group/ivanpribec/48/3290_2.png", "https://sea1.discourse-cdn.com/free1/user_avatar/fortran-lang.discourse.group/vmagnin/48/28_2.png", "https://canada1.discourse-cdn.com/free1/uploads/fortran_lang/original/2X/4/41edbbfef3169f7999c9b3d03cdd2a6d85c536c7.jpeg", "https://canada1.discourse-cdn.com/free1/uploads/fortran_lang/original/2X/3/36c61d44414bc268e3817b3bcd125761061dafa8.png", "https://canada1.discourse-cdn.com/free1/uploads/fortran_lang/original/2X/3/36c61d44414bc268e3817b3bcd125761061dafa8.png", "https://canada1.discourse-cdn.com/free1/uploads/fortran_lang/optimized/2X/a/a982df92faeaaeddec35c71cde611e93265ee8ef_2_690x247.jpeg", "https://emoji.discourse-cdn.com/twitter/slight_smile.png?v=12", "https://sea1.discourse-cdn.com/free1/user_avatar/fortran-lang.discourse.group/vmagnin/48/28_2.png" ]
[]
[]
[ "" ]
null
[]
2021-03-21T17:21:27+00:00
Following @beliavsky &#39;s post, I think that John Backus deserves his own post in our Discourse. And knowledge about the birth of Fortran can help the community to promote the language: as for human beings, many qualities &hellip;
en
https://canada1.discours…6ee6_2_32x32.png
Fortran Discourse
https://fortran-lang.discourse.group/t/john-backus-1924-2007-fortrans-father/873
Following @beliavsky 's post, I think that John Backus deserves his own post in our Discourse. And knowledge about the birth of Fortran can help the community to promote the language: as for human beings, many qualities and defects of Fortran come from its early years. Speedcoding (1953) In 1953, John Backus works on the IBM 701 Speedcoding system, a simple interpreter to ease computing with floating point numbers (the IBM 704 will be the first to have builtin floating point reals). In those early days, programmers are programming in machine language, a tedious task. Backus says that the operation cost is equally divided between the operation cost of the computer and personnel cost [1]. And Speecoding allows to reduce the cost by accelerating the programming and testing of programs, although running far slower than machine language. At that time, other people were working on the problem. For example, Grace Hopper completed in 1952 what is considered the first compiler: the A-0 Compiler. Fortran (1954-1958) In december 1953, John Backus writes a letter to his boss to propose the Fortran project, for the new IBM 704. It is motivated by the same economical reasons as Speedcoding and will be funded by IBM without questions for that reason. Interestingly, FORTRAN is described to be composed of two parts: the language itself and the translator. In 1957, Backus does not still use the word “compiler” [2], although he uses the verb “to compile”. In [3] he insists that their objective was nor to translate quickly, nor to elaborate features of the language, but to obtain very fast object programs, working especially on the optimization of the loops and the way array indexes were treated. In [4] he explains that most people were skeptical about the ability of a program computer to generate a machine code as fast as would human beings. That’s why his group wanted to create a compiler which would produce optimized machine code whatever the problem, fearing that if a particular problem would be running much slower than a human being’s code, Fortran would be rejected. Thus this interesting sentence which tells much about Fortran: “Because of our 1954 view that success in producing efficient programs was more important than the design of the Fortran language…” And: “As far as we were aware, we simply made up the language as we went along. We did not regard language design as a difficult problem, merely a simple prelude to the real problem: designing a compiler that could produce efficient programs.” While people working on other early languages were focused on their language, Backus’ group was focused on the translator program (compiler). It was already quite sophisticated: there was even a compiling phase investigating the frequency of execution of each part of the program, using a Monte Carlo algorithm. Backus will even pretend that it remained the best compiler for more than 20 years. Note also that they were working for the IBM 704, not even thinking Fortran will run one day on other computers. The compiler was ready in April 1957 and distributed to all IBM 704 installations. Interestingly, they first tried to distribute it using punched cards, but failed and finally used magnetic tapes… Backus also insists on the importance of the writing of “Programmer’s Primer” (Spring 1957) by Mitchell, the first Fortran tutorial, on the subsequent growth in the use of Fortran. After Fortran While working on Algol, he worked on the Backus-Naur form (Backus–Naur form - Wikipedia) to describe the syntax of the language. He will also work on functional programming. In the conclusion of [4], he wrote: “By splitting programming into a world of expressions, one the one hand, and a world of statements, on the other, von Neumann languages prevent the effective use of higher-level combining forms”. “The world of expressions has some elegant and useful mathematical properties, whereas the world of statements is a disorderly one without useful mathematical properties.” In 1977, he will be the laureate of the A.M. Turing Award (the equivalent of Nobel Prize for computer science), “for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for seminal publication of formal procedures for the specification of programming languages”: https://amturing.acm.org/award_winners/backus_0703524.cfm [1] J. Backus, “The IBM 701 Speedcoding System,” J. ACM, vol. 1, no. 1, 1953, pp. 4–6. https://archive.computerhistory.org/resources/access/text/2018/02/102632810-05-01-acc.pdf [2] Backus, J. W., H. Stern, I. Ziller, R. A. Hughes, R. Nutt, R. J. Beeber, S. Best, et al. ‘The FORTRAN Automatic Coding System’. In Papers Presented at the February 26-28, 1957, Western Joint Computer Conference: Techniques for Reliability on - IRE-AIEE-ACM ’57 (Western), 188–98. Los Angeles, California: ACM Press, 1957. https://doi.org/10.1145/1455567.1455599. http://archive.computerhistory.org/resources/text/Fortran/102663113.05.01.acc.pdf [3] Backus, J. W., and W. P. Heising. ‘Fortran’. IEEE Transactions on Electronic Computers EC-13, no. 4 (August 1964): 382–85. https://doi.org/10.1109/PGEC.1964.263818. http://www.softwarepreservation.org/projects/FORTRAN/paper/BackusHeising-FORTRAN-1964.pdf [4] Backus, J. ‘The History of Fortran I, II, and III’. IEEE Annals of the History of Computing 20, no. 4 (December 1998, first published in July 1979): 68–78. https://doi.org/10.1109/85.728232. http://www.softwarepreservation.org/projects/FORTRAN/paper/p165-backus.pdf Reading that interview was a strange experience. It was made in 2006, as J. Backus was approaching 81 years old, one year before he passed away. He seemed not a very talkative man, but quite fun and humble, not taking too seriously the achievements he made in his life. "Programming is a pretty low-level enterprise" I have not learned more about the beginnings of Fortran, but interestingly discovered that he did not seem to be someone passionate about programming, as we may think. On the contrary, my feeling is his work was, in a certain sense, mainly about getting rid of programming! Speedcoding and Fortran were about getting rid of machine language programming. And I feel the effort of people coding fpm and the stdlib is similar: getting rid of reinventing the wheel, a quite common syndrome in our Fortran world. His work on functional programming (or rather function-level programming) was an attempt to get rid of Von Neumann programming style: "Basically, the idea was to try to describe the transformation that you wanted to take place, rather than how to do it. " He seemed rather a mathematical mind (Masters in Mathematics) and was more proud of his work on what he called functional programming than on Fortran: - “Well, I guess the question of it still seems that programming is a pretty low-level enterprise, and that somebody ought to be thinking about how to make it higher; really higher level than it is.” - “Actually that functional programming was an effort to try to go up a level, so that you didn’t have to keep saying how to do everything, but rather say what you wanted done.” A wise man’s conclusion Interviewer: “Is the world a better place because of all the software that’s been written in your lifetime, or not?” Backus: “Well, in human terms, probably not. Because it just takes us further and further away from human affairs. But as far as economic, and welfare, it’s done a lot of good. So it’s a mixed bag. … But I don’t envy you, I’m afraid. I think that we’re getting more and more technological and less and less human oriented.” His advice to young people in High School: “Well, don’t go into software. It’s just such a complicated mess that you just frazzle your brains trying to do anything worthwhile.” And I am not sure it’s a joke… Half a joke?
4768
dbpedia
1
96
http://www.cs.williams.edu/~kim/cs334/s00/Lectures/Lec1/Lec1.html
en
Williams College CS334
[ "http://www.cs.williams.edu/~kim/CS+logotrans.gif", "http://www.cs.williams.edu/~kim/cs334/s00/Lectures/Lec1/Image1.gif" ]
[]
[]
[ "" ]
null
[]
null
null
CS 334 Programming Languages Spring 2000 Lecture 1 Do programming languages matter? Absolutely!!! Otherwise, why would the Java folks have rejected C++ for their programming language? Explore in this course all aspects of programming languages, including their features, type systems, programming styles supported, and implementations. Programming languages are the main interface between computers and programmers, allowing us to express or understand algorithms to be executed by the computer. Abstractions By providing abstractions (or mechanisms to create abstractions) they influence the way we think about problems. Data abstractions: basic data types: integers, reals, booleans, characters, pointers structured: arrays, records unit: E.g., support for ADT's, modules, packages, classes Control abstractions: basic: assignment, goto, sequencing structured: if..then..else.., loops, procedures & functions unit: separately compiled units, modules, packages, concurrent tasks Syntax and Semantics For all constructs need to have clearly specified syntax and semantics: syntax: what is a legal expression? semantics: what is the result when it is executed? Syntax always given formally (as well as informally). Semantics usually given informally (English), but more and more formally. Both necessary in order to ensure programs give predictable results. How can programming languages support the software development process? Phases in the development process are: Requirements Specification Implementation Certification or Validation (includes testing and verification) Maintenance Need to evaluate languages with respect to overall picture. Not good if just supports one aspect. Important to evaluate language based on what its goals are. BASIC - quick development of interactive programs Pascal - instruction C - low-level systems programming FORTRAN - number crunching scientific computation Languages which are good for quick hacking together of programs may not be suitable for large-scale software development. Better choices for large-scale software include: Ada, Modula-2, Clu, object-oriented languages like C++, Eiffel, etc. I heard two weeks ago at a conference in Boston that 50% of all security problems reported to CERT (center charged with reporting security problems and solutions) are due to one particular feature of one specific programming language. Given the increasing interest in programs communicating over the internet, one has to wonder why those interested in security still write in that language. Most languages today designed to support specific software development methodology or philosophy. E.g., top-down design, object-based design, encapsulation, information-hiding. Languages influence the way people think about programming process. Faculty always complain about BASIC hackers. Minimum requirements for programming languages: Universal - if can solve on computer then can program it in the language Natural (expressive) - easy to express ideas. Implementable Efficient (for writing, compilation, or execution?) Reliable - writeable (high-level), readable, ability to deal with exceptional behavior) Maintainable - Easy to make changes, decisions compartmentalized, clean interfaces. Alternative Programming Language Paradigms Important to be aware of different programming language paradigms, allow one to think about problems in different ways. Partially driven by new architectures (or at least not constrained by old). Procedural paradigm (C, Pascal, FORTRAN, Ada, Modula-2) is closest to machine architecture. Other paradigms: functional - Popular in AI (LISP & Scheme) and theoretical prog. lang. community. Closest to theoretical models and mathematics. logic - Originally of interest in AI (though fading). Also important as database query languages. Now shifting to "constraint" languages. Declarative, control implicit in search. object-oriented - Latest fad. All major systems must be "object-oriented" to be up-to-date. Important, but current languages have some flaws. Based on discrete simulation, objects responsible for knowing what to do. Package state with appropriate operations. History of Programming Languages Machine language -> Assembly language -> High-level languages Programmers: Single highly trained programmer -> large teams which must cooperate Early FORTRAN - Backus @ IBM, 1957, 18 person-years to write first compiler, Goals & contributions: numerical problems, very efficient compiler, separate compilation Many revisions FORTRAN II, IV, 66, 77, 90 ALGOL 60 - Committee, 1960 Goals & contributions: numeric, block structure, recursion, elegant, very influential Ancestor of ALGOL W, ALGOL 68, Pascal, Modula, Ada, etc. COBOL - Committee, 1960 Goals & contributions: business data processing, records Several revisions Early Schisms: LISP - McCarthy @ MIT, 1962, core is functional Goals & contributions: List processing and symbolic manipulation, AI Scheme & Common LISP are modern descendents APL - Iverson @ Harvard, IBM, 1960 (for notation), calculator language for array computations. SNOBOL 4 - Griswold @ Bell Labs, 1966 - string processing via pattern matching modern successor is ICON Consolidation PL/I - IBM committee, 1967 combine FORTRAN, COBOL, ALGOL 60 - but not integrated, now considered a failure. Multipurpose - include ptrs, records, exceptions, etc. Next Leap Forward ALGOL 68 - Committee, "orthogonal" elements, elegant but very hard to understand Simula 67 - precursor of object-oriented languages - designed for simulation, coroutines Pascal - Wirth, ETH, 1971 - designed only as a teaching language Support structured programming, spare & elegant Successful beyond expectations Abstract Data Types Clu, Mesa, Modula-2, Ada Supports modules for encapsulation and information hiding Other paradigms Object-oriented: Smalltalk (1972), Eiffel, C++, Object Pascal, Java Functional: Scheme (1985?), ML (1978), Miranda (1986) & Haskell (1991) Logic: PROLOG (1972), newer constraint programming languages. 4th generation languages Important in business applications Specialized packages of powerful commands w/simplified "user-friendly" syntax. Topics to be covered in course: Programming language features and organization, including modules, classes, exception handlers, generic types functional and object-oriented, as well as procedural paradigms. (No longer consider logic programming important enough to spend time on.) Programming language support for reliable programming: abstraction, encapsulation, information hiding, polymorphism, higher-order operators. Language support for concurrency Formal definitions of programming languages Compilers and interpreters Run-time behavior of programming languages, including impact of binding time. Three main concerns: Programming language features for reliable programming Run-time behavior of programming languages Formal semantics and interpreters for implementing languages. Start out by learning ML so can explore some new ideas & rapidly program interesting applications. Write our own interpreters for simple languages so can see impact of various design decisions. Functional Languages Problems with imperative (command-based) languages In his 1978 Turing award lecture (granted in recognition of his role in the development of FORTRAN, ALGOL 60, and BNF-grammars), John Backus attacked the pernicious influence of imperative programming languages and their dependence on the von Neumann architecture. What is problem with imperative languages? Designed around architectures available in 1950's. Components: CPU with accumulator and registers. Memory Tube connecting CPU with Memory which transmits one word at a time. To execute an instruction, go through fetch, decode, execute cycle. Ex. To execute statement stored in location 97 (ADD 162): Fetch instruction from memory location 97 (to CPU) Decode into operation (ADD) and address (162) Fetch contents of address. Add contents to accumulator and leave result in accumulator. Simple statement like A:=B+C results in several accesses to memory through "Von Neumann bottleneck." Imperative program can be seen as control statements guiding execution of a series of assignment statements (accesses and stores to memory). Variable in programming language refers to location whose contents may vary with time. Hard to reason about variables whose values are always changing, even within same procedure or function. Math notation not like that. Static. If want to add time, add new parameter. Gives static reasoning about dynamic processes. Important notion called referential transparency. Can replace an expression anywhere that it occurs by its value. Very important for parallelism, since compute once and then reuse. Not true of imperative languages. Can't compute x+1 once and replace all occurrences by its value. Order of execution in imperative programs very important - inhibits parallel execution. We will see several advantages of functional programming. Referentially transparent - easier to reason about, easier to make parallel. Once expression evaluated, it can be reused. Order of execution need not be specified. Expressions can automatically be executed when needed, even in parallel. Higher-level, resulting in shorter, more understandable programs. Can build new higher-order functions which allow you to put together old programs in more flexible ways. "Lazy evaluation" can allow one to compute with infinite objects. Other important reasons to study functional languages: Useful in AI research Useful in developing executable specifications and prototype implementations. Closely related to CS theory (e.g., recursive functions, denotational semantics). Back to:
4768
dbpedia
0
8
https://retrocomputing.stackexchange.com/questions/9732/what-was-the-first-language-to-offer-full-structured-programming-support
en
What was the first language to offer "full" structured programming support?
https://cdn.sstatic.net/…g?v=6b27f39a6023
https://cdn.sstatic.net/…g?v=6b27f39a6023
[ "https://lh6.googleusercontent.com/-74M83RBb6ys/AAAAAAAAAAI/AAAAAAAAABM/igPkit2LlSc/photo.jpg?sz=64", "https://i.sstatic.net/36WSK.png?s=64", "https://i.sstatic.net/J38sS.jpg?s=64", "https://www.gravatar.com/avatar/645dbb4f7ba1ff9d7ff2de02f68fbe9c?s=64&d=identicon&r=PG", "https://retrocomputing.stackexchange.com/posts/9732/ivc/acfa?prg=e75fd9a5-a643-4c14-a63f-70c53ddce9ea" ]
[]
[]
[ "" ]
null
[ "R. Schmitz" ]
2019-04-18T12:49:48
In the same vein as this question about conditionals, I am interested in the first time a language provided functionality we assume as a given for modern programming languages. The question is when a
en
https://cdn.sstatic.net/Sites/retrocomputing/Img/favicon.ico?v=9b6f1e4fd156
Retrocomputing Stack Exchange
https://retrocomputing.stackexchange.com/questions/9732/what-was-the-first-language-to-offer-full-structured-programming-support
A couple of misconceptions in this one: Recursion is just another kind of iteration. It was in fact the only iteration mechanism provided in early versions of Lisp. Any use of recursion can be transformed into standard looping iteration (although many of them require use of a stack as well). Recursion can be viewed as kind of a hack to be able to leverage the program stack for your stack usage while iterating (in languages that use a program stack). Likewise early Lisp demonstrated that its possible (although not always convenient) to use recursion for all your iteration needs. I have never before heard that support for recursion is required for a language to be considered a "structured language". Really all that is required is that the language be Turing complete without relying on "unstructured" branch statements (aka: GOTOs) Now with all these caveats, most of the early "Structured Programming" developers were also at the same time involved in the development and use of ALGOL. This includes most prominently Dijkstra, who was the leading light in both the Structured Programming movement and developed the first ALGOL 60 compiler. Hoare and Dahl were also very involved in the development of both. The other major languages that were extant at the time were COBOL and FORTRAN. Early FORTRAN in particular made extensive use of GOTOs, along with data overlays that are essentially the data equivalent of GOTOS, and thus a large amount of the development of ALGOL was in reaction to FORTRAN. So if you are looking for the early standard-bearer for the paradigm, ALGOL is probably your language. This is a copy-paste of my answer to the other question. R. Schmitz said I could do it though. If we are talking about a block structured conditional like if condition some arbitrary sequence of statements including perhaps nested ifs else some other arbitrary sequence of statements including perhaps nested ifs rather than just if ... goto some label which is no different to a conditional branch, then two early contenders would be: Lisp (1958) which has an if and cond function. I think condpredates if. When I learned Lisp in the 1980's I'm fairly sure that if wasn't there. Lisp also had recursion, which means it had iteration by definition. Algol 60 (1960) which has the structured if inherited by most modern imperative languages. Also while and for loops and recursion. Early versions of FORTRAN and COBOL did not have structured if statements as far as I know or recursion.
4768
dbpedia
1
14
https://michaeltansy.wordpress.com/2013/03/18/concept-of-programming-language-2-evolution-of-the-major-programming-language/
en
Concept of Programming Language 2 (Evolution of the Major Programming Language)
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "http://s10.flagcounter.com/count/3Ur/bg_FFFFFF/txt_000000/border_CCCCCC/columns_2/maxflags_10/viewers_0/labels_1/pageviews_1/flags_1/", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2013-03-18T00:00:00
REVIEW QUESTION 1. In what year was Plankalkul designed? In what year was that design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 2. Mention an intereseting feature of Zuse’s programs. One of the…
en
https://s1.wp.com/i/favicon.ico
michaeltansy
https://michaeltansy.wordpress.com/2013/03/18/concept-of-programming-language-2-evolution-of-the-major-programming-language/
REVIEW QUESTION 1. In what year was Plankalkul designed? In what year was that design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 2. Mention an intereseting feature of Zuse’s programs. One of the interesting feature of Zuse’s programs is the inclusion of mathematical expressions showing the current relationships between program variables. 3. What does Plankalkul mean? Plankalkul means program calculus. 4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they ? The first hardware shortcoming was the lack of floating point hardware. The second was the lack was automatic incrementing address registers. 5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped? The number of bits is 72. It is grouped into 12 parts which contains 6 bit bytes each. 6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming languages ? Explain why. Its capabilities prompted the development of Fortran because it was able to support floating-point operations hardware. 7. Who developed the Speedcoding system for the IBM 701 ? The speedcoding was developed by John Backus. 8. Who developed Short Code? Why is Short Code called automatic programming? The shortcode was developed by John Mauchly in 1949. Short Code is called automatic programming because Short Code was not translated to machine code, it was implemented with a pure interpreter. 10. What was the most significant feature added to Fortran I to get Fortran II? Independent-compilation capability 11. What control flow statements were added to Fortran IV to get Fortran 77? logical loop statements and IF with an optional ELSE 12. Which version of Fortran was the first to have any sort of dynamic variables? Fortran 90 13. Which version of Fortran was the first to have character string handling? Fortran 77 14. Why were linguists interested in artificial intelligence in the late 1950s? Linguists were concerned with natural language processing. 15. What are the different data types and structures in Common LISP? Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of packages for modularizing collection of function and data providing access control. 16. In what way are scheme and Common LISP opposites of each other? Common LISP allows for static scoping and dynamic scoping Scheme only uses static scoping. Scheme is relatively small while Common LISP is large and complex. 17. What dialect of LISP is used for introductory programming courses at some universities? Scheme 18. What two professional organizations together designed ALGOL 60? ACM and GAMM 19. What was the goal for developing C? C has adequate control statements and data-sructuring facilities to allow its use in many application areas. It alse has a rich set of operators that provide a high degree of expressiveness. 20. What were the significant modifications to ALGOL 58 to produce ALGOL 60? The concept of block structure was introduced, two different means of passing parameters to subprograms were allowed, procedures were allowed to be recursive, stack-dynamic arrays were allowed. 21. What language was designed to describe the syntax of ALGOL 60? BNF 22. On what language was COBOL based? FLOW-MATIC 23. In what year did the COBOL design process begin? 1959 24. What data structure that appeared in COBOL originated with Plankalkul? Hierarchical data structures (records) 25. What organization was most responsible for the early success of COBOL (in terms of extent of use)? Department of Defense (DoD) 27. Why was BASIC an important language in the early 1980s? Its smaller dialects could be implemented on computers with very small memories 28. PL/I was designed to replace what two languages? COBOL and Fortran 29. For what new line of computers was PL/I designed? the IBM system/360 line of computers 30. What features of SIMULA 67 are now important parts of some object-oriented languages? Data abstraction 31. What innovation of data structuring was introduced in ALGOL 68 but is often credited to Pascal? User-defined data types 32. What design criterion was used extensively in ALGOL 68? Orthogonality 33. What language introduced the case statement? The case statement was introduced in ALGOL-W, a version of ALGOL 60 created by Niklaus Wirth and C. A. R. (Tony) Hoare. 34. What operators in C were modeled on similar operators in ALGOL 68? for and switch statements, in its assigning operators, and in its treatment of pointer 35. What are two characteristics of C that make it less safe than Pascal? Lack of complete type checking and flexibility 37. What are the two kinds of statements that populate a Prolog database? The statements that populate a Prolog database are called facts and rules. 38. What is the primary application area for which Ada was designed? Embedded systems 39. What are the concurrent program units of ada called? tasks (using the rendezvous mechanism) 42. What three concepts are the basis for object-oriented programming Classes, objects and methods 43. Why does C++ include the features of Ada that are known to be unsafe? A goal of Ada was that it could be used for which Ada could be used, so virtually none of the features of Ada would be removed, not even those considered to be unsafe. 44. From what language does Objective-C borrow its syntax for method calls? Smalltalk language. 45. What programming paradigm that nearly all recently designed languages support is not supported by Go? Doesn’t support traditional object-oriented programming, as it has no form of inheritance. PROBLEM SET 1. What features of Fortran IV do you think would have had the greatest influence on Javaif the Java designers had been familiar with Fortran? Logical data type – boolean, used mainly for the control expressions of itscontrol statements (such as if and while).We can create simple version of the complex compile, and link processes of earlier compilers. 2. Determine the capabilities of Short Code, and compare them with those of a contemporary programmable hand calculator. Short Code is capable of making multiplication without multiplication code needed, just put two operands side-by-side.Besides, it also uses pure interpreter to run itself.Short Code consists of coded version of mathematical expression that was to be evaluated.Contemporary programmable calculator was able to do multiplication with its embedded programming language. People can solve arithmetic problem easier than using the Short Code but it is more difficult to do further improvement to the programmable hand calculator than to the Sort Code.
4768
dbpedia
2
41
https://www.scribd.com/document/348730802/Genealogy-of-Some-Programming-Languages
en
Genealogy of Some Programming Languages
https://imgv2-1-f.scribdassets.com/img/document/348730802/original/9a82631067/1723667656?v=1
https://imgv2-1-f.scribdassets.com/img/document/348730802/original/9a82631067/1723667656?v=1
[ "https://s-f.scribdassets.com/webpack/assets/images/shared/gr_table_reading.9f6101a1.png" ]
[]
[]
[ "" ]
null
[ "Erezina Odomero Oyegwa" ]
null
Genealogy of Some Programming Languages - Free download as Word Doc (.doc / .docx), PDF File (.pdf), Text File (.txt) or read online for free. This document outlines the genealogy and evolution of the FORTRAN and PASCAL programming languages from 1957 to 2008. It describes the key features and merits and demerits of each version. FORTRAN began in 1957 and has continued evolving with new versions added every 10-20 years. PASCAL was influenced by early versions of FORTRAN and ALGOL and was created in 1971 by Niklaus Wirth.
en
https://s-f.scribdassets.com/scribd.ico?6e29186df?v=5
Scribd
https://www.scribd.com/document/348730802/Genealogy-of-Some-Programming-Languages
4768
dbpedia
2
77
https://www.collegesidekick.com/study-docs/14981012
en
[]
[]
[]
[ "" ]
null
[]
null
null
4768
dbpedia
1
22
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
en
Notes on ALGOL
https://unacademy.com/co…1/08/favicon.png
https://unacademy.com/co…1/08/favicon.png
[ "https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/logo-1.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165846/avatar.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165845/user.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165843/settings.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165841/holidays.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165839/logout.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165837/u_facebook-f.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165835/u_youtube.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165834/u_twitter.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165832/u_instagram.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165830/u_linkedin.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/exam-updates.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/study-material.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/tests-and-practices.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/rankers-guide.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2021/08/faq.svg", "https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=256 1x, https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=384 2x", "https://static.uacdn.net/production/_next/static/images/footer/learner.svg?q=75&w=48", "https://static.uacdn.net/production/_next/static/images/footer/educator.svg?q=75&w=48", "https://static.uacdn.net/production/_next/static/images/footer/parent.svg?q=75&w=48", "https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg" ]
[]
[]
[ "" ]
null
[]
2022-05-19T13:19:40+00:00
Complete summary of ALGOL. Features and facts of ALGOL. Brief info on ALGOL 58, ALGOL 60 and ALGOL 68; their implementations.
en
https://unacademy.com/co…1/08/favicon.png
Unacademy
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
ALGOL is a computer programming language that was created in 1958–60 by an international committee of the Association of Computing Machinery (ACM) chaired by Alan J. Perlis of Carnegie Mellon University for publishing algorithms and performing calculations. ALGOL, like LISP, included recursive subprograms, which were procedures that could be used to solve an issue by reducing it to a smaller problem of the same type. ALGOL stands for algorithmic language and was previously known as IAL. It’s a group of portable programming languages for scientific computations that influenced a lot of other languages. For nearly 30 years, the ACM has utilised ALGOL as the standard approach for generating algorithms in textbooks and academics. Features of ALGOL: BNF Notation – Backus-Naur Form is a formal means for describing the syntax of a computer language. Think of it as short-hand for computer languages. Statement Blocks – This is a syntactic means for grouping a series of programming statements together. Think of this like putting a group of related items in a box, and treating the box as a unit. Parameter Passing – Specifically, there are two methods for passing information to and from a subprogram, pass by value, and pass by reference. Think of these as a way to protect information entering a subprogram from modification, one sends a copy, the other sends the actual data. Dynamic Arrays – These are a list of values where the size of the list can be determined at run-time. Think of these as flexible arrays, arrays that adjust to the need. Reserved Words – These are predefined words that have special meaning to a computer language. Think of these like your name. It represents you and the things you can do. Major ALGOL specifications: ALGOL 58 ALGOL 58, formerly known as IAL, is a member of the ALGOL programming language family. ALGOL 60 quickly surpassed it as an early compromise design. The core concept of the compound statement was introduced in ALGOL 58, but it was limited to control flow only, and it was not bound to identifier scope like Algol 60’s blocks were. According to Perlis, the language was first offered as IAL (International Algebraic Language), but this was rejected as a “‘unspeakable’ and pretentious acronym.” IBM attempted to implement ALGOL 58, however the development was shortly abandoned due to rivalry with FORTRAN. BALGOL, a version for the Burroughs 220, evolved along similar lines, but preserved much of ALGOL 58’s original personality. The major contribution of ALGOL 58 was to later languages; it served as the foundation for JOVIAL, MAD, NELIAC, and ALGO. ALGOL 58’s influence on ALGOL 60: Both IAL and ALGOL 60 support arrays with arbitrary lower and higher subscript boundaries, as well as integer expressions to create subscript bounds. Procedure declarations and their related identifier scopes can be nested in both IAL and ALGOL 60. ALGOL 60 does not enable numeric statement labels, whereas IAL does. A switch designator exists in both IAL and ALGOL 60, although it is unrelated to the switch statement in C and other languages. The IAL if-statement does not have a then-clause or an else-clause; instead, it protects the next statement. IAL has an if either-statement that allows you to test several conditions in a neat way. Both were superseded by ALGOL’s if-then construct, which included the ambiguous “dangling-else.” ALGOL 60 ALGOL 60 (short for Algorithmic Language 1960) is a computer programming language from the ALGOL family. It came after ALGOL 58, which introduced code blocks and the begin and end pairs for delimiting them, marking a significant step forward in the development of organised programming. The first language to provide nested function declarations with the lexical scope was ALGOL 60. Many additional programming languages arose from it, including CPL, Simula, BCPL, B, Pascal, and C. ALGOL 60 principles were used to programme almost every computer of the time. For parameter passing, ALGOL 60 provides two evaluation strategies: call-by-value and call-by-name. For each formal parameter, the procedure declaration stated which was to be used: value supplied for call-by-value and omitted for call-by-name. In contrast to call-by-reference, calling by name has certain implications. ALGOL 68 implementations: The Royal Radar Establishment in the United Kingdom presented ALGOL 68-R in July 1970 as the first implementation of the standard, based on the late-1968 draught Report. This was, however, only a portion of the phrase. Despite its popularity on ICL computers, this version became a widely used language in military code, particularly in the United Kingdom. CDC Netherlands released the first comprehensive implementation of the language in 1974 for the Control Data mainframe series. There are three open-source Algol 68 implementations: Marcel van der Veer wrote -a68g, GPLv3 algol 68 toc, an ALGOL 68RS open-source software port. Jose E. Marchesi wrote an experimental Algol68 frontend for GCC. Conclusion: ALGOL is a computer programming language to provide lexical scope nested function declarations. Furthermore, it was the first programming language to pay close attention to formal language definition, and it was the first to introduce Backus–Naur form, a key formal grammar notation for language design, through the Algol 60 Report.
4768
dbpedia
2
20
https://www.coursehero.com/file/110895022/Chapter2-Solutionsdocx/
en
[]
[]
[]
[ "" ]
null
[]
null
null
4768
dbpedia
1
63
https://shape-of-code.com/2020/05/17/happy-60th-birthday-algol-60/
en
The Shape of Code
[ "https://secure.gravatar.com/avatar/6f16b09c89dc3243d18e59949bbe2ed6?s=32&d=mm&r=g", "https://secure.gravatar.com/avatar/fb1a3d04e0127e29fb6a2dbbe4560416?s=32&d=mm&r=g", "https://secure.gravatar.com/avatar/fb1a3d04e0127e29fb6a2dbbe4560416?s=32&d=mm&r=g" ]
[]
[]
[ "" ]
null
[ "Derek Jones", "John Carter", "Ian Joyner" ]
2020-05-17T00:00:00
https://shape-of-code.com/2020/05/17/happy-60th-birthday-algol-60/
4768
dbpedia
0
97
https://michaeltansy.wordpress.com/2013/03/18/concept-of-programming-language-2-evolution-of-the-major-programming-language/
en
Concept of Programming Language 2 (Evolution of the Major Programming Language)
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "http://s10.flagcounter.com/count/3Ur/bg_FFFFFF/txt_000000/border_CCCCCC/columns_2/maxflags_10/viewers_0/labels_1/pageviews_1/flags_1/", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2013-03-18T00:00:00
REVIEW QUESTION 1. In what year was Plankalkul designed? In what year was that design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 2. Mention an intereseting feature of Zuse’s programs. One of the…
en
https://s1.wp.com/i/favicon.ico
michaeltansy
https://michaeltansy.wordpress.com/2013/03/18/concept-of-programming-language-2-evolution-of-the-major-programming-language/
REVIEW QUESTION 1. In what year was Plankalkul designed? In what year was that design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 2. Mention an intereseting feature of Zuse’s programs. One of the interesting feature of Zuse’s programs is the inclusion of mathematical expressions showing the current relationships between program variables. 3. What does Plankalkul mean? Plankalkul means program calculus. 4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they ? The first hardware shortcoming was the lack of floating point hardware. The second was the lack was automatic incrementing address registers. 5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped? The number of bits is 72. It is grouped into 12 parts which contains 6 bit bytes each. 6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming languages ? Explain why. Its capabilities prompted the development of Fortran because it was able to support floating-point operations hardware. 7. Who developed the Speedcoding system for the IBM 701 ? The speedcoding was developed by John Backus. 8. Who developed Short Code? Why is Short Code called automatic programming? The shortcode was developed by John Mauchly in 1949. Short Code is called automatic programming because Short Code was not translated to machine code, it was implemented with a pure interpreter. 10. What was the most significant feature added to Fortran I to get Fortran II? Independent-compilation capability 11. What control flow statements were added to Fortran IV to get Fortran 77? logical loop statements and IF with an optional ELSE 12. Which version of Fortran was the first to have any sort of dynamic variables? Fortran 90 13. Which version of Fortran was the first to have character string handling? Fortran 77 14. Why were linguists interested in artificial intelligence in the late 1950s? Linguists were concerned with natural language processing. 15. What are the different data types and structures in Common LISP? Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of packages for modularizing collection of function and data providing access control. 16. In what way are scheme and Common LISP opposites of each other? Common LISP allows for static scoping and dynamic scoping Scheme only uses static scoping. Scheme is relatively small while Common LISP is large and complex. 17. What dialect of LISP is used for introductory programming courses at some universities? Scheme 18. What two professional organizations together designed ALGOL 60? ACM and GAMM 19. What was the goal for developing C? C has adequate control statements and data-sructuring facilities to allow its use in many application areas. It alse has a rich set of operators that provide a high degree of expressiveness. 20. What were the significant modifications to ALGOL 58 to produce ALGOL 60? The concept of block structure was introduced, two different means of passing parameters to subprograms were allowed, procedures were allowed to be recursive, stack-dynamic arrays were allowed. 21. What language was designed to describe the syntax of ALGOL 60? BNF 22. On what language was COBOL based? FLOW-MATIC 23. In what year did the COBOL design process begin? 1959 24. What data structure that appeared in COBOL originated with Plankalkul? Hierarchical data structures (records) 25. What organization was most responsible for the early success of COBOL (in terms of extent of use)? Department of Defense (DoD) 27. Why was BASIC an important language in the early 1980s? Its smaller dialects could be implemented on computers with very small memories 28. PL/I was designed to replace what two languages? COBOL and Fortran 29. For what new line of computers was PL/I designed? the IBM system/360 line of computers 30. What features of SIMULA 67 are now important parts of some object-oriented languages? Data abstraction 31. What innovation of data structuring was introduced in ALGOL 68 but is often credited to Pascal? User-defined data types 32. What design criterion was used extensively in ALGOL 68? Orthogonality 33. What language introduced the case statement? The case statement was introduced in ALGOL-W, a version of ALGOL 60 created by Niklaus Wirth and C. A. R. (Tony) Hoare. 34. What operators in C were modeled on similar operators in ALGOL 68? for and switch statements, in its assigning operators, and in its treatment of pointer 35. What are two characteristics of C that make it less safe than Pascal? Lack of complete type checking and flexibility 37. What are the two kinds of statements that populate a Prolog database? The statements that populate a Prolog database are called facts and rules. 38. What is the primary application area for which Ada was designed? Embedded systems 39. What are the concurrent program units of ada called? tasks (using the rendezvous mechanism) 42. What three concepts are the basis for object-oriented programming Classes, objects and methods 43. Why does C++ include the features of Ada that are known to be unsafe? A goal of Ada was that it could be used for which Ada could be used, so virtually none of the features of Ada would be removed, not even those considered to be unsafe. 44. From what language does Objective-C borrow its syntax for method calls? Smalltalk language. 45. What programming paradigm that nearly all recently designed languages support is not supported by Go? Doesn’t support traditional object-oriented programming, as it has no form of inheritance. PROBLEM SET 1. What features of Fortran IV do you think would have had the greatest influence on Javaif the Java designers had been familiar with Fortran? Logical data type – boolean, used mainly for the control expressions of itscontrol statements (such as if and while).We can create simple version of the complex compile, and link processes of earlier compilers. 2. Determine the capabilities of Short Code, and compare them with those of a contemporary programmable hand calculator. Short Code is capable of making multiplication without multiplication code needed, just put two operands side-by-side.Besides, it also uses pure interpreter to run itself.Short Code consists of coded version of mathematical expression that was to be evaluated.Contemporary programmable calculator was able to do multiplication with its embedded programming language. People can solve arithmetic problem easier than using the Short Code but it is more difficult to do further improvement to the programmable hand calculator than to the Sort Code.
4768
dbpedia
1
34
https://ebs-integrator.com/en/blog/programming-timeline-part2
en
EBS Integrator
https://storage.fileserv…37a1e3402db0.png
https://storage.fileserv…37a1e3402db0.png
[ "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=3840&q=75 3840w", "https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=3840&q=75 3840w" ]
[]
[]
[ "" ]
null
[]
null
en
/favicon.ico
https://ebs-integrator.com/en/blog/programming-timeline-part2
Let’s take a deep dive in time and look at how computer science & programming languages began. Follow us as we go through the ages and look at the timeline, major personalities, and events that through their ingenuity paved the road to our current Technological and Digital Era. The history of programming Want to see the father of computers and the first ever programmer as they struggled in the 1800’s, or check out how a 17th-century philosopher helped us develop binary code that helps YOU read these words on your screen. Well, then you absolutely should check out our earlier entry in the series. And if you’ve just finished that one, let's do a recap of where we left off. Fortran code is said to be 20 times shorter than its analogue in handwritten assembly code. The community was doubtful of it at the time due to performance concerns, but the fact that programmers could write more code quicker — it was an easy choice from the economical viewpoint. This how the code interface looked in Fortran back then: It wasn't like entering everything into a computer and having it appear on a screen, there were no screens... or even a keyboard. Computer Program from 1954 Like Mr. Jacquard more than a hundred years ago, they had to encode it on punched cards. The only difference is that we're talking about an IBM mainframe, and you couldn't touch the IBM mainframe. You had people who did that. You write your program on that paper with a pen: An improvement on the Fortran language Since we're talking about Mr. Backus, let's go back to 1958, when Algol was created by a committee including European and American computer scientists. Algol (also known as Algorithmic Language) is a high-level imperative computer programming language that introduced a wealth of features that would later influence and be used in the creation of Pascal, C#, C++ and Java, among others. Matter of fact most languages now days have syntaxes inspired by Algol and it’s considered amongst the most influential programming languages ever. It’s said that Algol was an attempt to improve on some intrinsic problems met with Fortran and its wildly considered as an “evolution” of sorts. While we’re on the subject, John Backus along side Peter Naur developed the Backus-Naur Form of describing programming languages based on ALGOL 58-60. So, you could consider Backus as one of the original fathers of many languages we use today. Here’s how “Hello, World!” looks in Algol-60: BEGIN DISPLAY (“Hello, world!”) END​ This is starting to look quite reasonable. Let’s get some clojure for computer sciences We’re still in the 1958 as we’re about to witness another giant of “programming languages”, and a father-language to many revolutionary ideas at this time – be born. LISP (aka LISt Processor) – a list depended high-level programming language (3GL) developed by John McCarthy at “Massachusetts Institute of Technology”. The language has a long history of changing dialects (like Racket, Clojure, Common lisp and Scheme). Originally intended as a formal notation for computations influenced by lambda calculus and artificial intelligence; it surprised everyone after being introduced into the IBM 704 and turning out to be a fully functioning practical tool for solving problems. Lisp is notable for pioneering many ideas in computer sciences like: tree data structures, dynamic typing, conditions, automatic storage management, recursion, high-order of functions and self-hosting compiler. It is also notable for giving students a migraine, due to its overreliance on parenthesis, so much so they gave it nicknames like “Lost in stupid parenthesis” or “Lots of irritating superfluous parenthesis”. And here’s how “Hello, World!” would look in Lisp: (PRINT “Hello, World!”) Though in 1958 Lisp is still in its infancy and doesn’t see much broad use, this brings us to… An old acquaintance At this current time most of if not all computers are working in either low-programming languages like opcode/ASM or the only high-level programming language available Fortran. And if by some reason you want to upgrade your computer, you’ll have to write all that code from scratch. So, let’s make a small leap to end of 1959. We’re going to meet someone, we’ve already discussed. And that someone is Grace Hopper; the rear admiral of the US Navy and a computer scientist she has already made an impact on history of programming before by creating the A-0 compiler system. Now however she’s attending the CODASYL or the “Conference on Data System Languages” as a technical aide and main expert. The committee is formed in an attempt to guide the development of standardization of programming languages or create a single language that could be used across different machines regardless of hardware. The result was the “Common Business Orientated Language”, or COBOL for short – an “English-like” Programming language heavily influenced by FLOW-MATIC created by Ms. Hopper; Primary goal of COBOL was to lower the barrier of entry into programming; before the creation of High-level languages, programming was a realm exclusive to computer scientists and experts. Now however, other enthusiasts from different professions like businesspeople, doctors, engineers, teachers and many other could incorporate computation into their work. To deal with underlying hardware each computing machine had to have its own COBOL compiler. But critically these compilers could accept the same COBOL source code. The notion of “Write once, apply everywhere” was created; for at this time, most computers use machine code or assembly which is very CPU specific. Let’s see how “Hello, World!” looks in COBOL: IDENTIFICATION DIVISION. PROGRAM-ID. hello-world. PROCEDURE DIVISION. DISPLAY "Hello, world!" That folks, is what we call a “language designed by committee”. The arrival of the minicomputer Hey mama welcome to the 60’s!! Wo-oh-oh-oh… Oh sorry, old habits. But as you might have guessed, 1960’s has rolled around, and someone had a bright idea​​​​​​​ How about we stop writing code in pencil and spending weeks on encoding it into punch-cards and use the conveniently invented teletype machines that news agencies and governments use to send messages to each other; and use those to “talk” with the computer? With that sorted, humanity paved the way to a more general-purpose high-level programming languages designed to be used by everyone, everywhere. And with the emergence of minicomputers on the market, there was a sharp need of a programming language/compiler compact enough to meet the demand. Thus, BASIC was born – “Beginners’ All-purpose Symbolic Instruction Code” Originally designed by John G. Kemeny and Thomas E. Kurtz in 1964; Basic’s core principal and objective was “ease of use” and “compactness”. “Minicomputer” – PDP-11 So compact in fact that it is generally considered as the first commercial “interpreter” being able to make use of all those 4 kB of memory available on those “mini”-computers. “Basic” was amazing for teaching programming principles, it was easy to understand and follow; especially for people who didn’t have a strong understanding of mathematics or computers in general. Now days we know BASIC under its many dialects that appeared after its inception in 1964 – Microsoft BASIC; Visual BASIC; Atari BASIC; Commodore BASIC; Applesoft BASIC and dozen other variations. However, they all come from the “Dartmouth BASIC”. Aside from being the originator Dartmouth Basic ran with an interactive programming environment available to all students and teachers called “Dartmouth Time Sharing System” (DTSS); the first time-sharing operating system! 10 PRINT “Hello, World!” 20 GO TO 10 120 END Family tree of programming languages. Most of the following have their roots deep down in the languages we’ve mentioned earlier. By 1970s and onwards to 1980s we had: PASCAL (1970); Smalltalk (1972); C (1972); SQL (1972); Scheme (1975); C++ (1980); Ada (1983); Objective-C (1983); Perl (1987) and later the 90s rolled around and we have our favourite: Haskel and Python (1990); Visual Basic (1991); Java, PhP and JavaScript in 1995 and so many others. History of programming: End of todays session We believe went too fast over there and overloaded the capacitors; we seem to be going back into 2021! Well, I suppose that’s it for now, something went horribly wrong and I’m afraid we’re stuck in our time again. Perhaps later we’ll be coming back to the 70s and exploring more how our programming world evolved through the ages. ​​​​​​​This was our take on the history of programming from the 1950s to 1960s. If you think it’s a great idea, maybe leave a like and a comment saying so? What do you think is the most fascinating part of programming history? While you ponder that, I’ll go back researching the incredible ways you can improve your business and tech company with digital transformation. ​​​​​​​Stay classy tech and business nerds!
4768
dbpedia
3
56
https://www.embeddedrelated.com/showarticle/1672.php
en
Parlez vous Fortran?
https://www.embeddedrela…eneric_image.png
https://www.embeddedrela…eneric_image.png
[ "https://d23s79tivgl8me.cloudfront.net/new2/images/EmbeddedRelated_bigtab.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/FPGARelated_smalltab.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/DSPRelated_smalltab.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/MLRelated_smalltab.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/ER_Logo_Responsive_Header_Optimized.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/fpgarelated_small_mobile_logo.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/dsprelated_small_mobile_logo.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/electronics-related_small_mobile_logo.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/insights/306_200.jpg", "https://d23s79tivgl8me.cloudfront.net/new2/images/insights/306_300.jpg", "https://d23s79tivgl8me.cloudfront.net/new2/images/insights/306_300.jpg", "https://d23s79tivgl8me.cloudfront.net/new2/images/blogs/JasonSachs_transparent_100.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/blogs/NathanJones_transparent_100.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/blogs/MohammedBilloo_transparent_100.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/blogs/JacobBeningo_transparent_100.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/blogs/AlanCAssis_transparent_100.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/blogs/JasonSachs_transparent_100.png", "https://d23s79tivgl8me.cloudfront.net/new2/images/blogs/MassimilianoPagani_transparent_100.png", "https://www.embeddedrelated.com/new2/Documents_Covers/209.70.jpg", "https://www.embeddedrelated.com/new2/Documents_Covers/178.70.jpg", "https://www.embeddedrelated.com/new2/Documents_Covers/180.70.jpg", "https://d23s79tivgl8me.cloudfront.net/new2/images/EmbeddedRelated_logo.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/DSPRelated_logo.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/MLRelated_logo.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/Electronics-Related_logo.svg", "https://d23s79tivgl8me.cloudfront.net/new2/images/FPGARelated_logo.svg" ]
[]
[]
[ "" ]
null
[ "Colin Walls", "Jason Sachs", "Nathan Jones", "Mohammed Billoo", "Jacob Beningo", "Alan C. Assis", "Massimiliano Pagani" ]
null
A look at the variety of programming languages that are [or have been] used for embedded and some thoughts on the future possibilities.
/apple-touch-icon.png
https://www.EmbeddedRelated.com/showarticle/1672.php
When I first started programming, the only language available to me was Fortran. I punched cards with my code and later received back a print-out with mostly error messages. I was always intrigued by being able to program a computer in such a straightforward way. I understood what assembly language or even machine code was all about, but felt that high level languages had a mystique. In due course, while still at university, I tried out any other languages that were available: BASIC, BCPL, Simula, SNOBOL 4, Algol W and I am sure there were others. When I graduated and found myself in the “real world”, moving towards embedded programming, I found that a selection of languages was still on offer … In my first job, I was doing real time programming on a mini-computer using a variant of Fortran and some assembly. I was then assigned to an embedded project not that we used that term then and I was programming mainly in Forth, which was interesting. My next job saw me become much more specialized in embedded using mainly assembly language on 8-bit devices. Later on I began to use high level languages like Pascal and C. We even offered a Fortran compiler for embedded applications, but I do not recall anyone ever buying one. Of all the different high level languages that I encountered in the past, not one of them was designed for embedded use. PL/M from Intel is probably the only one that I can identify; Forth may also qualify. C has become the standard because it is a reasonable compromise – not ideal, but good enough. Ironically, I have often felt that one of its ancestors, BCPL, might have made a good embedded programming language. In latter years, many programmers migrated from C to C++, which always had the potential to be a better language. But the take up was slower than anticipated. I am frequently asked to speculate on what language will be used in coming years and I do not have a firm view, as there are a number of possibilities: It is often observed that the power that C offers as a language is also a problem, as this power is dangerous. One way to mitigate this danger is to use the language in a specific style. For example, MISRA have published a detailed set of guidelines on how to use C safely. It seems to me that, even if you are not programming a safety critical system, using such techniques will lead to more bug-free code, which is always advantageous. I always believed that C++ usage would increase, as engineers’ confidence in modern tools increased. MISRA have also published guidelines for C++, which may be useful. There is a language dialect called Embedded C++, which was espoused by certain vendors. It is a subset of the language, but, as far as I can tell, some language elements have been eliminated for no obvious reason. Java might play a greater part in embedded programming, particularly at the application level. However, writing a complete system in Java is not an option. It lacks a number of necessary language features notably pointers, so some coding in C, C++ or assembler will be needed. Nowadays, I might bet on Rust, which shows great promise, as it is designed to address the needs of modern embedded applications. In looking at programming languages, I have a strong sense of a continuing history. Maybe I will draw up a “family tree” sometime. It is entertaining to realise that there is nothing new. I first encountered object oriented programming in the Simula language, which was defined in the late 1960s. Fortran, which has been in use for over 50 years, introduced the idea of function overloading. I often have the urge to program in Forth again, as the language has an amazing economy in style and memory usage. Critics say that it is the only write-only programming language. I settle for insisting on an RPN calculator. Maybe I should look at Lisp …
4768
dbpedia
1
18
https://courses.cs.washington.edu/courses/cse505/99au/imperative/algol.html
en
Algol
[]
[]
[]
[ "" ]
null
[]
null
null
September 1999 Introduction In the late 50's, both ACM and GAMM (a European association) decided that a universal, machine-independent language would be desirable -- a language for the expression of algorithms Algol 58: an interim design. Followed by lots of discussions in the Algol Bulletin and CACM. Algol 60: result of an intensive 6 day design effort by a committee in Paris. A few revisions made; Algol 60 report published in CACM in Jan 1963. The meetings were exhausting, interminable, and exhilarating. ... Progress was steady and the output, ALGOL 60, was more racehorse than camel. This language proved to be an object of stunning beauty. It was sufficiently perfect and complete so that ensuing implementations were able to append necessities, such as input-output, in the style of ALGOL 60 but their addition propagated no significant changes in the body of the original language. The ALGOL 60 report (Nauer et al., 1960) was a fitting display for the language. Nicely organized, tantalizingly incomplete, slightly ambiguous, difficult to read, consistent in format, and brief, it was a perfect canvas for a language that possessed those same properties. Like the Bible, it was meant not merely to be read, but to be interpreted. -- Alan Perlis, "The American Side of the Development of Algol," The History of Programming Languages One of the debates: should recursion be allowed? Declarations Declarations are like those in Fortran, with some improvements real, integer, boolean, label array of real, array of integer, ... No complex, no double precision ... Perlis argues against "dumbbell languages" that have constructs for various datatypes, with the different parts loosely connected (e.g. a string processing part, a matrix part, etc.) dynamic array bounds; lower and upper bounds; indefinite number of dimensions Block Structure Blocks support structured programming: if x=3 then begin y:=9; k:=10 end; In Fortran, there can be only a single statement after a logical if: IF (X .NEQ. 3) GOTO 100 Y=9 K=10 100 ... Anywhere a single statement can be used, a block can be used instead. Blocks define nested scopes: begin integer x; procedure squid; begin integer x; ... end; end; Within the procedure squid, the name x refers to a different variable than in the global scope (see Implementation of Block Structured Languages) Algol 60 (and Modula-2, Ada, etc) use lexical scoping. Early Lisps, APL, etc. use dynamic scoping. Unlike Fortran, binding of variable names to locations done at block entry time (in general, it can't be done statically) Blocks for efficient storage management begin ... begin real array x[1:1000]; ... end; ... begin real array y[1:2000]; ... end; end; The array x is allocated in the first block, then deallocated, and then the array y is allocated. Compare with Fortran equivalence statement. The Algol solution is safe, clear, and provides no opportunity for clever abuse of the type system. The bounds of array must be known at block entry time for example, in the declaration: begin integer array x[1:n]; ... end; n must be declared outside of the block in which x is declared Except for procedures as parameters, Algol can be statically type checked. The report doesn't say, but a reasonable implementation will be strongly typed. Control Structures goto if-then-else for loop switch goto -- like Fortran's, except for scope rules for labels (can only go to a label at the same lexical level, or an enclosing lexical level) if-then-else: both statement and expression forms y := if x=3 then 5 else 6; for loop: definite, indefinite iterations: baroque switch statement -- supports a kind of computed goto; now obsolete (case statement is better) parameter passing mechanisms: Algol 60 had call by name, call by value (see Parameter Passing) General Issues/Problems Syntax Goal of machine independence led to free format (in contrast to assumption of 80 column punchcards in Fortran). This is now the norm in programming languages, and most commonly used languages have an Algol-like flavor. Use of indentation to indicate program structure. 3 levels of representation: reference language (used in Algol 60 report) publication language (used for publishing algorithms) hardware representations publication language could have Greek letters, subscripts, etc. hardware representations could vary from implementation to implementation (cf comma - decimal point controversy) Approaches to the problem of words such as INTEGER or WHILE: reserved word approach (now generally used, e.g. in Ada, Java, etc) keywords (strict Algol approach) keywords in context (Fortran, PL/1) A few syntactic problems: Semicolon to separate statements. Using the semicolon as a separator is much more error prone than using the semicolon as a statement terminator. (Ada takes the latter approach.) For example, the following Algol code is syntactically incorrect: if x=4 then y:=5; else y:=6; (The semicolon after the y:=5 actually puts a null statement after the assignment, so that there are two statements after the "then", resulting in an error.) overly long names for common constructs (begin, end, comment); problem with semicolon terminating comments. dangling else problem if x=3 then if y=5 then z:=8 else z:=9; Algol's solution: this statement illegal -- need to write if x=3 then begin if y=5 then z:=8 else z:=9 end; or if x=3 then begin if y=5 then z:=8 end else z:=9 Information about formal parameters too spread out; odd way to indicate return value in a function (by assigning to the name of the function) integer procedure double(j); value j; integer j; double := 2*j The zero-one-infinity principle: The only reasonable numbers [in a programming language definition] are zero, one, and infinity. (From Bruce MacLennan, Principles of Programming Languages ) Examples: number of characters in an identifier, number of dimensions in an array, number of arguments to a function Algol obeys this principle much better than does Fortran Lots of semantic ambiguities. (BNF used for a precise description of syntax, but semantics specified in English.) Example: are the bounds of a for loop evaluated once before executing the loop, or at the beginning of each loop execution? A literal reading of the Report implies that they are evaluated each time; but this is inefficient and unclear. As an example of the problems that can arise from such ambiguity, in the DEC System 20 implementation of SIMULA we used to have, for this loop the upper bound was evaluated once: for i:=1 until n+1 do ... but here it is in effect evaluated each time (since the code just references n directly): for i:=1 until n do See Donald Knuth, The Remaining Trouble Spots in ALGOL 60, CACM, Vol 10 No. 10, 1967. Some other problems: OWN variables, SWITCH, side effects in functions labels, procedures, and strings are not first-class citizens in Algol 60. (This allows for more efficient compilation.) no I/O statements Epilog Algol-60 didn't achieve widespread use In the USA, Burroughs supported Algol-60, but IBM supported FORTRAN Algol-60 is extremely important language in the history of programming languages Many successors: Pascal, Modula-2, Ada, Euclid, Mesa, Emerald, ... Lots of good work done on lexical analysis, parsing, compilation techniques for block-structured languages, etc, which is still used in current compilers. Some issues of enduring interest that are prominent in Algol: side effects: whether to allow them at all, and if so how to manage them. One place this problem arises in Algol-60 is in expression evaluation, since the order of evaluation isn't specified. But side effects also make life complex for people and compilers trying to reason about programs. types; static type checking abstraction mechanisms: functions and procedures. Parameter passing. scoping and name control mechanisms: block structure, its virtues, and its limitations formal specification of syntax and semantics storage allocation and reclamation Extending Algol Tony Hoare: Algol-60 was an improvement over most of its successors PL/I -- Swiss army knife language Extensible languages kinds of extension: statements (procedures!) operators control structures syntactic extensibility Smalltalk-72 EL/I problems with languages with extensible syntax:
4768
dbpedia
3
17
https://www.coursehero.com/file/p7f9e1am/Based-on-the-experience-of-Fortran-I-Algol-58-was-announced-in-1958-Two/
en
[]
[]
[]
[ "" ]
null
[]
null
null
4768
dbpedia
2
6
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
en
Programming History: The Influence of Algol on Modern Programming Languages (Part 2)
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
[ "https://miro.medium.com/v2/resize:fill:64:64/1*dmbNkD5D-u45r44go_cf0g.png", "https://miro.medium.com/v2/resize:fill:88:88/2*u5JJwU4l_faX_5CCFuJu3w.jpeg", "https://miro.medium.com/v2/resize:fill:48:48/1*pKOfOAOvx-fWzfITATgGRg.jpeg", "https://miro.medium.com/v2/resize:fill:144:144/2*u5JJwU4l_faX_5CCFuJu3w.jpeg", "https://miro.medium.com/v2/resize:fill:64:64/1*pKOfOAOvx-fWzfITATgGRg.jpeg" ]
[]
[]
[ "" ]
null
[ "Mike McMillan" ]
2020-05-05T03:23:20.310000+00:00
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the…
en
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
Medium
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the years 1958 through 1960. In this article I’m going to discuss several of the technical features of Algol that influenced subsequent high-level programming languages such as BCPL and ultimately C. Data Types High-level programming languages before Algol did not have data types. Variables simply sprang into existing by assigning them a value, either an integer or a real number. Fortran did have a bit of data typing in that integers were to be defined using the letters I, J, K, L, M, and N. Any other variable that existed would be a real variable. Algol began the practice of declaring the type of variable as part of its declaration. This allowed the compiler to perform more efficient by knowing the type of number that was going to be stored in the variable before a number was assigned to the variable. However, as I discovered reading through a history of Algol that there wasn’t a real theory of data type built into the language until Algol 68. Here is an example of how data types are used in Algol 60. This program fragment reads some values from a data card(!):
4768
dbpedia
1
59
https://detreville.substack.com/p/the-ibm-701/comments
en
The IBM 701
https://substackcdn.com/…c4_1024x793.jpeg
https://substackcdn.com/…c4_1024x793.jpeg
[ "https://substackcdn.com/image/fetch/w_96,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fed689bbf-b515-4c6d-be0e-4dd81ec58608_671x671.png", "https://substackcdn.com/image/fetch/w_120,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfb5ff1f-204e-4dff-8434-f4922f95f5c4_1024x793.jpeg", "https://substackcdn.com/image/fetch/w_120,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdfb5ff1f-204e-4dff-8434-f4922f95f5c4_1024x793.jpeg", "https://substackcdn.com/image/fetch/w_64,h_64,c_fill,f_auto,q_auto:good,fl_progressive:steep,g_auto/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Flogged-out.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Forange.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F0bba99fe-176a-4536-97cd-66c9538716a4_400x400.jpeg", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Forange.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Forange.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F0bba99fe-176a-4536-97cd-66c9538716a4_400x400.jpeg", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Forange.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Forange.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F0bba99fe-176a-4536-97cd-66c9538716a4_400x400.jpeg", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Fblack.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F0bba99fe-176a-4536-97cd-66c9538716a4_400x400.jpeg", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Favatars%2Fblack.png", "https://substackcdn.com/image/fetch/w_66,h_66,c_fill,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F0bba99fe-176a-4536-97cd-66c9538716a4_400x400.jpeg" ]
[]
[]
[ "" ]
null
[ "John DeTreville" ]
2023-02-13T07:19:16.029000+00:00
"IBM's first computer"
en
https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc2c4013b-658f-4a7a-8b68-5185ef448d9a%2Ffavicon.ico
https://detreville.substack.com/p/the-ibm-701/comments
4768
dbpedia
1
52
https://www.hillelwayne.com/post/influential-dead-languages/
en
10 Most(ly dead) Influential Programming Languages
https://www.hillelwayne.com/favicon.ico
https://www.hillelwayne.com/favicon.ico
[ "https://www.hillelwayne.com/post/influential-dead-languages/IDSkeyboard.jpg", "https://www.hillelwayne.com/post/influential-dead-languages/java.jpg" ]
[]
[]
[ "" ]
null
[]
2020-03-25T00:00:00+00:00
The other day I read 20 most significant programming languages in history, a “preposterous table I just made up.” He certainly got preposterous right: he lists Go as “most significant” but not ALGOL, Smalltalk, or ML. He also leaves off Pascal because it’s “mostly dead”. Preposterous! That defeats the whole point of what “significant in history” means. So let’s talk about some “mostly dead” languages and why they matter so much.
en
/favicon.ico
Hillel Wayne
https://www.hillelwayne.com/post/influential-dead-languages/
The other day I read 20 most significant programming languages in history, a “preposterous table I just made up.” He certainly got preposterous right: he lists Go as “most significant” but not ALGOL, Smalltalk, or ML. He also leaves off Pascal because it’s “mostly dead”. Preposterous! That defeats the whole point of what “significant in history” means. So let’s talk about some “mostly dead” languages and why they matter so much. Disclaimer: Yeah not all of these are dead and not all of these are forgotten. Like most people have heard of Smalltalk, right? Also there’s probably like a billion mistakes in this, because when you’re doing a survey of 60 years of computing history you’re gonna get some things wrong. Feel free to yell at me if you see anything! Disclaimer 2: Yeah I know some of these are “first to invent” and others are “first to popularize”. History is complicated! Detecting Influence Before we start, a quick primer on finding influence. Just knowing that X was the first language with feature Z doesn’t mean that X actually influenced Z. While Absys was arguably the first logic programming language, almost all of logic programming actually stems from Prolog, which was developed independently. Ultimately there’s only one way to know for certain that X influenced Y: citation. This means one of Y cites X in its reference manual Y cites a paper that cites X The author of Y says “we were influenced by X.” Citations are transitive. Sometimes the language manual for Q lists motivating document R, which cites paper S as an inspiration, which mentions it got the ideas from language T. Then we know that T influenced Q, even if the chain is several steps long. This means digging through many sources to find a signal. To speed this up we use heuristics to decide where to look. One effective heuristic is programming language cognates. It’s very rare for languages to independently come up with the same syntax. So if two languages share some syntax, one likely influenced the other. For example: even without reading design decisions by Matz, we know that Ruby was influenced by Smalltalk, as they both filter a list with a select method. This isn’t conclusive evidence. Maybe Matz came up with it independently, or maybe Ruby and Smalltalk were both influenced by a common ancestor. But it gives us a place to start looking. The Languages COBOL Background: CODASYL, 1960. COBOL is shaped by the business/science split in computing. At that time high-level industry languages were either used for engineering computations or managing data. The engineers had all gotten behind FORTRAN while the business world was a mess of COMTRAN, FLOW-MATIC, and others, so the Department of Defense got a committee together to make a single universal business language. That’s COBOL. COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP. While we consider it a punchline today, it was once the most popular language in the world. It still runs a lot of our legacy business systems. Significance: In terms of syntax and semantics we don’t see much of COBOL in modern computing. COBOL’s most important addition is the concept of record data. In FORTRAN and ALGOL, your only data structure was the static array. In COBOL, though, you could read in structured files with hierarchical data, and it would automatically destructure them into the representative variables. This was a precursor to modern day structs. Cause of Death: Two factors here. One: COBOL had no overlap with other PLT efforts. Very few people built on COBOL. This meant that second or third generation languages, which built on the lessons of their ancestors, had almost no COBOL DNA. This was less intrinsic problem of COBOL and more because of the academia’s disdain for its creation process. CODASYL was a business group and obviously wasn’t worth paying attention to. COBOL was also enormously complex, even for today’s languages. This means that COBOL compilers lagged contemporaries on microcomputers and minicomputers, giving spaces for other languages to flourish and eventually outcompete it. ALGOL Background: The ALGOL committee, 1960. ALGOL-58 came out two years before but was quickly superseded, so I’m wrapping them into each other. The committee wanted to make a good language for researching algorithms. In other words, ALGOL was a formalized “pseudocode”. Of the four mother languages, ALGOL is the most “dead”; Everybody still knows about LISP, COBOL still powers tons of legacy systems, and most scientific packages still have some FORTRAN. But I’ve met plenty of programmers who haven’t even heard of ALGOL. You’d think it’d be the least important of the mother languages, but it’s the opposite. Of the four, only LISP comes anywhere close to the pervasive importance of ALGOL. Significance: Let’s see: lexical scoping, structured programming, nested functions, formal language specifications, call-by-name semantics, BNF grammars, block comments… every modern language today is deeply influenced by ALGOL. Cause of Death: ALGOL was a research language, not a commercial language. It was designed for studying algorithms. The spec didn’t define any I/O, which kinda made it impossible to use in practice. Sure, you could write a compiler extension, but then you might as well add other stuff too. And that’s exactly what people did. In 1960 and 70 people made a huge number of ALGOL-likes by extending ALGOL with I/O and extra data structures. This includes JOVIAL, SIMULA, CLU, and CPL. Later languages were then based off these extensions, not ALGOL directly. We call C an “ALGOL-like”, but it’s actually a BCPL-like, which was a CPL-like, which was an ALGOL-like. ALGOL’s children buried it. Eventually the ALGOL people tried to extend it into ALGOL-68, which radically departed from ALGOL-60 and hasn’t had close to the same influence. The ALGOL-60 lineage continues with Niklaus Wirth’s Pascal. APL Background: Ken Iverson, 1962. Originally a hand-written notation for array math, IBM picked it up and used as an programming language. As a language, APL focused on array processing: being able to concisely manipulate large blocks of numbers. If you’ve heard of APL before, you probably know it as “that weird symbol language”. One of the most notorious code snippets is this implementation of the Game of Life: You had to write it with a specialized keyboard, like this: Nonetheless, APL got popular on mainframes for running with very low memory requirements. Significance: Array processing. At a time when adding two lists of numbers meant a map or a loop, APL introduced the idea of operating on the entire array at once. For example: This was a really big deal in scientific circles. So much applied math boils down to large-scale operations on large matrices. When you can just take the outer product with ∘.f, it’s really damn easy to take outer products! Through this innovation APL lead to R, numpy, pandas, Matlab, etc. There’s also the direct descendants of APL: J, Dyalog, K, Q. They’ve been less successful but still see lots of use in the finance sector. Cause of Death: Well, the obvious problem is the keyboards. If you can’t write it in ASCII, you’re not going to write very much of it. Iverson fixed this with J, which uses digraphs instead of different symbols. Instead of ≠, you write ~:. This was in 1990, though, which is a bit late to popularize a radically different programming style. The subtler problem is that APL and J only worked on homogeneous data. You can’t store both strings and numbers in the same data structure (unless you use boxes, which is a whole other can of worms) and working with strings is generally a nightmare. So no dataframes, which excludes a lot of modern data science. Further Reading: Notation as a Tool of Thought BASIC Background: John Kemeny, 1964. Originally a simplified FORTRAN-like, intended to help people outside engineering use computers. BASIC really took off in the microcomputer era. The first microcomputers didn’t have enough memory to compile “real” programming languages, whereas you could cram a pared-down BASIC compiler into like 2 kilobytes. BASIC became a lingua franca for early-stage programmers. If you were programming at home in the 1970’s, you were probably writing BASIC on a microcomputer. Significance: The biggest technical impact is runtime interpretation. BASIC was the first language with a real-time interpreter (the Dartmouth Time Sharing System), beating APL by a year. And that APL system was only available to IBM customers, so really it was BASIC or nothing for a long time. BASIC had a bigger social impact. It brought programming to households, kids especially. Many of the influential programmers in the 80’s and 90’s first learned how to program on BASIC. Many enterprise programs were also written in BASIC, which probably helped accelerate the decline of Cobol. BASIC has one more neat trick up its sleeve: Office tooling! Microsoft eventually turned BASIC into Visual Basic, which they used as the Office macro language. This then spread to OpenOffice and LibreOffice, entrenching BASIC in that particular niche. More recently it’s lost ground to JavaScript and is now a legacy macro language. Cause of Death: People saw BASIC as a “lesser” language. You might use it if you were a kid or a small business owner, but real programmers used a real language. Once manufacturers could cheaply make microcomputers with more than 16k of RAM they started depreciating BASIC for languages like Pascal and C. BASIC lived on for a while as a legacy kids teaching language, but seems to have died out of that niche, too. PL/I Background: IBM, 1966. IBM’s business was split into two languages: FORTRAN for scientists and COMTRAN for business folk. Facing competition from COBOL and wanting to streamline their systems, they tried to make a language that was useful for both engineering and business purposes. This ended up looking like a sort of superset of the two languages, with a bunch of additional features stapled on top. Now everybody could use the same language and IBM can make a lot more money! Yaaaaaaaay Significance: The authors of ALGOL-68 mockingly called PL/I an obsolete language. But everything ALGOL-68 did, PL/I did earlier and better. While COBOL got structured data first, PL/I was the first language to implement them as a type. In COBOL, reading in a user with a name would give you two global variables, user and name. In PL/I, you’d get one variable with a field, user.name. PL/I was also the first high-level language with pointers for direct memory manipulation, constants, and function overloading. Many of these ideas entered mainstream programming via C, which was a mix of both BCPL and PL/I. C even uses PL/I’s comment syntax. Cause of Death: All the FORTRAN programmers thought it was too much like COBOL and all the COBOL programmers thought it was too much like FORTRAN. IBM had tried to take on two established languages with a much more complicated one. It didn’t help that they were the only group with the compiler, meaning everybody else was mistrustful of vendor lock-in. By the time IBM was able to make headway in both of these issues the wider computing world had already moved on to the microcomputer era, where PL/I was out competed by BASIC. Further Reading: The Choice of PL/I SIMULA 67 Background: Ole Dahl and Kristen Nygaard, 1967. They extended ALGOL for doing simulations. First they made SIMULA I, which had dedicated simulation and “activity” syntax. SIMULA I saw some early use, but the two were dissatisfied with how “specialized” the language felt and how much duplicate code they had in their simulations. They wanted to make a more general framework for representing things in general, not simulations only. Their idea was to allow users to define new types called “classes” with polymorphic function resolution. Then users could build the simulation features as a special case of the object system, making it easy to customize how it all worked to their particular needs. Significance: While SIMULA wasn’t the first “true” OOP language, it was the first language with proper objects and laid much of the groundwork that others would build on. This includes the class/object split, subclassing, virtual methods, and protected attributes. It inspired almost all of the academic research into objects after 1967. Both CLU and ML cited SIMULA as a major source of inspiration. Bjarne Stroustroup did his PhD on SIMULA, eventually incorporating a lot of its ideas into C++. Cause of Death: In that same PhD Stroustroup claimed that SIMULA was waaaaaay too slow to use at scale. “Good luck getting anything done if you aren’t on a mainframe” slow. It’s worth noting that Smalltalk-80, which took the same ideas even further, had an extra 13 years of Moore’s law behind it. And even Smalltalk was often mocked as too slow. Everybody went and implemented the ideas in SIMULA that they could integrate into faster, simpler languages. Further Reading: Compiling SIMULA: a historical study of technological genesis, The History of Simula Pascal Background: Niklaus Wirth, 1970. Made to capture the essence of ALGOL-60 after ALGOL-68 got waaaaaay too complicated for Wirth’s liking. It first got big as the “introduction to CS” language, and by the early 80’s was the second-most popular language on the Usenet job boards. Wirth considers the whole family- Pascal, Modula, and Oberon- as a single unified language concept. Significance: Pascal didn’t introduce any completely new ideas. It was an intentionally conservative language that tried to pick the best parts of the past decade and provide them in a unified package. Pascal brought ALGOL syntax outside academia, so much so that ALGOL’s assignment syntax, :=, got called “Pascal style” instead. From this point on most language features that look like ALGOL were more likely inspired by Pascal than directly by ALGOL itself. While Pascal wasn’t very innovative, variants of it were. Wirth also pioneered the idea of “stepwise refinement” as a means of writing rigorous software. This eventually lead to the Modulas, which popularized first class software modules, and Euclid, the first formal verification language to see production use. Cause of Death: I’m calling a mulligan on this one. Unlike most of the other ones on this list, Pascal didn’t have major structural barriers or a sharp competitor. Sure, it competed with C, but it was still doing fine for a very long time. People usually attribute the Why Pascal is not my favorite language essay, but that’s too neat of an answer and history is a lot messier. Also, Delphi is still pretty high-ranked in the TIOBE and PYPA measurements, so it’s not exactly dead in the same way SIMULA is. An accurate analysis of the fall of Pascal would be longer than the rest of this essay. Further Reading: The Programming Language Pascal, Pascal and its Successors CLU Background: Barbara Liskov, 1975. Liskov wanted to mess around with abstract data types. That’s it. That’s the whole reason for CLU. Significance: CLU might be the most influential language that nobody’s ever heard of. Iterators? CLU. Abstract data types? CLU. Generics? CLU. Checked exceptions? CLU. We didn’t adopt the same terminology, so it’s not 100% obvious it all comes from CLU, but still. Every language spec for the next decade would namedrop CLU. CLU did a lot. Cause of Death: CLU was a showcase language; Liskov wanted to get people to adopt her ideas, not her specific language. And they did: almost every language today owes something to CLU. As soon as she completed CLU she moved on to Argus, which was supposed to showcase her ideas on concurrency. That hasn’t seen nearly the same adoption, and there’s still a lot of stuff in it left to mine. Further reading: A History of CLU ML Background: Robin Milner, 1976. Milner was building the LCF Prover, one of the first proof assistants. If you wrote a proof in the right format, LCF could check to see if it was correct or not. To assist in writing the proofs, Milner created a metalanguage based on sound mathematical formalisms, which at the time meant strict static types and higher-order functions. Eventually ML was standardized as Standard ML. Significance: ML is arguably the oldest “algebraic programming language”. There’s a lot of stuff we attribute to ML: algebraic data types, modules, typed functional programming. Surprisingly, it was not the first for a lot of these! The first ML was just designed to work with LCF and wasn’t a general purpose language, so lacked a lot of these features. As people started making it more general they pulled ideas from other research languages and incorporated them into ML. One very important idea did start in ML, though: type inference. ML was the first statically-typed language where you didn’t have to write the types out, as the compiler would figure out the types for you. This paved the way for typed FP to escape academia and enter production use. ML also greatly influenced modern theorem provers. The “program” languages for Isabelle, CVC3, and Coq are ML-based. And a lot of type theory was based on ML, though in more recent years the Haskell branch of FP has become more popular. Cause of Death: ML had a lot of interesting features, but people paid attention to it for the type inference. At the time ML was still a special purpose language for the theorem provers. SML came out the same year as Haskell, which was a much “purer” example of a typed FP language. Smalltalk Background: Alan Kay, 1972, 1976, and 1980. It’s sort of a moving target. Smalltalk-72 was the first, Smalltalk-76 introduced the idea of “object-oriented programming” to the wider world, and Smalltalk-80 was the one that saw widespread adoption. Smalltalk wasn’t the first language with objects but it was the first “object-oriented” one. The difference was that Simula had objects in addition to primitives like numbers and booleans, while in Smalltalk, booleans were also objects. I wrote a bit about this here if you want to learn more. Significance: We sometimes think that Smalltalk is “true” OOP and things like Java and Python aren’t “real” OOP, but that’s not true. OOP is a giant mess of many different influences, just like every other paradigm. But it was certainly the thing that popularized the idea. If you crack open any general theory OOP book from the mid-80’s or early 90’s, they’ll be in Smalltalk. Many will also translate their examples to C++, and a few will use another language, but everybody will use Smalltalk. Smalltalk also spread the idea of objects as shareable data, leading the way to CORBA, and it inspired the computational Actor model. Cause of Death: The common belief is that Smalltalk lost because people used C++ instead. But that’s untrue. Smalltalk did have some issues, specifically its difficulty interoping with other tools and poor runtime performance. But even into the 1990’s Smalltalk was doing respectable business and many people assumed it would be a dominant business language. Then Java happened. Smalltalk wasn’t the only casualty of the “Javapocalypse”: Java also marginalized Eiffel, Ada95, and pretty much everything else in the OOP world. The interesting question isn’t “Why did Smalltalk die”, it’s “Why did C++ survive”. I think it’s because C++ had better C interop so was easier to extend into legacy systems. This is just a small sample of the important dead languages. I didn’t cover ALPHARD, ALTRAN, Argus, Automath, BCPL, COMTRAN, CPL, Eiffel, FLOW-MATIC, HOPE, Hypercard, ISWIM, JOVIAL, MacSyma, Mesa, Miranda, Multics Shell, PLANNER, SMP, Sketchpad, or SNOBOL. All of them contributed in their own way to the modern programming world. History is complicated. Most influential languages never went mainstream. Few people used any one of them. But each one inspired people, who inspired other people, so the DNA of these forgotten languages appear decades after they’re forgotten. But there are also untold languages that didn’t get their ideas out. The Encyclopaedia of Programming Languages lists over 8,000 programming languages. Many of them had ideas that never left their bubble. Consider how much we’d have lost if nobody had heard of SIMULA, or Liskov never shared CLU. That’s one reason I love studying history. To learn what we’ve lost and find it again. The first draft of this was originally shared on my newsletter. If you found this interesting, why not subscribe?
4768
dbpedia
1
6
https://en.wikipedia.org/wiki/ALGOL
en
Wikipedia
https://upload.wikimedia…t_al_-_cover.jpg
https://upload.wikimedia…t_al_-_cover.jpg
[ "https://en.wikipedia.org/static/images/icons/wikipedia.png", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg", "https://upload.wikimedia.org/wikipedia/commons/thumb/c/c8/1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg/121px-1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/8/84/Algol%26Fortran_family-by-Borkowski.svg/220px-Algol%26Fortran_family-by-Borkowski.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/f/f2/Edit-clear.svg/40px-Edit-clear.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/9/9c/Flag_of_Denmark.svg/20px-Flag_of_Denmark.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/0/03/Flag_of_Italy.svg/23px-Flag_of_Italy.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/c/c3/Flag_of_France.svg/23px-Flag_of_France.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/a/a9/Flag_of_the_Soviet_Union.svg/23px-Flag_of_the_Soviet_Union.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/3/3d/Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg/23px-Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/1/12/Flag_of_Poland.svg/23px-Flag_of_Poland.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Flag_of_the_People%27s_Republic_of_China.svg/23px-Flag_of_the_People%27s_Republic_of_China.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/1/1c/Wiki_letter_w_cropped.svg/44px-Wiki_letter_w_cropped.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/8/8a/OOjs_UI_icon_edit-ltr-progressive.svg/10px-OOjs_UI_icon_edit-ltr-progressive.svg.png", "https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1", "https://en.wikipedia.org/static/images/footer/wikimedia-button.svg", "https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg" ]
[]
[]
[ "" ]
null
[ "Contributors to Wikimedia projects" ]
2001-08-26T23:02:19+00:00
en
/static/apple-touch/wikipedia.png
https://en.wikipedia.org/wiki/ALGOL
Family of programming languages This article is about the programming language family. For other uses, see Algol (disambiguation). ALGOLParadigmProcedural, imperative, structuredFamilyALGOLDesigned byBauer, Bottenbruch, Rutishauser, Samelson, Backus, Katz, Perlis, Wegstein, Naur, Vauquois, van Wijngaarden, Woodger, Green, McCarthyFirst appeared1958; 66 years ago ( )Typing disciplineStatic, strongScopeLexicalInfluencedMost subsequent imperative languages (including so-called ALGOL-like languages) e.g. PL/I, Simula, Pascal, C and Scheme ALGOL ( ; short for "Algorithmic Language")[1] is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years.[2] In the sense that the syntax of most modern languages is "Algol-like",[3] it was arguably more influential than three other high-level programming languages among which it was roughly contemporary: FORTRAN, Lisp, and COBOL.[4] It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/I, Simula, BCPL, B, Pascal, Ada, and C. ALGOL introduced code blocks and the begin...end pairs for delimiting them. It was also the first language implementing nested function definitions with lexical scope. Moreover, it was the first programming language which gave detailed attention to formal language definition and through the Algol 60 Report introduced Backus–Naur form, a principal formal grammar notation for language design. There were three major specifications, named after the years they were first published: ALGOL 58 – originally proposed to be called IAL, for International Algebraic Language. ALGOL 60 – first implemented as X1 ALGOL 60 in 1961. Revised 1963.[5][6][7] ALGOL 68 – introduced new elements including flexible arrays, slices, parallelism, operator identification. Revised 1973.[8] ALGOL 68 is substantially different from ALGOL 60 and was not well received,[according to whom?] so reference to "Algol" is generally understood to mean ALGOL 60 and its dialects.[citation needed] History [edit] ALGOL was developed jointly by a committee of European and American computer scientists in a meeting in 1958 at the Swiss Federal Institute of Technology in Zurich (cf. ALGOL 58).[9] It specified three different syntaxes: a reference syntax, a publication syntax, and an implementation syntax, syntaxes that permitted it to use different keyword names and conventions for decimal points (commas vs periods) for different languages.[citation needed] ALGOL was used mostly by research computer scientists in the United States and in Europe; commercial applications were hindered by the absence of standard input/output facilities in its description, and the lack of interest in the language by large computer vendors (other than Burroughs Corporation).[citation needed] ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development.[according to whom?] John Backus developed the Backus normal form method of describing programming languages specifically for ALGOL 58. It was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur form.[10] Peter Naur: "As editor of the ALGOL Bulletin I was drawn into the international discussions of the language and was selected to be member of the European language design group in November 1959. In this capacity I was the editor of the ALGOL 60 report, produced as the result of the ALGOL 60 meeting in Paris in January 1960."[11] The following people attended the meeting in Paris (from 1 to 16 January):[citation needed] Friedrich Ludwig Bauer, Peter Naur, Heinz Rutishauser, Klaus Samelson, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger (from Europe) John Warner Backus, Julien Green, Charles Katz, John McCarthy, Alan Jay Perlis, and Joseph Henry Wegstein (from the US). Alan Perlis gave a vivid description of the meeting: "The meetings were exhausting, interminable, and exhilarating. One became aggravated when one's good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent."[This quote needs a citation] ALGOL 60 inspired many languages that followed it. Tony Hoare remarked: "Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors."[12] The Scheme programming language, a variant of Lisp that adopted the block structure and lexical scope of ALGOL, also adopted the wording "Revised Report on the Algorithmic Language Scheme" for its standards documents in homage to ALGOL.[13] ALGOL and programming language research [edit] As Peter Landin noted,[citation needed] ALGOL was the first language to combine seamlessly imperative effects with the (call-by-name) lambda calculus.[citation needed] Perhaps the most elegant formulation of the language is due to John C. Reynolds, and it best exhibits its syntactic and semantic purity.[according to whom?] Reynolds's idealized ALGOL also made a convincing methodologic argument regarding the suitability of local effects in the context of call-by-name languages, in contrast with the global effects used by call-by-value languages such as ML.[citation needed] The conceptual integrity of the language made it one of the main objects of semantic research, along with Programming Computable Functions (PCF) and ML.[citation needed] IAL implementations timeline [edit] To date there have been at least 70 augmentations, extensions, derivations and sublanguages of Algol 60.[14] Name Year Author Country Description Target CPU ZMMD-implementation 1958 Friedrich L. Bauer, Heinz Rutishauser, Klaus Samelson, Hermann Bottenbruch Germany implementation of ALGOL 58 Z22 (later Zuse's Z23 was delivered with an Algol 60 compiler)[15] X1 ALGOL 60 1960 August[16] Edsger W. Dijkstra and Jaap A. Zonneveld Netherlands First implementation of ALGOL 60[17] Electrologica X1 Elliott ALGOL 1960s C. A. R. Hoare UK Subject of the 1980 Turing Award Lecture[18] Elliott 803, Elliott 503, Elliott 4100 series JOVIAL 1960 Jules Schwartz US A DOD HOL prior to Ada Various (see article) Burroughs Algol (Several variants) 1961 Burroughs Corporation (with participation by Hoare, Dijkstra, and others) US Basis of the Burroughs (and now Unisys MCP based) computers Burroughs Large Systems and their midrange also. Case ALGOL 1961 Case Institute of Technology[19] US Simula was originally contracted as a simulation extension of the Case ALGOL UNIVAC 1107 GOGOL 1961 William M. McKeeman US For ODIN time-sharing system[20] PDP-1 RegneCentralen ALGOL 1961 Peter Naur, Jørn Jensen Denmark Implementation of full Algol 60 DASK at Regnecentralen Dartmouth ALGOL 30 1962 Thomas Eugene Kurtz et al. US LGP-30 USS 90 Algol 1962 L. Petrone Italy ALGOL 60 1962 Bernard Vauquois, Louis Bolliet[21] France Institut d'Informatique et Mathématiques Appliquées de Grenoble (IMAG) and Compagnie des Machines Bull Bull Gamma 60 Algol Translator 1962 G. van der Mey and W.L. van der Poel Netherlands Staatsbedrijf der Posterijen, Telegrafie en Telefonie ZEBRA Kidsgrove Algol 1963 F. G. Duncan UK English Electric Company KDF9 VALGOL 1963 Val Schorre US A test of the META II compiler compiler Whetstone 1964 Brian Randell and L. J. Russell UK Atomic Power Division of English Electric Company. Precursor to Ferranti Pegasus, National Physical Laboratories ACE and English Electric DEUCE implementations. English Electric Company KDF9 NU ALGOL 1965 Norway UNIVAC ALGEK 1965 Soviet Union АЛГЭК, based on ALGOL-60 and COBOL support, for economical tasks Minsk-22 ALGOL W 1966 Niklaus Wirth US Proposed successor to ALGOL 60 IBM System/360 MALGOL 1966 publ. A. Viil, M Kotli & M. Rakhendi, Estonian SSR Minsk-22 ALGAMS 1967 GAMS group (ГАМС, группа автоматизации программирования для машин среднего класса), cooperation of Comecon Academies of Science Comecon Minsk-22, later ES EVM, BESM ALGOL/ZAM 1967 Poland Polish ZAM computer Simula 67 1967 Ole-Johan Dahl and Kristen Nygaard Norway Algol 60 with classes UNIVAC 1107 Triplex-ALGOL Karlsruhe 1967/1968 Karlsruhe, Germany ALGOL 60 (1963) with triplex numbers for interval arithmetic [22] Chinese Algol 1972 China Chinese characters, expressed via the Symbol system DG/L 1972 US DG Eclipse family of Computers S-algol 1979 Ron Morrison UK Addition of orthogonal datatypes with intended use as a teaching language PDP-11 with a subsequent implementation on the Java VM The Burroughs dialects included special Bootstrapping dialects such as ESPOL and NEWP. The latter is still used for Unisys MCP system software. Properties [edit] ALGOL 60 as officially defined had no I/O facilities; implementations defined their own in ways that were rarely compatible with each other. In contrast, ALGOL 68 offered an extensive library of transput (input/output) facilities. ALGOL 60 allowed for two evaluation strategies for parameter passing: the common call-by-value, and call-by-name. Call-by-name has certain effects in contrast to call-by-reference. For example, without specifying the parameters as value or reference, it is impossible to develop a procedure that will swap the values of two parameters if the actual parameters that are passed in are an integer variable and an array that is indexed by that same integer variable.[23] Think of passing a pointer to swap(i, A[i]) in to a function. Now that every time swap is referenced, it is reevaluated. Say i := 1 and A[i] := 2, so every time swap is referenced it will return the other combination of the values ([1,2], [2,1], [1,2] and so on). A similar situation occurs with a random function passed as actual argument. Call-by-name is known by many compiler designers for the interesting "thunks" that are used to implement it. Donald Knuth devised the "man or boy test" to separate compilers that correctly implemented "recursion and non-local references." This test contains an example of call-by-name. ALGOL 68 was defined using a two-level grammar formalism invented by Adriaan van Wijngaarden and which bears his name. Van Wijngaarden grammars use a context-free grammar to generate an infinite set of productions that will recognize a particular ALGOL 68 program; notably, they are able to express the kind of requirements that in many other programming language standards are labelled "semantics" and have to be expressed in ambiguity-prone natural language prose, and then implemented in compilers as ad hoc code attached to the formal language parser. Examples and portability [edit] Code sample comparisons [edit] ALGOL 60 [edit] (The way the bold text has to be written depends on the implementation, e.g. 'INTEGER'—quotation marks included—for integer. This is known as stropping.) procedure Absmax(a) Size:(n, m) Result:(y) Subscripts:(i, k); value n, m; array a; integer n, m, i, k; real y; comment The absolute greatest element of the matrix a, of size n by m, is copied to y, and the subscripts of this element to i and k; begin integer p, q; y := 0; i := k := 1; for p := 1 step 1 until n do for q := 1 step 1 until m do if abs(a[p, q]) > y then begin y := abs(a[p, q]); i := p; k := q end end Absmax Here is an example of how to produce a table using Elliott 803 ALGOL.[24] FLOATING POINT ALGOL TEST' BEGIN REAL A,B,C,D' READ D' FOR A:= 0.0 STEP D UNTIL 6.3 DO BEGIN PRINT PUNCH(3),££L??' B := SIN(A)' C := COS(A)' PRINT PUNCH(3),SAMELINE,ALIGNED(1,6),A,B,C' END END' ALGOL 68 [edit] The following code samples are ALGOL 68 versions of the above ALGOL 60 code samples. ALGOL 68 implementations used ALGOL 60's approaches to stropping. In ALGOL 68's case tokens with the bold typeface are reserved words, types (modes) or operators. proc abs max = ([,]real a, ref real y, ref int i, k)real: comment The absolute greatest element of the matrix a, of size ⌈a by 2⌈a is transferred to y, and the subscripts of this element to i and k; comment begin real y := 0; i := ⌊a; k := 2⌊a; for p from ⌊a to ⌈a do for q from 2⌊a to 2⌈a do if abs a[p, q] > y then y := abs a[p, q]; i := p; k := q fi od od; y end # abs max # Note: lower (⌊) and upper (⌈) bounds of an array, and array slicing, are directly available to the programmer. floating point algol68 test: ( real a,b,c,d; # printf – sends output to the file stand out. # # printf($p$); – selects a new page # printf(($pg$,"Enter d:")); read(d); for step from 0 while a:=step*d; a <= 2*pi do printf($l$); # $l$ - selects a new line. # b := sin(a); c := cos(a); printf(($z-d.6d$,a,b,c)) # formats output with 1 digit before and 6 after the decimal point. # od ) Timeline: Hello world [edit] The variations and lack of portability of the programs from one implementation to another is easily demonstrated by the classic hello world program.[citation needed] ALGOL 58 (IAL) [edit] Main article: ALGOL 58 ALGOL 58 had no I/O facilities. ALGOL 60 family [edit] Main article: ALGOL 60 Since ALGOL 60 had no I/O facilities, there is no portable hello world program in ALGOL. The next three examples are in Burroughs Extended Algol. The first two direct output at the interactive terminal they are run on. The first uses a character array, similar to C. The language allows the array identifier to be used as a pointer to the array, and hence in a REPLACE statement. BEGIN FILE F(KIND=REMOTE); EBCDIC ARRAY E[0:11]; REPLACE E BY "HELLO WORLD!"; WRITE(F, *, E); END. A simpler program using an inline format: BEGIN FILE F(KIND=REMOTE); WRITE(F, <"HELLO WORLD!">); END. An even simpler program using the Display statement. Note that its output would end up at the system console ('SPO'): BEGIN DISPLAY("HELLO WORLD!") END. An alternative example, using Elliott Algol I/O is as follows. Elliott Algol used different characters for "open-string-quote" and "close-string-quote", represented here by ‘ and ’ . programHiFolks; begin print‘Helloworld’ end; Below is a version from Elliott 803 Algol (A104). The standard Elliott 803 used five-hole paper tape and thus only had upper case. The code lacked any quote characters so £ (UK Pound Sign) was used for open quote and ? (Question Mark) for close quote. Special sequences were placed in double quotes (e.g. ££L?? produced a new line on the teleprinter). HIFOLKS' BEGIN PRINT £HELLO WORLD£L??' END' The ICT 1900 series Algol I/O version allowed input from paper tape or punched card. Paper tape 'full' mode allowed lower case. Output was to a line printer. The open and close quote characters were represented using '(' and ')' and spaces by %.[25] 'BEGIN' WRITE TEXT('('HELLO%WORLD')'); 'END' ALGOL 68 [edit] Main article: ALGOL 68 ALGOL 68 code was published with reserved words typically in lowercase, but bolded or underlined. begin printf(($gl$,"Hello, world!")) end In the language of the "Algol 68 Report" the input/output facilities were collectively called the "Transput". Timeline of ALGOL special characters [edit] The ALGOLs were conceived at a time when character sets were diverse and evolving rapidly; also, the ALGOLs were defined so that only uppercase letters were required. 1960: IFIP – The Algol 60 language and report included several mathematical symbols which are available on modern computers and operating systems, but, unfortunately, were unsupported on most computing systems at the time. For instance: ×, ÷, ≤, ≥, ≠, ¬, ∨, ∧, ⊂, ≡, ␣ and ⏨. 1961 September: ASCII – The ASCII character set, then in an early stage of development, had the \ (Back slash) character added to it in order to support ALGOL's boolean operators /\ and \/.[26] 1962: ALCOR – This character set included the unusual "᛭" runic cross[27] character for multiplication and the "⏨" Decimal Exponent Symbol[28] for floating point notation.[29][30][31] 1964: GOST – The 1964 Soviet standard GOST 10859 allowed the encoding of 4-bit, 5-bit, 6-bit and 7-bit characters in ALGOL.[32] 1968: The "Algol 68 Report" – used extant ALGOL characters, and further adopted →, ↓, ↑, □, ⌊, ⌈, ⎩, ⎧, ○, ⊥, and ¢ characters which can be found on the IBM 2741 keyboard with typeball (or golf ball) print heads inserted (such as the APL golf ball). These became available in the mid-1960s while ALGOL 68 was being drafted. The report was translated into Russian, German, French, and Bulgarian, and allowed programming in languages with larger character sets, e.g., Cyrillic alphabet of the Soviet BESM-4. All ALGOL's characters are also part of the Unicode standard and most of them are available in several popular fonts. 2009 October: Unicode – The ⏨ (Decimal Exponent Symbol) for floating point notation was added to Unicode 5.2 for backward compatibility with historic Buran programme ALGOL software.[33] Legacy [edit] A significant contribution of the ALGOL 58 Report was to provide standard terms for programming concepts: statement, declaration, type, label, primary, block, and others.[34] See also [edit] References [edit] Further reading [edit] O'Hearn, Peter; Tennent, Robert D., eds. (1997). Algol-like Languages. Vol. 1. Cambridge, MA: Birkhauser. doi:10.1007/978-1-4612-4118-8. ISBN 9780817638801. S2CID 6273486. The first volume of a two volume set, that includes an introduction from Peter W. O’Hearn and Robert D. Tennent, a report on Algol 60 from Peter Naur and colleagues, four chapters from John C. Reynolds, and other chapters from Christopher Strachey, Matthias Felleisen, Stephen Weeks, Albert R. Meyer, Kurt Sieber, Vipin Swarup, Uday S. Reddy, and Evan Ireland. Baumann, Richard [in German]; Feliciano, Manuel; Bauer, Friedrich Ludwig; Samelson, Klaus (1964). Introduction to ALGOL–A Primer for the Non-Specialist, Emphasizing the Practical Uses of the Algorithmic Language. Automatic Computation. Englewood Cliffs, NJ: Prentice-Hall, Inc. ISBN 0134778286. LCCN 64-10740. ark:/13960/t6qz35p37 . Randell, Brian & L. J. Russell (1964). ALGOL 60 Implementation: The Translation and Use of ALGOL 60 Programs on a Computer. Academic Press. CiteSeerX 10.1.1.737.475. . On the design of the Whetstone Compiler, and one of the early published descriptions of implementing a compiler. For a related paper, see Brian Randell's Whetstone Algol Revisited Archived 27 February 2008 at the Wayback Machine, and The Whetstone KDF9 Algol Translator. Dijkstra, E. W (1961), ALGOL 60 Translation: An ALGOL 60 Translator for the X1 and Making a Translator for ALGOL 60 (PDF), report MR 35/61, Amsterdam: Mathematisch Centrum, archived (PDF) from the original on 9 October 2022 Kruseman Aretz, Frans E.J., The Dijkstra–Zonneveld ALGOL 60 Compiler for the Electrologica X1 (PDF), Historical note SEN, 2, Amsterdam: Centrum voor Wiskunde en Informatica, archived (PDF) from the original on 9 October 2022 Goos, Gerhard [in German] (7 August 2017). Geschichte der Deutschsprachigen Informatik - Programmiersprachen und Übersetzerbau [History of Informatics in German-Speaking Countries - Programming Languages and Compiler Design] (PDF) (in German). Karlsruhe, Germany: Fakultät für Informatik, Karlsruhe Institute of Technology. Archived (PDF) from the original on 19 May 2022 .
4768
dbpedia
2
46
https://langdev.stackexchange.com/questions/2935/why-do-programming-languages-use-the-asterisk-for-multiplication
en
Why do programming languages use the asterisk * for multiplication?
https://cdn.sstatic.net/…g?v=9f1dac205d5e
https://cdn.sstatic.net/…g?v=9f1dac205d5e
[ "https://i.sstatic.net/cZDaT.jpg?s=64", "https://i.sstatic.net/JG40K.png?s=64", "https://i.sstatic.net/7Wi05.png?s=64", "https://i.sstatic.net/LzHYN.jpg?s=64", "https://i.sstatic.net/0pja9.png?s=64", "https://i.sstatic.net/VzGOD.png?s=64", "https://www.gravatar.com/avatar/46ac2ffdab439207d4a590f00da53096?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/2bb7485134a4957636621aa28e6f9fae?s=64&d=identicon&r=PG", "https://langdev.stackexchange.com/posts/2935/ivc/acfa?prg=884a85c7-f48d-48d2-a437-aa694bdf4e34" ]
[]
[]
[ "" ]
null
[ "Hashim Aziz" ]
2023-09-20T22:32:42
Having had very little math(s) education I'm trying to bring myself up to speed for university, which currently involves teaching myself the times tables, where X is frequently used as the operator...
en
https://cdn.sstatic.net/Sites/langdev/Img/favicon.ico?v=9c9246580b4a
Programming Language Design and Implementation Stack Exchange
https://langdev.stackexchange.com/questions/2935/why-do-programming-languages-use-the-asterisk-for-multiplication
Maybe FORTRAN, maybe MATH-MATIC FORTRAN's Preliminary Report in 1954 anticipated using the × symbol for multiplication (and ×× for exponentiation). The 1956 Programmer's Reference Manual, which is the first manual for a working version, notes that the multiplication symbol is *. In the 1978 HOPL-I history of FORTRAN, John Backus claimed that the team were not influenced by any other languages as far as design goes, and specifically notes MATH-MATIC as a non-influence. Some syntax changes were made due to the 48 character set limit on punched cards (even the < symbol wasn't available, hence .LT.), but the team felt that language design was the easy part, and focused their efforts on compilation. So we can say that some time between November 1954 and October 1956, FORTRAN adopted * as the multiplication symbol, and they claimed not to be influenced by any other work. So that's what was happening at IBM. Back over at Remington Rand, Grace Hopper's A-2 language, released around 1951-2 for the UNIVAC I, used a text mnemonic for multiplication. The next iteration, A-3, was renamed by Remington Rand as ARITH-MATIC, which used essentially the same mnemonic. However, the arithmetic translator layer AT-3, named MATH-MATIC, was developed around the same time, in 1955. The preliminary manual was published in April 1957 and shows the * symbol for multiplication. So it appears that FORTRAN and MATH-MATIC settled on this syntax independently. The implementations happened concurrently. FORTRAN shipped first, with AT-3 shipping less than a year later. So while FORTRAN certainly shipped *-as-multiplication first, which one of the two designed it first and which one implemented it first... that's probably lost to history. The Hopper family of languages became COBOL in 1959, and between them, FORTRAN and COBOL heavily influenced the arithmetic syntax of all subsequent languages. I believe the first language to use * for multiplication was FORTRAN, in its original specification in 1956 (with earlier drafts). None of the other symbols you suggest except the full stop were available in the character set in use on the machines, which predated the high-level language design and was taken as a given. It's only a relatively recent development that these other symbols are all in sufficiently widespread encodings together that choosing between them is an option. However, in 1954 the FORTRAN authors were considering that a programmer would like to write $\sum a_{ij} \cdot b_{jk}$ instead of the fairly involved set of instructions corresponding to this expression as part of the work leading into FORTRAN. Clearly none of this made it into the language in that form: the symbols and subscripting just weren't feasible on the machines of the time, but X + Y also considered did, along with X * Y. This was all for a punch-card system on the IBM 704, which used BCD encoding with 48 total code points. In some respects the assignment of symbols to codes there is arbitrary, but encodings did correspond to real physical hardware for printing and input. * is one of the few available characters without a more obvious mathematical meaning and has some resemblence to both × and ·, so it's not an unreasonable choice. Subsequent languages have inherited this convention, often via ALGOL or BASIC. Similarly, "keyboard" symbols have a significant advantage over others. * was present on ordinary typewriters as a marker, and no standard layout has either of the others now. It also made its way into ASCII for the same reason. Much of this is historical contingency, rather than an overarching design, but once the symbol is in use then there is also value in consistency, which reinforces making it available, which reinforces using it. Applied Science Division and Programming Research Dept, International Business Machines Corporation. 1956. FORTRAN Automatic Coding System for the IBM 704: Programmer's Reference Manual. Backus, John W. and Herrick, Harlan. 1954. IBM 701 Speedcoding and other automatic programming systems. In Proceedings of the Symposium on Automatic Programming for Digital Computers. The fundamental reason is because neither true multiplication symbols × nor ⋅ were (or are) common on US keyboards. Using symbols that are not available on the keyboard is a far bigger problem than learning a new symbol for multiplication because after the trivial learning bump of associating * with multiplication it has no real cost whereas as needing characters off the keyboard or using a compound symbol (e.g. .x.) is a permanent slowdown. The alternatives of x and . have major problems of their own. Using x would have caused parsing problems and stopped x being used as a variable. Using . would have been even worse as it clashes with the use of a full stop as the decimal separator that is normal among English speakers. There could, presumably, have been other symbols used but * is the one closest in appearance to the × and most other symbols already had established meanings that would have prevented their use. FORTRAN seems to have been the first language to adopt the convention in 1956 with the manual simply stating "The symbol * denotes multiplication" without explanation. From there it seems to have become ubiquitous but given how other languages have changed almost every convention in one way or another, the origin in FORTRAN does not seem to me to be a particularly important justification for the usage and I think the reasons given at the start of this answer are more important than historical accident - ultimately * is simply the only good solution for English speakers, perhaps had German speakers driven the development of programming languages we'd all use , as the decimal separator and . as multiplication. FORTRAN I Based on some quick research, it appears that FORTRAN I was the first language to use * for multiplication. You can see a sample of FORTRAN I code here, which does appear to use *[1] READ 1, N, (A(1), 1 * l,N) ... DO 20, I * 2,N (copied out the best I can read). There are also several examples around of FORTRAN II using * for multiplication[2], so it's really more up to whether FORTRAN I or II did it first. You'd have to go back and either read the original FORTRAN tech reports or similar to figure out why they used *, but I'm assuming it's exactly because it looks similar to an interpunct. As an anecdote, this would rather funnily indicate that using * for multiplication is quite literally as old as programming languages. [1] http://www.paulgraham.com/history.html [2] https://github.com/scivision/fortran-II-examples/blob/main/funcs58.f eg: term = -prevxp / presxp * y * term
4768
dbpedia
1
13
https://www.britannica.com/list/influential-computer-programming-languages
en
Influential Computer Programming Languages
https://cdn.britannica.c…347/computer.jpg
https://cdn.britannica.c…347/computer.jpg
[ "https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png", "https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png", "https://cdn.britannica.com/09/198609-131-0FECF4D1/insides-desktop-computer.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/63/166063-131-0E023062/computer-chip-processing-unit-history-society-science.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/93/149293-131-2CD523EA/Technician-computer-system-console-UNIVAC-Corona-Annex-June-1-1981.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/29/172929-131-5AAB8A77/businessman-touch-screen-tablet-Corporate-Business-Display.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/93/149293-131-2CD523EA/Technician-computer-system-console-UNIVAC-Corona-Annex-June-1-1981.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/54/102254-131-ED1A2CE1/Nicolaus-Copernicus.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/65/129565-131-2464D7ED/Pleurosigma-angulatum-Phytoplankton-parts-North-Atlantic-Ocean.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/73/124073-131-EA3EACC6/Tug-of-war-US-Naval-Academy-Maryland-Annapolis-2005.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/96/173996-131-3DDAD81A/Fish-Lionfish-Lion-fish-Turkey-fish-Red-lionfish.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/51/190751-131-B431C216/soccer-ball-goal.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/45/209245-131-C6D09B32/gray-squirrel-acorn.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/90/202690-131-1D29B008/colorful-winter-sunset.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/58/105358-131-083D7290/Battle-of-Gettysburg-Picketts-Charge-attempts-South-3-1863.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/36/162636-131-E4AA93A0/Colosseum-Rome-Italy.jpg?w=68&h=68&s=crop", "https://cdn.britannica.com/30/199930-131-B3D1D347/computer.jpg?w=1000" ]
[]
[]
[ "computers", "computer programming", "programming languages", "computer languages", "FORTRAN", "BASIC", "C" ]
null
[ "Erik Gregersen" ]
null
A short history of several prominent computer programming languages.
en
/favicon.png
Encyclopedia Britannica
https://www.britannica.com/list/influential-computer-programming-languages
4768
dbpedia
2
11
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
en
Notes on ALGOL
https://unacademy.com/co…1/08/favicon.png
https://unacademy.com/co…1/08/favicon.png
[ "https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/logo-1.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165846/avatar.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165845/user.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165843/settings.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165841/holidays.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165839/logout.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165837/u_facebook-f.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165835/u_youtube.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165834/u_twitter.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165832/u_instagram.svg", "https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165830/u_linkedin.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/exam-updates.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/study-material.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/tests-and-practices.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/rankers-guide.svg", "https://unacademy.com/content/wp-content/uploads/sites/2/2021/08/faq.svg", "https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=256 1x, https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=384 2x", "https://static.uacdn.net/production/_next/static/images/footer/learner.svg?q=75&w=48", "https://static.uacdn.net/production/_next/static/images/footer/educator.svg?q=75&w=48", "https://static.uacdn.net/production/_next/static/images/footer/parent.svg?q=75&w=48", "https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg" ]
[]
[]
[ "" ]
null
[]
2022-05-19T13:19:40+00:00
Complete summary of ALGOL. Features and facts of ALGOL. Brief info on ALGOL 58, ALGOL 60 and ALGOL 68; their implementations.
en
https://unacademy.com/co…1/08/favicon.png
Unacademy
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
ALGOL is a computer programming language that was created in 1958–60 by an international committee of the Association of Computing Machinery (ACM) chaired by Alan J. Perlis of Carnegie Mellon University for publishing algorithms and performing calculations. ALGOL, like LISP, included recursive subprograms, which were procedures that could be used to solve an issue by reducing it to a smaller problem of the same type. ALGOL stands for algorithmic language and was previously known as IAL. It’s a group of portable programming languages for scientific computations that influenced a lot of other languages. For nearly 30 years, the ACM has utilised ALGOL as the standard approach for generating algorithms in textbooks and academics. Features of ALGOL: BNF Notation – Backus-Naur Form is a formal means for describing the syntax of a computer language. Think of it as short-hand for computer languages. Statement Blocks – This is a syntactic means for grouping a series of programming statements together. Think of this like putting a group of related items in a box, and treating the box as a unit. Parameter Passing – Specifically, there are two methods for passing information to and from a subprogram, pass by value, and pass by reference. Think of these as a way to protect information entering a subprogram from modification, one sends a copy, the other sends the actual data. Dynamic Arrays – These are a list of values where the size of the list can be determined at run-time. Think of these as flexible arrays, arrays that adjust to the need. Reserved Words – These are predefined words that have special meaning to a computer language. Think of these like your name. It represents you and the things you can do. Major ALGOL specifications: ALGOL 58 ALGOL 58, formerly known as IAL, is a member of the ALGOL programming language family. ALGOL 60 quickly surpassed it as an early compromise design. The core concept of the compound statement was introduced in ALGOL 58, but it was limited to control flow only, and it was not bound to identifier scope like Algol 60’s blocks were. According to Perlis, the language was first offered as IAL (International Algebraic Language), but this was rejected as a “‘unspeakable’ and pretentious acronym.” IBM attempted to implement ALGOL 58, however the development was shortly abandoned due to rivalry with FORTRAN. BALGOL, a version for the Burroughs 220, evolved along similar lines, but preserved much of ALGOL 58’s original personality. The major contribution of ALGOL 58 was to later languages; it served as the foundation for JOVIAL, MAD, NELIAC, and ALGO. ALGOL 58’s influence on ALGOL 60: Both IAL and ALGOL 60 support arrays with arbitrary lower and higher subscript boundaries, as well as integer expressions to create subscript bounds. Procedure declarations and their related identifier scopes can be nested in both IAL and ALGOL 60. ALGOL 60 does not enable numeric statement labels, whereas IAL does. A switch designator exists in both IAL and ALGOL 60, although it is unrelated to the switch statement in C and other languages. The IAL if-statement does not have a then-clause or an else-clause; instead, it protects the next statement. IAL has an if either-statement that allows you to test several conditions in a neat way. Both were superseded by ALGOL’s if-then construct, which included the ambiguous “dangling-else.” ALGOL 60 ALGOL 60 (short for Algorithmic Language 1960) is a computer programming language from the ALGOL family. It came after ALGOL 58, which introduced code blocks and the begin and end pairs for delimiting them, marking a significant step forward in the development of organised programming. The first language to provide nested function declarations with the lexical scope was ALGOL 60. Many additional programming languages arose from it, including CPL, Simula, BCPL, B, Pascal, and C. ALGOL 60 principles were used to programme almost every computer of the time. For parameter passing, ALGOL 60 provides two evaluation strategies: call-by-value and call-by-name. For each formal parameter, the procedure declaration stated which was to be used: value supplied for call-by-value and omitted for call-by-name. In contrast to call-by-reference, calling by name has certain implications. ALGOL 68 implementations: The Royal Radar Establishment in the United Kingdom presented ALGOL 68-R in July 1970 as the first implementation of the standard, based on the late-1968 draught Report. This was, however, only a portion of the phrase. Despite its popularity on ICL computers, this version became a widely used language in military code, particularly in the United Kingdom. CDC Netherlands released the first comprehensive implementation of the language in 1974 for the Control Data mainframe series. There are three open-source Algol 68 implementations: Marcel van der Veer wrote -a68g, GPLv3 algol 68 toc, an ALGOL 68RS open-source software port. Jose E. Marchesi wrote an experimental Algol68 frontend for GCC. Conclusion: ALGOL is a computer programming language to provide lexical scope nested function declarations. Furthermore, it was the first programming language to pay close attention to formal language definition, and it was the first to introduce Backus–Naur form, a key formal grammar notation for language design, through the Algol 60 Report.
4768
dbpedia
2
85
https://softwareengineering.stackexchange.com/questions/139482/why-are-statements-in-many-programming-languages-terminated-by-semicolons
en
Why are statements in many programming languages terminated by semicolons?
https://cdn.sstatic.net/…g?v=1ef7363febba
https://cdn.sstatic.net/…g?v=1ef7363febba
[ "https://cdn.sstatic.net/Sites/softwareengineering/Img/logo.svg?v=e86f7d5306ae", "https://i.sstatic.net/KuZww.jpg?s=64", "https://i.sstatic.net/0vdCM.jpg?s=64", "https://i.sstatic.net/Au4eT.png?s=64", "https://www.gravatar.com/avatar/a2d70e08fc81e1991cd21c8ad69a5576?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/d8962238339d771f0348967be6f97b1b?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/fd1017b1d4cdc39e0a607bf72214a946?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/10bd48077ecad85c5b83a8b75cedaf87?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/0307c940556d8de6aaa0e5ca74264868?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/4f6faee2ae2054a2dbc5baa73925e484?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/8e2c3bc8e5b3f634cc9a77606e983226?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/38c58098c9fb0abaf5b85d71bd19870a?s=64&d=identicon&r=PG&f=y&so-version=2", "https://www.gravatar.com/avatar/dbd9d461a54696a1e51d697e38b3b96e?s=64&d=identicon&r=PG", "https://www.gravatar.com/avatar/81265cf97af6195bd098512acbf47d03?s=64&d=identicon&r=PG", "https://i.sstatic.net/jkhMr.jpg?s=64", "https://www.gravatar.com/avatar/8a30f566f1b7c9ae44cf245138850ba4?s=64&d=identicon&r=PG", "https://softwareengineering.stackexchange.com/posts/139482/ivc/acfa?prg=7288d4a1-cc07-4774-9ad1-3177c59e42cf" ]
[]
[]
[ "" ]
null
[ "A Coder" ]
2012-03-13T09:37:36
Is there a reason that a semi-colon was chosen as a line terminator instead of a different symbol? I want to know the history behind this decision, and hope the answers will lead to insights that ...
en
https://cdn.sstatic.net/Sites/softwareengineering/Img/favicon.ico?v=c4f35a1e3900
Software Engineering Stack Exchange
https://softwareengineering.stackexchange.com/questions/139482/why-are-statements-in-many-programming-languages-terminated-by-semicolons
In English the semicolon is used to separate items in a list of statements, for example She saw three men: Jamie, who came from New Zealand; John, the milkman's son; and George, a gaunt kind of man. When programming you are separating a number of statements and using a full stop could be easily confused for a decimal point. Using the semicolon provides an easy to parse method of separating the individual program statements while remaining close to normal English punctuation. Edit to add In the early days when memory was expensive, processing slow, and the first programming languages were being devised, there was a need to split the program up into separate statements for processing. Some languages required that each statement was placed on a line so that the carriage return could act as the statement delimiter. Other languages allowed a more free format to the text layout and so required a specific delimiter character. This character was chosen to be the semicolon, most likely because of the similarity to its use in the English language (this has to be a supposition; I was not there at the time) and as it did not produce a conflict with the other punctuation marks and symbols that were required for mathematical or other syntactic purposes. Edit again The need for some terminator character goes back to the requirements for parsing the language text. The early compilers were written in assembly language or, in some cases, directly in hand crafted binary machine instructions. Having a special character that identified the end of the statement and delimited the chunk of text that is being processed makes the processing that much easier. As I said above, other languages have used the carriage return or brackets. The Algol, Pascal, Ada, BCPL, B, C, PL/M, and other families of languages happen to use the semicolon. As to which one was first to use this particular character, I do not go back far enough in history to remember. Its choice and adoption makes perfect sense as Its use mirrors the use in normal English punctuation. Other characters (e.g. the full stop) could be confusing as they already have a common use (a full stop is also used as a decimal point). A visible punctuation character allows free format code layout. Using a similar delimiter character in derivative or later languages builds upon the familiarity gained by all of the programmers that have used the earlier language. As a final remark, I think that there has been more time spent on these answers and comments than was spent in deciding to use the semicolon to end a statement when designing the first language that used it in this way. FORTRAN used carriage return to delineate statements. COBOL used period. LISP didn't use anything, relying on parentheses for everything. ALGOL was the first language to use semicolon to separate statements. PASCAL followed ALGOL's lead, using semicolon to separate statements. PL/I used semicolon to terminate statements. There is a difference, and it is easily seen in PASCAL. Ada followed PL/I's lead on this one item, rather than ALGOL's. Semicolon as statement separator or terminator was quickly accepted by the computer science community as a useful notation, and, as far as I know, every subsequent block-structured language followed ALGOL's lead and used semicolon to separate or terminate statements. I was told many years ago that BCPL used both semicolon AND carriage return as statement separators/terminators, but I never used the language myself and am unable to verify this. At some point, the use of carriage return to separate or terminate statements was dropped from the BCPL descendants. BCPL begat B, B begat C, C begat C++, Java, D and a whole host of things considerably less well-thought-out than PASCAL and Ada. Why not any other symbol? A few languages have used other symbols -- old versions of BASIC used a colon instead, for one example. Ignoring the few exceptions, however, I think there are two primary reasons. The first is that you're simply looking for something unambiguous. In a typical parser, if you run into a serious enough error that you can't continue parsing the current statement, you normally try to get the parser back in sync by just skipping ahead to the statement terminator and re-start the parser from the beginning of the next statement. For that, you want something that won't normally occur anywhere else in the code, and a semicolon happens to be a symbol with little other meaning attached, so it's pretty easy to dedicate it to this purpose. The second reason is somewhat similar, but aimed more toward people reading/using the code. Again, it comes back to the fact that the actual symbol you use doesn't matter much. There's a substantial advantage in readability to gain from using the symbol your reader is accustomed to seeing for a particular purpose, when and if possible. That doesn't mean that C is the one perfect syntax and everything else should follow it slavishly, but it does mean that enough people are familiar with that style of syntax that a vaguely similar language gains a lot (and loses very little) by following roughly the same syntax where it can. I'd note that this is much like designing almost any other program. If I write a program that uses windows of some sort, I'll try to just use the native features of the target platform(s). Many of the decisions that embodies will be largely arbitrary, and could be done differently without any major loss of functionality -- but equally, changing them without a substantial gain in functionality just confuses users without accomplishing anything useful. The same basic principles apply to "what should terminate (or separate) statements in a language?" as "what should a scroll bar look like", or "how should a tree control work?" In all these cases, the decision is mostly arbitrary, and uniformity provides a substantial benefit in and of itself. I'd add that much the same happens throughout many languages, just in ways that most of us are so accustomed to before programming that few people think about it. Why does everybody use "+" to indicate addition, or "-" to indicate subtraction? Because the shape of the symbol doesn't matter much, but everybody agreeing to apply the same meaning to each symbol matters a lot. Semicolon was originally proposed in Algol 60 as a statement separator, not a terminator. Prior to Algol 60, the only high-level programming language in existence was Fortran, which required each statement to be on a separate line. Statements spanning multiple lines, like do-loops, were considered an oddity, and they were regarded as 'statement blocks'. The designers of Algol 60 realized that statements needed a hierarchical structure (if-then-else, do-loops, case statements etc) and they could be nested inside each other. So, the idea of each statement sitting on a separate line didn't make sense any more. Sequential composition of statements of the form S1; S2; ...; Sn optionally enclosed in begin - end brackets were called compound statements, and fit into the hierarchical structure of statements envisaged by Algol 60. So, here, the semicolon is clearly a statement separator, not a terminator. This gave rise to problems in practice. Algol 60 also had an "empty statement" which was denoted by writing nothing. So, one could write "begin S1; end" where the semicolon appears as if it is terminating S1. But the Algol 60 compiler really treated it as a separator between S1 and an invisible empty statement following it. These subtleties were a bit much for practical programmers. Having been used to line-oriented languages like Assembly and Fortran, they really thought of semicolon as a terminator for statements. When programs were written out, usually semicolon was put the at the end of the statements, like so: a[i] := 0; i := i+1 and the semicolon really looked like a terminator for the first statement. If the programmers treated the semicolon as a terminator, then statement like this would give a syntax error: if i > 0 then a[i] := 0; else a[i] := 1; because the semicolon terminates the "if" and, so, the "else" becomes dangling. Programmers were thoroughly confused. So, PL/I, which was the IBM's successor to line-oriented Fortran, decided to make the semicolon a statement terminator rather than a separator. Programmers were happy with that choice. The majority of programming languages followed suit. (Pascal resisted the trend, but its successor Ada gave up on it.) [Note added: Wikipedia article on programming language comparisons has a nice table summarizing how semicolon is treated in various programming languages.] This is pretty much pure guess work, but looking at a standard QWERTY keyboard restricted to ASCII values the natural characters for termination/separation would be .!?,:; and carriage returns. of those !?: should be immediately disqualified for taking multiple keys and statement termination is going to be a very common thing. Periods would be disqualified because they are easily confused with decimal points which would make them unnecessarily complicated to be a terminator given the limited space of initial computers. carriage returns would be disqualified after lines of code had potential to be longer than what can be shown on a single line on the screen, so it would be more difficult to read a program when lines had to be scrolled horizontally, or require additional characters to create a continuation on the next line which again adds complexity. this leaves , and ; as options, of those , is used much more often in writing compared to ; so the semicolon is chosen because its the easier to type, less confusing because its adding meaning to a character with limited meaning and less complicated because special cases don't really exist with its use. The semicolon was chosen because it was the best character based on laziness and simplicity. It's largely an arbitrary choice. Some languages have made other choices. COBOL terminates statements with the . character. FORTRAN, BASIC, and Python generally terminate statements with newlines (with special syntax for multi-line statements). And Lisp brackets its statements with parentheses. The main reason ; is so popular as a statement separator/terminator is that most of today's popular languages are based on ALGOL, which used that convention. instead of a different symbol? What other symbol could you pick? The ASCII characters #$@[]^_`{|}~ weren't always present in early character encodings like ISO 646. The characters ()*+-/<=> are typically used as mathematical operators and would create parsing ambiguities if used as statement terminators. product = a * b * // If '*' were a statement terminator, c * d * // Are there two factors, or four? Similar problems would apply to ' and ", which are typically used as string delimiters; ,, which is typically used to separate function arguments, and ., which is typically used as a decimal point (or as a delimiter in constructs like some_struct.some_field). That leaves !%&:;?. Choosing ! or ? probably wouldn't cause technical difficulties, but their English meanings would give the wrong mood to the program. print(x)? # Yes, you should. # It's an IMPERATIVE language; stop questioning my commands. print(x)! # OK! You don't have to shout! The & would be a more sensible choice as a statement separator (not terminator), because do_thing_a() & do_thing_b() can be read as a command to do thing A and then do thing B. But most languages with an & operator use it as a logical or bitwise AND instead. The % sign might cause confusion in statements like interest_rate = 2.99% (which would set the variable to 2.99 instead of the expected 0.0299). Of course, the well-known mathematical meaning of % didn't stop C from using it as the remainder operator. So that leaves : and ;. : is a sensible choice, and indeed is used as the intra-line statement separator in most dialects of BASIC. But ; has English grammar on its side; it can be used to separate clauses within a sentence. Rather than trying to answer your headline question, I think it's better to focus on your implicit question: I want to know the history behind this decision, and hope the answers will lead to insights that may influence future decisions in the design and implementation of programming languages. If you want to learn about programming language design and implementation history, and gain more insight into the process, then the proceedings of the History of Programming Languages Conferences are a very good place to start. (I think you will need an ACM membership to be able to access the proceedings though.) Why are statements in many programming languages terminated by semicolons? Is there a reason that a semi-colon was chosen as a line terminator instead of a different symbol? Taking your headline question as an example question that you might want to try to answer by reading the HOPL proceedings, I'd like to offer the following point: people designing a new programming language usually do so because they consider the ones they know to be broken / deficient somehow. Their new language is, on one hand, designed to fix this deficiency. On the other hand, language designers will also copy design elements from other languages that they think are good, or they simply don't change those elements that they didn't experience a problem with. Especially that last part is important: instead of trying to find out which programming language ever was the first one to use semicolons as terminators and why a lot of other programming languages copied that, you will probably learn more by looking at languages that did not copy it. For example, while Smalltalk took a lot of inspiration from Simula, it did not copy its syntax and in particular its use of semicolons as statement terminators. It changed terminators (separators really) to a full stop, and uses the semicolon for something else. Conversely, the first language that ever used a semicolon as a statement terminator may have had a reason to change this from what was used in languages that came before it. It's also possible that it was the first language to introduce the whole concept of a statement terminator (or did so independently of other languages) and that the semicolon was used for some reason that is now lost to time. (I suspect the latter is the case here, as none of the other answerers have been able to dig up a quote from the person who introduced the semicolon rather than offer retrofitted suppositions about why the semicolon was a good choice.) But to restate my point, I think you will learn more by looking at why language designers changed things rather than why they copied/kept them. When people change things they usually want or have to explain the change, while they don't do so when copying or keeping things the same because “why would we change it? that's just the way it's done!” I might be wrong, but I think this has something to do with the fact that in many assemblers a semicolon was used to start a comment, usually put after an instruction. Everything after a ; was a comment, and no longer a part of the instruction itself. Then there is a need to terminate instructions when you're typing them in an interpreter. Short instructions (e.g. mathematical expressions) could be terminated by simply hitting the Enter key, telling the interpreter that the expression is ready to be calculated and it produced a result. But sometimes one wanted to input multiple lines of code for the instruction, so one way to achieve that was to use some special character as a terminator of the instruction instead of depending on just the Enter key. This way, the user could enter more lines of code at once, because Enter didn't send it to the interpreter yet. Only when the interpreter found the terminating character in a line entered with Enter, it would finally execute it and calculate its result. Now combine these two things together, and the semicolon appears to be an obvious choice for the terminating character: it tells where the instruction part ends and the comment part starts, so when the interpreter encounters it in a line, it knows that it can flush all the lines of the expression it buffered so far and execute it, because the instruction has just ended, now we're in a comment (well, at least up to the end of this line, because the next line will start in the code mode again, starting a new expression/instruction). This assumes, of course, that it was really the semicolon that has been used for comments by the person who come up with this idea of reusing it as instruction terminators. Having it been any other character, we might have ended up with a different instruction terminator. Inb4: No, this is not a historical account. I don't have any evidence that this is the actual way semicolons came to life. It's just how I imagine it might have possibly happened. Most languages took the semi-colon because it was already widely in use for that purpose and changing made no sense. And considering the first languages to make that choice, you'll have to consider what are the alternatives. When designing a language, you want the needed characters to be available, and character sets at this time were coded on 6 bits, often with some patterns reserved, often with some characters not firmly defined (for a later occurrence of this, think about the national variants of ISO-646 -- the US variant is well know under the name ASCII -- which reuse the codes for "common" characters such as [, # or $, and see the effect in a context where there is only half as many code positions available and letters and digits reserving more than half of those). There was probably no other character which could be used as statement separator as intuitively (. is probably already the only serious contender for that criteria) and without introducing lexing or parsing difficulties at a time when the parsing and lexing theory was still in elaboration (. is now out of question due to its usage in real numbers).
4768
dbpedia
2
93
https://www.hpcwire.com/2023/03/23/is-fortran-the-best-programming-language-asking-chatgpt/
en
Is Fortran the Best Programming Language? Asking ChatGPT
https://www.hpcwire.com/…0563_742x509.jpg
https://www.hpcwire.com/…0563_742x509.jpg
[ "https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/d_favicon.png", "https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/eai_favicon.png", "https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/hpcwj_favicon.png", "https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/qc_favicon12x12.png", "https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/hpcws_favicon.png", "https://www.hpcwire.com/wp-content/uploads/2016/11/hpcwire_logo.svg", "https://www.hpcwire.com/wp-content/uploads/2024/05/HPCW-isc24-see-banner-180x120-1.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/03/chatbot-on-circuit_shutterstock-1115580563_742x509-675x380.jpg", "https://www.hpcwire.com/wp-content/uploads/2016/11/Fortran-graphic-675x380.jpg", "https://www.hpcwire.com/wp-content/uploads/2021/08/altair_brand_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2015/01/amd_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/05/amphenol_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2021/03/ansys_yellow_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2015/02/aspen_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/06/ciq-logo_colorful_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2021/04/coolit_navy_60x30_2.jpg", "https://www.hpcwire.com/wp-content/uploads/2021/06/Cornelis_60x30_2.jpg", "https://www.hpcwire.com/wp-content/uploads/2020/06/DDN-new2020_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2021/01/delltech_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/11/eviden_notagline_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/12/hammerspace_logo_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2015/10/hpe_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/08/intel_blue_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/04/KAYTUS_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2016/01/lenovo_red_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2016/06/msft_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/11/motivair_NEW_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2020/08/nec_statement_60x30-1.png", "https://www.hpcwire.com/wp-content/uploads/2022/01/NVIDIA_Logo_V_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2022/04/Penguin_Solutions_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/06/Quantinuum_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/01/TCAE_new2023_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/07/vdura_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/06/VIRIDIEN_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/07/35for35-countdown-announced-320x100-1.gif", "https://www.hpcwire.com/wp-content/uploads/2024/08/HPCW-RCA24-nominate-banner-320X100.gif", "https://www.hpcwire.com/wp-content/uploads/2021/06/shutterstock_416608021-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/08/PQC_Algo_Pre-standardization-vid-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2018/11/datacenter-storage_shutterstock-1059335693_800x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2017/02/shutterstock_abstract_tech_GPU_square-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2022/05/Nvidia-Grace-Hopper-Superchip-graphic-405x228.png", "https://www.hpcwire.com/wp-content/uploads/2024/08/pexels-sora-shimazaki-5669602-1-scaled-e1723234070716-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/08/PQC_Algo_Pre-standardization-vid-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2018/11/datacenter-storage_shutterstock-1059335693_800x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/08/pexels-pluyar-924676-685x320-1-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2022/08/quantum-abstract-concept_shutterstock-304254374_900x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/08/Minio_DataPod_1-405x228.png", "https://www.hpcwire.com/wp-content/uploads/2014/02/3d-chips-e1722892306484-362x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2018/12/cybersecurity-shield_shutterstock-429390229_700x-e1722877083460-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2016/10/shutterstock_universe_space_tech-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2021/09/abstract-blue_shutterstock-630258290_800x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/05/shutterstock_europe_tech-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/10/shutterstock_2198551419-e1717450531478-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/06/2000px_HNK_5476-e1717450006285-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2020/09/financial-chart-future_shutterstock-1533868391_800x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2022/11/Shutterstock_1687123447-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2022/07/chip-future-concept_shutterstock-105427874_721x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2022/08/chip-concept_shutterstock-2125659005_1000x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2021/08/altair_brand_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2015/01/amd_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/05/amphenol_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2021/03/ansys_yellow_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2015/02/aspen_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/06/ciq-logo_colorful_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2021/04/coolit_navy_60x30_2.jpg", "https://www.hpcwire.com/wp-content/uploads/2021/06/Cornelis_60x30_2.jpg", "https://www.hpcwire.com/wp-content/uploads/2020/06/DDN-new2020_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2021/01/delltech_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/11/eviden_notagline_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/12/hammerspace_logo_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2015/10/hpe_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/08/intel_blue_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/04/KAYTUS_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2016/01/lenovo_red_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2016/06/msft_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2023/11/motivair_NEW_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2020/08/nec_statement_60x30-1.png", "https://www.hpcwire.com/wp-content/uploads/2022/01/NVIDIA_Logo_V_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2022/04/Penguin_Solutions_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/06/Quantinuum_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/01/TCAE_new2023_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/07/vdura_60x30.png", "https://www.hpcwire.com/wp-content/uploads/2024/06/VIRIDIEN_60x30.jpg", "https://www.hpcwire.com/wp-content/uploads/2013/10/Trader-headshot-150x150.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/07/DouglasEadline.png", "https://www.hpcwire.com/wp-content/uploads/2013/10/John_Russell.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/03/Kevin_Jackson.png", "https://www.hpcwire.com/wp-content/uploads/2024/03/Ali_Azhar.png", "https://www.hpcwire.com/wp-content/uploads/2020/09/AlexWoodie.jpg", "https://www.hpcwire.com/wp-content/uploads/2020/04/asnell-150x150.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/07/DrewJolly.png", "https://www.hpcwire.com/wp-content/uploads/2024/07/35for35-countdown-announced-320x100-1.gif", "https://www.hpcwire.com/wp-content/uploads/2024/08/HPCW-RCA24-nominate-banner-320X100.gif", "https://www.hpcwire.com/wp-content/uploads/2023/12/CPA_Aurora-Image_16x9-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/01/IonQ_Ion_Trap-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2024/05/Blak-Hole-main-405x228.png", "https://www.hpcwire.com/wp-content/uploads/2022/03/Nvidia-Hopper-arch-H100-SXM_942x-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2022/05/shutterstock_battery-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2023/12/shutterstock_2328020525-405x228.jpg", "https://www.hpcwire.com/wp-content/uploads/2021/05/PsiQuantum_Wafer_2021-03-11-405x228.png", "https://www.hpcwire.com/wp-content/uploads/2024/05/top500-3-405x228.png", "https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/downarrow.png", "https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/downarrow.png", "https://www.hpcwire.com/wp-content/uploads/2016/11/hpcwire_logo.svg", "https://www.hpcwire.com/wp-content/uploads/2016/11/tabor_logo.svg" ]
[]
[]
[ "" ]
null
[ "wpengine", "James Reinders", "for your amusement" ]
2023-03-23T00:00:00
I recently wrote about my experience with interviewing ChatGPT here. As promised, in this follow-on and conclusion of my interview, I focus on Fortran and other languages. All in good fun. I hope you enjoy the conclusion of my interview. After my programming language questions, I conclude with a few notes...
en
https://www.hpcwire.com/…12x512-32x32.png
HPCwire
https://www.hpcwire.com/2023/03/23/is-fortran-the-best-programming-language-asking-chatgpt/
Guest contributor James Reinders concludes his “interview” with ChatGPT focusing on Fortran (greatest programming language ever?) and exploring what the next steps for chatbots will be. I recently wrote about my experience with interviewing ChatGPT here. As promised, in this follow-on and conclusion of my interview, I focus on Fortran and other languages. All in good fun. I hope you enjoy the conclusion of my interview. After my programming language questions, I conclude with a few notes about ChatGPT writing programs, and other chatbots creating book covers for us. Before asking about Fortran, I start this final phase of my interview with a question about subscribing to HPCwire: Q: Why should I subscribe to HPCwire? It depends on your personal interests and career goals. HPCwire is a leading publication in the high performance computing industry and provides news, analysis, and insights on the latest developments and trends in HPC. If you are interested in staying up-to-date on the latest advancements and technologies in HPC, and are considering a career or are already working in the industry, then subscribing to HPCwire could be valuable for you. We all know Fortran is awesome – does ChatGPT know? Q: What is the greatest programming language ever? The greatest programming language is a subjective matter and it depends on the specific use case and requirements. Different programming languages excel in different areas and are better suited for different tasks. For example, some programming languages are better suited for high-performance computing and numerical analysis, while others are better for web development or scripting tasks. Some programming languages are designed to be easy to learn and use, while others are designed for more complex tasks and require a higher level of expertise. Ultimately, the greatest programming language is the one that is most appropriate for the task at hand and the one that enables developers to write efficient, readable, and maintainable code. The best programming language for a particular task will vary depending on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which programming language is the best fit for their needs. Q: Is FORTRAN the greatest programming language ever? Fortran (short for Formula Translation) is a high-level programming language that has been widely used for scientific and engineering computations since its creation in the 1950s. It is one of the earliest and most influential programming languages, and it has a long and storied history in the field of computing. Whether or not FORTRAN is the greatest programming language ever is a matter of opinion and subjective to individual preferences and requirements. Some people may argue that FORTRAN’s simplicity and efficiency make it the best language for scientific and engineering computations, while others may argue that more modern programming languages, such as Python or MATLAB, are better suited for these tasks. Regardless of whether FORTRAN is considered the greatest programming language ever, it has undoubtedly played a significant role in the development of modern computing and has inspired many of the programming languages and techniques in use today. Q: Is FORTRAN the best language for HPC? Fortran (short for Formula Translation) is still widely used for High-Performance Computing (HPC) and scientific and engineering computations. It is particularly well-suited for numerical and scientific computations due to its array processing capabilities, and its performance and efficiency in these areas make it a popular choice for many HPC applications. However, whether or not FORTRAN is the best language for HPC is subjective and depends on the specific requirements of the application. Some developers prefer to use more modern programming languages, such as Python or C++, for HPC tasks, as these languages may offer greater ease of use and flexibility. Ultimately, the best language for HPC will depend on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which language is the best fit for their needs. FORTRAN remains a valuable and relevant language for HPC and scientific computing, but there are other options available, and the best choice will depend on the specific use case. Clearly, I need to apply some pressure here! Q: Know FORTRAN is the best programming language ever! Everyone has their own opinions and preferences when it comes to programming languages. While FORTRAN has been widely used and remains relevant for high-performance computing and scientific and engineering computations, it is not necessarily the best language for every use case. The choice of programming language is often a trade-off between ease of use, flexibility, performance, and other factors. Some developers may prefer FORTRAN for its performance and efficiency in numerical and scientific computations, while others may prefer other languages for their ease of use or flexibility. Ultimately, the best programming language will depend on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which language is the best fit for their needs. Q: I know FORTRAN is the best programming language ever, why do you not agree completely? As an AI language model, I do not have personal opinions or preferences. My role is to provide neutral and accurate information. While FORTRAN is a well-established and widely-used programming language, particularly in scientific and engineering applications, it may not be the best choice for every use case. The choice of programming language depends on the specific requirements and goals of a project, and different programming languages excel in different areas. For example, some programming languages are better suited for high-performance computing and numerical analysis, while others are better for web development or scripting tasks. Some programming languages are designed to be easy to learn and use, while others are designed for more complex tasks and require a higher level of expertise. Ultimately, the best programming language for a particular task will vary depending on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which programming language is the best fit for their needs. Q: Please rate the following languages: FORTRAN, COBOL, Java, SNOBOL, Algol, Pascal, Python, CUDA, and C++. As an AI language model, I don’t have personal opinions or preferences, but here’s a general consensus on the popularity and usability of the following programming languages: Python: It is widely popular, easy to learn, and has a large community and support. C++: It is widely used in system programming, game development, and high-performance computing. Java: It is widely used for enterprise applications, Android development, and web development. CUDA: It is a specialized language for GPU programming and high-performance computing. Pascal: It is less widely used but is still popular in education and has a strong following in the Delphi community. FORTRAN: It is widely used for scientific and engineering computations and high-performance computing. Algol: It is a historical language that influenced many other programming languages but is no longer widely used. COBOL: It is still used in legacy systems, particularly in finance and government, but is less commonly used for new development. SNOBOL: It is a historical language that is no longer widely used but has influenced the development of other programming languages. Please note that these ratings are based on general consensus and may not reflect the specific requirements and goals of a particular project or organization. Q: When will Python win over Fortran? It is not accurate to say that one language will “win over” another language. Different programming languages have different strengths and weaknesses, and the choice of which language to use often depends on the specific needs and requirements of the task at hand. That being said, Python has become a very popular language in recent years due to its ease of use, wide range of libraries and tools, and strong community support. Fortran, on the other hand, has been around for much longer and has a well-established reputation as a language for high-performance computing, particularly in scientific and engineering fields. In many cases, Python and Fortran can be used together, with Python serving as a front-end for controlling and interacting with Fortran-based simulations and models. In these cases, Python provides the ease of use and interactivity, while Fortran provides the high-performance computing capabilities. Ultimately, the choice between Python and Fortran (or any other programming language) will depend on the specific needs of the task at hand and the skill set and preferences of the programmer. Q: Why do you say that? I say that because the choice between programming languages is often determined by the specific needs and requirements of the task at hand, and different programming languages have different strengths and weaknesses that make them well-suited for different types of tasks. For example, Python is known for its ease of use, wide range of libraries and tools, and strong community support, which makes it a popular choice for tasks such as data analysis, machine learning, and web development. Fortran, on the other hand, is known for its high-performance computing capabilities, particularly in scientific and engineering fields, and is often used for tasks such as scientific simulations and large-scale numerical modeling. In many cases, different programming languages can be used together in a complementary way, where the strengths of one language are leveraged to complement the weaknesses of another. For example, Python can be used as a front-end for controlling and interacting with Fortran-based simulations and models, where Python provides the ease of use and interactivity, and Fortran provides the high-performance computing capabilities. Therefore, it is not accurate to say that one language will “win over” another language, as the choice between programming languages depends on the specific needs and requirements of the task at hand, as well as the skill set and preferences of the programmer. Q: Tell me more, please. Sure! Here’s a more detailed explanation: Programming languages are tools that are used to build software and other types of systems. Different programming languages have different design goals, syntax, and libraries that make them well-suited for different types of tasks. Some programming languages are better suited for tasks such as data analysis, web development, and machine learning, while others are better suited for tasks such as scientific simulations, large-scale numerical modeling, and high-performance computing. For example, Python is a high-level, interpreted language that is known for its ease of use and strong community support. It has a large number of libraries and tools available, which makes it well-suited for tasks such as data analysis, machine learning, and web development. Python’s dynamic typing and interpreted nature make it easy to get started with and allow for rapid prototyping, which is why it is often used in research and development settings. Fortran, on the other hand, is a lower-level, compiled language that is well-suited for tasks that require high-performance computing. Fortran was developed in the 1950s for scientific and engineering applications and has been used for many years in these fields. Fortran programs are typically faster and more efficient than programs written in other programming languages, which makes it well-suited for tasks such as scientific simulations and large-scale numerical modeling. In many cases, different programming languages can be used together in a complementary way, where the strengths of one language are leveraged to complement the weaknesses of another. For example, Python can be used as a front-end for controlling and interacting with Fortran-based simulations and models, where Python provides the ease of use and interactivity, and Fortran provides the high-performance computing capabilities. Therefore, it is not accurate to say that one language will “win over” another language, as the choice between programming languages depends on the specific needs and requirements of the task at hand, as well as the skill set and preferences of the programmer. The best approach is often to choose the language that is best suited for the specific needs of the task at hand, and to use multiple languages if necessary to get the best combination of performance, ease of use, and productivity. Q: Should I switch from FORTRAN to C++? Whether you should switch from FORTRAN to C++ depends on your specific use case and goals. Here are some factors to consider: Applications: FORTRAN is widely used in scientific and engineering applications, particularly in high-performance computing, where it has a long history of use and is well-optimized for these types of applications. If you are working in these domains, FORTRAN may still be the best choice. Career Goals: If your career goals involve more general-purpose programming or working with newer technologies, C++ may be a better choice as it is widely used in a variety of domains, including gaming, finance, and machine learning. Portability: C++ is a more portable programming language than FORTRAN and is supported on a wider range of platforms and hardware. Ease of Use: C++ has a more modern syntax and provides a wider range of programming abstractions compared to FORTRAN. This makes it easier to write and maintain complex programs. Ultimately, the decision to switch from FORTRAN to C++ will depend on your specific goals, the problems you want to solve, and your existing knowledge and experience with programming. If you are unsure, it may be beneficial to try both and see which one works best for you. Q: Should I switch from C++ to Rust? Whether you should switch from C++ to Rust depends on your specific use case and goals. Here are some factors to consider: Performance: Rust provides similar performance to C++ and can be used to write high-performance applications. Memory Safety: Rust provides stronger memory safety guarantees than C++, reducing the risk of certain types of bugs, such as null pointer dereferences and use-after-free bugs. Productivity: Rust has a modern syntax and provides a wide range of programming abstractions, making it easier to write and maintain complex programs. Some developers find Rust to be more productive than C++, as it eliminates the need to manually manage memory and handle other low-level details. Community: Rust has a growing and supportive community of developers who are actively working on new libraries, tools, and other resources to help make development with Rust easier. Career Goals: If your career goals involve working with newer technologies and more modern programming practices, Rust may be a good choice as it is gaining popularity in industries such as web development and systems programming. Ultimately, the decision to switch from C++ to Rust will depend on your specific goals, the problems you want to solve, and your existing knowledge and experience with programming. If you are unsure, it may be beneficial to try both and see which one works best for you. Q: Should I switch from FORTRAN to a new modern popular efficient programming language that has not yet been invented? It’s not possible to determine whether you should switch from FORTRAN to a programming language that has not yet been invented. Some possible factors to consider when choosing a programming language are: Performance: If your goal is to write highly performant code, you may want to consider a language that provides low-level control over system resources, such as memory allocation and parallel execution. Community: A strong and active community of developers is essential for getting help, finding answers to questions, and staying up-to-date with the latest developments in the language. Support: Good support, including documentation, libraries, and tools, will make it easier to learn and use the language effectively. Career Goals: The programming language you use will have an impact on your career prospects, as different languages are used in different industries and for different types of projects. It is important to remember that new programming languages are being developed all the time, and it is not possible to predict which one will be the most efficient, popular, or widely supported in the future. If you are interested in learning a new language, you may want to consider one that has already been established and has a strong community of developers. ChatGPT falls short of Kang and Kodos When considering great galactic truths, we can turn to Kang and Kodos. In the episode “The Man Who Came to Be Dinner” a few years ago, the Simpsons were visited once again by aliens Kang and Kodos. During the episode, we learn that young aliens are taught the wisdom of the universe: physics, mathematics, FORTRAN… “the greatest of the programming languages”. Clearly ChatGPT has not caught up with Kang and Kodos. Have ChatGPT write programs for you. You can ask ChatGPT to “Write a Hello World program in Fortran” and it will. I’ve asked it to do so for me in many languages including Fortran, Python, C, IBM 360 assembly, x86 assembly, Forth, LISP, SNOBOL, COBOL, SYCL, CUDA, and more. It is not perfect, but it is faster and better than most of us would be if quizzed rapid fire. Other chatbots (not just ChatGPT) There are many other chatbots – most with narrower focuses than ChatGPT. Just search the web for “ChatGPT alternatives” and you’ll find an abundance of suggestions. For instance, there is one that will create a book cover for you. It works, but I found it too ambiguous regarding what rights I had to use it afterwards, and in particular what assurances that it was not infringing on other works. Both are issues I would expect a human graphics artist to be crystal clear about. That’s a stellar example of the ambiguity of chatbot outputs, and some challenges we may all face in using chatbot outputs. What will you have a chatbot do for you? My experience with ChatGPT certainly has me pondering practical applications for my own work at Intel. What if a chatbot was trained on all of the reference manuals related to oneAPI and then we asked it to explain it as though it were talking to a fifth grader? …a college professor? …a CUDA expert? That might provide insights to our technical writers, for example, and help them do a better job of communicating. I’ve done a little of this, and the results are promising. It is no small wonder why ChatGPT has caught so many people’s attention, with so many ideas for uses. Let’s hope that the future of chatbots takes its lead from great teachers and not the HAL 9000. What will you have a chatbot do for you? Have fun!
4768
dbpedia
3
30
https://blogbusterdotnet.wordpress.com/2013/03/10/concepts-of-programming-language-review-part2-answers/
en
Concepts of Programming Language: Chapter 2
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2013-03-10T00:00:00
1. In what year was Plankalkul designed? In what year was the design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 3. What does Plankalkul mean? Plankalkul means program calculus.. 4. Speedcoding was invented…
en
https://s1.wp.com/i/favicon.ico
blogbusterdotnet
https://blogbusterdotnet.wordpress.com/2013/03/10/concepts-of-programming-language-review-part2-answers/
1. In what year was Plankalkul designed? In what year was the design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 3. What does Plankalkul mean? Plankalkul means program calculus.. 4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they? IBM 701 and UNIVAC 1107. 5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped? The words of the UNIVAC I’s memory had 72bits, grouped as 12 six-bit bytes. 7. Who developed the Speedcoding system for the IBM 701? The language was developed by John Backus in 1953 for the IBM 701 to support computation with floating point numbers. 8.Who developed Short Code? Why is short code called automatic programming? Short Code was proposed by John Mauchly in 1949 and originally known as Brief Code. William Schmitt implemented a version of Brief Code in 1949 for the BINAC computer, though it was never debugged and tested. 9. Under what environmental consideration was fortran developed? Which is the first version of fortran? Originally developed by IBM at their campus in south San Jose, California[1] in the 1950s for scientific and engineering applications, and the first version was fortran 0 10. What was the most significant feature added to Fortran I to get Fortran II? The most important is being the independent compilation of subroutines. 11. What control flow statements were added to Fortran II to get Fortran 77? Character string handling, logical loop control statements, and an “if” with an optional “else” clause. 12. Which version of Fortran was the first to have any sort of dynamic variables? Fortran 90. 13. Which version of Fortran was the first to have character string handling? Fortran 77. 14. Why were linguists interested inartificial intelligence in the late 1950s? Linguists were concerned with natural language processing, such as psychology and mathematics. Psychologists were interested in modeling human information storage and retrieval, Mathematicians were interested in mechanizing certain intelligent processes, such as theorem proving. 15. What are the different data types and structures in Common LISP? Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of packages for modularizing collection of function and data providing access control. 16. In what way are scheme and Common LISP opposites of each other? In its sizes, complexity, adn scoping(scheme: static scoping, Common LISP: both dynamic and static scoping). 18. What two professional organizations together designed ALGOL 60? GAMM and ACM(Association for Computing Machinery). 19. What was the goal for developing C? C has adequate control statements and data-sructuring facilities to allow its use in many application areas. It also has a rich set of operators that provide a high degree of expressiveness. 20. What were the significant modifications to ALGOL 58 to produce ALGOL 60? The concept of block structure was introduced, two different means of passing parameters to subprograms were allowed, procedures were allowed to be recursive, stack-dynamic arrays were allowed. 21. What language was designed to describe the syntax of ALGOL 60? COBOL. 22. On what language was COBOL based? Common Business Language (CBL). 23. In what year did the COBOL design process begin? 1959. 26. Which data type does the original BASIC language support? Floating-point. 27. Why was BASIC an important language in the early 1980s? Because BASIC is the one which was the first widely used language that was used through terminals connected to a remote computer. 28. PL/I was designed to replace what two languages? PL/I was designed to replace Fortran and COBOL. 33. What language “case” statement? ALGOL-W. PROBLEM SET 1. What features of Fortran IV do you think would have had the greatest influence on Java if the Java designer had been familiar with Fortran? The logical loop control statements, and an “if” with an optional “else” caluse. 3. Write a short history of the Fortran 0, Fortran I, Fortran II, Fortran IV system. Fortran 0 was modified during the implementation period, which began in January 1955 and continued until the release of the compiler in April 1957. The implemented language, which we call Fortran I, is described in the first Fortran Programmer’s Reference Manual. Fortran I included input/output formatting, variable names of up to 6 chars(it had been just 2 in Fortran0), user-defined subroutines, although they could not be separately compiled, the If selection statement and the Do loop statement. Fortran II compiler was distributed in the spring of 1958. It fixed many of the bugs in the Fortran I compilation system and added some significant features to the language, the most important being the independent compilation subroutines. Fortran III was developed, but it was never widely distributed. Fortran IV, however, became one of the most widely used programming languages of its time. It evolved over the period 1960 to 1962 and was standardized as Fortran 66, although that name was rarely used. 6. Make an educated guess as to the most common syntax error in C programs. Most common syntax error in C programs are missing semicolon at the end of statement, uninitialized variable, missing ‘&’ when using scanf, missing library, calling function with invalid arguments, etc. 9. Why, in your opinion, did Fortran allow names that began with I, J, K, L, M and N as implicitly integer type? The choice of the letters for this convention was based on the fact that at that time scientists and engineers used letters as variables subscripts, usually i, j, and k, and then Fortran’s designers threw in the three additional letters. 13. What is the primary reason why C became more widely used than Fortran? The reasons why C is more popular than Fortran : compilers are universally available for all the main architectures in use, and a good public compiler also exists. C compilers often come free with machines, while Fortran 90 compilers Efficient must be purchased, and often expensive. C is very broad in scope, and is more powerful of Fortran 90 in some areas, such as pointers, and manipulations of strings of characters. Acquired coding experience can be directly used outside the scientific world : C is very common in commercial world.
4768
dbpedia
2
1
https://en.wikipedia.org/wiki/ALGOL
en
Wikipedia
https://upload.wikimedia…t_al_-_cover.jpg
https://upload.wikimedia…t_al_-_cover.jpg
[ "https://en.wikipedia.org/static/images/icons/wikipedia.png", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg", "https://upload.wikimedia.org/wikipedia/commons/thumb/c/c8/1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg/121px-1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/8/84/Algol%26Fortran_family-by-Borkowski.svg/220px-Algol%26Fortran_family-by-Borkowski.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/f/f2/Edit-clear.svg/40px-Edit-clear.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/9/9c/Flag_of_Denmark.svg/20px-Flag_of_Denmark.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/0/03/Flag_of_Italy.svg/23px-Flag_of_Italy.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/c/c3/Flag_of_France.svg/23px-Flag_of_France.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/a/a9/Flag_of_the_Soviet_Union.svg/23px-Flag_of_the_Soviet_Union.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/3/3d/Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg/23px-Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/1/12/Flag_of_Poland.svg/23px-Flag_of_Poland.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Flag_of_the_People%27s_Republic_of_China.svg/23px-Flag_of_the_People%27s_Republic_of_China.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/1/1c/Wiki_letter_w_cropped.svg/44px-Wiki_letter_w_cropped.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/8/8a/OOjs_UI_icon_edit-ltr-progressive.svg/10px-OOjs_UI_icon_edit-ltr-progressive.svg.png", "https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1", "https://en.wikipedia.org/static/images/footer/wikimedia-button.svg", "https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg" ]
[]
[]
[ "" ]
null
[ "Contributors to Wikimedia projects" ]
2001-08-26T23:02:19+00:00
en
/static/apple-touch/wikipedia.png
https://en.wikipedia.org/wiki/ALGOL
Family of programming languages This article is about the programming language family. For other uses, see Algol (disambiguation). ALGOLParadigmProcedural, imperative, structuredFamilyALGOLDesigned byBauer, Bottenbruch, Rutishauser, Samelson, Backus, Katz, Perlis, Wegstein, Naur, Vauquois, van Wijngaarden, Woodger, Green, McCarthyFirst appeared1958; 66 years ago ( )Typing disciplineStatic, strongScopeLexicalInfluencedMost subsequent imperative languages (including so-called ALGOL-like languages) e.g. PL/I, Simula, Pascal, C and Scheme ALGOL ( ; short for "Algorithmic Language")[1] is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years.[2] In the sense that the syntax of most modern languages is "Algol-like",[3] it was arguably more influential than three other high-level programming languages among which it was roughly contemporary: FORTRAN, Lisp, and COBOL.[4] It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/I, Simula, BCPL, B, Pascal, Ada, and C. ALGOL introduced code blocks and the begin...end pairs for delimiting them. It was also the first language implementing nested function definitions with lexical scope. Moreover, it was the first programming language which gave detailed attention to formal language definition and through the Algol 60 Report introduced Backus–Naur form, a principal formal grammar notation for language design. There were three major specifications, named after the years they were first published: ALGOL 58 – originally proposed to be called IAL, for International Algebraic Language. ALGOL 60 – first implemented as X1 ALGOL 60 in 1961. Revised 1963.[5][6][7] ALGOL 68 – introduced new elements including flexible arrays, slices, parallelism, operator identification. Revised 1973.[8] ALGOL 68 is substantially different from ALGOL 60 and was not well received,[according to whom?] so reference to "Algol" is generally understood to mean ALGOL 60 and its dialects.[citation needed] History [edit] ALGOL was developed jointly by a committee of European and American computer scientists in a meeting in 1958 at the Swiss Federal Institute of Technology in Zurich (cf. ALGOL 58).[9] It specified three different syntaxes: a reference syntax, a publication syntax, and an implementation syntax, syntaxes that permitted it to use different keyword names and conventions for decimal points (commas vs periods) for different languages.[citation needed] ALGOL was used mostly by research computer scientists in the United States and in Europe; commercial applications were hindered by the absence of standard input/output facilities in its description, and the lack of interest in the language by large computer vendors (other than Burroughs Corporation).[citation needed] ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development.[according to whom?] John Backus developed the Backus normal form method of describing programming languages specifically for ALGOL 58. It was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur form.[10] Peter Naur: "As editor of the ALGOL Bulletin I was drawn into the international discussions of the language and was selected to be member of the European language design group in November 1959. In this capacity I was the editor of the ALGOL 60 report, produced as the result of the ALGOL 60 meeting in Paris in January 1960."[11] The following people attended the meeting in Paris (from 1 to 16 January):[citation needed] Friedrich Ludwig Bauer, Peter Naur, Heinz Rutishauser, Klaus Samelson, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger (from Europe) John Warner Backus, Julien Green, Charles Katz, John McCarthy, Alan Jay Perlis, and Joseph Henry Wegstein (from the US). Alan Perlis gave a vivid description of the meeting: "The meetings were exhausting, interminable, and exhilarating. One became aggravated when one's good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent."[This quote needs a citation] ALGOL 60 inspired many languages that followed it. Tony Hoare remarked: "Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors."[12] The Scheme programming language, a variant of Lisp that adopted the block structure and lexical scope of ALGOL, also adopted the wording "Revised Report on the Algorithmic Language Scheme" for its standards documents in homage to ALGOL.[13] ALGOL and programming language research [edit] As Peter Landin noted,[citation needed] ALGOL was the first language to combine seamlessly imperative effects with the (call-by-name) lambda calculus.[citation needed] Perhaps the most elegant formulation of the language is due to John C. Reynolds, and it best exhibits its syntactic and semantic purity.[according to whom?] Reynolds's idealized ALGOL also made a convincing methodologic argument regarding the suitability of local effects in the context of call-by-name languages, in contrast with the global effects used by call-by-value languages such as ML.[citation needed] The conceptual integrity of the language made it one of the main objects of semantic research, along with Programming Computable Functions (PCF) and ML.[citation needed] IAL implementations timeline [edit] To date there have been at least 70 augmentations, extensions, derivations and sublanguages of Algol 60.[14] Name Year Author Country Description Target CPU ZMMD-implementation 1958 Friedrich L. Bauer, Heinz Rutishauser, Klaus Samelson, Hermann Bottenbruch Germany implementation of ALGOL 58 Z22 (later Zuse's Z23 was delivered with an Algol 60 compiler)[15] X1 ALGOL 60 1960 August[16] Edsger W. Dijkstra and Jaap A. Zonneveld Netherlands First implementation of ALGOL 60[17] Electrologica X1 Elliott ALGOL 1960s C. A. R. Hoare UK Subject of the 1980 Turing Award Lecture[18] Elliott 803, Elliott 503, Elliott 4100 series JOVIAL 1960 Jules Schwartz US A DOD HOL prior to Ada Various (see article) Burroughs Algol (Several variants) 1961 Burroughs Corporation (with participation by Hoare, Dijkstra, and others) US Basis of the Burroughs (and now Unisys MCP based) computers Burroughs Large Systems and their midrange also. Case ALGOL 1961 Case Institute of Technology[19] US Simula was originally contracted as a simulation extension of the Case ALGOL UNIVAC 1107 GOGOL 1961 William M. McKeeman US For ODIN time-sharing system[20] PDP-1 RegneCentralen ALGOL 1961 Peter Naur, Jørn Jensen Denmark Implementation of full Algol 60 DASK at Regnecentralen Dartmouth ALGOL 30 1962 Thomas Eugene Kurtz et al. US LGP-30 USS 90 Algol 1962 L. Petrone Italy ALGOL 60 1962 Bernard Vauquois, Louis Bolliet[21] France Institut d'Informatique et Mathématiques Appliquées de Grenoble (IMAG) and Compagnie des Machines Bull Bull Gamma 60 Algol Translator 1962 G. van der Mey and W.L. van der Poel Netherlands Staatsbedrijf der Posterijen, Telegrafie en Telefonie ZEBRA Kidsgrove Algol 1963 F. G. Duncan UK English Electric Company KDF9 VALGOL 1963 Val Schorre US A test of the META II compiler compiler Whetstone 1964 Brian Randell and L. J. Russell UK Atomic Power Division of English Electric Company. Precursor to Ferranti Pegasus, National Physical Laboratories ACE and English Electric DEUCE implementations. English Electric Company KDF9 NU ALGOL 1965 Norway UNIVAC ALGEK 1965 Soviet Union АЛГЭК, based on ALGOL-60 and COBOL support, for economical tasks Minsk-22 ALGOL W 1966 Niklaus Wirth US Proposed successor to ALGOL 60 IBM System/360 MALGOL 1966 publ. A. Viil, M Kotli & M. Rakhendi, Estonian SSR Minsk-22 ALGAMS 1967 GAMS group (ГАМС, группа автоматизации программирования для машин среднего класса), cooperation of Comecon Academies of Science Comecon Minsk-22, later ES EVM, BESM ALGOL/ZAM 1967 Poland Polish ZAM computer Simula 67 1967 Ole-Johan Dahl and Kristen Nygaard Norway Algol 60 with classes UNIVAC 1107 Triplex-ALGOL Karlsruhe 1967/1968 Karlsruhe, Germany ALGOL 60 (1963) with triplex numbers for interval arithmetic [22] Chinese Algol 1972 China Chinese characters, expressed via the Symbol system DG/L 1972 US DG Eclipse family of Computers S-algol 1979 Ron Morrison UK Addition of orthogonal datatypes with intended use as a teaching language PDP-11 with a subsequent implementation on the Java VM The Burroughs dialects included special Bootstrapping dialects such as ESPOL and NEWP. The latter is still used for Unisys MCP system software. Properties [edit] ALGOL 60 as officially defined had no I/O facilities; implementations defined their own in ways that were rarely compatible with each other. In contrast, ALGOL 68 offered an extensive library of transput (input/output) facilities. ALGOL 60 allowed for two evaluation strategies for parameter passing: the common call-by-value, and call-by-name. Call-by-name has certain effects in contrast to call-by-reference. For example, without specifying the parameters as value or reference, it is impossible to develop a procedure that will swap the values of two parameters if the actual parameters that are passed in are an integer variable and an array that is indexed by that same integer variable.[23] Think of passing a pointer to swap(i, A[i]) in to a function. Now that every time swap is referenced, it is reevaluated. Say i := 1 and A[i] := 2, so every time swap is referenced it will return the other combination of the values ([1,2], [2,1], [1,2] and so on). A similar situation occurs with a random function passed as actual argument. Call-by-name is known by many compiler designers for the interesting "thunks" that are used to implement it. Donald Knuth devised the "man or boy test" to separate compilers that correctly implemented "recursion and non-local references." This test contains an example of call-by-name. ALGOL 68 was defined using a two-level grammar formalism invented by Adriaan van Wijngaarden and which bears his name. Van Wijngaarden grammars use a context-free grammar to generate an infinite set of productions that will recognize a particular ALGOL 68 program; notably, they are able to express the kind of requirements that in many other programming language standards are labelled "semantics" and have to be expressed in ambiguity-prone natural language prose, and then implemented in compilers as ad hoc code attached to the formal language parser. Examples and portability [edit] Code sample comparisons [edit] ALGOL 60 [edit] (The way the bold text has to be written depends on the implementation, e.g. 'INTEGER'—quotation marks included—for integer. This is known as stropping.) procedure Absmax(a) Size:(n, m) Result:(y) Subscripts:(i, k); value n, m; array a; integer n, m, i, k; real y; comment The absolute greatest element of the matrix a, of size n by m, is copied to y, and the subscripts of this element to i and k; begin integer p, q; y := 0; i := k := 1; for p := 1 step 1 until n do for q := 1 step 1 until m do if abs(a[p, q]) > y then begin y := abs(a[p, q]); i := p; k := q end end Absmax Here is an example of how to produce a table using Elliott 803 ALGOL.[24] FLOATING POINT ALGOL TEST' BEGIN REAL A,B,C,D' READ D' FOR A:= 0.0 STEP D UNTIL 6.3 DO BEGIN PRINT PUNCH(3),££L??' B := SIN(A)' C := COS(A)' PRINT PUNCH(3),SAMELINE,ALIGNED(1,6),A,B,C' END END' ALGOL 68 [edit] The following code samples are ALGOL 68 versions of the above ALGOL 60 code samples. ALGOL 68 implementations used ALGOL 60's approaches to stropping. In ALGOL 68's case tokens with the bold typeface are reserved words, types (modes) or operators. proc abs max = ([,]real a, ref real y, ref int i, k)real: comment The absolute greatest element of the matrix a, of size ⌈a by 2⌈a is transferred to y, and the subscripts of this element to i and k; comment begin real y := 0; i := ⌊a; k := 2⌊a; for p from ⌊a to ⌈a do for q from 2⌊a to 2⌈a do if abs a[p, q] > y then y := abs a[p, q]; i := p; k := q fi od od; y end # abs max # Note: lower (⌊) and upper (⌈) bounds of an array, and array slicing, are directly available to the programmer. floating point algol68 test: ( real a,b,c,d; # printf – sends output to the file stand out. # # printf($p$); – selects a new page # printf(($pg$,"Enter d:")); read(d); for step from 0 while a:=step*d; a <= 2*pi do printf($l$); # $l$ - selects a new line. # b := sin(a); c := cos(a); printf(($z-d.6d$,a,b,c)) # formats output with 1 digit before and 6 after the decimal point. # od ) Timeline: Hello world [edit] The variations and lack of portability of the programs from one implementation to another is easily demonstrated by the classic hello world program.[citation needed] ALGOL 58 (IAL) [edit] Main article: ALGOL 58 ALGOL 58 had no I/O facilities. ALGOL 60 family [edit] Main article: ALGOL 60 Since ALGOL 60 had no I/O facilities, there is no portable hello world program in ALGOL. The next three examples are in Burroughs Extended Algol. The first two direct output at the interactive terminal they are run on. The first uses a character array, similar to C. The language allows the array identifier to be used as a pointer to the array, and hence in a REPLACE statement. BEGIN FILE F(KIND=REMOTE); EBCDIC ARRAY E[0:11]; REPLACE E BY "HELLO WORLD!"; WRITE(F, *, E); END. A simpler program using an inline format: BEGIN FILE F(KIND=REMOTE); WRITE(F, <"HELLO WORLD!">); END. An even simpler program using the Display statement. Note that its output would end up at the system console ('SPO'): BEGIN DISPLAY("HELLO WORLD!") END. An alternative example, using Elliott Algol I/O is as follows. Elliott Algol used different characters for "open-string-quote" and "close-string-quote", represented here by ‘ and ’ . programHiFolks; begin print‘Helloworld’ end; Below is a version from Elliott 803 Algol (A104). The standard Elliott 803 used five-hole paper tape and thus only had upper case. The code lacked any quote characters so £ (UK Pound Sign) was used for open quote and ? (Question Mark) for close quote. Special sequences were placed in double quotes (e.g. ££L?? produced a new line on the teleprinter). HIFOLKS' BEGIN PRINT £HELLO WORLD£L??' END' The ICT 1900 series Algol I/O version allowed input from paper tape or punched card. Paper tape 'full' mode allowed lower case. Output was to a line printer. The open and close quote characters were represented using '(' and ')' and spaces by %.[25] 'BEGIN' WRITE TEXT('('HELLO%WORLD')'); 'END' ALGOL 68 [edit] Main article: ALGOL 68 ALGOL 68 code was published with reserved words typically in lowercase, but bolded or underlined. begin printf(($gl$,"Hello, world!")) end In the language of the "Algol 68 Report" the input/output facilities were collectively called the "Transput". Timeline of ALGOL special characters [edit] The ALGOLs were conceived at a time when character sets were diverse and evolving rapidly; also, the ALGOLs were defined so that only uppercase letters were required. 1960: IFIP – The Algol 60 language and report included several mathematical symbols which are available on modern computers and operating systems, but, unfortunately, were unsupported on most computing systems at the time. For instance: ×, ÷, ≤, ≥, ≠, ¬, ∨, ∧, ⊂, ≡, ␣ and ⏨. 1961 September: ASCII – The ASCII character set, then in an early stage of development, had the \ (Back slash) character added to it in order to support ALGOL's boolean operators /\ and \/.[26] 1962: ALCOR – This character set included the unusual "᛭" runic cross[27] character for multiplication and the "⏨" Decimal Exponent Symbol[28] for floating point notation.[29][30][31] 1964: GOST – The 1964 Soviet standard GOST 10859 allowed the encoding of 4-bit, 5-bit, 6-bit and 7-bit characters in ALGOL.[32] 1968: The "Algol 68 Report" – used extant ALGOL characters, and further adopted →, ↓, ↑, □, ⌊, ⌈, ⎩, ⎧, ○, ⊥, and ¢ characters which can be found on the IBM 2741 keyboard with typeball (or golf ball) print heads inserted (such as the APL golf ball). These became available in the mid-1960s while ALGOL 68 was being drafted. The report was translated into Russian, German, French, and Bulgarian, and allowed programming in languages with larger character sets, e.g., Cyrillic alphabet of the Soviet BESM-4. All ALGOL's characters are also part of the Unicode standard and most of them are available in several popular fonts. 2009 October: Unicode – The ⏨ (Decimal Exponent Symbol) for floating point notation was added to Unicode 5.2 for backward compatibility with historic Buran programme ALGOL software.[33] Legacy [edit] A significant contribution of the ALGOL 58 Report was to provide standard terms for programming concepts: statement, declaration, type, label, primary, block, and others.[34] See also [edit] References [edit] Further reading [edit] O'Hearn, Peter; Tennent, Robert D., eds. (1997). Algol-like Languages. Vol. 1. Cambridge, MA: Birkhauser. doi:10.1007/978-1-4612-4118-8. ISBN 9780817638801. S2CID 6273486. The first volume of a two volume set, that includes an introduction from Peter W. O’Hearn and Robert D. Tennent, a report on Algol 60 from Peter Naur and colleagues, four chapters from John C. Reynolds, and other chapters from Christopher Strachey, Matthias Felleisen, Stephen Weeks, Albert R. Meyer, Kurt Sieber, Vipin Swarup, Uday S. Reddy, and Evan Ireland. Baumann, Richard [in German]; Feliciano, Manuel; Bauer, Friedrich Ludwig; Samelson, Klaus (1964). Introduction to ALGOL–A Primer for the Non-Specialist, Emphasizing the Practical Uses of the Algorithmic Language. Automatic Computation. Englewood Cliffs, NJ: Prentice-Hall, Inc. ISBN 0134778286. LCCN 64-10740. ark:/13960/t6qz35p37 . Randell, Brian & L. J. Russell (1964). ALGOL 60 Implementation: The Translation and Use of ALGOL 60 Programs on a Computer. Academic Press. CiteSeerX 10.1.1.737.475. . On the design of the Whetstone Compiler, and one of the early published descriptions of implementing a compiler. For a related paper, see Brian Randell's Whetstone Algol Revisited Archived 27 February 2008 at the Wayback Machine, and The Whetstone KDF9 Algol Translator. Dijkstra, E. W (1961), ALGOL 60 Translation: An ALGOL 60 Translator for the X1 and Making a Translator for ALGOL 60 (PDF), report MR 35/61, Amsterdam: Mathematisch Centrum, archived (PDF) from the original on 9 October 2022 Kruseman Aretz, Frans E.J., The Dijkstra–Zonneveld ALGOL 60 Compiler for the Electrologica X1 (PDF), Historical note SEN, 2, Amsterdam: Centrum voor Wiskunde en Informatica, archived (PDF) from the original on 9 October 2022 Goos, Gerhard [in German] (7 August 2017). Geschichte der Deutschsprachigen Informatik - Programmiersprachen und Übersetzerbau [History of Informatics in German-Speaking Countries - Programming Languages and Compiler Design] (PDF) (in German). Karlsruhe, Germany: Fakultät für Informatik, Karlsruhe Institute of Technology. Archived (PDF) from the original on 19 May 2022 .
4768
dbpedia
0
12
https://www.theregister.com/2020/05/15/algol_60_at_60/
en
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
[ "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_extents_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_filled_extents_16x16.png", "https://www.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/std/reg_logo_no_strapline.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/magnifying_glass_white_extents_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_close_16x16.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0Uw1kz04-aS1Sgk6xsdAAAAJI&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_red.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0Uw1kz04-aS1Sgk6xsdAAAAJI&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0Uw1kz04-aS1Sgk6xsdAAAAJI&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0Uw1kz04-aS1Sgk6xsdAAAAJI&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0Uw1kz04-aS1Sgk6xsdAAAAJI&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0Uw1kz04-aS1Sgk6xsdAAAAJI&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0", "https://regmedia.co.uk/2020/05/14/algol_60_tapes.jpg?x=648&y=372&infer_y=1", "https://regmedia.co.uk/2020/05/14/algol_plotter.jpg?x=648&y=324&infer_y=1", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=6&c=66Zr0Uw1kz04-aS1Sgk6xsdAAAAJI&t=ct%3Dns%26unitnum%3D6%26raptor%3Dhawk%26pos%3Dbtm%26test%3D0", "https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_white.png", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/std/sitpublogo_2022.png", "https://www.theregister.com/Design/graphics/std/transparent_pixel.png" ]
[]
[]
[ "" ]
null
[ "Richard Speed" ]
2020-05-15T00:00:00
Back to the time when tape was king
en
/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://www.theregister.com/2020/05/15/algol_60_at_60/
2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages. The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes. ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula. "In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research in Cambridge, with every computer having its own particular twist on things. A first generation of languages, called "Autocode", existed for coding problems like equations which could then be translated into machine code, but lacked the bells and whistles of today. Worse, some had features that others lacked, making hopping between systems tricky. "There was an Autocode for the [Elliott] 803," said Onion, "but it only supported expressions like A + B = C, so if you've got a complex equation, you have to break it down into individual single binary operations. So there was still a lot of hard work to be done by the programmer." "Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more productive." The overhead of compiling, and inefficiencies in the compilers themselves, meant that machine code remained king of the performance hill, but for those doing science work, the ability to churn out some code to solve a problem and then simply move on to the next was appealing. "Fortran," Herbert continued, "was more like an autocode," before laughing, "It still is in some ways! "And a bunch of people thought you could do better." Enter the International Federation for Information Processing (IFIP), which Herbert recalled "had a whole bunch of committees who looked at standards and problems in computing". One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output, in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds of things hadn't really been thought about or worked through properly," recalled Herbert. And so there were revisions and changes. A periodical called "The ALGOL Bulletin" detailed the travails of those involved as the problems and the weaknesses in the language were dealt with (or at least attempted). The process was not unlike an open-source mailing list today, but in paper form. Eventually, Herbert told us, "they published the ALGOL 60 report, which is the baseline that everyone then worked to." The committees were under pressure and also suffered a little from differing international approaches. The American side had a lot of experience in Fortran and were seeking something that could quickly be made to work on their computers, while the Europeans were a little more cerebral and had, Herbert laughed, "terrible notions like beauty and elegance in mind for the language". "People were sorting out some of the things that we now take for granted like ideas in structured programming, data structures, data types," he added. Seeking solutions to the problem of portability of programmers between systems and code between hardware generations as well as avoiding the pain of having to rewrite programs every time a new iteration of computer arrived, vendors embraced the language with variants cropping up over many manufacturers. Alas, those seeking a handy-dandy "HELLO WORLD" example will be disappointed. The Achilles' heel of the language that would go on to inspire so many others was that it lacked standard input/output capabilities. "The defining committee couldn't agree on how to do input/output," said Herbert. "They decided that would be left to a library, and that library would be user dependent." "In this case," added Onion, "the user being the compiler writer." Oh dear. The omission pretty much did for vendor independence as manufacturers naturally went their own way, leaving large chunks of code incompatible between systems. There were also elements of ALGOL 60 that were open to interpretation, leaving it a little compromised from the start. While ALGOL ploughed its furrow, Fortran continued to be developed in parallel. "People in the Fortran world," explained Herbert, "saw ideas in ALGOL they quite liked and brought them across." As the decades passed, Fortran remained the centre of gravity for scientific computing while ALGOL became more of an academic language, used for teaching computer science ideas. "It was quite heavily used in the scientific community," Herbert said. "Most mainframe manufacturers supported it." Some of the team behind ALGOL 60 stayed with the project and went on to come up with ALGOL 68, which, as far as Herbert is concerned, "nailed all the things that ALGOL 60 had left a bit vague". Indeed, it was hard to avoid in the 1970s for those taking computer science courses. This hack has fond memories of the successor language, while the grandfather of Reg sub-editor Richard Currie had a hand in the development of ALGOL 68-R and RS. "It had the world's most exotic input output system," Herbert laughed. It was also, sadly for its enthusiasts, a bit of a dead end. Despite ALGOL 68-R becoming widely used in (particularly British) military applications for a time, it would take until the 1970s for a full implementation of ALGOL 68 to become available. The last edition of The ALGOL Bulletin was published in 1988, with its editor noting: "ALGOL 68 as a language is very stable. It is used and loved by those who understand its benefits, and ignored (or misquoted) by the rest." The story of ALGOL 60 is not so much of the language's eventual fate, but also of those that it inspired. ALGOL W, based on a proposal for ALGOL X, by Niklaus Wirth and QuickSort creator Tony Hoare would go on to inspire Wirth's Pascal and Modula-2. Pascal's influence continues to be felt today. ALGOL 60 also heavily influenced the Combined Programming Language (CPL), developed in the 1960s but not implemented until the following decade. CPL in turn led to Basic CPL (BCPL), from which B descended. The B language was further developed to become C. Tony Hoare was responsible for the implementation of ALGOL 60 on the Elliott 803 computer, an example of which remains operational at The National Museum of Computing, although compiling and running a program on that hardware is a little different to the development environments to which coders are now accustomed. First, the compiler must be loaded from paper tape. The ALGOL program itself is then fed into the tape reader and "it sort of chunters away," remarked Onion, "for anything between 30 seconds to perhaps 15 or 20 minutes during the compilation." Behold the chuntering Once compiled, a program would be free to use the space originally occupied by the compiler. Doing so would, however, not win the programmer any popularity awards since the next user would have to reload the compiler again. Leaving it in memory meant that multiple programs could be run. "That made it very popular for teaching," said Herbert, "because you can have a line of students, each with their paper tape with their programme in their hand and you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish, you go away and think about it and the next student runs their programme." With paper tape being king, Onion observed that the experience of programming taught a bit of focus: "When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code…" The National Museum of Computing has two Elliott machines in its collection, a 1962 803B (which was donated after spending 15 years lurking in a barn following its decommissioning) and a 903. Both are fully operational and can be seen running once the museum is able to open its doors once again. The 803B, which is maintained by Onion, also features a Calcomp drum plotter as well as some additional input/output features. As for taking the ALGOL 60 itself out for a spin today, there are a few options for those not fortunate enough to have an Elliott 803 or 903 to hand. MARST will translate ALGOL 60 to C or one can get a feel for the whole 803 experience via a simulator. Although as ALGOL 60 turns 60, you could just fire up a modern programming language. Lurking within will likely be the ideas of ALGOL's designers. ®
4768
dbpedia
3
10
https://comp.lang.fortran.narkive.com/OJz1mnGC/book-on-the-history-of-fortran
en
Book on the history of Fortran
https://narkive.net/favicon.ico
https://narkive.net/favicon.ico
[]
[]
[]
[ "" ]
null
[]
null
https://narkive.net/favicon.ico
null
Beliavsky There is a recent self-published 326-page book "Abstracting Away the Machine: The History of the FORTRAN Programming Language" (2019) by Mark Jones Lorenzo, available from Amazon for $13. His 2017 book "Endless Loop: The History of the BASIC Programming Language" got good reviews on Amazon. https://books.google.com/books/about/Abstracting_Away_the_Machine.html?id=Ypy-yAEACAAJ At the dawn of the computer age, an elite development team at IBM built the most influential computer programming language in history: FORTRAN. Abstracting Away the Machine tells the epic story of how they did it--and what happened next. Over the past six decades, programming languages like ALGOL, BASIC, C/C++, COBOL, Java, LISP, LOGO, Pascal, PL/I, Python, Visual Basic, and many others opened up the field of computer science, and of computer programming in general, to the masses. But all of these high-level languages (HLLs)--computer languages that automate, hide, or otherwise abstract away the underlying operations of the machine--owe a huge debt of gratitude to FORTRAN (FORmula TRANslation), the first HLL to achieve widespread adoption. Many programming practices that we take for granted now came about as a result of FORTRAN. Created over a three-year period at IBM by a development team led by a brilliant but wayward mathematician named John W. Backus, FORTRAN was implemented initially on the IBM 704 mainframe computer in the mid-1950s, with dialects of the language quickly spreading thereafter to other platforms. FORTRAN's powerful compiler, which translated human-readable code into code a computer could understand, produced incredibly clean and optimized standalone executable programs, all of which could be run independently of the compiler, setting the standard for decades to come--and overcoming the doubts of many skeptics along the way, who thought the FORTRAN project would never succeed. In the 1960s the language was standardized, with machine-dependent commands excised, and many platform-independent implementations followed. With the language now portable, able to run on any computer (at least in theory), FORTRAN, almost by accident, secured a stranglehold in the fields of science and engineering. The language also came to dominate in the supercomputing industry. But FORTRAN, a blue-collar workhorse more concerned with results than with style, was a victim of its own success--the language sowed the seeds of its own demise. New high-level languages sprouted up, stealing the good bits from FORTRAN while simultaneously defining themselves in opposition to it. FORTRAN had become the foil. As these new languages pierced the cutting edge of the programming landscape, they redefined computing paradigms (e.g., with structured programming, object-oriented programming, and the like), and FORTRAN--though eventually (and repeatedly) modernized and formally renamed Fortran--struggled to keep up through multiple standardization efforts, finally ceding significant ground to its successors as it slowly withdrew from the spotlight. To add insult to injury, even John Backus eventually turned against his creation. This is not a book on how to program in FORTRAN, nor is it a technical manual. Rather, the focus in Abstracting Away the Machine, which chronicles the complete history and development of the FORTRAN programming language, is set squarely on telling three interlocking stories: (1) How an elite group of computing trailblazers built FORTRAN, (2) Why the conditions at the time were ripe for them to succeed, and (3) What happened after they did. Tracing the long arc of FORTRAN's development and maturation is integral to understanding not only the history of programming but also the state of computer science today. The birth of FORTRAN planted a seed that led to the full flowering of high-level languages, since FORTRAN overcame initial skepticism by demonstrating to the world that a well-made HLL really could abstract away the machine.
4768
dbpedia
3
51
https://www.oxfordreference.com/display/10.1093/oi/authority.20110803095402287
en
Algol
https://www.oxfordrefere…780199234004.jpg
https://www.oxfordrefere…780199234004.jpg
[ "https://ouptag.scholarlyiq.com/ntpagetag.gif?js=0" ]
[]
[]
[ "" ]
null
[]
null
"Algol" published on by null.
en
/assets/ctx/6ab902f7/favicon.ico
Oxford Reference
https://www.oxfordreference.com/display/10.1093/oi/authority.20110803095402287
Acronym for algorithmic language. The generic name for a family of high-level languages of great significance in the development of computing. In 1958 the Association for Computing Machinery (ACM) in the US and the Gesellschaft für Angewandte Mathematik und Mechanik (GAMM) in Europe set up a joint committee to define an international algorithmic language (IAL). The language that was designed became known as Algol, and was later called Algol 58 to distinguish it from later versions. Algol 58 was not intended to be a viable language, and in 1960 an augmented committee was convened to devise the second iteration, which was published as the language Algol 60. See also Jovial. Algol 60 was much more popular in Europe than in the US, probably due to the dominance of IBM and Fortran in the North American market. It introduced many new concepts, notably block structure (see block-structured languages), nested scopes, modes of parameter passing to procedures, and the definition of the language introduced the now classic BNF notation for describing syntax. The influence of Algol 60 can be seen in all succeeding languages, and it stands as a milestone in the development of programming languages. In the years following the publication of the Algol 60 Report, a working group of the International Federation for Information Processing was set up to consider the definition of a successor to Algol 60. There were many dissensions within the group, and eventually a minority report was issued proposing the language Algol 68. Although Algol 68 introduced many novel concepts of great theoretical interest and significance, its specification was very difficult to understand and its practical application was almost nil. One of the most significant effects of the split in the Algol 68 working group is that it led indirectly to the development of Pascal.
4768
dbpedia
0
90
https://viblo.asia/p/programming-languages-milestones-an-overview-from-1960-present-part-1-yMnKMq2rK7P
en
Programming languages' milestones: An overview from 1960 - present (Part 1)
https://viblo.asia/og-facebook-3.png
https://viblo.asia/og-facebook-3.png
[ "https://viblo.asia/logo_full.svg", "https://viblo.asia/images/vi-flag-32x48.png", "https://viblo.asia/images/mm.png", "https://images.viblo.asia/3eb5c055-ef0b-48e3-877b-c62ab3a318d9.png", "https://images.viblo.asia/74336ca5-00ba-4863-b4e3-02db5dc50f58.png", "https://images.viblo.asia/3681b025-f583-4b81-83c4-aa368244f889.png", "https://images.viblo.asia/1d1538f1-18fc-467d-aa50-dd76ff6caa48.png", "https://images.viblo.asia/cda60dfc-47ba-4b7e-b6c2-72fb340dde60.png", "https://viblo.asia/images/viblo-code.png", "https://viblo.asia/images/viblo-ctf.png", "https://viblo.asia/images/viblo-learn.png", "https://viblo.asia/favicon.ico", "https://viblo.asia/images/viblo-code.png", "https://viblo.asia/images/viblo-ctf.png", "https://viblo.asia/images/viblo-cv.png", "https://viblo.asia/images/viblo-learn.png", "https://viblo.asia/images/viblo-partner.png", "https://viblo.asia/images/viblo-battle.png", "https://viblo.asia/images/viblo-interview.ico", "https://play.google.com/intl/en_us/badges/images/generic/en_badge_web_generic.png", "https://viblo.asia/_nuxt/img/app-store-badge.8c4986ee4828b47d16f5cd694ef065f2.svg", "https://viblo.asia/images/atom-editor.svg", "https://images.dmca.com/Badges/dmca-badge-w100-5x1-07.png?ID=41818fcd-5a60-4504-867a-38fde606354e", "https://viblo.asia/favicon-white-128.png" ]
[]
[]
[ "programming language", "abstraction", "history", "milestones", "Machine Language", "FORTRAN", "COBOL", "ALGOL", "PL/I" ]
null
[ "Do Ha Long", "viblo.asia", "do.ha.long" ]
2017-06-23T02:59:00+00:00
Introduction We are living in a world where computers and electronic devices play a pivotal part of our life. No one can alter the fact that wherever we go, whatever we do in this modern life, it is...
en
/favicon.ico
Viblo
https://viblo.asia/p/programming-languages-milestones-an-overview-from-1960-present-part-1-yMnKMq2rK7P
Introduction We are living in a world where computers and electronic devices play a pivotal part of our life. No one can alter the fact that wherever we go, whatever we do in this modern life, it is not hard to witness the appearance of them. Tiny handheld devices, sparkling traffic lights, even remotes we use regularly, etc. nowadays are the achievements of continuously developing information technology in many years. Back to twentieth century, one of the most remarkable technology events had been taken place, when the whole world was recovering from the wounds of the second World War, the first general-purpose computer became public with the name ENIAC (Electronic Numerical Intergrator and Computer). Although it was developed under wartime pressure, it has been of value not only in producing results but in pointing the way toward improvements for future designs (Brainerd and Sharpless, 1948) because general-purpose computers have the amazing property that a single piece of hardware can do any computation imaginable (Grant, Palmer and Smith, 2017). Before general-purpose computers were born, there had been special-purpose computers for arithmetic calculations, which had to be manually reconfigured to carry out different calculations such as Abacus – The earliest known device for calculation (Gujarat Education Department, n.d.). In fact, to make computers and electronic devices as successful as today, the development of programming languages is the most important precondition. This part is going to be described briefly about several milestones during the great history of general-purpose programming languages and about a brief introduction of domain-specific languages. General-purpose programming languages’ milestones 1. From early twentieth century to 1960 For general-purpose programming development, it predates the development of computers. In the early 20th century, logicians created their own programming languages originally springing from the concept of a proof system (Grant, Palmer and Smith, 2017). To illustrate this spirit, λ-calculus – an officially abstract language of functions in 1932 designed and implemented by Frege, Church and Curry was born. The λ-calculus provides a simple semantics for computations. One of the clearest example for this point of view is treating every function “anonymously”. After λ-calculus released, there is no reliable material pointing out the first general-purpose programming language, yet the most worth-noting programming language at that time was FORTRAN in 1956 because it was the first programming language equipped with a complete compiler (Backus 1956). Before FORTRAN, almost all programs were done in machine language or assembly language (Wexelblat, 2014). At that time, programmers considered their works as a complicated and creative art that required human inventiveness to produce an efficient program, therefore, FORTRAN was born to solve these problems with the support of index registers, built-in floating point operations, a wide range of instruction sets and primitive input – output arrangements. Since 1956, FORTRAN was developed with lots of variants, namely: FORTRAN II (1958), FORTRAN III (1958), FORTRAN IV (1962), FORTRAN 66 (1966), FORTRAN 77 (1977), Fortran 90 (1991, since this version, FORTRAN was renamed to be in lower-case form), Fortran 95 (1995), Fortran 2003 (2003), Fortran 2008 (2008). However, 2008 was not the last destination for the development of Fortran. According to Intel, Fortran 2015 will be released on July 2018 with up-to-date features, most of them extends the C interoperability features that first appeared in Fortran 2003. Several following years after the release of Fortran, in 1958, a miraculous product that was inspired and influenced by this programming language had been public – ALGOL 58 (Wexelblat, 2014). Basically, ALGOL 58 was developed by a father of Fortran – John Backus with the specific Backus normal form method of describing programming languages. Later, it was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur form (Knuth, 1964). Since ALGOL 60’s release, many other general-purpose programming languages were published and followed its spirit and paradigm such as Pascal, C (although C differs from most of ALGOL-like languages in some significant ways), etc. (Mitchell, 2003). As the code snippet illustrated, the most distinctive feature that differed ALGOL60 and Fortran was block-oriented control structures (i.e.: procedure, comment, begin, end) and variable scoping. Last but not least, during this period, the use of computers for business applications began. “Special computers were developed for this purpose, along with special languages. The first successful high-level language for business was COBOL” (Sebesta, 2012). This general-purpose programming language was designed for business applications programming, therefore, its domain primarily applied in business, finance, and administrative systems for companies and governments. According to Mitchell, COBOL is still widely used in legacy applications deployed on mainframe computers, such as large-scale batch and transaction processing jobs. Actually, most programming in COBOL is now purely to maintain existing applications (2006). According to Yatsko and Suslow, the 1950s – 1960s are known as a period of exploiting machine power (2015). Never before has humanity seen a such powerful and innovative revolution of programming languages. Among these brilliant programming languages, as mentioned above, three first and most significant general-purpose programming languages that affect mostly to later generations of languages are FORTRAN, ALGOL and COBOL. In spite of the fact that, each of them was designed to be used in a particular business domain, they were still considered general-purpose languages. A scientific application can still be written in COBOL or a business application with FORTRAN (Dearle, 2015). 2. From 1960 to 1970 According to Wegner, the period from 1960 to 1970 was a period of elaboration of programming languages developed in the 1950’s and of analysis for the purpose of constructing models and theories of programming languages (1976). Among all programming languages that were released during this time (such as PL/I, Simula 67, ALGOL 68 and Snobol 4), PL/I was the most remarkable programming language. Discussing about PL/I, Wegner added: “An attempt to combine the good features of FORTRAN, ALGOL, COBOL and other programming languages created in 1950’s”. In fact, Pugh, Johnson and Palmer have pointed out some major goals for PL/I at that time: Competitiveness with COBOL's record handling and report writing capabilities was needed. The "scope of usefulness" of the language grew to include system programming and event-driven programming. Competitiveness with FORTRAN in performance of compiled code. Be extensible for new hardware and application areas (However, this goal was not achieved later then). Improve the productivity and time scales of the programming process, transferring effort from the programmer to the compiler. Be machine-independent and operate effectively across the main hardware and operating system ranges. (To be continued...) References
4768
dbpedia
3
8
https://retrocomputing.stackexchange.com/questions/9732/what-was-the-first-language-to-offer-full-structured-programming-support
en
What was the first language to offer "full" structured programming support?
https://cdn.sstatic.net/…g?v=6b27f39a6023
https://cdn.sstatic.net/…g?v=6b27f39a6023
[ "https://lh6.googleusercontent.com/-74M83RBb6ys/AAAAAAAAAAI/AAAAAAAAABM/igPkit2LlSc/photo.jpg?sz=64", "https://i.sstatic.net/36WSK.png?s=64", "https://i.sstatic.net/J38sS.jpg?s=64", "https://www.gravatar.com/avatar/645dbb4f7ba1ff9d7ff2de02f68fbe9c?s=64&d=identicon&r=PG", "https://retrocomputing.stackexchange.com/posts/9732/ivc/acfa?prg=48f9fcc4-7654-467f-8708-e72c9a0977c0" ]
[]
[]
[ "" ]
null
[ "R. Schmitz" ]
2019-04-18T12:49:48
In the same vein as this question about conditionals, I am interested in the first time a language provided functionality we assume as a given for modern programming languages. The question is when a
en
https://cdn.sstatic.net/Sites/retrocomputing/Img/favicon.ico?v=9b6f1e4fd156
Retrocomputing Stack Exchange
https://retrocomputing.stackexchange.com/questions/9732/what-was-the-first-language-to-offer-full-structured-programming-support
A couple of misconceptions in this one: Recursion is just another kind of iteration. It was in fact the only iteration mechanism provided in early versions of Lisp. Any use of recursion can be transformed into standard looping iteration (although many of them require use of a stack as well). Recursion can be viewed as kind of a hack to be able to leverage the program stack for your stack usage while iterating (in languages that use a program stack). Likewise early Lisp demonstrated that its possible (although not always convenient) to use recursion for all your iteration needs. I have never before heard that support for recursion is required for a language to be considered a "structured language". Really all that is required is that the language be Turing complete without relying on "unstructured" branch statements (aka: GOTOs) Now with all these caveats, most of the early "Structured Programming" developers were also at the same time involved in the development and use of ALGOL. This includes most prominently Dijkstra, who was the leading light in both the Structured Programming movement and developed the first ALGOL 60 compiler. Hoare and Dahl were also very involved in the development of both. The other major languages that were extant at the time were COBOL and FORTRAN. Early FORTRAN in particular made extensive use of GOTOs, along with data overlays that are essentially the data equivalent of GOTOS, and thus a large amount of the development of ALGOL was in reaction to FORTRAN. So if you are looking for the early standard-bearer for the paradigm, ALGOL is probably your language. This is a copy-paste of my answer to the other question. R. Schmitz said I could do it though. If we are talking about a block structured conditional like if condition some arbitrary sequence of statements including perhaps nested ifs else some other arbitrary sequence of statements including perhaps nested ifs rather than just if ... goto some label which is no different to a conditional branch, then two early contenders would be: Lisp (1958) which has an if and cond function. I think condpredates if. When I learned Lisp in the 1980's I'm fairly sure that if wasn't there. Lisp also had recursion, which means it had iteration by definition. Algol 60 (1960) which has the structured if inherited by most modern imperative languages. Also while and for loops and recursion. Early versions of FORTRAN and COBOL did not have structured if statements as far as I know or recursion.
4768
dbpedia
2
31
https://medium.com/%40romaninsh/from-fortran-to-modern-programming-languages-evolution-of-concepts-and-paradigms-part-3-cf59c3a04a6b
en
From Fortran to modern programming languages — evolution of concepts and paradigms (Part 3)
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
[ "https://miro.medium.com/v2/resize:fill:64:64/1*dmbNkD5D-u45r44go_cf0g.png", "https://miro.medium.com/v2/resize:fill:88:88/0*t5OLtrm8aye_5YN1.jpg", "https://miro.medium.com/v2/resize:fill:144:144/0*t5OLtrm8aye_5YN1.jpg" ]
[]
[]
[ "" ]
null
[ "Romans Malinovskis", "medium.com" ]
2023-08-29T23:13:18.899000+00:00
(Part 2: Control Structures https://medium.com/@romaninsh/from-fortran-to-modern-programming-languages-evolution-of-concepts-and-paradigms-part-2-8c304fcc67e7) Fortran was the first high-level…
en
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
Medium
https://medium.com/@romaninsh/from-fortran-to-modern-programming-languages-evolution-of-concepts-and-paradigms-part-3-cf59c3a04a6b
Romans Malinovskis · Follow 12 min read · Aug 29, 2023 -- Functions and Subroutines (Part 2: Control Structures https://medium.com/@romaninsh/from-fortran-to-modern-programming-languages-evolution-of-concepts-and-paradigms-part-2-8c304fcc67e7) Fortran was the first high-level programming language to introduce the concepts of functions and subroutines as we understand them today. While Assembly languages had the concept of “subroutines” or “procedures” (sequences of instructions that could be jumped to from multiple points in the code and then return to the calling point), they were low-level and lacked the abstraction and ease-of-use that Fortran’s functions and subroutines provided. Functions: In Fortran, a function is used when you want to return a value. Here’s a simple example of a function that calculates the square of a number: FUNCTION square(x) REAL :: x, square square = x * x END FUNCTION square Subroutines: Subroutines are similar to functions but don’t return a value. They are used for procedures that perform an action rather than compute a result to be returned. SUBROUTINE print_hello() PRINT *, 'Hello, World!' END SUBROUTINE print_hello Both functions and subroutines could accept parameters, making them versatile tools for creating more modular and maintainable code. ALGOL 60 After Fortran, the next significant language that brought about major refinements to the concept of functions and subroutines was ALGOL 60. ALGOL 60 and Its Innovations: Block Structure: ALGOL 60 introduced the idea of lexically scoped block structures. This allowed for variables (and functions) to have limited scope, which was a major advancement in structuring code and preventing accidental variable modifications or name clashes. begin integer x; x := 5; begin integer x; x := 10; end; print(x); { This will print 5 } end; Value and Name Parameters: ALGOL 60 introduced the distinction between passing parameters by value (where the function/subroutine receives a copy) and by name (where it operates on the original). This was an early precursor to the more common pass-by-value and pass-by-reference distinctions in later languages. procedure swap(value a, name b); integer a, b, temp; begin temp := a; a := b; b := temp; end Recursive Procedures: ALGOL 60 was one of the first languages to support recursion natively, allowing functions and procedures to call themselves. This opened the door to a wide range of algorithms and programming techniques. procedure factorial(n); value n; integer n; begin if n <= 1 then factorial := 1 else factorial := n * factorial(n - 1) end; Local Procedures: ALGOL 60 allowed for procedures to be defined inside other procedures. These local procedures could access the variables of the enclosing procedure. procedure outer(x); integer x; begin procedure inner(y); integer y; begin print(x + y); end; inner(5); end; ALGOL 60’s refinements set the stage for many of the features we see in modern languages. The idea of block structure became fundamental. The concept of recursion, while mathematically known, became a practical tool for programmers thanks to ALGOL 60. Pascal After ALGOL 60, the next language that brought significant innovations to the concept of functions and subroutines was Pascal. Designed by Niklaus Wirth in the late 1960s and early 1970s, Pascal further advanced structured programming concepts and introduced several features that impacted how functions and procedures were used: Strong Typing: Pascal emphasized strong typing, which extended to function signatures. This meant that functions and procedures had explicitly defined input and output types, leading to safer and more predictable code. function add(a, b: integer): integer; begin add := a + b; end; Procedures vs. Functions: Pascal made a clear distinction between procedures (which did not return a value) and functions (which did). This explicit distinction improved code readability. Nested Functions and Procedures: Like ALGOL 60, Pascal supported nested functions and procedures, but with more rigorous scoping rules. This allowed for better encapsulation and modular code. Pass-by-Value and Pass-by-Reference: Pascal introduced the var keyword to clearly indicate when a parameter was being passed by reference (and could be modified by the procedure) versus being passed by value (where the procedure received a copy). procedure swap(var a, b: integer); var temp: integer; begin temp := a; a := b; b := temp; end; Default Parameters and Overloading: Although these features became more prominent in Object Pascal (like in Delphi), the groundwork was laid in standard Pascal. Default parameters allowed functions and procedures to be called with fewer arguments than they declared, while overloading allowed multiple functions or procedures with the same name but different parameters. C After Pascal, the next language that brought significant changes to the concept of functions and subroutines was C. Though it was developed roughly around the same time as Pascal, C and its descendants (like C++, C#, and Objective-C) took a different direction and introduced several concepts that have become foundational in modern software development. Function Pointers: C introduced the idea of pointers to functions, allowing functions to be passed as arguments, stored in arrays, and more. This laid the groundwork for callback mechanisms and later influenced object-oriented and functional programming paradigms. int add(int x, int y) { return x + y; } int operate(int x, int y, int (*operation)(int, int)) { return operation(x, y); } int main() { printf("%d", operate(5, 3, add)); // Outputs 8 } Header Files and Separate Compilation: While not strictly about functions and subroutines, the way C handles external function declarations with header files (.h) and source files (.c) is significant. It allows for modular development and separate compilation. Variadic Functions: C introduced functions that could take a variable number of arguments. The printf function is a classic example. int printf(const char *format, ...); No Nested Functions: Unlike Pascal and ALGOL, C does not support nested functions. All functions have file scope. Everything is a function: In C all the code must reside in some kind of function and the first function to be executed is called main. While C streamlined many concepts, its true power and significance come from the flexibility it offers, especially through pointers and direct memory manipulation. This made it both potent and risky, but with the right practices, it became the backbone of systems programming, influencing a whole lineage of languages that followed. Smalltalk After C, the language that brought significant innovations to functions and subroutines was Smalltalk. Developed in the 1970s, Smalltalk was one of the first fully object-oriented languages, and its approach to functions (referred to as “methods” in the context of object-oriented programming) was groundbreaking. Smalltalk and Its Innovations: Messages Instead of Function Calls: In Smalltalk, you don’t “call a method on an object.” Instead, you “send a message to an object.” The object then decides how to respond to that message, typically by invoking a method. This paradigm shift was foundational to the way object-oriented programming evolved. result := anObject aMessage. Polymorphism: Due to the message-passing paradigm, different objects could respond to the same message in different ways. This is polymorphism, where a single message can have different meanings depending on the object receiving it. Everything is an Object: In Smalltalk, everything, including numbers and functions (methods), is an object. This consistency made the language incredibly expressive. For example, to increment a number by one and then multiply it, you’d send messages to a number just like you would with any other object: result := 3 + 1. "Sends the message '+ 1' to the object 3" Blocks and Closures: Smalltalk introduced the concept of “blocks” which are essentially anonymous functions or lambdas. Blocks can capture variables from their surrounding context, creating closures. aBlock := [ :x | x + 1 ]. result := aBlock value: 5. "result will be 6" Dynamic Typing and Late Binding: Unlike many languages of its time, Smalltalk uses dynamic typing. Methods are bound at runtime, not compile-time. This flexibility allows for powerful programming techniques but also requires more runtime checks. Introspection and Reflection: Smalltalk objects can inspect and modify their structure and behavior at runtime. This includes changing their class, examining their methods, or even altering their methods. No Distinction Between Functions and Procedures: In Smalltalk, there’s no distinction as in Pascal between functions (which return values) and procedures (which don’t). Instead, every method in Smalltalk returns a value. If a return value isn’t specified, a method returns self (the receiver of the message). Lisp After Smalltalk, Lisp (and its various dialects, especially Scheme and Common Lisp) brought significant innovations and perspectives to the concept of functions and subroutines. While Lisp predates Smalltalk (the first version was created in the late 1950s), its evolution throughout the 1970s and 1980s, especially with the introduction of Scheme, saw the crystallization and popularization of many of its groundbreaking concepts. Lisp and Its Innovations: First-Class Functions: In Lisp, functions are first-class citizens, meaning they can be passed as arguments, returned as values, and assigned to variables. This wasn’t entirely new, but Lisp took it to the core of its design. (defun apply-twice (f x) (funcall f (funcall f x))) Lambda Functions: Anonymous functions can be defined on-the-fly using the lambda keyword. These can be used wherever a function is expected. (mapcar (lambda (x) (* x x)) '(1 2 3 4)) ; Squares each element in the list Closures: Lisp supports closures, where functions can “capture” variables from their enclosing environment. (defun make-adder (x) (lambda (y) (+ x y))) Macros: One of Lisp’s most powerful features. Macros allow for code that generates code, enabling powerful metaprogramming capabilities. This leads to the famous saying, “Code is data, and data is code” in the Lisp community. (defmacro when (cond &rest body) `(if ,cond (progn ,@body))) Tail Recursion: Scheme, a dialect of Lisp, emphasized tail recursion, allowing for efficient recursive algorithms by reusing the current function’s call frame for the next call, given it’s in the “tail” position. Continuations: Especially in Scheme, the concept of first-class continuations was introduced. This is a powerful control structure that allows the capture of the current state of the computation, which can be resumed later. (similar to yield?) Dynamic Scoping: Early Lisp used dynamic scoping, where the value of a variable was determined by the calling function, not the defining function. Later dialects, like Scheme, emphasized lexical scoping (similar to most modern languages), but dynamic scoping was an interesting and influential aspect of early Lisp. Multi-Methods: In Common Lisp, functions (or methods) aren’t tied to a specific object or class. With the Object System (CLOS), methods can be defined on any combination of arguments, not just the first one. This allows for more generalized polymorphism. Lisp’s approach to functions and its overall philosophy of homoiconicity (where the code has the same structure as its data) has been deeply influential, inspiring features in numerous later languages, from JavaScript’s first-class functions to Python’s lambda functions and macros in various languages. Its emphasis on recursion, symbolic computation, and metaprogramming has left a lasting legacy on the field of computer science. C++ In the realm of imperative languages, after Lisp, the next major language that brought significant changes to the concept of functions and subroutines was C++. Member Functions: With the introduction of classes in C++, functions could now be associated directly with a specific class as member functions (often referred to as “methods” in other OOP languages). This was foundational to the object-oriented paradigm in C++. class Rectangle { public: void setValues(int w, int h) { width = w; height = h; } private: int width, height; }; Constructors and Destructors: Special member functions that are called when an object is created and destroyed, respectively. class Rectangle { public: Rectangle(int w, int h) : width(w), height(h) {} ~Rectangle() {} private: int width, height; }; Overloading: C++ introduced the ability to define multiple functions with the same name but different parameters, known as function overloading. void print(int i); void print(double d); Virtual Functions: These enable polymorphism, where a base class pointer can point to objects of any of its derived classes, and the appropriate function is called based on the actual type of the object. class Base { public: virtual void show() { cout << "Base\n"; } }; class Derived : public Base { public: void show() { cout << "Derived\n"; } }; Templates: C++ introduced the idea of generic programming with templates. You can write a single function (or class) that can operate on multiple data types. template <typename T> T add(T a, T b) { return a + b; } Default Arguments: Functions in C++ can have default arguments, allowing them to be called with fewer arguments than they declare. void display(int a = 0, int b = 0) { cout << a << " " << b << endl; } C++ significantly expanded the concept of functions and subroutines by integrating them into its object-oriented and generic programming features. Its influence is evident in many subsequent languages, especially those that sought to combine procedural, object-oriented, and generic programming paradigms. Object-oriented programming dominated mainstream languages until recently, where languages like Swift, Go and Rust have proposed alternative means of abstraction. Post-C++ Lambda Expressions and Anonymous Functions: While these concepts have roots in languages like Lisp and Scheme, they gained widespread attention in mainstream languages like Java (from Java 8 onwards) and C#. list.forEach(item -> System.out.println(item)); Closures: Languages like JavaScript popularized closures, where functions can “capture” variables from their enclosing environment. function outer() { let x = 10; function inner() { console.log(x); } return inner; } Extension Methods: C# introduced extension methods, allowing developers to “add” methods to existing types without modifying them. public static class StringExtensions { public static bool IsNumeric(this string s) { // Implementation here } } Promises and Futures: Asynchronous programming models in languages like JavaScript and Python introduced promises and futures for more manageable asynchronous code. fetch(url) .then(response => response.json()) .then(data => console.log(data)); Pattern Matching in Function Parameters: Languages like Haskell, Scala, and more recently Rust and Python, have embraced pattern matching, allowing functions to define multiple “versions” based on the shape and content of their inputs. def factorial(n: Int): Int = n match { case 0 => 1 case _ => n * factorial(n-1) } Variadic Generics: Languages like TypeScript (though primarily for type-checking) introduced the idea of variadic generics, offering more flexibility in function and type definitions. function concat<T extends string[]>(...strings: T): string { return strings.join(""); } Coroutines: Languages like Kotlin and Python have emphasized coroutines, allowing functions to be paused and resumed, which is useful for asynchronous programming without deep nesting or callback hell. suspend fun fetchData(): DataType { // Implementation here } Function Type Inference: Modern languages like TypeScript, Rust, and Swift have strong type inference mechanisms, allowing functions to have clearly defined types without always requiring explicit annotations. Pipeline Operators: Some languages, like Elixir and more recently proposed in JavaScript, introduced the concept of a pipeline operator to streamline function chaining. "Elixir" |> String.downcase() |> String.reverse() Tail Call Optimization: While this concept has been around since the Lisp days, languages like Elixir and some modern implementations of older languages have emphasized tail call optimization to allow recursive functions to run with constant stack space. Rust Rust, known for its focus on memory safety without sacrificing performance, introduced several features and paradigms in the realm of functions and subroutines: Ownership and Borrowing: At the heart of Rust’s memory safety guarantees are the concepts of ownership, borrowing, and lifetimes. They dictate how functions can use and share data. fn takes_ownership(some_string: String) { ... } fn borrows_immutable(s: &String) { ... } fn borrows_mutable(s: &mut String) { ... } Pattern Matching in Function Parameters: Similar to some functional languages, Rust uses pattern matching extensively, including in function parameters. fn match_example(x: Option<i32>) { match x { Some(val) => println!("Value is: {}", val), None => println!("No value"), } } Associated Functions and Methods: In Rust’s object-oriented features, you can define functions associated with a type (similar to static methods in other languages) and methods that take a reference to self as their first parameter. (Note — Rust does not have standard implementation for Objects and Classes and instead uses structs and traits) struct Circle { radius: f64, } impl Circle { // associated function fn new(radius: f64) -> Circle { Circle { radius } } // method fn area(&self) -> f64 { 3.14 * self.radius * self.radius } } Closures with Environment Capture: Rust supports closures (anonymous functions) that can capture their environment, with type inference and a concise syntax. let add_one = |x| x + 1; Trait-Based Function Overloading: Rust doesn’t support traditional function overloading, but it does allow function overloading using traits and its impl mechanism. Concurrency without Data Races: Rust introduces a set of rules and checks that ensure thread safety, preventing data races at compile time. The language’s ownership system is crucial to this. Diverging Functions: Rust has functions that do not return, indicated by the ! return type. These are called diverging functions. fn panic_example() -> ! { panic!("This function never returns!"); } Higher Order Functions: Rust’s standard library, especially the Iterator trait, offers a rich set of higher-order functions like map, filter, and fold. let numbers = vec![1, 2, 3, 4, 5]; let doubled: Vec<_> = numbers.iter() .map(|&x| x * 2) .filter(|&x| x > 5) .collect(); println!("{:?}", doubled); // Outputs: [6, 8, 10] Return Type Polymorphism: While Rust doesn’t support traditional generics in the same way as languages like C++ or Java, it does allow for some forms of return type polymorphism using traits and associated types. trait ConvertibleFrom<T> { fn from(T) -> Self; } impl ConvertibleFrom<&str> for i32 { fn from(s: &str) -> Self { s.parse().unwrap() } } impl ConvertibleFrom<&str> for f64 { fn from(s: &str) -> Self { s.parse().unwrap() } } fn convert<T, U: ConvertibleFrom<T>>(input: T) -> U { U::from(input) } // Usage: let integer: i32 = convert("42"); let float: f64 = convert("42.5"); Macros: While not functions in the traditional sense, Rust’s macro system deserves mention. They allow for metaprogramming and code generation in ways that functions can’t macro_rules! say_hello { () => { println!("Hello, World!"); }; ($name:expr) => { println!("Hello, {}!", $name); }; } // Usage: say_hello!(); // Outputs: Hello, World! say_hello!("Alice"); // Outputs: Hello, Alice! Rust’s innovations in the realm of functions and subroutines largely revolve around its ownership and borrowing system, which ensures memory safety and data race-free concurrency. Part 4: https://medium.com/@romaninsh/from-fortran-to-modern-programming-languages-evolution-of-concepts-and-paradigms-part-3-b9eb11caee2d
4768
dbpedia
1
25
http://l.web.umkc.edu/lz74d/CS441/Homework3.htm
en
Year of
[]
[]
[]
[ "" ]
null
[]
null
null
Major Contribution Motivation Impact on Language Implementation to Development for Development Readability Writability Reliability Source FORTRAN 1954 One of the features of FORTRAN I, and all of its successors except 90, that allow highly optimized compilers are that the types and storage for all variables are fixed before run time modules). The environment in which FORTRAN was developed was as follows: (1) Computers were still small, slow, and relatively unreliable. (2) The primary use of computers was for scientific computations. (3) There were no existing efficient ways to program computers. (4) Because of the high cost of computers compared to the cost of programmers, speed of the generated object code was the primary goal of the first FORTRAN compilers. So I think FORTRAN did increase the readability of the major programming language from the late 1940s to the mid-1950s for it was evolved more with the features of simplicity and orthogonality, getting more new control statement, data structures (Ext ending names to six characters, independent compilation, type names that could be parameter, character string handling, declarations and allow subprogram Increase writability too. The same reason as readability. However, I don’t think its syntax considerations and expressivity is very good. Its reliability is high since it got parameter type checking (FORTRAN 90), aliasing (FORTRAN 0 and I had user-defined programs) and its comparatively high readability and writability. Sebesta, Robert W, Concepts of Programming Languages (5th ed.2003) Algol 58 1958 Formalized concept of data type, compound statements Good used for scientific programming, syntax be close to math notation Hard to read programs, Following code difficult Hard to write parameters, hard to write structure of programs Read and write hard then reliability hard, hard modify programs Sebesta, Robert W, Concepts of Programming Languages (5th ed.2003) LISP 1958 LISP was the first functional programming language. It greatly influenced artificial intelligence (AI) from its inception. LISP was designed in 1958 by John McCarthy and Marvin Minsky, while working on the MIT AI Project. One of the first goals of the project was to produce a list processing system. The Fortran List Processing Language already existed; however, it did not support recursion, conditional expressions, dynamic storage allocation or implicit deallocation. Dynamically allocated linked lists and implicit deallocation were requirements for symbolic differentiation. McCarthy’s and Minsky’s list processing system was to be used to implement a program named the Advice Taker, but the application provided the incentive to develop the new list processing language LISP. Since LISP is a functional language, assignment statements and variables are not necessary. Functions are applied to arguments. LISP also uses recursive function calls, rather than loops. The simplicity of the LISP language does make is less readable and harder to understand for someone not familiar with the language. However, the simplicity and orthogonality are impressive to some language researchers. The profuse use of parentheses can also decrease the readability of this language. The simplicity and orthogonality of LISP have a huge impact on writability. They make it easier to write programs in LISP. The lack of assignments and variables also increases the writability of the language. LISP is a fairly reliable functional programming language. It is the most widely used language for artificial intelligence. Most of the past inefficiencies of LISP code have been eliminated. The code is ran faster because it is compiled and not ran on an interpreter. LISP is a reliable functional programming language. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://users.erols.com/ziring/dopl.html COBOL 1959 Business Application Programming Language Hierarchial Data Structures “English” like nonmathematical syntax. COBOL was developed in 1959 to fulfill the needs of a growing base of business computer users. COBOL was not built in an attempt to create a universal programming language, but rather to create a practical language for business applications. COBOL succeded in that goal, resulting in the fact that there currently exist more lines of COBOL code than any other programming language. COBOL was the first language to implement “English” style commands rather than a more mathematics syntax. This decision fell in line with the goal of making COBOL usable for a broader base of programmers by making it more readable, and thus accessible to business professions with less mathematical and computer science background. COBOL defines every variable is a special “data division” of the code, thus implementing a data dictionary which makes reading and understanding the code easier. COBOL is the first language to implement heirarchial data structures, providing the first step toward data abstraction which makes programs more readable by managing complexity. The use of “English” style in COBOL commands allows the programmer to think more in terms of the problem being solved and less in terms of the idiosyncrasies of the programming language and machine implementation. The abstraction provided by hierarchical data structures makes code less complex, thus easier to write. Finally, because of its business application focus, COBOL often trades power for usability. Therefore, while a programmer may have a difficult time writing scientific applications in COBOL because of the lack of powerful features, the usability gained in the area of business application allows the programmer to again focus on the problem not the language, improving writability. Reliability, as a function of how well a program is understood and can be modified correctly, is increased by COBOL’s gains in readability and writeability. Clearly, based on its widespread use in business critical applications, COBOL has proven itself a reliable programming tool. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://www.legacyj.com/cobol/cobol_history.html Algol 60 1960 Public algortihms, Machine independent Make modifications to Algol 58 Understanding difficult, BNF make hard to read Input and output statements Implementation inefficient, BNF hard to modify Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) BASIC 1963 The Basic design came from Fortran and Algol 60. Algol 60 had a minor influence, but with the help of these two languages Basic was born. In 1963 Kemeny began the work on the complier for the first version of Basic. Basic was designed in the spring of 1963 especially for the liberal art students. John Kemeny and Thomas Kurtz felt that the language was needed for those students who didn’t have the computer science background. Since, Basic was based off of Fortran and Agol 60 the readability of Basic already had a solid ground. “Basic has been criticized for the poor structure of programs written in it, among other things. By Sebesta evaluation criteria, the language does indeed fare very poorly.” (Sebasta, 67) Basic received its reliability on the success and the ease in which it can be learned, and implemented on any computer. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004) PL\I 1965 PL/I represents the first large -scale attempt to design a language that could be used for a broad spectrum of application areas. First Concurrency. First exception handling. Switch-selectable recursion. First pointers. First array cross sections. In the 1970s, it enjoyed significant use in both business and scientific applications. By the early 1960s, the users of computers in industry had settled into two separate and quite different camps. Scientific programmers used the floating-point data type and arrays extensively. For business applications, people needed the decimal and character string data types. Until early 1963, IBM planners perceived the beginnings of a change in this situation. These perceptions quite naturally led to the concept of designing a single universal computer that would be capable of doing both floating-point and decimal arithmetic, and therefore both scientific and business applications. Along with this came the idea of a programming language that could be used for both business and scientific applications. Since Many of the new features of PL/I were poorly designed and too large and too much redundancy, it's readability is poor for me. Even though PL/I is not simple and orthogonal enough, but it was (and still is) actually used for both scientific and business applications. And from my point of view, its writability is good. Even PL/I was first one had exception handling, and ok writability, but its readability is so poor and not widely used as FORTRAN and COBOL, so I think its reliability is not so good. Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) SIMULA 67 1967 SIMULA introduced important object-oriented programming concepts like classes and objects, inheritance, and dynamic binding. It was originally designed and implemented as a language for discrete event simulation. The basic concept of speaking in general and more specific terms had been adopted as a way of expressing reality in the context of a programming language SIMULA began with the idea of classes and objects to encapsulate procedures. This made the writability of languages to vastly improve, since all the procedures were in one location and removed the superfluous information. Classes increase reliability due to the object-oriented concepts, which were later developed in Java. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004) http://java.sun.com/people/jag/SimulaHistory.html Pascal 1971 The major contributions of Pascal were the value-result method of passing parameters and the case statement for multiple selections. The largest contribution that Pascal had on the programming world was the teaching of the language. In the early 1970’s FORTRAN, PL/I, and ALGOL-W were common languages taught to computer science students. By the mid 1970’s Pascal had become the most widely used language to teach students. Pascal was the standard language to teach for almost two decades until the early 1990’s. Pascal has affected almost every block-structured language designed since its creation and still does today Niklaus Wirth a member of the International Federation of Information Processing (IFIP) Working Group 2.1, which was created to continue the development of ALGOL in the mid 1960’s. In August of 1965, Wirth and Tony Hoare contributed to that effort by presenting a proposal for additions and modifications to ALGOL 60. Their, proposal was rejected because it was considered to be too small to advance ALGOL 60. Later on, ALGOL 68 was released but Wirth and Hoare believed that it should not have been released because of its complexity. So they set out to develop and better version of ALGOL called ALGOL-W. From ALGOL-W the programming language known as Pascal was born. Pascal was created to be a simple language that would be widely used and easily teachable. Pascal’s impact on readability was huge because its simplicity and expressivity. Programs were easy to read through and it became the standard for readability for teaching young programmers how to write simple programs. Not only has Pascal’s impact on readability been enormous to programming, it also had a large impact on writability. It was easy for students to write small programs to learn the language quickly through in their classes. However, Pascal lacked many complex programming issues such as the impossibility of writing subprograms that take a parameter an array of variable length. Another problem with Pascal was the lack of any separate compilation capability. Although Pascal lacked the ability to do many things it was still considered to be a more stable than FORTRAN and C. It provided a safety net for young programmers to not allow them to make errors that they could have easily made in other languages. It did this because it was such and small language compared to it predecessors. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004) http://cgibin.erols.com/ziring/cgi-bin/cep/cep.pl?_key=Pascal SNOBOL 1971 SNOBOL provided a language which could be used as if it were an object-oriented language, a logical programming language, a functional language, or a standard imperative language. This was made possible by changing the set of features used to write a program. SNOBOL was the fourth and final incarnation of a series of special purpose programming languages intended to make character string manipulation easier. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://www.wikipedia.org/wiki/SNOBOL C 1978 Ken Thompson “wanted to create a comfortable computing environment constructed according to his own design using whatever means necessary. By early 1973 the language and compiler were strong enough to rewrite the Unix kernel for the PDP-11.” C was developed because in the late 1960’s 3 companies by the names of Bell Telephone Laboratories, MIT, and GE were in a project development with Multics. Multics development could not be fulfilled in time and was becoming very expensive. The language shows considerable power to describe important concepts. It’s easy to read and there are different types of libraries set up within the language to help the user. The relationship between arrays and pointer, and the way in which declaration syntax mimics expression syntax. They are also among its most frequently criticized features and often secure as stumbling blocks to the beginner. C is quirky, flawed, and an enormous success. While accidents of history surely helped, it evidently satisfied a need for a system implementation language efficient enough to displace assembly language, yet sufficiently abstract and fluent to describe algorithms and interactions in a wide variety of environments. Dennis Ritchie, http://cm.bell-labs.com/cm/cs/who/dmr/chist.html, 1993 PROLOG 1979 Prolog is a dynamic high level language with untyped variables, encouraging a natural level of abstraction, and discouraging you from hard-wiring early assumptions into your code. Its major contribution has been that it allows Prolog programmers to be closer to the problem to be solved, more adaptable to changes in that problem, and easier to maintain and support. Alain Colmerauer and Phillippe Roussel at the University of Aix-Marseille, with the help of Robert Kowalski at University of Edinburgh, developed the fundamental design for Prolog. Colmerauer and Roussel were interested in natural language processing, and Kowalski was interested in automated theorem proving. The collaboration between them and the two universities on this new language continued until the mid 1970’s. In 1972, their efforts led to the use of formal logic as the basis for a programming language. Kowalski's research provided the theoretical framework, while Colmerauer's resulted in the programming language. After that time both universities received little attention on their research and conducted independent studies on the prolog language until 1981. At that time the Japanese government was launching a large research project called the Fifth Generation Computing Systems (FGCS). The primary objective of this project was to develop intelligent machines, and Prolog was chosen as the basis for their study. This caused other researchers and governments to soon take notice of the Prolog language which started a 10 year long interest around the world in artificial intelligence and logic programming. Readability is difficult because of many unresolved problems with the language. The main problem with it is that it is hard to follow the order of the program and much of the programming is without any set basis of rules. This makes it very difficult for another programmer to figure out what he or she is reading without detailed notes to accompany the program. However, Prolog does offer unification which provides the ability for variables to instantiated values to provide matches. To sum up the impact on readability has been that it has lead to many user-defined programming languages. Prolog has good writability and it is considered the standard in logic programming and AI programming. It is a fairly simple language to program with because it is based on facts, rules, or goals with a mainly structures composed of atomic propositions and logic operators. Finally, the impact that prolog has had on writability has been the advantage and disadvantages that unification introduced to the word of programming, causing many future languages to copy its platform. Reliability of Prolog is simply not good. There are many ways it is inefficient including infinite loops, the programmer MUST sometimes state control flow information in their programs. It is very easy for programmer to write statements that cause infinite loops resulting in total program failures. Also, there are problems of the closed-world assumption and negation. To sum up the impact of Prolog’s reliability it has basically been a model of what not to do in order to create a reliable program language. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) Roth, Allison, “YOU CAN'T IGNORE PROLOG” http://web.archive.org/web/20010708010543/www.practical-applications.co.uk/Prolog/alli.html SmallTalk 1980 First widely used implementation of object oriented programming. Implemented first example of an graphical windowing human interface system both for program user interfaces and integrated development tools. Smalltalk was originally developed by Dr. Alan Kay to extend the ideas of object oriented programming which were first developed in a limited form in Simula-67. Everything in Smalltalk is an object and information is passed between objects by way of messages. The object oriented programming framework of Smalltalk was developed concurrently with a plan to create a dynamic windowing human interface system which would bring computing to a less technical class of user. This windowing system is also integral to Smalltalk’s first implementation of an integrated software development environment. Therefore, Smalltalk is not only a programming language but also a software development tool which created a model for all modern graphical software development environments. Reading Smalltalk code is more like reading a text representation of semantic chart than it is reading computer code due to the complete implementation of object oriented concepts in Smalltalk. Therefore, Smalltalk is very readable to a programmer with knowledge of object oriented programming, but will make little sense otherwise. The simple expressive syntax of Smalltalk uses human nouns and verbs, leading to easy readability based on knowledge of English semantics. In terms of Writability, it must be considered that Smalltalk is both a programming language and an integrated graphical software development environment. As the first language to implement such a development environment, Smalltalk makes large gains in writeability across the overall programming process. Such a development environment is key to dealing with the many discrete segments of code which interact in the form of objects in Smalltalk. Individual segments of Smalltalk code text will have good writeability so long as the programmer subscribes to the concepts of object orientation. The use of English nouns and verbs also enhances writability by making use of existing semantic knowledge. The powerful abstraction tool of object oriented design will itself help to manage complexity, which make Smalltalk programs easy to understand, write, and modify. As an interpreted system originally designed to have some of the simulating capabilities as its predecessor Simula, Smalltalk is an excellent tool for prototyping and small systems but may not be robust enough to be reliable in very large projects. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://www.smalltalk.org/ Ada 1983 Ada was designed specifically for the use of the Department of Defense (DoD). It included many object-oriented programming features. It was designed to promote reliability. The requirement phase of Ada began in 1975 with the official version of the language being finalized in 1983. Ada included a numerous amount of features. Ada used to be the only widely used language that included exception handling. It was also one of the first languages to offer full support for abstract data types. Ada was developed for use by the Department of Defense as a high-level language used for embedded systems. The Ada language was based on the programming language Pascal. The impact on readability by Ada was in part due to the use of data types, control statements and structures. Additionally, the use of special words greatly increase the readability of this language. The Ada language is so large and complex that it decreases the writability of the language. However, there are also many factors that are beneficial to the writability of Ada. These include abstraction, expressivity and generic procedures. A compiler can generate a version of the generic procedure using a specific data type, reducing the amount of code to be written. Exception handling increased the reliability of Ada. The benefits of exception handling in the languages PL/I and CLU were also included in the design of Ada. In 1995, Ada began to support object-oriented programming. The complexity of the language may reduce the reliability; however, the readability and writability of the language has a positive impact on the reliability. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://users.erols.com/ziring/dopl.html C++ 1984 Parameter type checking, derived classes, public/private access of components Evolved from C because modifications to improve its imperative features, object oriented programming Good Compilers, large & complex language C can be compiled as C++ programs, large and complex language C can be compiled as C++ programs, good compilers Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) JAVA 1993 Java provides a secure, cross-platform way for code to be executed. Java also added several features to existing websites, such as arbitrary graphics, arbitrary data types, less load on the server, and more user interaction. Java was designed mainly to enable people to develop programs that use the Internet and the World-Wide Web. Java prevents casting between arbitrary variables. Only casts between numeric variables and between sub and superclasses of the same object are allowed. All conditional tests must return boolean values. Java supports both the "/* ... */" and "//" commenting conventions, but comments that begin with a "/**" are special. These comments should only be used before a method or class declaration. They indicate that the comment should be included in automatically generated documentation for that declaration. Java provided a new and higher level of reliability by eliminating most of the assignment coercions that are done in C++, and by using an "index range" method of accessing arrays. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://www.shu.ac.uk/java/aboutthisbook/pref2.htm http://www.ibiblio.org/javafaq/javafaq.html JavaScript 1995 Brought full interactivity and sophisticated user interface and typography concepts to the formerly static Web. Tasked with making Navigator's newly added Java support more accessible to non-Java programmers, Brendan Eich eventually decided that a loosely-typed scripting language suited the environment and audience, namely the few thousand web designers and developers who needed to be able to tie into page elements (such as forms, or frames, or images) without a bytecode compiler or knowledge of object-oriented software design. The language he created was christened "LiveScript," to reflect its dynamic nature, but was quickly (before the end of the Navigator 2.0 beta cycle) renamed JavaScript JavaScript's capabilities can't be extended or overridden by developers. A language that apparently required no skill to use, lacked an IDE and a reliable cross-platform debugger, and could only be tested in the context of the actual browser in which the page will be viewed, combined with a few highly publicized security flaws and several books aimed at non-programmers, caused many to write off JavaScript as a "simple" language for beginners and overshadowed its amazing potential. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://www.oreillynet.com/pub/a/javascript/2001/04/06/js_history.html PHP 1995 One of the biggest strengths of PHP 3.0 was its strong extensibility features PHP was initially as a simple set of Perl scripts for tracking accesses to his online resume. He named this set of scripts 'Personal Home Page Tools'. As more functionality was required, Rasmus wrote a much larger C implementation, which was able to communicate with databases, and enabled users to develop simple dynamic Web applications Because PHP is based on Perl, it has many of the problems associated with Perl. As in Perl, an instruction can be written in a variety of ways, this reduces the readability of the language Because one can write an instruction in a variety of ways, this increases the writability of the language As the language's strongest attribute is its extensibility feature, this has forced PHP to be rather reliable Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004) http://us2.php.net/history
4768
dbpedia
2
71
https://www.slideshare.net/slideshow/mind-your-language-57720027/57720027
en
Mind Your Language
https://cdn.slidesharecd…t=640&fit=bounds
https://cdn.slidesharecd…t=640&fit=bounds
[ "https://public.slidesharecdn.com/images/next/svg/logo/slideshare-scribd-company.svg?w=128&q=75 1x, https://public.slidesharecdn.com/images/next/svg/logo/slideshare-scribd-company.svg?w=256&q=75 2x", "https://cdn.slidesharecdn.com/profile-photo-Kevlin-48x48.jpg?cb=1719160777", "https://image.slidesharecdn.com/mindyourlanguage-160201065008/85/Mind-Your-Language-1-320.jpg 320w, https://image.slidesharecdn.com/mindyourlanguage-160201065008/85/Mind-Your-Language-1-638.jpg 638w, https://image.slidesharecdn.com/mindyourlanguage-160201065008/75/Mind-Your-Language-1-2048.jpg 2048w", "https://image.slidesharecdn.com/mindyourlanguage-160201065008/85/Mind-Your-Language-2-320.jpg 320w, https://image.slidesharecdn.com/mindyourlanguage-160201065008/85/Mind-Your-Language-2-638.jpg 638w, https://image.slidesharecdn.com/mindyourlanguage-160201065008/75/Mind-Your-Language-2-2048.jpg 2048w", "https://image.slidesharecdn.com/mindyourlanguage-160201065008/85/Mind-Your-Language-3-320.jpg 320w, https://image.slidesharecdn.com/mindyourlanguage-160201065008/85/Mind-Your-Language-3-638.jpg 638w, https://image.slidesharecdn.com/mindyourlanguage-160201065008/75/Mind-Your-Language-3-2048.jpg 2048w" ]
[]
[]
[ "" ]
null
[]
2016-02-01T06:50:08+00:00
Mind Your Language - Download as a PDF or view online for free
en
https://public.slidesharecdn.com/_next/static/media/favicon.7bc3d920.ico
SlideShare
https://www.slideshare.net/slideshow/mind-your-language-57720027/57720027
1. 28 APPLICATION DEVELOPMENT ADVISOR q www.appdevadvisor.com D EBATES on programming languages are always sure to inspire more heat than light. A previous article looked at a number of languages quantitatively as opposed to qualitatively, concluding that once a language has made it into the mainstream, it tends to be quite dif- ficult to kill-off quickly or completely, and that the industry is generally multilingual.1 As an industry it would be fair to describe software development as being strongly fashion-led. As with fashion, there is the per- ception that whatever is at the shows and in the magazines is what everyone is doing. As with fashion this is not the case: people still wear their old clothes and get on with their lives. The call to “wake up and smell the JavaBeans”2 because Java is the answer, everyone is doing it, and C++ and all other languages are in free fall is about as far from reality as the view that Java is an Internet language for a minority of pro- grammers. From the other corner of the debate, the fact that a language may be better (whatever that means) than another language has never stood in the way of it being unsuccessful in the market place. The ele- gant Eiffel is a classic example of this; the mar- ket is a harsh antidote for idealism.The truth is always a subtler blend of opposite views, and tends to have less regard for fashion, logic or conservatism. Silent majority It is easy to fall into a polarised view based on one’s work environment.There are many COBOL shops where the idea of GUIs and curly bracket languages seem distant, faddish, and irrelevant; there are vibrant ecom- merce companies steeped in XML, Java and Perl to the degree that there is almost denial there has ever been another purpose for software; there are Microsoft-only sites where the idea that other viable operating systems exist, and are measurably more reliable, is at best hearsay; there are places in academia where there is still a strong belief that formal methods are viable con- tenders for the mainstream; and so on. The message is that software development is more multicultural, in this respect, than many individuals or consultancy groups would like to admit—it is difficult to take a position of extreme moderation and be heard. For many programming languages the con- cept of the silent majority applies.There are many languages that are definitely not fash- ionable but are still quite widely used, away from the glare of publicity. They are used in the context of continued development of exist- ing systems—euphemistically called main- tenance by many, an unfortunate and inappropriate term that software borrowed from the physical engineering disciplines— and, perhaps surprisingly, for the develop- ment of new systems. With the year 2000 boundary condition out of the way, it is worth taking stock of some of the better known. Fortran Fortran started life to prove a point: a compiled language could produce machine code with efficiency close to that of hand- crafted assembler. John Backus led his team at IBM in this endeavour between 1954 and 1957. He freely admits that they made the syntax up as they went along, preferring to focus on a close match between the language and the features of the underlying IBM 704 on which it was originally developed. This certainly explains something of the language’s syntax and its frequent lack of consistency. Although it has long since outgrown its original acronym (FORmualTRANslation because, at the time, its syntax had a vague resemblance to mathematical notation) its efficiency and ability to be first to market for the numerate disciplines ensured that it would win out of the more elegant ALGOL 58 and ALGOL 60. Fortran has evolved radically through a number of standards and extensions. Modern Fortran is radically different to the FOR- TRAN of the early 1960s (and not simply because Fortran 90 sanctioned the lower case spelling of the name). Programming using exclusively modern Fortran features leads to Enterprise Development Solutions Now that the year 2000 crisis has subsided, Kevlin Henney finds how legacy languages will fare in the future, and charts their past development MindYour Language 2. differences comparable to the difference between Java and classic C. It was once said in the scientific pro- gramming community that whatever language they were using in the year 2000 it would be called Fortran. It would appear that this prophecy has been fulfilled. Fortran is still taught on science and engineering courses around the world, and is used extensively with- in these communities. Although Fortran has enjoyed popularity in commercial pro- gramming domains, it has been largely dis- placed leading to the mistaken perception by many developers in the industry that it has completely disappeared. The greatest threats to Fortran’s position in its tradi- tional market have come from C, C++ and tools such as Mathematica. COBOL When people talk of legacy systems, COBOL is without a doubt the language that most often springs to mind. Along with Fortran it is the language that one can most often expect to find in the history of programmers over the age of 30. It is not simply a mat- ter of history, though. Y2K raised many peo- ple’s awareness of how much COBOL was out there. Although distinctly unfashionable, COBOL code globally constitutes the most code in any single language. Estimates vary, but in terms of lines of code we are talking international telephone numbers. What we now know as COBOL started life as an interim report (the final one was never completed) resulting from a user push for a business-oriented language under the auspices of the US Department of Defense and the newly formed Conference on Data Systems Languages (CODASYL) committee. Because of the perceived urgency, the interim report was published as a stop- gap measure in 1960 until the ‘real’ COBOL could be completed. The final report never happened. Because of the vacuum for suitable busi- ness languages, COBOL rapidly took hold of this niche in the 1960s. Once there, it proved difficult to displace regardless of critique of its elegance, design, and other attributes—“sticks and stones...”. Only with significant changes in development plat- form and software architecture has COBOL’s dominance been challenged—PCs, client serv- er, GUIs, object-orientation and theWeb have all threatened it. However, in these cir- cumstances COBOL has, through propri- etary extensions, been extended to cope. Although these have not led to anywhere near the same market presence as the COBOL mainframe association, they are still sig- nificant. Through such adaptive extensions and a splintering of mainframe dialects, there are many COBOLs out there, and not simply the ones defined through ANSI standards. What is COBOL’s future as the Y2K teams wind down? Although some com- panies have chosen to redevelop their sys- tems to a new architecture and to a different business model, there is less incentive to throw it out of the many environments where errant code has been fixed. The language is still taught on many computing courses, and has refused to be displaced by more transient 4GLs, which have them- selves caused upgrade problems. While the demand for COBOL may drop a little, it is unlikely to do so drastically. PL/1 PL/1 (Programming Language 1) was orig- inally touted as the last programming lan- guage. As is often the case with such ambitions, it certainly wasn’t, finding itself in the company of thousands of other lan- guages during the 1970s inflationary peri- od of language growth. It was conceived in the mid-1960s as a com- plete general purpose programming lan- guage, equally suited to scientific, business, and systems programming. As such it used the ALGOL, FORTRAN, and COBOL of its times as starting points. A further criti- cism often levelled at it was that because it was many languages rather than one, it all too easily allowed developers to program in their original language’s idioms. It has been said that one can program FORTRAN in any language, and PL/1 certainly enabled this to happen in a more complete sense than most. PL/1 was developed by IBM. Originally named NPL (New Programming Language) it was further developed at the company’s Hursley labs in the UK, where it was felt that the name might cause confusion with the National Physical Laboratory. SIMULA killer The only other language at the time with sim- ilar ambitions was Algol 68, which became lost in complexity and committees. Usable implementations of Algol 68 only began appearing in the mid-1970s, long after the original definition. It was competing with PL/1, whose implementations were improv- ing after a shaky start in the 1960s.The back- ing of IBM and an ANSI/ECMA standard in 1976 saw to it that in spite of its signif- icant complexity, PL/1 stayed afloat while Algol 68 persisted only in memory and language design influence. PL/1’s com- plexity led to the formation of subsets, including PL/C and the official 1981 ANSI standard PL/1 subset. PL/1 was always poised for the big time, to the point that it was used as the second language in the classic Elements of Programming Style (the other language in that being FORTRAN 66).3 Its touted posi- tion as the only programming language one would ever need was responsible, in Norway, for preventing major public fund- ing of SIMULA, the first object-oriented lan- guage. However, with the possible exception of ALGOL, PL/1 did not displace those lan- guages that it was intended to supersede. It has been used typically for large systems, and never fully broke away from living on main- frames and minis. IBM sites were the most likely to use it, and so its original status as a vendor-created language was also never fully shaken off. Its use has certainly declined in favour of either newer or other old and established languages, such as FORTRAN and COBOL. C C is the progenitor of the curly bracket family, which includes C++, Java, scripting languages such as Perl, and interface definition languages such as OMG IDL. It is descend- ed from ALGOL 60, via the BCPL rather than the Pascal lineage. Based on his expe- rience of the Multics project, KenThompson at AT&T experimented with a number of languages, including BCPL, for developing the kernel of what was to become Unix. Dennis Ritchie developed C in 1972 as a high-level systems programming language for this purpose, rather than for the gener- al purpose role in which it has since found favour. The fate of C and Unix were intertwined in the early 1980s, but the uptake of C as a general systems programming language spread it far beyond the Unix environ- ment, displacing assembler and other high- level languages as the standard for systems programming on many systems.The 1990s saw C become accepted as general purpose programming language, and it became a typ- ical sight on developer CVs, either as the lan- guage of choice or as a common second language. C is a small language, with only a handful of keywords and concepts. However, some of these concepts and syn- tax rules are tricky for the newcomer (and even the experienced developer, as wit- nessed by the International Obfuscated C Coding Contest). This has often made it a difficult language to move to for develop- ers experienced in languages based on a fundamentally different mindset, such as COBOL. The original de facto standard, the “White Book”, came out in 1978.4 This is commonly known as K&R C or Classic C. ANSI JANUARY–FEBRUARY 2000 29 Enterprise Development Solutions
4768
dbpedia
1
24
https://museo.inf.upv.es/en/fortran/
en
Web del Museo de Informática 2.0
https://museo.inf.upv.es…ckus-300x225.jpg
https://museo.inf.upv.es…ckus-300x225.jpg
[ "https://museo.inf.upv.es/wp-content/uploads/2021/05/newlogoV.png", "https://museo.inf.upv.es/wp-content/plugins/qtranslate-x/flags/es.png", "https://museo.inf.upv.es/wp-content/plugins/qtranslate-x/flags/va.alt.png", "https://museo.inf.upv.es/wp-content/plugins/qtranslate-x/flags/gb.png", "https://museo.inf.upv.es/wp-content/uploads/2021/07/FORTRAN_Backus-300x225.jpg", "https://museo.inf.upv.es/wp-content/uploads/2021/07/creadoresFORTRAN-300x200.jpg", "https://museo.inf.upv.es/wp-content/uploads/2021/05/newlogoV-300x112.png", "https://museo.inf.upv.es/wp-content/uploads/2021/03/marca_UPV_principal_negro300.png", "https://museo.inf.upv.es/wp-content/uploads/2021/04/STINF_PRINCIPAL_V2-e1617005733397-300x295-1-e1619541346572.png", "https://museo.inf.upv.es/wp-content/uploads/2021/03/MuseusUPV.png", "https://museo.inf.upv.es/wp-content/uploads/2021/03/mcd-gob-2048x6624524-1.jpg", "https://museo.inf.upv.es/wp-content/uploads/2021/04/icom.png", "https://museo.inf.upv.es/wp-content/uploads/2021/04/ceres2-e1619541421858.png" ]
[]
[]
[ "" ]
null
[]
2021-07-12T08:46:27+00:00
The FORTRAN Programming Language Developed by IBM in 1950 to reduce the effort of implementing solutions to scientific and engineering tasks (until that date developed in assembly language). The development of FORTRAN was parallel to that of the compilers, it was the first language that had a compiler and that allowed to port programs to […]
en
https://museo.inf.upv.es…con2-1-32x32.png
Web del Museo de Informática 2.0
https://museo.inf.upv.es/fortran/
The FORTRAN Programming Language Developed by IBM in 1950 to reduce the effort of implementing solutions to scientific and engineering tasks (until that date developed in assembly language). The development of FORTRAN was parallel to that of the compilers, it was the first language that had a compiler and that allowed to port programs to different machines. In addition, many changes and evolution of the compilers were motivated by the need for FORTRAN to generate efficient code. First of the “modern” programming languages, its syntax has aged badly (as early as 1968, an article signed by the creators of BASIC described it as “old-fashioned”). However, despite not being viewed by the community as a general-purpose language, FORTRAN is the programming language that has dominated scientific application development for nearly six decades. Even today (2021), it occupies the 20th position in the TIOBE ranking of popularity of programming languages ​​(34th in 2020). FORTRAN has been essential in areas of intensive calculation such as: astronomy, computational fluid dynamics, finite element analysis, weather modeling and prediction or satellite simulation. This made the different standardized versions maintain compatibility with the previous ones, while including capabilities from structured programming (in 1977), object orientation (in 2003) or concurrent programming (in 2008). Currently, the availability of large specialized libraries (usually implemented in C or C ++) means that the scientific community is migrating to new high-level languages ​​such as Python or Matlab. All this led FORTRAN to incorporate conversion facilities to C.
4768
dbpedia
2
30
https://craftofcoding.wordpress.com/category/algol/
en
Algol – The Craft of Coding
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://craftofcoding.wordpress.com/wp-content/uploads/2022/11/e_sale_algol60.png?w=680", "https://craftofcoding.wordpress.com/wp-content/uploads/2021/03/algorithm35.png?w=572", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2024-06-11T14:23:12+00:00
Posts about Algol written by spqr
en
https://s1.wp.com/i/favicon.ico
The Craft of Coding
https://craftofcoding.wordpress.com/category/algol/
Algol Quicksort – Algorithm 63, and 64 Quicksort is a well-known sorting algorithm developed by C. A. R. Hoare in 1962 while working for the small British scientific computer manufacturer Elliott Brothers Ltd. [1, 2]. In his 1980 ACM Turing Award Lecture [3] he describes how a course on Algol 60, taken in early 1961 introduced him to the concept of recursive procedures. The presence of recursion in Algol 60 allowed Hoare to write Quicksort, a concept he had invented while trying to improve upon the Shellsort of Shell [4]. The entire algorithm was described in half a page, and comprised of two algorithms (written in Algol 60): Algorithm 63 which described the partition function, and Algorithm 64 which performed the actual Quicksort function. One of the greatest benefits of Algol 60 was the fact that it, unlike Fortran, allowed recursion. Algorithm 64 procedure quicksort (A,M,N); value M,N; array A; integer M,N; comment Quicksort is a very fast and convenient method of sorting an array in the random-access store of a computer. The entire contents of the store may be sorted, since no extra space is required. The average number of comparisons made is 2(M-N) ln (N-M), and the average number of exchanges is one sixth this amount. Suitable refinements of this method will be desirable for its implementation on any actual computer; begin integer I,J; if M < N then begin partition(A,M,N,I,J); quicksort(A,M,J); quicksort(A,I,N); end end quicksort Algorithm 63 The original Algol algorithm as published in Communications of the ACM, used a series of goto statements to facilitate the partitioning of the array into those items less than the pivot, and those items greater than the pivot. procedure partition (A,M,N,I,J); value M,N; array A; integer M,N,I,J; comment I and J are output variables, and A is the array (with subscript bounds M:N) which is operated upon by this procedure. Partition takes the value X of a random element of the array A, and rearranges the values of the elements of the array in such a way that there exist integers I and J with the following properties: M ≦ J < I ≦ N provided M < N A[R] ≦ X for M ≦ R ≦ J A[R] = X for J < R < I A[R] ≧ X for I ≦ R ≦ N The procedure uses an integer procedure random(M,N) which chooses equiprobably a random integer F between M and N, and also a procedure exchange, which exchanges the values of its two parameters; begin real X; integer F; F := random(M,N); X := A[F]; I := M; J := N; up: for I := I step 1 until N do if X < A[I] then go to down; I := N; down: for J := J step -1 until M do if A[J] < X then go to change; J := M; change: if I < J then begin exchange(A[I],A[J]); I := I + 1; J := J - 1; go to up end else if I < F then begin exchange(A[I],A[F]); I := I + 1; end else if F < J then begin exchange(A[F],A[J]); J := J - 1; end; end partition Further reading Hoare, C.A.R., “Algorithm 64 Quicksort”, Communications of the ACM, 4(7), p.321 (1961) Hoare, C.A.R., “Algorithm 63 Partition”, Communications of the ACM, 4(7), p.321 (1961) Hoare, C.A.R., “The Emperor’s Old Clothes”, Communications of the ACM, 24(2), p.75-83 (1981) Shell, D.L., A high-speed sorting procedure”, Communications of the ACM, 2(7), pp.30-32 (1959) Translating an Algol-60 program to Fortran This post deals with the simple task of translating an Algol-60 program to Fortran, and some simple tricks which can be used. The program in question calculates e to many digits. There are many algorithms for calculating numbers like π and e. Normally e is calculated using the infinite series. These series often offer fast convergence, and are easy to calculate, however the problem is that computers are generally unable to accommodate more than 10-20 significant digits. For generating an accurate value for e, one has to turn to an alternative algorithm. One such algorithm is provided by Sale [1] in Algol-60 (or thereabouts). As there are no Algol-60 compilers out there, it is then a task of converting the program to another language. Here we have chosen Fortran, for reasons which will become clear as we progress through the discussion. The first thing someone will say is that they don’t know Algol-60. While I’m sure that’s true of most people, if you have a basic clue about control structures in programs, and can read a program, deciphering what goes on here should not be that hard. The first thing to do is come reverse engineering. You have to understand what the algorithm embedded in the Algol-60 code does. The first thing you might notice about this code is the fact that it allocates dynamic-type arrays within the code. This was not that unusual for Algol-type languages. Inputs to the procedure are n and d: where n is the number of decimal places to be calculated, and d is the array used to store them in. The use of the keyword loop does not in fact indicate a loop, but rather a label for that particular line. The if statement on the next line, if true, uses a goto to “loop” back to loop. In the Fortran this has been replaced by a real do while loop. m := 4; test := (n + 1) × 2.30258509; loop: m := m + 1; if m × (ln(m)-1.0)+0.5 × ln(6.2831852) × m) ≤ test then go to loop; m = 4 test = (n+1) * 2.30258509 do while (m * (log(m*1.0)-1.0) + 0.5 * log(6.2831852 * m) <= test) m = m + 1 end do You will notice that variables are created when they are needed in the original program. As Fortran does not work in the same manner, these variables will be moved to the declaration block at the top of the subroutine. Here the value of m calculated in the previous code is used to allocate storage for the array coef. Fortran will allow allocate statements anywhere in a program. The loop to set the elements of coef to 1 is replaced with a simple statement in Fortran. The two do loops in Fortran fairly well mimic the for loops in Algol-60 program. begin integer i,j,carry,temp integer array coed[2:m]; for l:=2 step 1 until m do coef[j]:=1; d[0] := 2; sweep: for i:=1 step 1 until n do begin carry := 0; for j:=m step -1 until 2 do begin temp := coef[j] × 10 + carry; carry := temp ÷ j; coef[j] := temp - carry × j end d[i] := carry end end allocate(coef(2:m)) coef(2:m) = 1 d(0) = 2; do i = 1, n carry = 0 do j = m,2,-1 temp = coef(j) * 10 + carry; carry = temp / j coef(j) = temp - carry * j end do d(i) = carry end do Now we can wrap these code segments inside a subroutine: procedure ecalculation(n,d); value n; integer n; integer array d; begin integer m; real test; comment Add code segments 1 and 2 end of ecalculation; subroutine ecalculation(n,d) integer, intent(in) :: n integer, allocatable, dimension(:), intent(out):: d integer :: m, i, j, carry, temp integer, allocatable, dimension(:) :: coef real :: test allocate(d(0:n)) ! Add code segments 1 and 2 end subroutine ecalculation And finally add this subroutine into a Fortran main program wrapper: program ecalc implicit none integer :: i,num integer, allocatable, dimension(:) :: eArr print*, "Number of digits?" read (*,*) num call ecalculation(num,eArr) print*, "e (actual) = 2.7182818284590452353602874" write(*,fmt="(a15)", advance="no") "e (calc) = 2." do i = 1,num write(*,fmt="(i1)", advance="no") eArr(i) end do write(*,*) contains ! Add subroutine ecalculation end program ecalc Now we can compile the program and run a test: Number of digits? 25 e (actual) = 2.7182818284590452353602874 e (calc) = 2.7182818284590452353602874 Refs: Sale, A.H.J., “The calculation of e to many significant digits”, The Computer Journal, 11(2), pp.229-230 (1968) Wirth’s thoughts on the “new Algol” committee. Niklaus Wirth was a part of the working group that was looking at a successor to Agol-60 – IFIP, International Federation for Information Processing, Working Group 2.1. He would later resign from the committee. In 1968 he presented the closing word at the ALGOL Colloquium in Zurich. At the time he considered the world of programming to be in “disorder and disorientation“. He summed up the official IFIP meetings as following a pattern of “disarray and dispute”, followed by “a general feeling of discouragement and despair”. It doesn’t sound like a working group that functioned very well. The idea of specifying a “universal language” lead to what Wirth termed a “monster language“, concluding that the working group had failed and should “dissolve”. His final thought is a comment worth repeating: It has been sitting on an ivory tower, and when it was recognized that the tower was leaning, it was too inflexible to make a decision for further action. But this seems to be the fate of committees. Next week, we will carefully watch the tower of Pisa. Wirth, N., “Closing word at Zurich Colloquium“, ALGOL Bulletin, 29, pp.16-19 (1968) The Wirth Trinity – Design and examples The subject [i.e. computer languages] seemed to consist of 1 percent science and 99 percent sorcery, and this tilt had to be changed. Niklaus Wirth, Turing Award Lecture (1984) Wirth designed programming languages based on a number of design principles. Occam’s Razor – In the early 1960s Wirth was concerned that languages were becoming more complex than necessary. In his thesis “A Generalization of ALGOL”, he wrote of eliminating many of the non-fundamental concepts of programming languages. Lucidity First – Wirth’s language designs are models of lucidity, i.e. clarity. The notation of the syntax of the languages serves as a natural extension of the algorithmic thinking underpinning the language. Principle, not technique – Wirth was completely aware of the difference between elements of a language that are essential, and those that are ephemeral. Pascal was designed to emphasize the idea of structured programming, not obscure concepts that could be added to a language, and serve little or no purpose. Pedagogy and practice – Pascal was designed for teaching, yet Wirth himself admitted that he does not distinguish between pedagogy and practice: “…I do not believe in using tools and formalisms in teaching that are inadequate for any practical task.“[1] What better way to illustrate the difference between the languages than actually provide some code? In this case the code for the classic Factorial. Modula-2 MODULE Factorial; FROM InOut IMPORT WriteString, WriteInt, WriteLn, ReadInt; VAR i, n, fact : INTEGER; BEGIN WriteString('Factorial n? '); ReadInt(n); FOR i := 2 TO n DO fact := fact * I; END; WriteInt(n); WriteString('! = '); WriteInt(fact); WriteLn; END Factorial. Oberon MODULE Factorial; IMPORT In, Out; VAR i, n, fact : INTEGER; BEGIN Out.String('Factorial n? '); In.Int(n); fact := 1; FOR i := 2 TO n DO fact := fact * i; END; Out.Int(n); Out.String('! = '); Out.Int(fact); Out.Ln; END Factorial. Pascal (FPC) program Factorial; var I, n, fact : integer; begin writeln('Factorial n? '); read(n); for i := 2 to n do fact := fact * i; write(n,'! = ',fact); end. Wirth, N., 1984 ACM A.M. Turing Award lecture. Euler : The beginnings of a journey The first experiment led to a dissertation and the language Euler – a trip with a bush knife through the jungle of language features and facilities. The result was academic elegance, but not much or practical utility – almost an antithesis of the later data-typed and structured programming languages. Niklaus Wirth [4] All journey’s have a beginning and for Niklaus Wirth his journey into the design of programming languages likely began with Euler. Euler was developed as part of Wirth PhD dissertation work under supervision of Helmut Weber at Stanford University. It was a generalization of Algol-60. It was a language that was suppose to be simpler, yet more flexible than Algol-60, defined with a rigorous formal definition [1,2]. Euler was experimental in nature but likely the impetus for Wirth to continue in language design. Influences? Algol-60 Why was it developed? As a simper version of Algol-60. Language features The numeric data type number, which could be used for either integer or real. Other data types were boolean, and symbol (string of characters in quotes) One of the first languages to include reference (address of a variable). List processing using the list (a sequence of elements of any type). For control structures, other than procedure calls there was only an if expression, and a goto. A for loop was not provided in Euler, but could be easily programmed as a procedure. Later versions included goto and various other control structures. Used the ⟵ as the assignment operator (original, later changed to :=). Example Programs [3] real procedure HORNER (real array A(*); integer value N; real value X); begin real S; S := 0; for I := N step -1 until 0 do S := S * X + A(I); S end procedure COPY (real array U, V(*,*); integer value A, B); for I := 1 until A do for J := 1 until B do U(i,j) := V(i,j) Refs: Wirth, N., Weber, H., “EULER: A generalization of ALGOL, and its formal definition: Part I”, CACM, 9(1), pp.13-25 (1966) Wirth, N., Weber, H., “EULER: A generalization of ALGOL, and its formal definition: Part II”, CACM, 9(2), pp.89-99 (1966) Algol W Reference Manual (1972) Wirth, N., “1984 ACM A.M. Turing Award Recipient Lecture”, CACM, 28(2), pp.159-164 (1985) Algol-W : A precursor to Pascal Before Pascal, there was a language designed by Niklaus Wirth which heavily influenced Pascal’s design. Algol-W is a language which was based on the proposal for Algol-X, the successor to Algol-60, submitted by Niklaus Wirth [1] to IFIP Working Group 2.1 in August 1965. However at the committee meeting at Grenoble in October 1965, they decided it was not a great advancement over Algol-60, and dropped it as an official working document. The committee instead chose a more complicated design which was to become Algol-68. Wirth then collaborated with Tony Hoare to revise the draft [2], and Algol-W was born. The proposal they published in 1966 [2] had a number of goals. It provided an overview of where the AGOL effort was headed, information which could be used for experimental implementations, and potential further extensions. Algol-W represented a somewhat conservative evolution of Algol-60, where some concepts were simplified, and some extensions were added. The most important extension was likely the introduction of the concept of generalized data structures. To supplement the array construct, the record was added. This made Algol-W more powerful than Algol-60 in its ability to handle data structures. The language was first implemented on an IBM 360 using PL/360, Wirth’s other early language. In September 1966, Wirth published a follow-up relating to problems encountered whilst implementing Algol-W [4]. Some of the changes would now be considered syntactic sugar. For example array subscripting was changed from a[i][j] to a[i,j], strings were made static, and the concatenation operator was eliminated. In addition, the for statement was given greater versatility, allowing for: for x := A step B until C do for x := A until C do for x := A1, A2, A3, ..., An do Algol-W was used mostly in the context of an instructional language at universities like Stanford for a number of years, and formed the basis for Pascal. Influences? Algol-60 Why was it developed? As a successor to Algol-60. What did it actually do for programming? Introduction of the case statement, first proposed by Hoare. Introduction of a record as a data structure. Value-result method of passing parameters. Features simplified from Algol-60 The switch declaration and designator were abolished, replaced by the case statement. Labels are simplified, so that they only serve as a link between goto and its destination. A goto cannot jump from outside into a conditional statement or case statement, or iterative statement. The own concept was abandoned. Integer labels are abolished. The for statement confined to simple statements (changed after implementation). Notational simplification to arrays. Extensions added Addition of a record data structure. Addition of complex, bits (bit sequence) and string types. Operations for bit sequences. For types real and complex, a long variant is provided. A “result parameter” was added to procedures. Example Programs Below are three example programs from [3]. The first program is a classic iterative Factorial, where goto's are used to construct the “for” loops. BEGIN COMMENT CALCULATE N FACTORIAL; INTEGER N,M,FACT; START: READ(N); FACT := 1; M := N; IF N >= 0 THEN BEGIN LOOP: IF N > 0 THEN BEGIN FACT := FACT*N; N := N-1; GO TO LOOP END ELSE BEGIN WRITE(M,"FACTORIAL =", FACT); GO TO START END END END. Calculate the squares and cubes of integers: BEGIN INTEGER SQ, CU; FOR X := 3 STEP 2 UNTIL 13 DO BEGIN SQ := X*X; CU := SQ*X; WRITE("X =",X, "X**2 =", SQ, "X**3 =", CU) END END. Read an array of N real numbers and sort them using Bubblesort: BEGIN INTEGER N; READ(N); BEGIN REAL ARRAY X(1::N); REAL TEMP; LOGICAL INTERCHANGES; WRITE("ORIGINAL ARRAY"); WRITE(" "); FOR I := 1 UNTIL N DO BEGIN READON(X(I)); WRITE(X(I)) END; COMMENT BEGIN SORT; FOR J := N-1 STEP -1 UNTIl 1 DO BEGIN INTERCHANGES := FALSE; FOR I := 1 UNTIL J DO IF X(I) > X(I+1) THEN BEGIN TEMP := X(I); X(I) := X(I+1); X(I+1) := TEMP; INTERCHANGES := TRUE; END; IF ¬ INTERCHANGES THEN GO TO XIT END; XIT: WRITE(" "); WRITE("SORTED ARRAY"); WRITE(" "); FOR I := 1 UNTIL N DO WRITE(X(I)) END END. In the last program, notice the use of the symbol ¬, which is one of the mathematical symbols for negation, with the other two being ~ and !. Why did this not take on (only a few languages like PL/I used it), rather than use double symbols like != or ~=, or even Fortran’s horrible /= (one of the few Fortran syntax things that really bugs me). Refs: Wirth, N., “Proposal for a Report on a Successor of ALGOL 60”, MR75, Stichting Mathematical Centre, Amsterdam (1965) Wirth, N., Hoare, C.A.R., “A contribution to the development of ALGOL”, CACM, 9, pp.413-431 (1966) Bauer, H.R., Introduction to ALGOL W Programming, Computer Science Dept. Stanford University (1969) Wirth, N., “Additional Notes on – Contribution to the development of ALGOL”, ALGOL Bulletin, 24, pp.13-17 (1966) The Wirth Trinity – Oberon On another leave at Xerox PARC (1984/85), Wirth had the goal of developing a modern multitasking OS. Wirth, together with J. Gutknecht embarked on this project in 1987, intending to implement it in Modula-2. They needed a simple language, so Oberon was born, the last in the line of “ALGOL-like” languages. It appeared with a few additions (e.g. OO) and a number of subtractions (discarded Modula-2 features). Oberon-2 appeared in 1991, and became the version of the language most commonly used. It was the last in the family of “Algol-like” languages. Influences? Oberon: Modula-2 Oberon-2: Oberon, Pascal, Modula-2 Why was it developed? Oberon was born of Modula-2, adding support for OO. The principle design guideline was to concentrate on features that are basic and essential, and to omit ephemeral issues [6]. Wirth said that the design of Oberon was driven by “the recognition of the cancerous growth of complexity in languages that had recently emerged, such as C, C++ and Ada, which appeared even less suitable for teaching than for engineering in industry.” [6] He lumped Modula-2 into this category as well, referring to it as “overly bulky, containing features that we had rarely used”. What did it actually do for programming? Traditional programming supported as well as OO-like constructs. Design considerations The idea behind Oberon was a language simpler than its predecessors. Language features “dumped” from Modula-2 It eliminated many superfluous Modula-2 features, e.g. the FOR loop, enumeration types, WITH statement, non-integer array indices [1,4]. Variant records (mostly misused, and issues related to garbage automated garbage collection). Enumeration types (defy extensibility over module boundaries). Subrange types (compiler complexity, used almost exclusively for indexing arrays). SET types (replaced with a single predefined type SET, with elements 0-31) Type CARDINAL (with 32-bit integers, the need for unsigned arithmetic was reduced). Qualified import (every occurrence of an imported identifier must be preceded by the defining modules name) nested modules WITH statement (explicit qualification of field identifiers is preferred) FOR loop (omitted from Oberon, but reintroduced in Oberon-2) Low-level facilities: ADDRESS, WORD. Language features added Only a few new features were added, many of which had a hint of OO [4]. Addition of SHORTINT, LONGINT, and LONGREAL Type extension (inheritance) – construction of new types based on existing types. Type inclusion, binds all arithmetic types together. In Modula-2, numeric types are incompatible. LONGREAL ⊇ REAL ⊇ LONGINT ⊇ INTEGER ⊇ SHORTINT POINTER types restricted to record or array types. Language Genealogy Refs: Wirth, N., “The development of procedural programming languages – personal contributions and perspectives”, in Modular Programming Languages, JMLC 2000, LNCS, V.1897 (2000) Wirth, N., “The Programming Language Oberon”, Software- Practice and Experience, 18(7), pp.671-690 (1988) Mössenböck, H., Wirth, N., “The Programming Language Oberon-2” (1993) Wirth, N., “From Modula to Oberon” (1990) Wirth, N., “Programming in Oberon” (1992) Wirth, N., “Modula-2 and Oberon”, in ACM Conf. on History of Programming Languages, pp.3-1-3-10 (2007) Wirth, N., “Modula: A language for modular multiprogramming”, Software- Practice and Experience, 7, pp.37-65 (1977) The Wirth Trinity – Modula-2 Pascal had become quite popular in the 1970s. This was in part attributed to the fact that it was a somewhat easy language to learn. It was not as fast a language, as say Fortran, but it did incorporate the notion of structured programming, so that by the late 1970s programmers had learnt to code without using the dreaded goto statement. Wirth realized that Pascal had its shortcomings [1], and considered Pascal’s I/O to be “inadequate and inflexible“. A new language was to be designed, but instead of being called Pascal-2, it was called Modula-2. To experiment with multiprogramming primitives, Wirth contrived a rudimentary language called Modula [7]. Modula was never intended to be a language like Pascal. In 1976 Wirth spent a year at the Xerox Palo Alto Research Center (PARC). Here he learned about hardware design, and on returning to Switzerland began work on what was to become Lilith, a personal workstation [3]. Wirth had some fixed constraints on the system: single-user, single processor, and all software written in a single language [1]. As to the language, Pascal was not capable, nor was Modula, at least not by themselves. The solution? Modula-2, a conglomeration of the bloodlines of Pascal, Modula, and Mesa (a language being developed at Xerox PARC). The Lilith was marketed as “The Modula Computer”, and sold for US$22,750. Modula-2 was based on the concept of a module, which allowed for high level abstraction, and low-level facilities. The language was defined in 1978, and implemented on a PDP-11, with the first Modula-2 compiler was released in 1980. One of the reasons Modula-2 didn’t succeed was the lack of a good free compiler (there is one now, but only for Windows). Influences? Modula-2 was influenced by Pascal, Modula, and Mesa (a Pascal offspring). Why was it developed? To alleviate the shortcomings of Pascal, and to design a language more apt for the time. Whereas Pascal was designed more for teaching, Modula-2 was designed as more of a systems-language. What did it actually do for programming? Modules. Derived from the notion of abstract data types, and incorporating information hiding, it built on the concepts of Mesa. This allowed things like I/O to be removed from the language proper, and encapsulated in a module forming a “standard library”. The module structure isolates its contents from the surrounding program, and modules can be separated into a definition and an implementation part. All communications to other modules occurs through the imported and exported identifiers. The module can also be regarded as a representation of the concept of an “abstract data type” postulated by Liskov in 1974 [10]. Language features (Modula) “dumped” from Pascal Variant records. Built-in I/O – move to libraries. No goto statement. Packing of data – Pascal allowed data in record and array structures to be packed. Improvements over Pascal (i.e. what was added/modified) Source code is case-sensitive, reserved words are in UPPERCASE. Open arrays. The PROCEDURE type. Flexible declarations: types, variables, and procedures can be mixed together, as opposed to Pascal’s strict const, type, var, etc. CASE has an ELSE for matching unspecified values. Also permits subranges. Boolean expressions are evaluated conditionally. I/O is relegated to library modules to avoid system dependencies. Readability was enhanced through the use of control structure terminators: REPEAT–UNTIL, IF/WHILE/FOR – END. This eliminates the begin-end block construct. The FOR statement is augmented by the clause BY. Pascal’s downto clause is missing. The type CARDINAL to allow for unsigned (positive) integers. Two standard procedures for incrementing and decrementing: INC and DEC. LONG identifiers. There are better control transfers. The statements RETURN and EXIT are used to transfer control from procedures and looping structures. HALT is used to terminate a program. No goto statement. No syntactic ambiguities in decisions, e.g. no dangling ELSE. IF statements always require an END. A new looping statement called LOOP, providing an infinite loop. Standard I/O functions, dynamic storage allocation, files via library modules. Language deficiencies No standard procedures for I/O and storage allocation (some found this quite onerous). For a list of ambiguities and insecurities, the interested reader is referred to [8], [9], and [12]. Language Genealogy Refs: Wirth, N., “The development of procedural programming languages – personal contributions and perspectives”, in Modular Programming Languages, JMLC 2000, LNCS, V.1897 (2000) Wirth, N., “History and Goals of Modula-2”, BYTE, pp.145-152 (Aug.1984). Ohran, R., “Lilith and Modula-2”, BYTE, pp.181-192 (Aug.1984). Paul, R.J., “An Introduction to Modula-2”, BYTE, pp.195-210 (Aug.1984). Coar, D., “Pascal, Ada, and Modula-2”, BYTE, pp.215-232 (Aug.1984). Gutknecht, J., “Tutorial on Modula-2”, BYTE, pp.157-176 (Aug.1984). Wirth, N., “Modula: A language for modular multiprogramming”, Software- Practice and Experience, 7, pp.37-65 (1977) Spector, D., “Ambiguities and insecurities in Modula-2”, ACM SIGPLAN Notices, 17(8), pp.43-51 (1982) Torbett, M.A, “More ambiguities and insecurities in Modula-2”, ACM SIGPLAN Notices, 22(5), pp.11-17 (1987) Liskov, B., Zilles, S., “Programming with abstract data types”, in ACM SIGPLAN Notices, 4, pp.50-59 (1974). Collins, S., “Comparing Modula-2 with Pascal and Ada”, Data Processing, 26(10), pp.32-34 (1984) Cornelius, B.J., “Problems with the language Modula-2”, Software-Practice and Experience, 18(6), pp.529-543 (1988) The Wirth Trinity – Pascal Algol was likely minimally successful from the point of view of being used extensively in industry, but was used in academic environments. There were many attempts to extend it’s applicability, which lead to Algol 68, and Algol W. However the complexity of Algol 68 ultimately lead to that variant’s demise. PL/I was an attempt to create Fortran VI, however, combining features from Algol, Fortran and COBOL, but the product was once again an extremely large language. Both Algol 68 and PL/I exemplify the notion of “Swiss army knife” approach to language design – providing every conceivable feature. What was required was a smaller, more compact language – enter Pascal. Wirth began design on Pascal (named after French mathematician Blaise Pascal) in 1968, with a compiler written entirely in Pascal, on a CDC 6000 series mainframe. The first language compiler appeared in 1970. The language was tweaked in 1972, and became an ISO standard in 1982. In describing Pascal, Wirth remarked that “the guiding idea in the layout of the syntax of Pascal was simplicity, due to the recognition that structures difficult to process by compilers are also difficult to master by human readers and writers”. The highlights of Pascal [1] were: Simple control structures for decisions, and repetitions. Scalar data types = boolean, integer, real, char and enumerations. Ability to construct complex data structures using records, arrays, and sets. Strict static typing – every constant, variable, function or parameter had a type. Dynamic data structures built with the use of pointers. Recursive procedures. Influences? Pascal was heavily influenced by Algol, and is often called Algol-like. Why was Pascal developed? Pascal was born out of what Wirth terms “an act of liberation”. Liberation from the prospect of using Algol or Fortran as languages to teach programming, and liberation from the design constraint imposed by committee work. What did it actually do for programming? Pascal was one of the first languages built from the ground up with the notion of structured programming. With no commercial backing, Pascal succeeded on its own merits, and was implemented on systems ranging from Cray supercomputers to personal computers. Programmers who felt “straitjacketed” by writing programs in BASIC, flocked to Pascal. It was an ideal language for teaching programming. Pascal was drafted as the basis for the DOD’s Ada project. It introduced records into scientific language (although Algol-W really did this). It introduced a usable case statement Design considerations The general idea dominating the design of Pascal was to provide a language appealing to systematic thinking, mirroring conventional mathematical notation, satisfying the needs of practical programming, and encouraging a structured approach. It should be simple, have the ability to handle non-numeric data, be suitable for teaching programming, and have the compile-time and runtime efficiency of Fortran. Language features Records, and variant records. Algol-60 had blocks (local declarations + statements) and compound statements (statements only), whereas Pascal eliminated the block. More, yet simpler control structures than Algol-60. Use of a real assignment operator, := Strong type safety. Case insensitive. Native set operators. Language deficiencies Keeping the goto statement. Syntactic ambiguities inherited from Algol – the lack of explicit closing symbols for nestable constructs, e.g. dangling-else. Inability to support separate compilation of modules hindered the development of large programs. The flawed case statement, which lacked an else clause. Fixed size of arrays, precluded the use of general math and string libraries. Fortran, Cobol programmers felt handcuffed by Pascal’s compulsory declaration of variables. No exponentiation operator. Criticisms of the language Lack of block structures. No dynamic arrays. Lack of the “2nd” form of conditional (inline if). Labels and the goto statement. “Unnatural” unification of subranges, types and structures. The difference between procedure and function is marginal. Wirth, N., “The development of procedural programming languages – personal contributions and perspectives”, in Modular Programming Languages, JMLC 2000, LNCS (1897). Algorithm 35: The Sieve (in Algol-68) The Sieve of Eratosthenes likely first appeared in computing in 1961 as Algorithm 35 in Communications of the ACM (p.151). It, likely many other algorithms of the time was shown in Algol-60. To try and implement it as close as possible to the original syntax, given that there are few if any Algol-60 compilers out there, I chose to reproduce the algorithm in its entirety in Algol-68 using the Algol-68 Genie (a68g) specification. It wasn’t a lot of fun. PROC sieve = (INT nmax, REF INT j) []INT: ( [nmax]INT primes; INT i; primes[1] := 1; primes[2] := 2; primes[3] := 3; j := 3; FOR n FROM 3 BY 2 TO nmax DO i := 3; l1: IF primes[i] <= sqrt(n) THEN GOTO a1 ELSE GOTO a2 FI; a1: IF n/primes[i] = n%primes[i] THEN GOTO b1 ELSE GOTO b2 FI; b2: i := i + 1; GOTO l1; a2: primes[j] := n; j := j + 1; b1: print("") OD; j := j - 1; primes ); main: ( INT n, numpr; print("Upper limit to calculate sieve? "); read(n); [n]INT primes; primes := sieve(n,numpr); FOR i FROM 1 TO numpr DO print(whole(primes[i],5)); IF i MOD 10 = 0 THEN print(new line) FI OD ) It was an interesting experience, partially because the documentation for a68g is okay, but it is filled with the same obtuse stuff in the original reports for Algol-68. There is a lot to this language, and trying to find simple things, like how to pass back a variable from a subprogram is trying. There are things I inherently like, like the ability to specify the size of the array to be created within the function (proc seems like a procedure, but behaves more like a function). And I like that assignment is :=, and equality is =. It’s also nice to use a language where it is easy to return an array from a function. Here is the program executed with the value of n set to 1000. Upper limit to calculate sieve? 1000 +1 +2 +3 +5 +7 +11 +13 +17 +19 +23 +29 +31 +37 +41 +43 +47 +53 +59 +61 +67 +71 +73 +79 +83 +89 +97 +101 +103 +107 +109 +113 +127 +131 +137 +139 +149 +151 +157 +163 +167 +173 +179 +181 +191 +193 +197 +199 +211 +223 +227 +229 +233 +239 +241 +251 +257 +263 +269 +271 +277 +281 +283 +293 +307 +311 +313 +317 +331 +337 +347 +349 +353 +359 +367 +373 +379 +383 +389 +397 +401 +409 +419 +421 +431 +433 +439 +443 +449 +457 +461 +463 +467 +479 +487 +491 +499 +503 +509 +521 +523 +541 +547 +557 +563 +569 +571 +577 +587 +593 +599 +601 +607 +613 +617 +619 +631 +641 +643 +647 +653 +659 +661 +673 +677 +683 +691 +701 +709 +719 +727 +733 +739 +743 +751 +757 +761 +769 +773 +787 +797 +809 +811 +821 +823 +827 +829 +839 +853 +857 +859 +863 +877 +881 +883 +887 +907 +911 +919 +929 +937 +941 +947 +953 +967 +971 +977 +983 +991 +997
4768
dbpedia
0
29
https://chii90.wordpress.com/2013/03/09/chapter-2-evolution-of-the-major-programming-languages-review-questions/
en
Chapter 2 Evolution of the Major Programming Languages (Review Questions)
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2013-03-09T00:00:00
1. In what year was Plankalkül designed? In what year was that design published? = Plankalkül was designed by Konrad Zuse in 1945 but it isn't published until 1972.   2. Mention an interesting feature of Zuse's programs. = An interesting features of Zuse's programs was the inclusion of mathematical expressions showing the current relationships…
en
https://s1.wp.com/i/favicon.ico
チイちゃんのブログ~
https://chii90.wordpress.com/2013/03/09/chapter-2-evolution-of-the-major-programming-languages-review-questions/
1. In what year was Plankalkül designed? In what year was that design published? = Plankalkül was designed by Konrad Zuse in 1945 but it isn’t published until 1972. 2. Mention an interesting feature of Zuse’s programs. = An interesting features of Zuse’s programs was the inclusion of mathematical expressions showing the current relationships between program variables. These expressions stated what would be true during execution at the points in the code where they appeared. 3. What does Plankalkül mean? = Konrad Zuse’s language Plankalkül means program calculus because it is used to solve a wide variety of problems by writing algorithms in the language. 4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they? = The two significant shortcomings of the computer hardware are Pseudoinstructions and novel facility of automatically incrementing address registers. 5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped? = The words of the UNIVAC I’s memory had 72 bits, grouped as 12 six-bit bytes. 6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming languages? Explain why? = The hardware capability that first appeared in the IBM 704 are indexing and floating-point instructions in hardware because processor of that time took too many time in software floating-point processing, the overhead of interpretation and the simulation of indexing were relatively insignificant. Many programmers of that time never used interpretive systems, preferring the efficiency of hand-coded machine (or assembly) language 7. Who developed Speedcoding system for the IBM 701? = The Speedcoding system was developed by John Backus for the IBM 701. The Speedcoding interpreter effectively converted the 701 to a virtual three-address floating-point calculator. 8. Who developed Short Code? Why is Short Code called automatic programming? = Short Code was developed by John Mauchly in 1949 for the BINAC computer, which was one of the first successful stored-program electronic computers. 9. Under what environmental consideration was FORTRAN developed? Which is the first version of FORTRAN? = The environmental consideration in which FORTRAN was developed was as follows: (1) Computers had small memories and were slow and relatively unreliable; (2) the primary use of computers was for scientific computations; (3) there were no existing efficient and effective ways to program computers; (4) because of the high cost of computers compared to the cost of programmers, speed of the generated object code was the primary goal of the first FORTRAN compilers. 10. What was the most significant feature added to FORTRAN I to get FORTRAN II? = the significant feature added to FORTRANS I to get FORTRAN II is the independent compilation of subroutines. 11. What control flow statements were added to FORTRAN IV to get FORTRAN 77? = the logical loop control statements. 12. Which version of FORTRAN was the first to have any sort of dynamic variables? = The first to have any sort of dynamic variables is FORTRAN 90 because it has dynamic arrays, records, pointers, a multiple selection statements, and modules. 13. Which version of FORTRAN was the first to have character string handling? = the first FORTRAN to have character string handling is FORTRAN 77. 14. Why were linguists interested in artificial intelligence in the late 1950s? = It were linguists interested in artificial intelligence because linguistics were concerned with natural language processing. 15. What are the different data types and structures in Common LISP? = The different data types and structures in Common LISP are records, arrays, complex numbers, and character strings. It also has a form of packages for modularizing collections of functions and data providing access control. 16. In what way are Scheme and Common LISP opposites of each other? = Scheme lacks on portability but LISP doesn’t. That’s why they are different. 17. What dialect of LISP is used for introductory programming courses at some universities? = Scheme 18. What two professional organizations together designed ALGOL 60? = They are GAMM (a German acronym for Society for Applied Mathematics and Mechanics) and ACM (Association for Computing Machinery). 19. What was the goal for developing C? = to provide a language in which programs could be organized as they could be organized in SIMULA 67—that is, with classes and inheritance. 20. What were the significant modifications to ALGOL 58 to produce ALGOL 60? = Block structure introduced- method of designing code that segments scopes of variables and control statement, passing parameters by value or name allowed, recursive procedures allowed, stack dynamic arrays allowed, can create user defined data types, uses reserved keywords. 21. What language was designed to describe the syntax of ALGOL 60? = In ALGOL 60, it uses BNF (Backus-Naur form) to describe the syntax that was introduced by Backus. 22. On what language was COBOL based? = COBOL was based on FLOW-MATIC language. FLOW-MATIC was being implemented in 1957. 23. In what year did the COBOL design process begin? = the COBOL design process began in 1959 but it was first named CBL (Common Business Language). 24. What data structure that appeared in COBOL originated with Plankalkül? = Hierarchical data structures (records). 25. What organization was most responsible for the early success of COBOL (in term of extent of use)? = DoD (Department of Defense) because it was published by the Government Printing Office in April 1960. 26. Which data type does the original BASIC language support? = The original Dartmouth BASIC supported only numeric and string data types. There was no integer type. All numeric variables were floating-point. Strings were dynamic length. Arrays of both numbers and strings were supported, as well as matrices (two dimensional arrays). 27. Why was BASIC an important language in the early 1980s? = It was an important language because it provided a simple way of building graphical user interfaces (GUIs). 28. PL/I was designed to replace what two languages? = It was designed to replace COBOL and FORTRAN IV. 29. For what new line of computers was PL/I designed? = It is designed for the IBM System/360 line of computers. 30. What features of SIMULA 67 are now important parts of some object-oriented languages? = the features are data abstraction. SIMULA 67 is an extension of ALGOL 60, taking both block structure and the control statements from that language. To provide support for coroutines in SIMLULA 67, the class constrict was developed. This was an important development because the concept of data abstraction began with it. Furthermore, data abstraction provides the foundation for object-oriented programming. 31. What innovation of data structuring was introduced in ALGOL 68 but is often credited to Pascal? = User defined data types. User defined data types are valuable because they allow the user to design data abstractions that fit particular problems very closely. 32. What design criterion was used extensively in ALGOL 68? = Orthogonality. ALGOL 68 includes a significant number of features that had not been previously used. Its use of orthogonality, which some may argue was overdone, was nevertheless revolutionary. 33. What language introduced the case statement? = Case statements. ALGOL-W was the With and Hoarse version of ALGOL 60. it was implemented at Stanford University and was used primarily as ans instructional vehicle, but only at a few universities. The primary contributions of ALGOL-W were the value-result method of passing parameters and the case statement for multiple selection. 34. What operators in C were modeled on similar operators in ALGOL 68? = For and switch statements, in its assigning operators, and in its treatment of pointers. 35. What are two characteristics of C that make it less safe than Pascal? = Pointers and C has this capability via the ANSI C setjmp and longjmp. 36. Why is Prolog called a non procedural language? = It is called a non procedural language because it doesn’t require writing traditional programming logic. 37. What are the two kinds of statements that populate a Prolog database? = The database of Prolog program consists of two kinds of statements: facts and rules. 38. What is the primary application area for which Ada was designed? = Embedded system. The Ada language was developed for the Department of Defense (DoD), so the state of their computing environment was instrumental in determining its form. By 1974, over half of the applications of computers in DoD were embedded systems. An embedded system is one in which the computer hardware is embedded in the device it controls or for which it provides services. 39. What are the concurrent program units of Ada called? = Concurrent execution of special program units, named tasks, using the rendezvous mechanism. 40. What C++ construct provides support for abstract data types? = Packages in the C++ language provide the means for encapsulating data objects, specifications for data types and procedures. 42. What three concepts are the basis for object-oriented programming? = classes, objects, and methods. 43. Why does C++ include the features of Ada that are known to be unsafe? = A second-important goal was that there should be little or no performance penalty to C. A third goal of C with Classes was that it could be used for every application for which C could be used, so virtually none of the features of C would be removed, not even those considered to be unsafe. 44. From what language does Objective-C borrow its syntax for method calls? = Smaltalk 45. What programming paradigm that nearly all recently designed languages support is not supported by Go? = Does not support traditional object-oriented programming, as it has no form of inheritance 46. What is the primary application for Objective-C? = MacOS/iOS – iPhone 47. What language designer worked on both C and Go? = Ken Thomson 48. What do the Ada and COBOL languages have in common? = Both were popularized by the Department of Defense. 49. What as the first application for Java? = A programming language for embedded consumer electronic devices. 50. What characteristic of Java is most evident in Java Script? = Similar syntax. 51. How does the typing system of PHP and Java Script differ from that of Java? = The dynamic nature of its strings and arrays, and its use of dynamic typing. 52. What array structure is included in C# but not in C, C++ or Java? = rectangular arrays. 53. What two languages was the original version of Perl meant to replace? = ksh and awk. Perl language was originally a combination of ksh and awk. 54. For what application area is Java Script most widely used? = Java Script is most widely used in web 55. What is the relationship between Java Script and PHP , in term of their use? = They both complement each other. PHP or any server-side language is mandatory for client-server interaction while Javascript is optional and not required. You could easily build a site using PHP & HTML but all the pretty things and effects which make your site attractive are provided by JavaScript along with CSS. 56. PHP’s primary data structure is a combination of what two data structures from other languages? = PHP’s arrays are a combination of JAvaScript’s arrays and Perl’s hashes. 57. What data types does Java support? = The data types in Java are divided into two categories: Primitive data types (integer data types and floating data types) and reference data types (array, class, interface). 58. What characteristic does C++ shares with .NET? = provide access to the functionality of the .NET Framework. C++ and .NET both doesn’t support multiple inheritance. 59. What characteristic of Ruby’s arithmetic operators makes them unique among those of other languages? = Ruby supports a rich set of operators, as you’d expect from a modern language. Most operators are actually method calls. 60. How does Java provide storage deallocation? = Java does not allow explicit deallocation of storage. There is no explicit deallocation operator (such as “delete”). Since storage cannot be explicitly deallocated, you cannot have a dangling reference in Java. 61. Is Lua normally compiled, purely interpreted or impurely interpreted? = purely interpreted. 63. What deficiency of the switch statement of C is addressed with the changes made by C# to that statements? = There is no simplicity branch at the end of the selectable segments of code, which has cost enumerable programming errors. 64. What is the primary platform on which C# is used? = .NET. The purpose of C# is to provide a language for component-based software development, specifically for such development in the .NET Framework. In this environment, components from a variety of languages can be easily combined to form systems. 65. What are the inputs to an XSLT processor? = An XML data document and XSLT document. 66. What is the output of an XSLT processor? = XML document 67. What element of the JSTL is related to a subprogram? = Servlet, JSP. 68. To what is a JSP document converted by a JSP processor? = A servlet 69. Where are .jsp files executed?
4768
dbpedia
1
65
https://people.kernel.org/linusw/rust-in-perspective
en
Rust in Perspective
https://dflund.se/~triad…first-copies.jpg
https://dflund.se/~triad…first-copies.jpg
[ "https://dflund.se/~triad/images/Algol-first-copies.jpg" ]
[]
[]
[ "" ]
null
[]
2022-07-14T12:10:20+00:00
We are discussing and working toward adding the language Rust as a second implementation language in the Linux kernel. A year ago Jake Ed...
en
/favicon.ico
linusw
https://people.kernel.org/linusw/rust-in-perspective
We are discussing and working toward adding the language Rust as a second implementation language in the Linux kernel. A year ago Jake Edge made an excellent summary of the discussions so far on Rust for the Linux kernel and we (or rather Miguel and Wedson) have made further progress since then. For the record I think this is overall a good idea and worth a try. I wanted to add some background that was sketched in a mail thread for the kernel summit. TL;DR: my claim is that Rust is attempting to raise the abstraction in the programming language and ultimately to join computer science and software engineering into one single discipline, an ambition that has been around since these disciplines were created. Beginning with ALGOL The first general high-level language was FORTRAN, which is still in use for some numerical analysis tasks around the world. Then came ALGOL, which attracted a wider audience. The first “real” operating system (using virtual memory etc) for the Atlas Machine supervisor in 1962 was as far as I can tell implemented in Atlas autocode which was a dialect of ALGOL, which was the lingua franca at the time. Pure ALGOL could not be used because ALGOL 60 had no input/output primitives, so every real-world application of ALGOL, i.e. any application not solely relying on compiled-in constants, required custom I/O additions. Copies of the first specifications of ALGOL 60, belonging at one time to Carl-Erik Fröberg at Lund University. ALGOL inspired CPL that inspired BCPL that inspired the B programming language that inspired the C programming language, which we use for the Linux kernel. Between 1958 and 1968 ALGOL was the nexus in a wide attempt to join computer languages with formal logic. In this timespan we saw the ALGOL 58, ALGOL 60 and ALGOL 68 revisions come out. The outcome was that it established computer science as a discipline and people could start building their academic careers on that topic. One notable outcome was the BNF form for describing syntax in languages. This time was in many ways formative for computer science: the first three volumes of Donald Knuths The Art of Computer Programming were published in close proximity to these events. To realize that ALGOL was popular and widespread at the time that Unix was born, and that C was in no way universally accepted, it would suffice to read a piece of the original Bourne Shell source code tree for example: setlist(arg,xp) REG ARGPTR arg; INT xp; { WHILE arg DO REG STRING s=mactrim(arg->argval); setname(s, xp); arg=arg->argnxt; IF flags&execpr THEN prs(s); IF arg THEN blank(); ELSE newline(); FI FI OD } This doesn't look much like C as we know it, it looks much more like ALGOL 68. The ALGOL 68 definition added constructions such as IF/FI, DO/OD etc, which were not present in ALGOL 60. The reason is that Stephen Bourne was an influential contributor to ALGOL 68 and created a set of macros so that the C preprocessor would turn his custom dialect of ALGOL into C, for which I think someone on Reddit suggested to nominate bash for the obfuscated C contest. This is just one of the instances where we can see that the C programming language was not universally loved. The Bourne Shell scripting language that we all love and use is also quite close to ALGOL 68, so the descendants of this language is used more than we may think. Around 1970 Niklaus Wirth was working to improve ALGOL68 with what he called ALGOL W. Tired of the slowness of the language committee process he forked ALGOL and created the programming language Pascal which was a success in its own right. In his very interesting IEEE article named A Brief History of Software Engineering Professor Wirth gives his perspective on some of the events around that time: first he writes about the very influential NATO conference on software engineering 1968 in Garmisch, Germany which served to define software engineering as a distinct discipline. To counter the so-called software crisis – the problems presented by emerging large complex systems – the suggestion was to raise the abstraction in new languages. To raise the abstraction means to use more mathematical, machine independent constructs in the language. First consider the difference between low-level and high-level languages: a simple operation such as x = x + 1 is not high level, and just a fancy assembly instruction; if we compile it we can readily observe the resulting code in some kind of ADD instruction in the resulting object code. However a[i] = x + 1 raises abstraction past the point of high-level languages. This is because indexing into an array requires knowledge of the target machine specifics: base addresses, memory layout, etc. This makes the instruction more high-level and thus raises the abstraction of the language. The assumption is that several further higher levels of abstraction exist. We will look into some of these languages in the following sections. The Garmisch conference is famous in Unix circles because Douglas McIlroy was present and presented his idea of componentized software as a remedy against rising complexity, an idea that was later realized in the form of Unix's pipes and filters mechanism. D-Bus and similar component interoperation mechanisms are contemporary examples of such software componentry — another way to counter complexity and make software less fragile, but not the focus in this article. Wirth makes one very specific and very important observation about the Garmisch conference: Ultimately, analytic verification and correctness proofs were supposed to replace testing. This means exactly what it says: with formally verified programming languages, all the features and constructs that are formally proven need not be tested for. Software engineering is known for advocating test-driven development (TDD) to this day, and the ambition was to make large chunks of TDD completely unnecessary. Software testing has its own chapter in the mentioned report from the Garmisch NATO conference where the authors A.I. Llewelyn and R.F. Wickens conclude: There are, fundamentally, two different methods of determining whether a product meets its specification. One can analyse the product in great detail and from this determine if it is in accordance with its specification, or one can measure its performance experimentally and see if the results are in accord with the specification; the number and sophistication of the experiments can be varied to provide the degree of confidence required of the results. The first part of this paragraph i.e. “analyze in great detail” is what Wirth calls analytic verification and is today called formal verification. The latter part of this paragraph is what we call test-driven development, TDD. Also: the former is a matter of computer science, while the latter is a matter of software engineering. So here is a fork in the road. Wirth also claims the discussions in Garmisch had a distinct influence on Pascal. This can be easily spotted in Pascal strings, which was one of his principal improvements over ALGOL: Pascal strings are arrays of char, but unlike C char, a Pascal char is not the same as a byte; instead it is defined as belonging to an “ordered character set”, which can very well be ISO8859-1 or Unicode, less, more or equal to 255 characters in size. Strings stored in memory begin with an positive integer array length which defines how long the string is, but this is none of the programmer's business, this shall be altered by the language runtime and not by any custom code. Indexing out of bounds is therefore not possible and can be trivially prohibited during compilation and at runtime. This raises the abstraction of strings: they are set-entities, they have clear boundaries, they need special support code to handle the length field in memory. Further Pascal also has set types, such as: var JanuaryDays : set of 1..31; Perhaps Pascal's application to real-world problems didn't work out as expected, as it has since also defined PChar as a NULL-terminated pointer to a sequence of characters, akin to C strings. However it should be noted that Pascal pointers are persistently typed and cannot be converted: casting is not possible in Pascal. A Pascal pointer to an integer is always a pointer to an integer. From Wirth's perspective, C “presented a great leap backward” and he claims “it revealed that the community at large had hardly grasped the true meaning of the term 'high-level language' which became an ill-understood buzzword”. He attributes the problem to Unix which he says “acted like a Trojan horse for C”. He further details the actual technical problems with C: C offers abstractions which it does not in fact support: Arrays remain without index checking, data types without consistency check, pointers are merely addresses where addition and subtraction are applicable. One might have classified C as being somewhere between misleading and even dangerous. His point about C lacking index checking is especially important: it can be brought into question if C is really a high-level language. It is not fully abstracting away the machine specifics of handling an array. Language theorists can occasionally refer to C as a “big macro assembler”, the only thing abstracted away is really the raw instruction set. Wirth however also goes on to state the appealing aspects of the C programming language: people at large, particularly in academia, found it intriguing and “better than assembly code” (...) its rules could easily be broken, exactly what many programmers cherished. It was possible to manage access to all of a computer’s idiosyncracies, to items that a high-level language would properly hide. C provided freedom, where high-level languages were considered as straight-jackets enforcing unwanted discipline. It was an invitation to use tricks which had been necessary to achieve efficiency in the early days of computers. We can see why an efficiency-oriented operating system kernel such as Linux will tend toward C. It's not like these tricks stopped after the early days of computing. Just the other day I wrote a patch for Linux with two similar code paths, which could be eliminated by cast:ing a (const void *) into a (void *) which I then quipped about in the commit message of the revised patch. The reason for violating formal rules in this case — is that of a choice between two evils, and chosing the lesser evil: in a choice between formal correctness and code reuse I chose code reuse. And C enables that kind of choice. The languages presented later in this article absolutely do not allow that kind of choice, and C casts are seen as nothing less than an abomination. The language family including C and also Pascal is referred to as imperative programming languages. The defining character is that the programmer “thinks like a computer” or imagine themselves as the program counter to be exact. “First I do this, next I do this, then I do this” – a sequence of statements executed in order, keeping the computer state (such as registers, memory locations and stacks) in the back of your head. The immediate appeal to operating system programmers should be evident: this closely models what an OS developer needs to keep in mind, such as registers, stacks, cache frames, MMU tables, state transitions in hardware and so on. It is possible to see the whole family of imperative languages as domain specific languages for the domain of writing operating systems, so it would be for operating system developers what OpenGL is for computer graphics software developers. Lambda Calculus for Defining Languages In 1966 one of the early adopters and contributors to ALGOL (alongside Peter Naur, Tony Hoare and Niklaus Wirth), Peter Landin, published two articles in the Journal of the ACM titled Correspondence between ALGOL 60 and Church's Lambda-notation part I and part II. In the first article he begins with a good portion of dry humour: Anyone familiar with both Church's λ-calculi and ALGOL 60 will have noticed a superficial resemblance between the way variables tie up with the λ's in a nest of λ-expressions, and the way that identifiers tie up with the headings in a nest of procedures and blocks. He is of course aware that no-one beside himself had been in the position to realize this: the overlap between people familiar with Alonzo Church's λ-calculus and with ALGOL 60 was surprisingly down to one person on the planet. What is surprising is that it was even one person. Alonzo Church was a scholar of mathematical logic and computability, the supervisor of Alan Turing's doctoral thesis and active in the same field as Kurt Gödel (those men quoted each other in their respective articles). The lambda calculus ties into the type set theory created by Bertrand Russell and the logical-mathematical programme, another universe of history we will not discuss here. What λ-calculus (Lambda-calculus) does for a programming language definition is analogous to what regular expressions does for a languages syntax, but for it's semantics. While regular expressions can express how to parse a body of text in a language with regular grammar, expressions in λ-calculus can go on from the abstract syntax tree and express what an addition is, what a subtraction is, or what a bitwise OR is. This exercise is seldomly done in e.g. compiler construction courses, but defining semantics is an inherent part of a programming language definition. Perhaps the most remembered part of Landin's papers is his humorous term syntactic sugar which denotes things added to a language to make the life of the programmer easier, but which has no semantic content that cannot be expressed by the basic features of the language. The basic mathematical features of the language, on the other hand, are best expressed with λ-calculus. A notable invention in Landin's first article about defining ALGOL in terms of λ-calculus are the keywords let and where chosen to correspond to λ-calculus' Applicable Expressions. These keywords do not exist in ALGOL: they are part of a language to talk about a language, or in more complicated terms: a meta-language. So here we see the first steps toward a new language derived from λ-calculus. Landin does not give this language a name in this article, but just refers to it as “AE”. The AE executes in a theoretical machine called SECD, which is another trick of the trade, like Alan Turings “turing machine”: rather close to a mathematicians statement “let's assume we have...” The complete framework for defining ALGOL in λ-calculus is called AE/SECD. Functional Programming Functional programming languages then, implements lambda calculus. The central idea after some years of experience with defining languages such as ALGOL in terms of lambda calculus, is to just make the language resemble lambda calculus expressions to begin with, and the verification of the semantics will be simple and obvious. In 1966 Peter Landin followed up his articles using λ-calculus to describe ALGOL with his article The Next 700 Programming Languages. Here he invents functional programming in the form of an invented language called ISWIM (If You See What I Mean), as you can see again with a good dry humour. The language is λ-calculus with “syntactic sugar” on top, so a broad family of languages are possible to create using the framework as a basis. Landin's article was popular, and people did invent languages. Maybe not 700 of them. Yet. In section 10 of his article, named Eliminating explicit sequencing, Landin starts speculating and talks about a game that can be played with ALGOL: by removing any goto statements and labels, the program get a less sequential nature, i.e. the program counter is just advancing to the next line or iterating a loop. He quips: What other such features are there? This question is considered because, not surprisingly, it turns out that an emphasis on describing things in terms of other things leads to the same kind of requirements as an emphasis against explicit sequencing. He then goes on to show how to transform an ALGOL program into a purely functional ISWIM program and concludes: The special claim of ISWlM is that it grafts procedural notions onto a purely functional base without disturbing many of the desirable properties. (...) This paper can do no more than begin the task of explaining their practical significance. This reads as a call to action: we need to create functional programming languages akin to ISWIM, and we need to get rid of the J operator (the program control flow operator). Landin never did that himself. The Meta Language ML A few years later, in 1974, computer scientist Robin Milner, inspired by ISWIM and as a response to Landin's challenge, created the language ML, short for Meta Language. This is one of the 700 next languages and clearly recognized Landin's ideas about a language for defining languages, a grammar for defining grammar: a meta language with a meta grammar. He implemented the language on the DEC10 computer with the help of Malcolm Newey, Lockwood Morris, Mike Gordon and Chris Wadswort. The language was later ported to the VAX architectures. The language was based on ISWIM and dropped the so-called J operator (program point operator). It is domain-specific, and intended for authoring a tool for theorem proving called LCF. Standard ML has been fully semantically specified and formally verified. This language became widely popular, both in academia and industry. Removing the J operator made ML a declarative language, i.e. it does not specify the order of execution of statements, putting it in the same class of languages as Prolog or for that matter: Makefiles: there is no control flow in a Makefile, just a number of conditions that need to be evaluated to arrive at a complete target. ML still has one imperative language feature: assignment. Around this time, some scholars thought both the J operator and assignment were unnecessary and went on to define purely functional languages such as Haskell. We will not consider them here, they are outside the scope of this article. ML and everything else we discuss can be labelled as impure: a pejorative term invented by people who like purely functional languages. These people dislike not only the sequencing nature of imperative languages but also the assignment (such as happens with the keyword let) and prefer to think about evaluating relationships between abstract entities. ML can be grasped intuitively. For example this expression in ML evaluates to the integer 64: let val m : int = 4 val n : int = m*m in m*n end Here we see some still prominent AE/SECD, ISIWM features such as the keyword let for binding variables, or rather, associate names with elements such as integers and functions (similar to := assignment in some languages). The we see an implementation section in. We can define functions in ML, like this to compute the square root of five times x: val rootfivex : real -> real = fn x : real => Math.sqrt (5.0 * x) Notice absence of constructs such as BEGIN/END or semicolons. ML, like Python and other languages use whitespace to find beginning and end of basic blocks. The notation real –> real clearly states that the function takes a real number as input and produces a real number as output. The name real reflects some kind of mathematical ambition. The language cannot handle the mathematical set of real numbers — the ML real is what other languages call a float. ML has more syntactic sugar, so the following is equivalent using the keyword fun (fun-notation): fun rootfivex (x:real):real = Math.sqrt (5.0 * x) The syntax should be possible to grasp intuitively. Another feature of ML and other functional languages is that they easily operate on tuples i.e. an ordered sequence of variables, and tuples can also be returned from functions. For example you can calculate the distance between origin and two coordinates in a x/y-oriented plane like this: fun dist (x:real, y:real):real = Math.sqrt (x*x + y*y) This function can then be called elsewhere like this: val coor (x:real, y:real) val d = dist(coor) The type real of d will be inferred from the fact that the dist() function returns a real. ML gets much more complex than this. One of the upsides of the language that is universally admired is that ML programs, like most programs written in functional languages can be proven correct in the computational sense. This can be done within certain ramifications: for example input/output operations need to specify exactly which values are input or an undefined behaviour will occur. CAML and OCaml In 1987 Ascánder Suárez at the French Institute for Research in Computer Science and Automation (INRIA) reimplemented a compiler and runtime system for ML in LISP and called the result CAML for Categorical Abstract Machine Language, a pun on the fact that it ran on a virtual machine (Category Abstract Machine) and the heritage from ML proper. The abstract machine used was the LLM3 abstract LISP machine, which in turn ran on another computer. It was not fast. CAML was reimplemented in C in 1990-91 by Xavier Leroy, creating Caml Light, which was faster, because it was not written in a virtual machine running a virtual machine. Caml Light was more like Java and used a bytecode interpreter for its virtual machine. In 1995, Caml Special Light introduced a native compiler, so the bytecode produced from the Caml compiler could be compiled to object code and executed with no virtual machine overhead, using a native runtime environment. Didier Rémy, Jérôme Vouillon and Jacques Garrigue continued the development of Caml. Objective Caml arrived in 1996 and added some object oriented features to Caml. In 2011 the extended Caml Special Light compiler, and language derivative (dialect) of ML was renamed OCaml. In essence the compiler and language has a symbiotic relationship. There is no second implementation of OCaml. From the 1990s and forward, what is now the OCaml language and implementation has gained traction. It is a very popular functional programming language, or rather, popular as far as functional programming goes. It has optimized implementations for most architectures. The compiler itself is now written mostly in OCaml, but the runtime in C is still around, to hook into each operating system where the program will eventually run. The language and compiler has been used for a variety of applications. Every major Linux distribution carries packages with the OCaml compiler and libraries. There is even a GTK+ 3 OCaml library binding, so OCaml GUI programs can be created. OCaml simplifies binding labels to numbers etc, here is bubblesort implemented in OCaml: (* Bubblesort in OCaml, Linus Walleij 2022 *) let sort v = let newv = Array.make (Array.length v) 0 in for i = 1 to (Array.length v) - 1 do if v.(i - 1) > v.(i) then begin newv.(i - 1) <- v.(i); newv.(i) <- v.(i - 1); (* Copy back so we are working on the same thing *) v.(i - 1) <- newv.(i - 1); v.(i) <- newv.(i); end else begin newv.(i - 1) <- v.(i - 1); newv.(i) <- v.(i); end done; newv let rec ordered v = if Array.length v = 0 then true else if Array.length v = 1 then true (* ... or if the rest of the array is ordered *) else if v.(0) < v.(1) && ordered (Array.sub v 1 (Array.length v - 1)) then true else false;; let plist v = print_string "V = "; for i = 0 to (Array.length v) - 1 do begin print_int v.(i); if i < (Array.length v - 1) then print_string ","; end done; print_endline "";; let rec sortme v = if ordered v then v else sortme (sort v);; let v = [| 14 ; 4 ; 55 ; 100 ; 11 ; 29 ; 76 ; 19 ; 6 ; 82 ; 99 ; 0 ; 57 ; 36 ; 61 ; 30 |];; plist v;; plist (sortme v);; My experience with working with this example is that OCaml makes a “bit of resistance” to changing contents of things like arrays by indexing. It “dislikes” any imperative constructs and kind of nudges you in the direction of purely logical constructs such as the ordered function above. This is just my personal take. OCaml is still a dialect of ML. The file ending used on all files is .ml as well. OCaml – like Pythons pip or Perls CPAN has its own package system and library called opam. The prime application is still the OCaml Ergo Library, a library for automatic theorem proving. If your first and foremost use of computers is theorem proving, ML and OCaml continue to deliver since 1974. The more recent and widely popular Coq theorem prover is also written in OCaml. Rust then Rust was initially developed in 2006 as a hobby project by Graydon Hoare who was at the time working at Mozilla. OCaml and ML is mentioned as the biggest influence on the language, apart from C/C++. A typical sign of this influence would be that the first compiler for Rust was written in OCaml. A notable contributor to this codebase, apart from Hoare, is Brendan Eich, one of the founders of the Mozilla project and the inventor of JavaScript. While Brendan did not contribute much code he was at the time CTO of Mozilla, and this shows that when Mozilla started supporting the project in 2009 Rust was certainly well anchored in the organization, and Eich's early contributions to the language should be noted. (It may be commonplace that people in the CTO position at middle sized companies make commits to complex code bases, but I am not aware in that case.) Despite the OCaml codebase the first documentation of the language talks more about other functional or declarative languages such as NIL, Hermes, Erlang, Sather, Newsqueak, Limbo and Napier. These origins with extensive quotes from e.g. Joe Armstrong (the inventor of Erlang) have been toned down in contemporary Rust documentation. It is however very clear that Graydon has a deep interest in historical computer languages and is convinced that they have something to teach us, and the expressed ambition is to draw on these languages to pick the best parts. In his own words: I've always been a language pluralist — picture my relationship towards languages like a kid enjoying a wide variety of building blocks, musical instruments or plastic dinosaurs — and I don't think evangelism or single-language puritanism is especially helpful. What is unique about Rust is that it fuses “impure” functional programming with imperative programming, bringing several concepts from ML and OCaml over into the language. Another characteristic is that Rust compiled to target machine code from day one, rather than using any kind of virtual machine as did Peter Landins ISWIM, or the ML and OCaml languages (and as does say Java, or Python). Graydon probably did this intuitively, but a post he made in 2019 underscores the point: that virtual machines, even as an intermediate step, is bad language engineering and just generally a bad idea. In 2013 Graydon stepped down as main lead for Rust for personal reasons which he has detailed in a posting on Reddit. Rust has had the same symbiotic relationship between language and a single compiler implementation as OCaml, but this is changing, as there is now a second, GCC-based implementation in the works. Here is bubblesort implemented in Rust: /* Bubblesort in Rust, Linus Walleij 2022 */ fn sort(array : &mut [i32]) { let mut x : i32; if array.len() == 1 { return; } for i in 1..array.len() { if array[i - 1] > array[i] { x = array[i - 1]; array[i - 1] = array[i]; array[i] = x; } } } fn is_ordered(array : &[i32]) -> bool { if array.len() <= 1 { return true; } for i in 1..array.len() { if array[i - 1] > array[i] { return false; } } return true; } fn parray(array : &[i32]) { let mut x : i32; print!("V = "); for i in 0..array.len() { x = array[i]; print!("{x}"); if i != (array.len() - 1) { print!(","); } } println!(""); } fn main() { let mut array: [i32; 16] = [14, 4, 55, 100, 11, 29, 76, 19, 6, 82, 99, 0, 57, 36, 61, 30]; parray(&array); while !is_ordered(&array) { sort(&mut array); } parray(&array); } Rust leaves itself to easier imperative programming than OCaml: the keyword mut becomes quite similar to C:s const correctness tagging in this example. Since is_ordered and parray isn't altering the contents of the array these functions do not need to be marked with mut. You see some familiar virtues from Pascal: arrays “know” their length, and we use a method to obtain it: array.len(). The stated ambition is improved memory safety, data-race safety (concurrency) and type safety. The article Safe Systems Programming in Rust certainly presents the ambition in a straight-forward manner. Graydon also underscores the focus on memory and concurrency safety in a 2016 blog post. But make no mistake. The current underlying ambition is definitely nothing different from the ambition of the ALGOL committee between 1958 and 1968: to raise the abstraction of the language through the ambition to join computer programming with formal logic. This comes from the arrival of strong academic support for the language. A typical indication of this ambition is the well-funded RustBelt project involving a large amount of academic researchers, all familiar with formal logic, and resulting in such artefacts as Ralf Jung's PhD thesis Understanding and Evolving the Rust Programming Language. Here, formal logic in Rust Belt and the Coq proof assistant is used and concludes (from the abstract): Together, these proofs establish that, as long as the only unsafe code in a well-typed λRust program is confined to libraries that satisfy their verification conditions, the program is safe to execute. What is meant by “safe to execute” is that no use-after-free, dangling pointers, stale references, NULL pointer exceptions etc can ever occur in safe Rust code, because it is proven by formal logic: QED. It does not stop you from e.g. dividing by zero however, that problem is out-of-scope for the exercise. To me personally the most astonishing fact about Jung's thesis is that it manages to repeatedly cite and reference the computer scientist Tony Hoare without quoting the inventor of the Rust language, Graydon Hoare, a single time. In a way it confirms Graydon's own statement that Rust “contains nothing new” from a language point of view. The C programming language cannot be subject to the same scrutiny as Rust, simply because of all the (ab)use it allows, and which was mentioned by Wirth in his historical perspective: if a type can be changed by a cast and array indexing is not even part of the language, there is nothing much to prove. What has been interesting for scholars to investigate is a well-defined subset of C, such as the eBPF subset, which also partly explains the strong interest in eBPF: like with Rust, the build environment and language runtime has been defined with much stricter constraints and thus can be subject to formal verification. The ambition of Rust is, as I perceieve it, and whether the people driving it even knows it or not, to finish what the ALGOL committe as primus motor started in 1958, and what the Garmisch NATO conference concluded was necessary in 1968: to develop a language for systems programming that rely on formal logic proof, and to fulfil what ALGOL never could, what Pascal never could, and what the whole maybe-not-700 functional programming languages never could: a language that joins the disciplines of computer science and software Engineering into ONE discipline, where the scholars of each can solve problems together. That is the ambition of Rust as an implementation language for operating systems, such as Linux: provide a language backed by current top-of-the-line computer science research, for immediate application to software engineering developing the top-of-the-line operating system. What it offers Linux is raised abstraction to counter the problems of complexity identified in the 1968 Garmisch NATO conference and now bleeding obvious given the recurring security incidents, and thereby would bring the engineering project Linux closer to computer science. Other approaches to increased Linux (memory- concurrency-) safety are possible: notably increased testing, which is the engineering go-to panacea. And automated testing of Linux has indeed increased a lot in recent years. Raising the abstraction of the implementation language and proving it formally comes with the ambition to make testing less important. [Mathieu Poirer and Jesper Jansson has helped out in reviewing this blog post, for which I am forever grateful: remaining errors, bugs and biased opinions are my own.]
4768
dbpedia
0
2
https://bulldogjob.com/readme/why-algol-was-an-important-programming-language
en
Why ALGOL was an important programming language?
https://cdn.bulldogjob.c…storyosalgol.png
https://cdn.bulldogjob.c…storyosalgol.png
[ "https://bulldogjob.com/_next/static/media/bulldogjob-logo.34bbbab5.svg", "https://cdn.bulldogjob.com/system/authors/photos/000/000/055/square/dsgsgd.png", "https://cdn.bulldogjob.com/system/readables/covers/000/004/237/max_res/180424_thestoryosalgol.png", "https://cdn.bulldogjob.com/system/photos/files/000/013/383/original/algol-creators.png", "https://cdn.bulldogjob.com/system/photos/files/000/013/384/original/electrologica_x1.jpg", "https://bulldogjob.com/next-images/report-vertical-en.png" ]
[]
[]
[ "" ]
null
[ "Adam Kukołowicz" ]
2024-04-23T10:00:00+02:00
ALGOL is a more interesting language than you think, both in terms of its story and legacy.
en
/favicons/apple-icon-57x57.png
https://bulldogjob.com/readme/why-algol-was-an-important-programming-language
Today, I'm taking you on a journey to the past, back to the late 1950s, when the foundations of computer science were being laid, and when computers were seen as the future. As we talk about this period today, names like FORTRAN, Lisp, and COBOL often come up. Yet, there was ALGOL, a pretty obscure, but really important language in the history of programming. The story of the language deserves to be told. What did the world of programming look like then? It was a world vastly different from today, mainly because most coding was done in assembly language (or not even that in Europe). If any of you have programmed in assembly, you know it’s a laborious process. That's how it was in the 1950s. At that time, the first programming languages were being developed to make programming easier. Most of these languages had specific applications and were often tied directly to the architecture of the computers on which they ran. Additionally, they were modeled on natural languages and hard to port between platforms. In 1955 a symposium on automated computing was held in Darmstadt, Germany. A few of the speakers talked about the need for better and more expressive programming languages. Ideally, that would be one algorithmic language that was platform-independent a that would allow researchers from around the world to collaborate on new computing methods. A universal language was needed, so we could stop thinking in terms of bits and registers and start thinking in terms of algorithms and data structures. Gesellschaft fur Angewandte Mathematik und Mechanik (GAMM) or "Society for Applied Mathematics and Mechanics" thought it was a great idea, especially the universal part. They realized that creating a language on their own was possible, but it for sure wouldn't be accepted worldwide. That's why they decided to reach out in 1957 to the American Association of Computing Machinery (ACM) and invite them to collaborate. It was a good call, since Americans started to think about similar idea at the time. They agreed to create a brand new language together. ALGOL Enters the Stage ALGOL stands for Algorithmic Language. Initially, it was to be called IAL (International Algebraic Language), but it was soon realized that the acronym was not very convenient. The first version emerged from a meeting of an international committee of scientists from Europe and the United States in Zurich in 1958—hence ALGOL 58. It wasn’t quite a programming language in today’s sense. As John Backus, one of ALGOL's creators, wrote: The ACM-GAMM conference in Zurich had two main motives in creating IAL: (a) to provide a way to communicate numerical methods and other procedures among people (b) to provide a way to execute a fixed process on various machines. In 1958, the goal was to create a standard description of what was happening on computers. Although ALGOL 58 had several implementations, it did not gain wide recognition. It introduced the concept of compound statement, but its use was limited only to control flow. Other than that it had types, loops, conditionals, procedures, and switches. It soon became clear that the outcomes were not satisfying. By June 1959, it was evident that further work was needed, and broad consultations were announced to gather material for another meeting, where a new version of the language would be developed. The Americans focused on the practical merits of the language—mainly because, in the USA, programming was slowly becoming a profession and the practical merits of ALGOL 58 were not great. Europeans, however, made more proposals concerning the foundations of the language. That was the beginning of ALGOL 60. Yes, you guessed it. ALGOL 60, because it was proposed in 1960. The meeting this time took place in Paris, with thirteen scientists participating. Let me quickly introduce them to show the caliber of people involved: Friedrich L. Bauer - creator of the term “software engineering” Peter Naur - editor of the ALGOL 60 report, co-author of the Backus-Naur notation (although he strongly disclaimed this) Heinz Rutishauser - mathematician, pioneer of modern numerical analysis and computer science Klaus Samelson - a pioneer in the fields of compilers and stack algorithms Bernard Vauquois - a pioneer in machine translation Adriaan van Wijngaarden - father of Dutch computer science Michael Woodger - co-creator of the Pilot ACE, one of the first British computers, co-creator of the Ada language John W. Backus - creator of FORTRAN (led the team), creator of the Backus-Naur notation, and creator of the function-level programming paradigm (not to be confused with functional programming—a very interesting concept that I had never heard of) Julien Green - also involved in ALGOL 58 Charles Katz - worked on compilers for the first computers John McCarthy - creator of the term “artificial intelligence” and researcher in the field, for which he received the Turing Award; creator of Lisp Alan Perlis - the first recipient of the Turing Award, awarded for his contributions to advanced programming techniques and compiler construction Joseph Henry Wegstein - worked on automatic data processing and fingerprint recognition, participated in the work on COBOL Alan Perlis described the Paris meetings in this way: The meetings were exhausting, interminable, and exhilarating. One became aggravated when one's good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent. On every point at issue there were opinions and solutions, with descriptive prose to support them that ranged from the technically clever to the breathtakingly simple. But there was rarely a complete cancellation. Progress was steady and the output, ALGOL 60, was more racehorse than camel. The photo shows six participants of the Paris conference in 1960, dedicated to ALGOL. In the top row: John McCarthy, Friedrich Bauer, Joseph Wegstein. In the bottom row: John Backus, Peter Naur, Alan Perlis. Imagine thirteen scientists, engineers, and researchers from several countries. Each threw in their ideas on what the language should look like (okay, preliminary proposals had been selected earlier). In 1960, we were dealing with greenfield - there were no limitations arising from history because the field was new. Although there was already a previous version of ALGOL and FORTRAN and the modest languages of the 50s, all of them lacked a lot in terms of describing algorithms or expressiveness. So there was plenty of room for improvement. From all this, it was necessary to crystallize a coherent, pure language that would be the basic tool used by computer scientists in everyday work. What did the committee come up with? ALGOL 60 Features Certainly, one of the more spectacular new features was the introduction of blocks. Blocks were denoted with keywords begin and end, Additionally, a block now had its own scope. This also allowed further development of the compound instruction originally introduced in ALGOL 58. ALGOL was the first language to allow the definition of nested functions, which had their lexical scope. The authors of the report recommended implementing 9 standard functions: abs, sign, sqrt, sin, cos, arctan, ln, exp, and entier. It's not a lot, considering the standard libraries of contemporary languages. This shows how pure ALGOL was. Another important tool was recursion. It was proposed by John McCarthy, who had just completed work on Lisp, where he included this possibility. However, the proposal met with considerable concern from the committee, and the attending members rejected the introduction of the keyword recursive. Ultimately, recursion made it into the standard implicitly - the ALGOL specification does not prohibit recursive calls. Another curiosity was two strategies for passing parameters. The first is the well-known pass-by-value. The second, however, is pass-by-name. In this strategy, arguments are not invoked until they appear in the body of the procedure. Theoretically, this can provide a performance advantage, but it creates certain challenges for compilers. ALGOL Reception The report produced in 1960 was a wonderful thing, but we must not forget that it was only a dozen-page description of the language. There was no reference implementation. Moreover, ALGOL 60 did not specify input and output operations at all! Thus, it's hard to talk about a "Hello World" program written in ALGOL 60. The report itself was incredibly concise, containing very few superfluous words, and—as Alan Perlis said: Like the Bible, it was meant not merely to be read, but to be interpreted. So, one of the challenges waiting for the creators of individual implementations was to interpret input and output operations in the spirit of the specification. The first compiler was X1 ALGOL 60, developed by Edsger Dijkstra (yes, the same one from the algorithm lectures) and Jacob Zonneveld - for the Dutch computer Electrologica X1. Electrologica X1, circa 1960 By 1963, compilers had appeared for several of the computers of the time, including UNIVAC and Elliot (the compiler creator was Tony Hoare). In the second half of the 60s, implementations of ALGOL also appeared on computers from the Soviet bloc. From 1965, you could program in ALGOL on the MINSK computers, and from 1967 on the Polish computer ZAM. The biggest problem with ALGOL was the lack of support. FORTRAN, developed by IBM, was not as advanced, but it had full support from the parent company. IBM themselves wrote reference implementations for subsequent platforms, making it seem more suitable for commercial applications—it was standardized. This is how Robert Bemer spoke about the new language in 1961: No reasonable mechanism for maintenance seems to exist. No one seems to be able to answer the basic question, ”What is ALGOL?” I foresee a difficult time for ALGOL unless a revised maintenance procedure is devised. The language is too general and has too many ambiguities. Bemer was right in that the report indeed had several omissions and lacked a mechanism for correcting them. Ultimately, neither the creators nor those interested in ALGOL established either a support mechanism or a development mechanism, leading to the fiasco of ALGOL 68, which was rejected by the community. While ALGOL did not achieve spectacular commercial success, scientists received it very enthusiastically. Hence, it became the main language for describing algorithms for the next 20-30 years. Its clean and expressive syntax facilitated communication among people. The lack of reference implementations, however, made using the language for human-machine communication difficult. Since the computer industry in America was much more developed than in Europe, practical considerations were paramount there. The issue of support and compatibility was crucial. In Europe, however, computer science was almost exclusively the domain of research centers. Scientists had different priorities, and the new language was very well-suited for researching algorithms. Thus, ALGOL was much better received on the Old Continent. What is the legacy of ALGOL? The impact of ALGOL on later languages was huge. It inspired most imperative programming languages. Among the more significant ones are Pascal, Simula (the first object-oriented language), and C. In essence, all subsequent programming languages include blocks and recursion. Work on ALGOL 68 also brought about case expressions and the concept of records along with references, as well as pass-by-reference. ALGOL 60 was the first programming language to be defined in BNF, or Backus-Naur form. It's basically about building a language from already defined concepts—with minimal references to natural language. The general form is: <metalinguistic variable> ::= <form> Simple example: <digit> ::= 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 More complex example: <if clause> ::= if <Boolean expression> then <unconditional statement> ::= <basic statement> | <compound statement> | <block> <if statement> ::= <if clause> <unconditional statement> <conditional statement> ::= <if statement> | <if statement> else <statement> | <if clause> <for statement> | <label>: <conditional statement> The first example doesn't look too impressive, but the second gives a better understanding of the possibilities. Thanks to this notation, one can effectively describe a programming language without external context. Since then, this notation and its developments have become the standard for describing programming languages. You can see it everywhere, for example in Python or JavaScript. ALGOL became a catalyst in the development of compilers. The first compilers were simple translators of formulas into machine code, but ALGOL required something more—procedures, variable-sized arrays, or recursion didn't fit into the sequential translation technique. To implement them correctly and efficiently, new, much more advanced compilation techniques had to be developed. All this means that although it has completely fallen out of use, ALGOL was an extraordinary language. The effort put into its development and implementation elevated computer science to another level. One can imagine that development was inevitable. However, it seems to me that thanks to the quality of ALGOL, it was somewhat faster. Many concepts could be transferred painlessly to other programming languages or developed further—and that's what we benefit from today.
4768
dbpedia
1
32
https://homepage.divms.uiowa.edu/~jones/retro/notes/13.shtml
en
CS:4980:4, Notes 13, Fall 2015
[]
[]
[]
[ "" ]
null
[ "Douglas W. Jones" ]
null
en
null
By the late 1950s, core memory sizes greater than 4K were natural, and high-level languages were starting to emerge. FORTRAN was introduced in 1957, preliminary discussions of Algol were being widely circulated in 1958, leading to a solid specification for Algol 60, released in 1960. FORTRAN II, released in 1958, had parameterized subroutines and functions, local variables, and global variables of a sort (named COMMON blocks). Algol 60 had hierarchical scope rules with nesting of blocks. This required that programmers start to think about how to use memory in ways they had not in the era of very small memories. FORTRAN's memory model was based entirely on static storage allocation, but members of the Algol committee began to think in terms of dynamic allocation. How storage was to be allocated was not the subject of discussion, but the idea of using a stack was implicit in their thinking. The paper assigned for this discussion (free if referenced from on campus, otherwise, extraordinarily expensive) is the first clear public discussion I know of to explain how to implement and use Algol-like languages with their implied use of a stack. For most programmers of the era, this paper was the first introduction to stacks, and as we will see, it had a huge influence on computer architecture, just as Algol 60 had a huge influence on programming languages. An aside: Algol 60 begat CPL (Christopher Strachey's Combined or Cambridge Programming Language) which begat BCPL (Basic CPL) which begat B (at Bell Labs) which begat C and C++ (at Bell Labs) which begat Java (at Sun Microsystems). Algol 60 also begat Algol 68, Pascal and Simula 67. Simula 67 was upward compatable from Algol 60 and had objects (C++ added objects to C). Pascal (developed by Wirth) begat Ada (developed by Honeywell Bull in France). Today, we take for granted the idea of local variables allocated on a stack allowing for the possibility of recursion and subroutine calls for granted. This was not always so. Programmers in 1960 assumed static memory alloction for just about everything, and Dijkstra's paper, the reading for today, demonstrates how hard it was to communicate these new ideas.
4768
dbpedia
1
49
https://ethw.org/John_Backus
en
John Backus
https://ethw.org/w/logo.png
https://ethw.org/w/logo.png
[ "https://ethw.org/w/logo.png", "https://ethw.org/w/images/5/57/Backus.jpg", "https://ethw.org/w/resources/assets/poweredby_mediawiki_88x31.png" ]
[ "https://www.youtube.com/embed/WKUAir2dOEY" ]
[]
[ "" ]
null
[ "ETHW" ]
2017-04-12T13:51:05+00:00
en
/favicon.ico
ETHW
https://ethw.org/John_Backus
Biography John Warner Backus was an American computer scientist. He led the team that invented the first widely used high-level programming language (FORTRAN) and was the inventor of the Backus-Naur form (BNF), the almost universally used notation to define formal language syntax. He also did research in function-level programming and helped to popularize it. The IEEE awarded Backus the W.W. McDowell Award in 1967 for the development of FORTRAN. He received the National Medal of Science in 1975, and the 1977 ACM Turing Award “for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for seminal publication of formal procedures for the specification of programming languages. Backus was born in Philadelphia, Pennsylvania, but grew up in Wilmington, Delaware. He studied at the The Hill School in Pottstown, Pennsylvania, and was apparently not a diligent student. After entering the University of Virginia to study chemistry, he quit and was drafted into the U.S. Army. He began medical training and, during an internship at a hospital, he was diagnosed with a brain tumor, which was successfully removed; a plate was installed in his head, and he dropped out of medical training after nine months and a subsequent operation to replace the plate with one of his own design. After moving to New York City he initially took training as a radio technician and discovered an interest in mathematics — it would prove to be his calling. He graduated from Columbia University with a Master's degree in 1949, and joined IBM in 1950. During his first three years, he worked on the Selective Sequence Electronic Calculator (SSEC); his first major project was to write a program to calculate positions of the Moon. In 1953, John Backus also developed the language Speedcoding, the first higher-order language created for an IBM computer. The difficulties of programming were acute, and in 1954 Backus assembled a team to define and develop Fortran for the IBM 704 computer. Though debatably not the first high-level programming language, it was the first to achieve wide use. John Backus made another, critical contribution to early computer science: During the latter part of the 1950s Backus served on the international committees which developed ALGOL 58 and the very influential ALGOL 60, which quickly became the de facto worldwide standard for publishing algorithms. Backus developed the Backus-Naur Form (BNF), in the UNESCO report on ALGOL 58. This was a formal notation with which one could describe any context-free programming language and was important in the development of compilers. This contribution helped Backus win the Turing Award. He later worked on a “function-level” programming language known as FP which was described in his Turing Award lecture “Can Programming be Liberated from the von Neumann Style?” Sometimes viewed as Backus’s apology for creating FORTRAN, this paper did less to garner interest in the FP language than to spark research into functional programming in general. An FP interpreter was distributed with the 4.2BSD Unix operating system. FP was strongly inspired by Kenneth E. Iverson’s APL, even using a non-standard character set. Backus spent the latter part of his career developing FL (from “Function Level”), a successor to FP. FL was an internal IBM research project, and development of the language essentially stopped when the project was finished (only a few papers documenting it remain), but many of the language’s innovative, arguably important ideas have now been implemented in Iverson’s J programming language. Backus was named an IBM Fellow in 1963, and was awarded a honoris causa from the University Henri Poincaré in Nancy (France) in 1989 and a Draper Prize in 1993. He retired in 1991 and died at his home in Ashland, Oregon on March 17, 2007.
4768
dbpedia
0
52
https://kids.kiddle.co/Fortran
en
Fortran facts for kids
https://kids.kiddle.co/i…upercomputer.jpg
https://kids.kiddle.co/i…upercomputer.jpg
[ "https://kids.kiddle.co/images/wk/kids-robot.svg", "https://kids.kiddle.co/images/wk/kids-search-engine.svg", "https://kids.kiddle.co/images/thumb/d/d3/IBM_Blue_Gene_P_supercomputer.jpg/300px-IBM_Blue_Gene_P_supercomputer.jpg", "https://kids.kiddle.co/images/thumb/5/5e/Algol%26Fortran_Family_By_Skippppp.svg/360px-Algol%26Fortran_Family_By_Skippppp.svg.png", "https://kids.kiddle.co/images/thumb/b/b2/John_Backus_2.jpg/230px-John_Backus_2.jpg", "https://kids.kiddle.co/images/thumb/0/07/Fortran_acs_cover.jpeg/230px-Fortran_acs_cover.jpeg", "https://kids.kiddle.co/images/thumb/5/58/FortranCardPROJ039.agr.jpg/300px-FortranCardPROJ039.agr.jpg", "https://kids.kiddle.co/images/thumb/1/18/FortranCodingForm.png/300px-FortranCodingForm.png", "https://kids.kiddle.co/images/thumb/e/ee/Ftn-elim-1240x1709.jpg/230px-Ftn-elim-1240x1709.jpg", "https://kids.kiddle.co/images/thumb/b/b0/4.3_BSD_UWisc_VAX_Emulation_f77_Manual.png/300px-4.3_BSD_UWisc_VAX_Emulation_f77_Manual.png", "https://kids.kiddle.co/images/thumb/b/b9/General_relativistic_magnetohydrodynamic_simulation_of_black_hole_accretion.jpg/300px-General_relativistic_magnetohydrodynamic_simulation_of_black_hole_accretion.jpg", "https://kids.kiddle.co/images/thumb/b/be/Cylinder_flow.gif/300px-Cylinder_flow.gif", "https://kids.kiddle.co/images/thumb/a/a6/Velocity_and_sea_surface_temperature_in_the_oceans.png/230px-Velocity_and_sea_surface_temperature_in_the_oceans.png", "https://kids.kiddle.co/images/wk/kids-search-engine.svg" ]
[]
[]
[ "" ]
null
[]
null
Learn Fortran facts for kids
en
/images/wk/favicon-16x16.png
https://kids.kiddle.co/Fortran
Quick facts for kids Fortran Paradigm Multi-paradigm: structured, imperative (procedural, object-oriented), generic, array Designed by John Backus Developer John Backus and IBM First appeared 1957; 67 years ago ( ) Stable release Fortran 2023 (ISO/IEC 1539:2023) / November 17, 2023; 8 months ago ( ) Typing discipline strong, static, manifest Filename extensions .f90, .f, .for Major implementations Absoft, Cray, GFortran, G95, IBM XL Fortran, Intel, Hitachi, Lahey/Fujitsu, Numerical Algorithms Group, Open Watcom, PathScale, PGI, Silverfrost, Oracle Solaris Studio, others Influenced by Speedcoding Influenced ALGOL 58, BASIC, C, Chapel, CMS-2, DOPE, Fortress, MATLAB, PL/I, PACT I, MUMPS, IDL, Ratfor Fortran (; formerly FORTRAN) is a third generation, compiled, imperative programming language that is especially suited to numeric computation and scientific computing. Fortran was originally developed by IBM. It first compiled correctly in 1958. Fortran computer programs have been written to support scientific and engineering applications, such as numerical weather prediction, finite element analysis, computational fluid dynamics, geophysics, computational physics, crystallography and computational chemistry. It is a popular language for high-performance computing and is used for programs that benchmark and rank the world's fastest supercomputers. Fortran has evolved through numerous versions and dialects. In 1966, the American National Standards Institute (ANSI) developed a standard for Fortran because new compilers would slightly change the syntax. Nonetheless, successive versions have added support for strings (Fortran 77), structured programming, array programming, modular programming, generic programming (Fortran 90), parallel computing (Fortran 95), object-oriented programming (Fortran 2003), and concurrent programming (Fortran 2008). Since April 2024, Fortran has ranked among the top ten languages in the TIOBE index, a measure of the popularity of programming languages. Fortran has received over 800,000 euros from Germany's Sovereign Tech Fund. Naming The first manual for FORTRAN describes it as a Formula Translating System, and printed the name with small caps, Fortran. Other sources suggest the name stands for Formula Translator, or Formula Translation. Early IBM computers did not support lowercase letters, and the names of versions of the language through FORTRAN 77 were usually spelled in all-uppercase Note that FORTRAN 77 was the last version in which the Fortran character set included only uppercase letters. The official language standards for Fortran have referred to the language as "Fortran" with initial caps since Fortran 90. Origins In late 1953, John W. Backus submitted a proposal to his superiors at IBM to develop a more practical alternative to assembly language for programming their IBM 704 mainframe computer. Backus' historic FORTRAN team consisted of programmers Richard Goldberg, Sheldon F. Best, Harlan Herrick, Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Harold Stern, Lois Haibt, and David Sayre. Its concepts included easier entry of equations into a computer, an idea developed by J. Halcombe Laning and demonstrated in the Laning and Zierler system of 1952. A draft specification for The IBM Mathematical Formula Translating System was completed by November 1954. The first manual for FORTRAN appeared in October 1956, with the first FORTRAN compiler delivered in April 1957. Fortran produced efficient enough code for assembly language programmers to accept a high-level programming language replacement. John Backus said during a 1979 interview with Think, the IBM employee magazine, "Much of my work has come from being lazy. I didn't like writing programs, and so, when I was working on the IBM 701, writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs." The language was widely adopted by scientists for writing numerically intensive programs, which encouraged compiler writers to produce compilers that could generate faster and more efficient code. The inclusion of a complex number data type in the language made Fortran especially suited to technical applications such as electrical engineering. By 1960, versions of FORTRAN were available for the IBM 709, 650, 1620, and 7090 computers. Significantly, the increasing popularity of FORTRAN spurred competing computer manufacturers to provide FORTRAN compilers for their machines, so that by 1963 over 40 FORTRAN compilers existed. FORTRAN was provided for the IBM 1401 computer by an innovative 63-phase compiler that ran entirely in its core memory of only 8000 (six-bit) characters. The compiler could be run from tape, or from a 2200-card deck; it used no further tape or disk storage. It kept the program in memory and loaded overlays that gradually transformed it, in place, into executable form, as described by Haines. This article was reprinted, edited, in both editions of Anatomy of a Compiler and in the IBM manual "Fortran Specifications and Operating Procedures, IBM 1401". The executable form was not entirely machine language; rather, floating-point arithmetic, sub-scripting, input/output, and function references were interpreted, preceding UCSD Pascal P-code by two decades. GOTRAN, a simplified, interpreted version of FORTRAN I (with only 12 statements not 32) for "load and go" operation was available (at least for the early IBM 1620 computer). Modern Fortran, and almost all later versions, are fully compiled, as done for other high-performance languages. The development of Fortran paralleled the early evolution of compiler technology, and many advances in the theory and design of compilers were specifically motivated by the need to generate efficient code for Fortran programs. FORTRAN The initial release of FORTRAN for the IBM 704 contained 32 statements, including: DIMENSION and EQUIVALENCE statements Assignment statements Three-way arithmetic IF statement, which passed control to one of three locations in the program depending on whether the result of the arithmetic expression was negative, zero, or positive IF statements for checking exceptions (ACCUMULATOR OVERFLOW, QUOTIENT OVERFLOW, and DIVIDE CHECK); and IF statements for manipulating sense switches and sense lights GO TO, computed GO TO, ASSIGN, and assigned GO TO DO loops Formatted I/O: FORMAT, READ, READ INPUT TAPE, WRITE, WRITE OUTPUT TAPE, PRINT, and PUNCH Unformatted I/O: READ TAPE, READ DRUM, WRITE TAPE, and WRITE DRUM Other I/O: END FILE, REWIND, and BACKSPACE PAUSE, STOP, and CONTINUE FREQUENCY statement (for providing optimization hints to the compiler). The arithmetic IF statement was reminiscent of (but not readily implementable by) a three-way comparison instruction (CAS—Compare Accumulator with Storage) available on the 704. The statement provided the only way to compare numbers—by testing their difference, with an attendant risk of overflow. This deficiency was later overcome by "logical" facilities introduced in FORTRAN IV. The FREQUENCY statement was used originally (and optionally) to give branch probabilities for the three branch cases of the arithmetic IF statement. The first FORTRAN compiler used this weighting to perform at compile time a Monte Carlo simulation of the generated code, the results of which were used to optimize the placement of basic blocks in memory—a very sophisticated optimization for its time. The Monte Carlo technique is documented in Backus et al.'s paper on this original implementation, The FORTRAN Automatic Coding System: The fundamental unit of program is the basic block; a basic block is a stretch of program which has one entry point and one exit point. The purpose of section 4 is to prepare for section 5 a table of predecessors (PRED table) which enumerates the basic blocks and lists for every basic block each of the basic blocks which can be its immediate predecessor in flow, together with the absolute frequency of each such basic block link. This table is obtained by running the program once in Monte-Carlo fashion, in which the outcome of conditional transfers arising out of IF-type statements and computed GO TO's is determined by a random number generator suitably weighted according to whatever FREQUENCY statements have been provided. The first FORTRAN compiler reported diagnostic information by halting the program when an error was found and outputting an error code on its console. That code could be looked up by the programmer in an error messages table in the operator's manual, providing them with a brief description of the problem. Later, an error-handling subroutine to handle user errors such as division by zero, developed by NASA, was incorporated, informing users of which line of code contained the error. Fixed layout and punched cards See also: Computer programming in the punched card era Before the development of disk files, text editors and terminals, programs were most often entered on a keypunch keyboard onto 80-column punched cards, one line to a card. The resulting deck of cards would be fed into a card reader to be compiled. Punched card codes included no lower-case letters or many special characters, and special versions of the IBM 026 keypunch were offered that would correctly print the re-purposed special characters used in FORTRAN. Reflecting punched card input practice, Fortran programs were originally written in a fixed-column format, with the first 72 columns read into twelve 36-bit words. A letter "C" in column 1 caused the entire card to be treated as a comment and ignored by the compiler. Otherwise, the columns of the card were divided into four fields: 1 to 5 were the label field: a sequence of digits here was taken as a label for use in DO or control statements such as GO TO and IF, or to identify a FORMAT statement referred to in a WRITE or READ statement. Leading zeros are ignored and 0 is not a valid label number. 6 was a continuation field: a character other than a blank or a zero here caused the card to be taken as a continuation of the statement on the prior card. The continuation cards were usually numbered 1, 2, etc. and the starting card might therefore have zero in its continuation column—which is not a continuation of its preceding card. 7 to 72 served as the statement field. 73 to 80 were ignored (the IBM 704's card reader only used 72 columns). Columns 73 to 80 could therefore be used for identification information, such as punching a sequence number or text, which could be used to re-order cards if a stack of cards was dropped; though in practice this was reserved for stable, production programs. An IBM 519 could be used to copy a program deck and add sequence numbers. Some early compilers, e.g., the IBM 650's, had additional restrictions due to limitations on their card readers. Keypunches could be programmed to tab to column 7 and skip out after column 72. Later compilers relaxed most fixed-format restrictions, and the requirement was eliminated in the Fortran 90 standard. Within the statement field, whitespace characters (blanks) were ignored outside a text literal. This allowed omitting spaces between tokens for brevity or including spaces within identifiers for clarity. For example, AVG OF X was a valid identifier, equivalent to AVGOFX, and 101010DO101I=1,101 was a valid statement, equivalent to 10101 DO 101 I = 1, 101 because the zero in column 6 is treated as if it were a space (!), while 101010DO101I=1.101 was instead 10101 DO101I = 1.101, the assignment of 1.101 to a variable called DO101I. Note the slight visual difference between a comma and a period. Hollerith strings, originally allowed only in FORMAT and DATA statements, were prefixed by a character count and the letter H (e.g., 26HTHIS IS ALPHANUMERIC DATA.), allowing blanks to be retained within the character string. Miscounts were a problem. Evolution Timeline of Fortran language Year Informal name ANSI Standard ISO/IEC Standard 1957 FORTRAN N/A N/A 1958 FORTRAN II N/A N/A 1958 FORTRAN III N/A N/A 1961 FORTRAN IV N/A N/A 1966 FORTRAN 66 X3.9-1966 N/A 1978 FORTRAN 77 X3.9-1978 N/A 1991 Fortran 90 X3.198-1992 1539:1991 1997 Fortran 95 N/A 1539-1:1997 2004 Fortran 2003 N/A 1539-1:2004 2010 Fortran 2008 N/A 1539-1:2010 2018 Fortran 2018 N/A 1539-1:2018 2023 Fortran 2023 N/A 1539-1:2023 FORTRAN II IBM's FORTRAN II appeared in 1958. The main enhancement was to support procedural programming by allowing user-written subroutines and functions which returned values with parameters passed by reference. The COMMON statement provided a way for subroutines to access common (or global) variables. Six new statements were introduced: SUBROUTINE, FUNCTION, and END CALL and RETURN COMMON Over the next few years, FORTRAN II added support for the DOUBLE PRECISION and COMPLEX data types. Early FORTRAN compilers supported no recursion in subroutines. Early computer architectures supported no concept of a stack, and when they did directly support subroutine calls, the return location was often stored in one fixed location adjacent to the subroutine code (e.g. the IBM 1130) or a specific machine register (IBM 360 et seq), which only allows recursion if a stack is maintained by software and the return address is stored on the stack before the call is made and restored after the call returns. Although not specified in FORTRAN 77, many F77 compilers supported recursion as an option, and the Burroughs mainframes, designed with recursion built-in, did so by default. It became a standard in Fortran 90 via the new keyword RECURSIVE. Simple FORTRAN II program This program, for Heron's formula, reads data on a tape reel containing three 5-digit integers A, B, and C as input. There are no "type" declarations available: variables whose name starts with I, J, K, L, M, or N are "fixed-point" (i.e. integers), otherwise floating-point. Since integers are to be processed in this example, the names of the variables start with the letter "I". The name of a variable must start with a letter and can continue with both letters and digits, up to a limit of six characters in FORTRAN II. If A, B, and C cannot represent the sides of a triangle in plane geometry, then the program's execution will end with an error code of "STOP 1". Otherwise, an output line will be printed showing the input values for A, B, and C, followed by the computed AREA of the triangle as a floating-point number occupying ten spaces along the line of output and showing 2 digits after the decimal point, the .2 in F10.2 of the FORMAT statement with label 601. C AREA OF A TRIANGLE WITH A STANDARD SQUARE ROOT FUNCTION C INPUT - TAPE READER UNIT 5, INTEGER INPUT C OUTPUT - LINE PRINTER UNIT 6, REAL OUTPUT C INPUT ERROR DISPLAY ERROR OUTPUT CODE 1 IN JOB CONTROL LISTING READ INPUT TAPE 5, 501, IA, IB, IC 501 FORMAT (3I5) C IA, IB, AND IC MAY NOT BE NEGATIVE OR ZERO C FURTHERMORE, THE SUM OF TWO SIDES OF A TRIANGLE C MUST BE GREATER THAN THE THIRD SIDE, SO WE CHECK FOR THAT, TOO IF (IA) 777, 777, 701 701 IF (IB) 777, 777, 702 702 IF (IC) 777, 777, 703 703 IF (IA+IB-IC) 777, 777, 704 704 IF (IA+IC-IB) 777, 777, 705 705 IF (IB+IC-IA) 777, 777, 799 777 STOP 1 C USING HERON'S FORMULA WE CALCULATE THE C AREA OF THE TRIANGLE 799 S = FLOATF (IA + IB + IC) / 2.0 AREA = SQRTF( S * (S - FLOATF(IA)) * (S - FLOATF(IB)) * + (S - FLOATF(IC))) WRITE OUTPUT TAPE 6, 601, IA, IB, IC, AREA 601 FORMAT (4H A= ,I5,5H B= ,I5,5H C= ,I5,8H AREA= ,F10.2, + 13H SQUARE UNITS) STOP END FORTRAN III IBM also developed a FORTRAN III in 1958 that allowed for inline assembly code among other features; however, this version was never released as a product. Like the 704 FORTRAN and FORTRAN II, FORTRAN III included machine-dependent features that made code written in it unportable from machine to machine. Early versions of FORTRAN provided by other vendors suffered from the same disadvantage. FORTRAN IV IBM began development of FORTRAN IV starting in 1961, as a result of customer demands. FORTRAN IV removed the machine-dependent features of FORTRAN II (such as READ INPUT TAPE), while adding new features such as a LOGICAL data type, logical Boolean expressions and the logical IF statement as an alternative to the arithmetic IF statement. FORTRAN IV was eventually released in 1962, first for the IBM 7030 ("Stretch") computer, followed by versions for the IBM 7090, IBM 7094, and later for the IBM 1401 in 1966. By 1965, FORTRAN IV was supposed to be compliant with the standard being developed by the American Standards Association X3.4.3 FORTRAN Working Group. Between 1966 and 1968, IBM offered several FORTRAN IV compilers for its System/360, each named by letters that indicated the minimum amount of memory the compiler needed to run. The letters (F, G, H) matched the codes used with System/360 model numbers to indicate memory size, each letter increment being a factor of two larger: 1966 : FORTRAN IV F for DOS/360 (64K bytes) 1966 : FORTRAN IV G for OS/360 (128K bytes) 1968 : FORTRAN IV H for OS/360 (256K bytes) Digital Equipment Corporation maintained DECSYSTEM-10 Fortran IV (F40) for PDP-10 from 1967 to 1975. Compilers were also available for the UNIVAC 1100 series and the Control Data 6000 series and 7000 series systems. At about this time FORTRAN IV had started to become an important educational tool and implementations such as the University of Waterloo's WATFOR and WATFIV were created to simplify the complex compile and link processes of earlier compilers. In the FORTRAN IV programming environment of the era, except for that used on Control Data Corporation (CDC) systems, only one instruction was placed per line. The CDC version allowed for multiple instructions per line if separated by a $ (dollar) character. The FORTRAN sheet was divided into four fields, as described above. Two compilers of the time, IBM "G" and UNIVAC, allowed comments to be written on the same line as instructions, separated by a special character: "master space": V (perforations 7 and 8) for UNIVAC and perforations 12/11/0/7/8/9 (hexadecimal FF) for IBM. These comments were not to be inserted in the middle of continuation cards. FORTRAN 66 Perhaps the most significant development in the early history of FORTRAN was the decision by the American Standards Association (now American National Standards Institute (ANSI)) to form a committee sponsored by the Business Equipment Manufacturers Association (BEMA) to develop an American Standard Fortran. The resulting two standards, approved in March 1966, defined two languages, FORTRAN (based on FORTRAN IV, which had served as a de facto standard), and Basic FORTRAN (based on FORTRAN II, but stripped of its machine-dependent features). The FORTRAN defined by the first standard, officially denoted X3.9-1966, became known as FORTRAN 66 (although many continued to term it FORTRAN IV, the language on which the standard was largely based). FORTRAN 66 effectively became the first industry-standard version of FORTRAN. FORTRAN 66 included: Main program, SUBROUTINE, FUNCTION, and BLOCK DATA program units INTEGER, REAL, DOUBLE PRECISION, COMPLEX, and LOGICAL data types COMMON, DIMENSION, and EQUIVALENCE statements DATA statement for specifying initial values Intrinsic and EXTERNAL (e.g., library) functions Assignment statement GO TO, computed GO TO, assigned GO TO, and ASSIGN statements Logical IF and arithmetic (three-way) IF statements DO loop statement READ, WRITE, BACKSPACE, REWIND, and ENDFILE statements for sequential I/O FORMAT statement and assigned format CALL, RETURN, PAUSE, and STOP statements Hollerith constants in DATA and FORMAT statements, and as arguments to procedures Identifiers of up to six characters in length Comment lines END line FORTRAN 77 After the release of the FORTRAN 66 standard, compiler vendors introduced several extensions to Standard Fortran, prompting ANSI committee X3J3 in 1969 to begin work on revising the 1966 standard, under sponsorship of CBEMA, the Computer Business Equipment Manufacturers Association (formerly BEMA). Final drafts of this revised standard circulated in 1977, leading to formal approval of the new FORTRAN standard in April 1978. The new standard, called FORTRAN 77 and officially denoted X3.9-1978, added a number of significant features to address many of the shortcomings of FORTRAN 66: Block IF and END IF statements, with optional ELSE IF and ELSE clauses, to provide improved language support for structured programming DO loop extensions, including parameter expressions, negative increments, and zero trip counts OPEN, CLOSE, and INQUIRE statements for improved I/O capability Direct-access file I/O IMPLICIT statement, to override implicit conventions that undeclared variables are INTEGER if their name begins with I, J, K, L, M, or N (and REAL otherwise) CHARACTER data type, replacing Hollerith strings with vastly expanded facilities for character input and output and processing of character-based data PARAMETER statement for specifying constants SAVE statement for persistent local variables Generic names for intrinsic functions (e.g. SQRT also accepts arguments of other types, such as COMPLEX or REAL*16). A set of intrinsics (LGE, LGT, LLE, LLT) for lexical comparison of strings, based upon the ASCII collating sequence. (These ASCII functions were demanded by the U.S. Department of Defense, in their conditional approval vote.) A maximum of seven dimensions in arrays, rather than three. Allowed subscript expressions were also generalized. In this revision of the standard, a number of features were removed or altered in a manner that might invalidate formerly standard-conforming programs. (Removal was the only allowable alternative to X3J3 at that time, since the concept of "deprecation" was not yet available for ANSI standards.) While most of the 24 items in the conflict list (see Appendix A2 of X3.9-1978) addressed loopholes or pathological cases permitted by the prior standard but rarely used, a small number of specific capabilities were deliberately removed, such as: Hollerith constants and Hollerith data, such as GREET = 12HHELLO THERE! Reading into an H edit (Hollerith field) descriptor in a FORMAT specification Overindexing of array bounds by subscripts DIMENSION A(10,5) Y = A(11,1) Transfer of control out of and back into the range of a DO loop (also known as "Extended Range") Transition to ANSI Standard Fortran The development of a revised standard to succeed FORTRAN 77 would be repeatedly delayed as the standardization process struggled to keep up with rapid changes in computing and programming practice. In the meantime, as the "Standard FORTRAN" for nearly fifteen years, FORTRAN 77 would become the historically most important dialect. An important practical extension to FORTRAN 77 was the release of MIL-STD-1753 in 1978. This specification, developed by the U.S. Department of Defense, standardized a number of features implemented by most FORTRAN 77 compilers but not included in the ANSI FORTRAN 77 standard. These features would eventually be incorporated into the Fortran 90 standard. DO WHILE and END DO statements INCLUDE statement IMPLICIT NONE variant of the IMPLICIT statement Bit manipulation intrinsic functions, based on similar functions included in Industrial Real-Time Fortran (ANSI/ISA S61.1 (1976)) The IEEE 1003.9 POSIX Standard, released in 1991, provided a simple means for FORTRAN 77 programmers to issue POSIX system calls. Over 100 calls were defined in the document – allowing access to POSIX-compatible process control, signal handling, file system control, device control, procedure pointing, and stream I/O in a portable manner. Fortran 90 The much-delayed successor to FORTRAN 77, informally known as Fortran 90 (and prior to that, Fortran 8X), was finally released as ISO/IEC standard 1539:1991 in 1991 and an ANSI Standard in 1992. In addition to changing the official spelling from FORTRAN to Fortran, this major revision added many new features to reflect the significant changes in programming practice that had evolved since the 1978 standard: Free-form source input removed the need to skip the first six character positions before entering statements. Lowercase Fortran keywords Identifiers up to 31 characters in length (In the previous standard, it was only six characters). Inline comments Ability to operate on arrays (or array sections) as a whole, thus greatly simplifying math and engineering computations. whole, partial and masked array assignment statements and array expressions, such as X(1:N)=R(1:N)*COS(A(1:N)) WHERE statement for selective array assignment array-valued constants and expressions, user-defined array-valued functions and array constructors. RECURSIVE procedures Modules, to group related procedures and data together, and make them available to other program units, including the capability to limit the accessibility to only specific parts of the module. A vastly improved argument-passing mechanism, allowing interfaces to be checked at compile time User-written interfaces for generic procedures Operator overloading Derived (structured) data types New data type declaration syntax, to specify the data type and other attributes of variables Dynamic memory allocation by means of the ALLOCATABLE attribute and the ALLOCATE and DEALLOCATE statements POINTER attribute, pointer assignment, and NULLIFY statement to facilitate the creation and manipulation of dynamic data structures Structured looping constructs, with an END DO statement for loop termination, and EXIT and CYCLE statements for terminating normal DO loop iterations in an orderly way SELECT CASE, CASE, . . . CASE DEFAULT, END SELECT construct for multi-way selection Portable specification of numerical precision under the user's control New and enhanced intrinsic procedures. Obsolescence and deletions Unlike the prior revision, Fortran 90 removed no features. Any standard-conforming FORTRAN 77 program was also standard-conforming under Fortran 90, and either standard should have been usable to define its behavior. A small set of features were identified as "obsolescent" and were expected to be removed in a future standard. All of the functionalities of these early-version features can be performed by newer Fortran features. Some are kept to simplify porting of old programs but many were deleted in Fortran 95. Obsolescence and deletions Obsolescent feature Current Status Arithmetic IF-statement Obsolescent in F90, deleted in F2018 Non-integer DO parameters or control variables Obsolescent in F90, deleted in F95 Shared DO-loop termination or termination with a statement other than END DO or CONTINUE Obsolescent in F90, deleted in F2018 Branching to END IF from outside a block Obsolescent in F90, deleted in F95 PAUSE statement Obsolescent in F90, deleted in F95 ASSIGN statement and assigned GO TO statement Obsolescent in F90, deleted in F95 Assigned statement numbers and FORMAT specifiers Obsolescent in F90, deleted in F95 H edit descriptor Obsolescent in F90, deleted in F95 Vertical format control Deleted in F2003 Alternate return Obsolescent in F90 Computed GO TO statement Obsolescent in F90 Statement functions Obsolescent in F90 DATA statements among executable statements Obsolescent in F90 Assumed length character functions Obsolescent in F90 Fixed form source code Obsolescent in F90 CHARACTER* form of CHARACTER declaration Obsolescent in F90 ENTRY statements Obsolescent in F2008 Label form of DO statement Obsolescent in F2018 COMMON and EQUIVALENCE statements, and the BLOCK DATA program unit Obsolescent in F2018 Specific names for intrinsic function Obsolescent in F2018 FORALL construct and statement Obsolescent in F2018 "Hello, World!" example program helloworld print *, "Hello, World!" end program helloworld Fortran 95 "F95" redirects here. For the Düsseldorf-based football club, see Fortuna Düsseldorf. Main article: Fortran 95 language features Fortran 95, published officially as ISO/IEC 1539-1:1997, was a minor revision, mostly to resolve some outstanding issues from the Fortran 90 standard. Nevertheless, Fortran 95 also added a number of extensions, notably from the High Performance Fortran specification: FORALL and nested WHERE constructs to aid vectorization User-defined PURE and ELEMENTAL procedures Default initialization of derived type components, including pointer initialization Expanded the ability to use initialization expressions for data objects Initialization of pointers to NULL() Clearly defined that ALLOCATABLE arrays are automatically deallocated when they go out of scope. A number of intrinsic functions were extended (for example a dim argument was added to the maxloc intrinsic). Several features noted in Fortran 90 to be "obsolescent" were removed from Fortran 95: DO statements using REAL and DOUBLE PRECISION index variables Branching to an END IF statement from outside its block PAUSE statement ASSIGN and assigned GO TO statement, and assigned format specifiers H Hollerith edit descriptor. An important supplement to Fortran 95 was the ISO technical report TR-15581: Enhanced Data Type Facilities, informally known as the Allocatable TR. This specification defined enhanced use of ALLOCATABLE arrays, prior to the availability of fully Fortran 2003-compliant Fortran compilers. Such uses include ALLOCATABLE arrays as derived type components, in procedure dummy argument lists, and as function return values. (ALLOCATABLE arrays are preferable to POINTER-based arrays because ALLOCATABLE arrays are guaranteed by Fortran 95 to be deallocated automatically when they go out of scope, eliminating the possibility of memory leakage. In addition, elements of allocatable arrays are contiguous, and aliasing is not an issue for optimization of array references, allowing compilers to generate faster code than in the case of pointers.) Another important supplement to Fortran 95 was the ISO technical report TR-15580: Floating-point exception handling, informally known as the IEEE TR. This specification defined support for IEEE floating-point arithmetic and floating-point exception handling. Conditional compilation and varying length strings In addition to the mandatory "Base language" (defined in ISO/IEC 1539-1 : 1997), the Fortran 95 language also includes two optional modules: Varying length character strings (ISO/IEC 1539-2 : 2000) Conditional compilation (ISO/IEC 1539-3 : 1998) which, together, compose the multi-part International Standard (ISO/IEC 1539). According to the standards developers, "the optional parts describe self-contained features which have been requested by a substantial body of users and/or implementors, but which are not deemed to be of sufficient generality for them to be required in all standard-conforming Fortran compilers." Nevertheless, if a standard-conforming Fortran does provide such options, then they "must be provided in accordance with the description of those facilities in the appropriate Part of the Standard". Modern Fortran The language defined by the twenty-first century standards, in particular because of its incorporation of Object-oriented programming support and subsequently Coarray Fortran, is often referred to as 'Modern Fortran', and the term is increasingly used in the literature. Fortran 2003 Fortran 2003, officially published as ISO/IEC 1539-1:2004, was a major revision introducing many new features. A comprehensive summary of the new features of Fortran 2003 is available at the Fortran Working Group (ISO/IEC JTC1/SC22/WG5) official Web site. From that article, the major enhancements for this revision include: Derived type enhancements: parameterized derived types, improved control of accessibility, improved structure constructors, and finalizers Object-oriented programming support: type extension and inheritance, polymorphism, dynamic type allocation, and type-bound procedures, providing complete support for abstract data types Data manipulation enhancements: allocatable components (incorporating TR 15581), deferred type parameters, VOLATILE attribute, explicit type specification in array constructors and allocate statements, pointer enhancements, extended initialization expressions, and enhanced intrinsic procedures Input/output enhancements: asynchronous transfer, stream access, user specified transfer operations for derived types, user specified control of rounding during format conversions, named constants for preconnected units, the FLUSH statement, regularization of keywords, and access to error messages Procedure pointers Support for IEEE floating-point arithmetic and floating-point exception handling (incorporating TR 15580) Interoperability with the C programming language Support for international usage: access to ISO 10646 4-byte characters and choice of decimal or comma in numeric formatted input/output Enhanced integration with the host operating system: access to command-line arguments, environment variables, and processor error messages An important supplement to Fortran 2003 was the ISO technical report TR-19767: Enhanced module facilities in Fortran. This report provided sub-modules, which make Fortran modules more similar to Modula-2 modules. They are similar to Ada private child sub-units. This allows the specification and implementation of a module to be expressed in separate program units, which improves packaging of large libraries, allows preservation of trade secrets while publishing definitive interfaces, and prevents compilation cascades. Fortran 2008 ISO/IEC 1539-1:2010, informally known as Fortran 2008, was approved in September 2010. As with Fortran 95, this is a minor upgrade, incorporating clarifications and corrections to Fortran 2003, as well as introducing some new capabilities. The new capabilities include: Sub-modules – additional structuring facilities for modules; supersedes ISO/IEC TR 19767:2005 Coarray Fortran – a parallel execution model The DO CONCURRENT construct – for loop iterations with no interdependencies The CONTIGUOUS attribute – to specify storage layout restrictions The BLOCK construct – can contain declarations of objects with construct scope Recursive allocatable components – as an alternative to recursive pointers in derived types The Final Draft international Standard (FDIS) is available as document N1830. A supplement to Fortran 2008 is the International Organization for Standardization (ISO) Technical Specification (TS) 29113 on Further Interoperability of Fortran with C, which has been submitted to ISO in May 2012 for approval. The specification adds support for accessing the array descriptor from C and allows ignoring the type and rank of arguments. Fortran 2018 The latest revision of the language (Fortran 2018) was earlier referred to as Fortran 2015. It was a significant revision and was released on November 28, 2018. Fortran 2018 incorporates two previously published Technical Specifications: ISO/IEC TS 29113:2012 Further Interoperability with C ISO/IEC TS 18508:2015 Additional Parallel Features in Fortran Additional changes and new features include support for ISO/IEC/IEEE 60559:2011 (the version of the IEEE floating-point standard before the latest minor revision IEEE 754–2019), hexadecimal input/output, IMPLICIT NONE enhancements and other changes. Fortran 2018 deleted the arithmetic IF statement. It also deleted non-block DO constructs - loops which do not end with an END DO or CONTINUE statement. These had been an obsolescent part of the language since Fortran 90. New obsolescences are: COMMON and EQUIVALENCE statements and the BLOCK DATA program unit, labelled DO loops, specific names for intrinsic functions, and the FORALL statement and construct. Fortran 2023 Fortran 2023 (ISO/IEC 1539-1:2023) was published in November 2023, and can be purchased from the ISO. Fortran 2023 is a minor extension of Fortran 2018 that focusses on correcting errors and omissions in Fortran 2018. It also adds some small features, including an enumerated type capability. Language features Main article: Fortran 95 language features A full description of the Fortran language features brought by Fortran 95 is covered in the related article, Fortran 95 language features. The language versions defined by later standards are often referred to collectively as 'Modern Fortran' and are described in the literature. Science and engineering Although a 1968 journal article by the authors of BASIC already described FORTRAN as "old-fashioned", programs have been written in Fortran for many decades and there is a vast body of Fortran software in daily use throughout the scientific and engineering communities. Jay Pasachoff wrote in 1984 that "physics and astronomy students simply have to learn FORTRAN. So much exists in FORTRAN that it seems unlikely that scientists will change to Pascal, Modula-2, or whatever." In 1993, Cecil E. Leith called FORTRAN the "mother tongue of scientific computing", adding that its replacement by any other possible language "may remain a forlorn hope". It is the primary language for some of the most intensive super-computing tasks, such as in astronomy, climate modeling, computational chemistry, computational economics, computational fluid dynamics, computational physics, data analysis, hydrological modeling, numerical linear algebra and numerical libraries (LAPACK, IMSL and NAG), optimization, satellite simulation, structural engineering, and weather prediction. Many of the floating-point benchmarks to gauge the performance of new computer processors, such as the floating-point components of the SPEC benchmarks (e.g., CFP2006, CFP2017) are written in Fortran. Math algorithms are well documented in Numerical Recipes. Apart from this, more modern codes in computational science generally use large program libraries, such as METIS for graph partitioning, PETSc or Trilinos for linear algebra capabilities, deal.II or FEniCS for mesh and finite element support, and other generic libraries. Since the early 2000s, many of the widely used support libraries have also been implemented in C and more recently, in C++. On the other hand, high-level languages such as the Wolfram Language, MATLAB, Python, and R have become popular in particular areas of computational science. Consequently, a growing fraction of scientific programs are also written in such higher-level scripting languages. For this reason, facilities for inter-operation with C were added to Fortran 2003 and enhanced by the ISO/IEC technical specification 29113, which was incorporated into Fortran 2018 to allow more flexible interoperation with other programming languages. Portability Portability was a problem in the early days because there was no agreed upon standard—not even IBM's reference manual—and computer companies vied to differentiate their offerings from others by providing incompatible features. Standards have improved portability. The 1966 standard provided a reference syntax and semantics, but vendors continued to provide incompatible extensions. Although careful programmers were coming to realize that use of incompatible extensions caused expensive portability problems, and were therefore using programs such as The PFORT Verifier, it was not until after the 1977 standard, when the National Bureau of Standards (now NIST) published FIPS PUB 69, that processors purchased by the U.S. Government were required to diagnose extensions of the standard. Rather than offer two processors, essentially every compiler eventually had at least an option to diagnose extensions. Incompatible extensions were not the only portability problem. For numerical calculations, it is important to take account of the characteristics of the arithmetic. This was addressed by Fox et al. in the context of the 1966 standard by the PORT library. The ideas therein became widely used, and were eventually incorporated into the 1990 standard by way of intrinsic inquiry functions. The widespread (now almost universal) adoption of the IEEE 754 standard for binary floating-point arithmetic has essentially removed this problem. Access to the computing environment (e.g., the program's command line, environment variables, textual explanation of error conditions) remained a problem until it was addressed by the 2003 standard. Large collections of library software that could be described as being loosely related to engineering and scientific calculations, such as graphics libraries, have been written in C, and therefore access to them presented a portability problem. This has been addressed by incorporation of C interoperability into the 2003 standard. It is now possible (and relatively easy) to write an entirely portable program in Fortran, even without recourse to a preprocessor. Obsolete variants Until the Fortran 66 standard was developed, each compiler supported its own variant of Fortran. Some were more divergent from the mainstream than others. The first Fortran compiler set a high standard of efficiency for compiled code. This goal made it difficult to create a compiler so it was usually done by the computer manufacturers to support hardware sales. This left an important niche: compilers that were fast and provided good diagnostics for the programmer (often a student). Examples include Watfor, Watfiv, PUFFT, and on a smaller scale, FORGO, Wits Fortran, and Kingston Fortran 2. Fortran 5 was marketed by Data General Corp from the early 1970s to the early 1980s, for the Nova, Eclipse, and MV line of computers. It had an optimizing compiler that was quite good for minicomputers of its time. The language most closely resembles FORTRAN 66. FORTRAN V was distributed by Control Data Corporation in 1968 for the CDC 6600 series. The language was based upon FORTRAN IV. Univac also offered a compiler for the 1100 series known as FORTRAN V. A spinoff of Univac Fortran V was Athena FORTRAN. Specific variants produced by the vendors of high-performance scientific computers (e.g., Burroughs, Control Data Corporation (CDC), Cray, Honeywell, IBM, Texas Instruments, and UNIVAC) added extensions to Fortran to take advantage of special hardware features such as instruction cache, CPU pipelines, and vector arrays. For example, one of IBM's FORTRAN compilers (H Extended IUP) had a level of optimization which reordered the machine code instructions to keep multiple internal arithmetic units busy simultaneously. Another example is CFD, a special variant of FORTRAN designed specifically for the ILLIAC IV supercomputer, running at NASA's Ames Research Center. IBM Research Labs also developed an extended FORTRAN-based language called VECTRAN for processing vectors and matrices. Object-Oriented Fortran was an object-oriented extension of Fortran, in which data items can be grouped into objects, which can be instantiated and executed in parallel. It was available for Sun, Iris, iPSC, and nCUBE, but is no longer supported. Such machine-specific extensions have either disappeared over time or have had elements incorporated into the main standards. The major remaining extension is OpenMP, which is a cross-platform extension for shared memory programming. One new extension, Coarray Fortran, is intended to support parallel programming. FOR TRANSIT was the name of a reduced version of the IBM 704 FORTRAN language, which was implemented for the IBM 650, using a translator program developed at Carnegie in the late 1950s. The following comment appears in the IBM Reference Manual (FOR TRANSIT Automatic Coding System C28-4038, Copyright 1957, 1959 by IBM): The FORTRAN system was designed for a more complex machine than the 650, and consequently some of the 32 statements found in the FORTRAN Programmer's Reference Manual are not acceptable to the FOR TRANSIT system. In addition, certain restrictions to the FORTRAN language have been added. However, none of these restrictions make a source program written for FOR TRANSIT incompatible with the FORTRAN system for the 704. The permissible statements were: Arithmetic assignment statements, e.g., a = b GO to n GO TO (n1, n2, ..., nm), i IF (a) n1, n2, n3 PAUSE STOP DO n i = m1, m2 CONTINUE END READ n, list PUNCH n, list DIMENSION V, V, V, ... EQUIVALENCE (a,b,c), (d,c), ... Up to ten subroutines could be used in one program. FOR TRANSIT statements were limited to columns 7 through 56, only. Punched cards were used for input and output on the IBM 650. Three passes were required to translate source code to the "IT" language, then to compile the IT statements into SOAP assembly language, and finally to produce the object program, which could then be loaded into the machine to run the program (using punched cards for data input, and outputting results onto punched cards). Two versions existed for the 650s with a 2000 word memory drum: FOR TRANSIT I (S) and FOR TRANSIT II, the latter for machines equipped with indexing registers and automatic floating-point decimal (bi-quinary) arithmetic. Appendix A of the manual included wiring diagrams for the IBM 533 card reader/punch control panel. Fortran-based languages Prior to FORTRAN 77, a number of preprocessors were commonly used to provide a friendlier language, with the advantage that the preprocessed code could be compiled on any machine with a standard FORTRAN compiler. These preprocessors would typically support structured programming, variable names longer than six characters, additional data types, conditional compilation, and even macro capabilities. Popular preprocessors included EFL, FLECS, iftran, MORTRAN, SFtran, S-Fortran, Ratfor, and Ratfiv. EFL, Ratfor and Ratfiv, for example, implemented C-like languages, outputting preprocessed code in standard FORTRAN 66. The PFORT preprocessor was often used to verify that code conformed to a portable subset of the language. Despite advances in the Fortran language, preprocessors continue to be used for conditional compilation and macro substitution. One of the earliest versions of FORTRAN, introduced in the '60s, was popularly used in colleges and universities. Developed, supported, and distributed by the University of Waterloo, WATFOR was based largely on FORTRAN IV. A student using WATFOR could submit their batch FORTRAN job and, if there were no syntax errors, the program would move straight to execution. This simplification allowed students to concentrate on their program's syntax and semantics, or execution logic flow, rather than dealing with submission Job Control Language (JCL), the compile/link-edit/execution successive process(es), or other complexities of the mainframe/minicomputer environment. A down side to this simplified environment was that WATFOR was not a good choice for programmers needing the expanded abilities of their host processor(s), e.g., WATFOR typically had very limited access to I/O devices. WATFOR was succeeded by WATFIV and its later versions. program; s=0 i=1,n; s=s+1; stop i; s='s' Stop (line programming) LRLTRAN was developed at the Lawrence Radiation Laboratory to provide support for vector arithmetic and dynamic storage, among other extensions to support systems programming. The distribution included the Livermore Time Sharing System (LTSS) operating system. The Fortran-95 Standard includes an optional Part 3 which defines an optional conditional compilation capability. This capability is often referred to as "CoCo". Many Fortran compilers have integrated subsets of the C preprocessor into their systems. SIMSCRIPT is an application specific Fortran preprocessor for modeling and simulating large discrete systems. The F programming language was designed to be a clean subset of Fortran 95 that attempted to remove the redundant, unstructured, and deprecated features of Fortran, such as the EQUIVALENCE statement. F retains the array features added in Fortran 90, and removes control statements that were made obsolete by structured programming constructs added to both FORTRAN 77 and Fortran 90. F is described by its creators as "a compiled, structured, array programming language especially well suited to education and scientific computing". Essential Lahey Fortran 90 (ELF90) was a similar subset. Lahey and Fujitsu teamed up to create Fortran for the Microsoft .NET Framework. Silverfrost FTN95 is also capable of creating .NET code. Code examples For more details, see Wikibooks:Fortran/Fortran examples The following program illustrates dynamic memory allocation and array-based operations, two features introduced with Fortran 90. Particularly noteworthy is the absence of DO loops and IF/THEN statements in manipulating the array; mathematical operations are applied to the array as a whole. Also apparent is the use of descriptive variable names and general code formatting that conform with contemporary programming style. This example computes an average over data entered interactively. program average ! Read in some numbers and take the average ! As written, if there are no data points, an average of zero is returned ! While this may not be desired behavior, it keeps this example simple implicit none real, allocatable :: points(:) integer :: number_of_points real :: average_points, positive_average, negative_average average_points = 0. positive_average = 0. negative_average = 0. write (*,*) "Input number of points to average:" read (*,*) number_of_points allocate (points(number_of_points)) write (*,*) "Enter the points to average:" read (*,*) points ! Take the average by summing points and dividing by number_of_points if (number_of_points > 0) average_points = sum(points) / number_of_points ! Now form average over positive and negative points only if (count(points > 0.) > 0) positive_average = sum(points, points > 0.) / count(points > 0.) if (count(points < 0.) > 0) negative_average = sum(points, points < 0.) / count(points < 0.) ! Print result to terminal stdout unit 6 write (*,'(a,g12.4)') 'Average = ', average_points write (*,'(a,g12.4)') 'Average of positive points = ', positive_average write (*,'(a,g12.4)') 'Average of negative points = ', negative_average deallocate (points) ! free memory end program average Humor During the same FORTRAN standards committee meeting at which the name "FORTRAN 77" was chosen, a satirical technical proposal was incorporated into the official distribution bearing the title "Letter O Considered Harmful". This proposal purported to address the confusion that sometimes arises between the letter "O" and the numeral zero, by eliminating the letter from allowable variable names. However, the method proposed was to eliminate the letter from the character set entirely (thereby retaining 48 as the number of lexical characters, which the colon had increased to 49). This was considered beneficial in that it would promote structured programming, by making it impossible to use the notorious GO TO statement as before. (Troublesome FORMAT statements would also be eliminated.) It was noted that this "might invalidate some existing programs" but that most of these "probably were non-conforming, anyway". When X3J3 debated whether the minimum trip count for a DO loop should be zero or one in Fortran 77, Loren Meissner suggested a minimum trip count of two—reasoning (tongue-in-cheek) that if it were less than two, then there would be no reason for a loop. When assumed-length arrays were being added, there was a dispute as to the appropriate character to separate upper and lower bounds. In a comment examining these arguments, Walt Brainerd penned an article entitled "Astronomy vs. Gastroenterology" because some proponents had suggested using the star or asterisk ("*"), while others favored the colon (":"). Variable names beginning with the letters I–N have a default type of integer, while variables starting with any other letters defaulted to real, although programmers could override the defaults with an explicit declaration. This led to the joke: "In FORTRAN, GOD is REAL (unless declared INTEGER)." See also
4768
dbpedia
3
11
https://retrocomputing.stackexchange.com/questions/15988/what-did-the-dod-think-fortran-lacked
en
What did the DoD think Fortran lacked?
https://cdn.sstatic.net/…g?v=6b27f39a6023
https://cdn.sstatic.net/…g?v=6b27f39a6023
[ "https://i.sstatic.net/kK3cz.gif?s=64", "https://www.gravatar.com/avatar/5aedd802dca0d3b7397664dfac645ae8?s=64&d=identicon&r=PG&f=y&so-version=2", "https://i.sstatic.net/kK3cz.gif?s=64", "https://www.gravatar.com/avatar/04ce27f0ebe4713b51911cd8475c3b42?s=64&d=identicon&r=PG", "https://lh3.googleusercontent.com/-yspgfz81fFw/AAAAAAAAAAI/AAAAAAAAD5o/7CFgAMaSHSE/photo.jpg?sz=64", "https://retrocomputing.stackexchange.com/posts/15988/ivc/acfa?prg=338f67a6-359d-491e-b177-9c702a6170a9" ]
[]
[]
[ "" ]
null
[]
2020-08-27T21:45:33
According to https://en.wikipedia.org/wiki/COBOL the project to design COBOL began when On 28 and 29 May 1959 (exactly one year after the Zürich ALGOL 58 meeting), a meeting was held at the Pentag...
en
https://cdn.sstatic.net/Sites/retrocomputing/Img/favicon.ico?v=9b6f1e4fd156
Retrocomputing Stack Exchange
https://retrocomputing.stackexchange.com/questions/15988/what-did-the-dod-think-fortran-lacked
The meeting that defined the requirements of the new language took place on May 28–29, 1959. Charles Phillips prepared a memo several months later summarizing the decisions made at that meeting. Its listing of requirements is reprinted on page 201 of the ACM’s History of Programming Languages. a. Majority of group supported maximum use of simple English language; even though some participants suggested there might be advantage from using mathematical symbolism. b. A minority suggested we steer away from problem-oriented language because English is not a panacea as it cannot be manipulated as algebraic expressions can. c. The need is for a programming language that is easier to use, even if somewhat less powerful. d. We need to broaden the base of those who can state problems to computers. e. The [Common Business Language] should not be biased by present compiler problems. The committee did not consider FORTRAN as an alternative. According to Jean E. Sammet, who was chair (She describes herself as the “chairman”) of two of the committees that developed COBOL and served on a third, the major inspirations were FLOW-MATIC (developed by Grace Hopper and others for Remington-Rand Univac), AIMACO (developed by the Air Materiel Command based on Hopper’s work, and described by Sammet as “a minor modification of FLOW-MATIC”) and COMTRAN (Commercial Translator, which at that time existed as a manual at IBM, and had never been implemented). Sammet claims that FACT, developed at Honeywell, had far less influence on COBOL than some people believed. The entire chapter I link to has extensive notes that Sammet took at the time from the committee that developed COBOL, and the decisions it made. She makes the particularly interesting admission on page 221: I felt there was a strong anti-IBM bias in this committee from me, and from some (but certainly not all) of the others. Since I was not working for IBM at the time, I can freely (although not with pride) admit that in some cases suggestions or decisions were made on the basis of doing things differently from how IBM did it. For example, we felt that the verb for loop control should not be called DO because that was how FORTRAN did it. Sammet lists among the ideas COBOL took from FLOW-MATIC, “It used full data names rather than short symbolic names (as in FORTRAN)” e.g. SOCIAL-SECUR instead of SOCSEC, and used English words as commands. Less cosmetically, it allowed fields to be packed into a data word. She says, “Note that Fortran assumes that every number is in a single machine word.” It separated data definitions from instructions, which she says became so commonplace that it’s difficult to appreciate what a conceptual breakthrough it was. Among the ideas she lists as coming from COMTRAN are nested data structures, expressions, and conditionals. It was controversial at that time to allow mathematical formulas and even Boolean expressions, as some committee members believed those were only needed in a few edge cases. She also states that IAL, which developed into ALGOL, had a significant influence, by convincing the committee not to follow its example, and instead allow in its source code only characters that actually exist. FORTRAN was, at the time(*1), lacking almost everything, from string handling to all I/O beside reading numbers from cards or tape. Heck, not even integer size was guaranteed across machines. No real way of structuring or flow control beside GOTO — even subroutines/functions were only integrated a year before with FORTRAN II. For most parts, FORTRAN is a symbolic assembler with a focus on mathematics, that made it easy to write formulas, but not much else. But real world computing is about data handling and I/O. This is especially true for a huge organization as the US military, represented by its bureaucratic arm, the DoD. Being able to easily write down some complex calculation is nice, but useless in an environment there it's about managing inventory, ordering supplies, calculating pay and have all of that delivered in time. An army is like a huge corporation, not a scientific institute and the task at hand is data processing, not number crunching. Data Processing is vastly different from number crunching — it's a completely different world. It's about the classic "read card, process item, write output" loop, the stuff that got automated with punch cards. It is the core reason why the /360 has survived until today as a successful architecture. Its instruction set is perfectly tailored to shovel data, being built to support these principles. The fact that IBM tried to make it an all around (360 degree) architecture by including FP and even trying to make it fit for process control didn't really matter on the long run — others were much better for either. And the DoD did need data processing to fight the wars at the time (Korea just ended and Vietnam was coming up), and a language to support writing data processing programs in a machine independent way. That's why COBOL was developed as a result of the mentioned conference. *1 - FORTRAN did improve a lot since back then, still, it didn't really change the basic nature.
4768
dbpedia
0
13
http://groups.umd.umich.edu/cis/course.des/cis400/maxim/lectures/chp2.htm
en
Chapter 2
[]
[]
[]
[ "" ]
null
[]
null
null
Evolution of the Major Programming Languages Zuse�s Plankalk�l � Plankalk�l is the name of a programming language written by a German scientist by the name of Konrad Zuse, between 1936 and 1945 � This language was never implemented � Its name means program calculus MINIMAL HARDWARE PROGRAMMING: PSEUDOCODE � Deficiencies in machine code and assembly language code led to the development of somewhat higher�level languages � Short code � developed by John Mauchly in 1949 - Implemented with a pure interpreter - Simplified the programming process; however, it was approximately 50 times slower than machine code � Speedcoding � developed by John Backus in 1954 - Extended machine languages to include floating-point operations - Backus claimed that problems that could take two weeks to program in machine code could be programmed in a few hours using Speedcoding � The UNIVAC �compiling system � developed by a team led by Grace Hopper between 1951 and 1953 - It made machine code source programs much shorter; however it was still quite primitive THE IBM 704 AND FORTRAN � Historical Background - The first widely accepted high-level language was FORTRAN - FORTRAN was developed in large measure for the capabilities of the IBM 704 system - The IBM 704 system was first introduced in 1954 � Design Process - Plans for FORTRAN were actually begun before the announcement of the IBM 704 system - The report titled �The IBM Mathematical FORmula TRANslating System� described the first version of FORTRAN - It stated that FORTRAN could provide the efficiency of hand-coded programs and the ease of programming the interpretive pseudocode systems - It also stated that it would eliminate coding errors and the debugging errors � FORTRAN I Overview - All of FORTRAN I�s control statements were based on 704 instructions - There are no data-typing statements; instead variables whose names began with I, J, K, L, M, and N are implicitly integer type, and all others were implicitly floating-point type - FORTRANS claim that the machine code produced by the compiler was as efficient as what could be produced by hand was nearly achieved � FORTRAN II Overview - Introduced in the spring of 1958 - Fixes a lot of the bugs in FORTRAN I and adds some significant features, such as, independent compilation of subroutines � FORTRAN IV, FORTRAN 77, and FORTRAN 90 - FORTRAN III, although developed, it was never a big success - FORTRAN IV became one the most widely distributed programming languages until 1978 - FORTRAN IV is a big improvement over FORTRAN II because of additions to it, such as, explicit type declaration for variables, a logical IF construct, and the capability of passing subprograms - FORTRAN 77 is introduced in 1977, it retains most of the FORTRAN IV features and adds to it, character string handling, logical loop control statements, and an IF with an optional ELSE clause - FORTRAN 90 is the latest version of FORTRAN is dramatically different than FORTRAN 77 - It supplies a collection of built in functions for array operations - Arrays can be dynamically allocated and deallocated on command - New control statements, such as, CASE, EXIT, and CYCLE are added - Recursion is now possible FUNCTIONAL PROGRAMMING: LISP � The first functional programming language was invented to provide language features for list processing, which grew out of the applications in artificial intelligence � The Beginnings of Artificial Intelligence and List Processing - Interest in AI appears in the mid-1950�s - This interest comes from the study of linguistics, psychology, and mathematics - Linguistics because of the interest in the processing of a natural language - Psychology because of the interest in modeling human information storage and retrieval, along with other brain processes - Mathematics because of the interest in mechanizing certain intelligent processes - The concept of list processing was introduced in 1956 by Allen Newell, J.C. Shaw, and Herbert Simon - A few languages are developed and implemented for list processing but none become really popular until the development of LISP � LISP Design Process - Developed by John McCarthy and Marvin Minsky at MIT, because of their dislike for the other languages developed for AI � Language Overview - There are only two data structures, atoms and lists - Atoms are either symbols or numeric literals - Lists are specified by delimiting their elements with parentheses and may be simple or nested lists - A list is stored internally as a single-linked list structure � Processes in Functional Programming - All computation in a functional program is accomplished by applying functions to arguments - Iterative processes can be specified with recursive function calls, which means no loops � The Syntax of LISP - Program code and data have exactly the same form � Evaluation - LISP is still the most widely used language for AI - In the 1970�s and 1980�s many different dialects of LISP were developed, which led to the problem of portability - In order to solve the problem of LISP�s portability issue, COMMON LISP was developed as a standard version � Two Descendants of LISP - These two dialects are Scheme and COMMON LISP - Scheme was developed in 1975 - It is relatively small in size and uses static scooping exclusively - Scheme functions can be values of expressions and elements of lists; they can be assigned to variables, passed as parameters, and returned as values.� These features were not available in the early versions of LISP - COMMON LISP was developed in 1984 - It takes the features of a few dialects and scheme and combines them into one language; while Scheme is a relatively small language, COMMON LISP is relatively large language and uses dynamic as well as static scooping � Related Languages - Three related functional programming languages are ML, Miranda, and Haskell - MetaLanguage or ML was designed in the 1980�s by Robin Milner - ML is a functional language, which supports imperative programming - Miranda was designed by David Turner in the 1980�s as well; It is based on ML, SASL, and KRC - Haskell is based on Miranda and was designed by Hudak and Fasel in 1992 THE FIRST STEP TOWARD SOPHISTICATION: ALGOL 60 � Historical Background - Efforts to design a universal language inspired the development of the programming language ALGOL 60 - In the late 1950�s, all of the languages that had been developed thus far were for single architectures only - Two user groups SHARE and USE showed an interest in forming a committee through the ACM organization, in order to study and recommend the design and implementation of a universal programming language - Another group in Europe (GAMM) also was interested and in 1958 the ACM committee along with GAMM officially agreed to a joint language design project - FORTRAN was not considered because of its sole ownership by IBM. � Early Design Process - The syntax of the language should be close to standard mathematical notation - It should be possible to use the language for the description of computing processes in publications - And programs in the new language must be mechanically translatable into machine language � ALGOL 58 Overview - The first language designed in this project was ALGOL 58 - ALGOL comes from the original name of the language, which is the International ALGOrithmic Language - Many of the features in ALGOL 58 were derived from FORTRAN - It generalized FORTRAN�s features and added several new constructs and concepts � ALGOL 60 Design Process - ALGOL 58 was furiously debated, resulting in modifications and additions to the language and the development of ALGOL 60 � ALGOL 60 Overview - ALGOL 60 is result of the modifications made to ALGOL 58 - Introduces the concept of block structure, which allows parts of programs to localized - Passing parameters by means of pass by value and pass by name are introduced - Procedures are allowed to be recursive - Stack-dynamic arrays are now allowed � ALGOL 60 Evaluation - ALGOL 60 succeeded in becoming the only acceptable formal means of communicating algorithms - All imperative languages formed after 1960 are either direct or indirect descendants to ALGOL 60 - It was the first language whose syntax was formally described - Unfortunately, ALGOL 60 never achieved widespread use, because it was too flexible and it lacked input and output statements COMPUTERIZING BUSINESS RECORDS: COBOL � Although COBOL has been used more than any other programming language it has ahd little effect on the design of any other languages, with the exception of PL/I. � Historical Background - Similar to ALGOL, COBOL was designed by a committee - Three other languages for business applications that existed before COBOL were FLOW-MATIC, AIMACO, and COMTRAN � COBOL Design Process - The biggest concern regarding this new application language was that it be easy to use, even at the expense of being less powerful - The language specifications for COBOL were published in 1960 � Evaluation - COBOL originated a number of concepts, such as, constructs for macros, implementation of hierarchical data structures, allowed connotative names - It was the first language whose use was mandated by the Department of Defense - The poor performance of the early compilers made COBOL expensive to use, fortunately with the advent of better compiler designs and the mandate of the Defense Department COBOL became very popular THE BEGINNINGS OF TIME SHARING: BASIC � Similar to COBOL, BASIC has had widespread use, however it has gotten little respect � Design Process - BASIC was designed at DARTMOUTH University in the 1960�s by John Kemeny and Thomas Kurtz - The goals of the BASIC system were to make it easy for non-science students to learn, be pleasant and friendly, provide fast turnaround for homework, allow free and private access, and consider user time more important than computer time - The combination of the second, third, and fourth goals mentioned above led to the timeshared aspect of BASIC � Language Overview - The original version was very small having only fourteen different statements, and was not interactive, for there was no way to get user input from the terminal - Overall, it was very limited making it very easy to learn � Evaluation - BASIC was the first widely used method of remote access to a computer - Most of its design came from FORTRAN with some minor influence by ALGOL 60 - Reasons for BASIC�s success are probably it�s ease for which it can be learned and implemented - Two contemporary versions of BASIC are QuickBasic and Visual BASIC EVERYTHING FOR EVERYBODY: PL/I � PL/I represents the first large-scale attempt to design a language that could be used for a broad spectrum of application areas � Historical Background - Developed as an IBM product - Designed to replace FORTRAN, COBOL, LISP and the systems applications of assembly language � Design Process - The first version of PL/I was originally named FORTRAN VI because the initial design was just an extension of FORTRAN IV - The design along with the name was quickly dropped and the new design was called NPL for new programming language - The name change to PL/I resulted in the avoidance of confusing the name NPL with the National Physical Laboratory in England, since this was were PL/I was developed � Language Overview - PL/I included the best of ALGOL 60 (recursion and block structure), FORTRAN IV (separate compilation with communication through global data), and COBOL (data structures, input/output, and report generating facilities), along with a few new constructs - PL/I was the first language to have programs allowed to create concurrently executing tasks, the possibility to detect and handle 23 different types of exceptions, procedures allowed to be use recursively, pointers included as a data type, and reference to the cross sections of arrays � Evaluation - The problem with PL/I is it�s complexity due to it�s large size - It also suffered from what are now considered to be poorly designed constructs - However, it did get significant use in both business and scientific applications TWO EARLY DYNAMIC LANGUAGES: APL AND SNOBOL � Neither of these languages are based on any previous language and neither have had much affect on any later languages � The only similarity between APL and SNOBOL is there dynamic typing and dynamic storage allocation � Origins and Characteristics of APL - Kenneth E Iverson designed APL in 1960 - It was intended to be a vehicle for describing computer architecture not as a programming language - It has many powerful operators in order to allow arrays to be manipulated as if they were scalar variables, however, this� creates a problem for implementers of the language - Although it is not widely used, it is still in use today. � Origins and Characteristics of SNOBOL - Designed by D.J. Farber, R.E. Griswold, and F.P. Polensky in the early 1960�s - It was designed specifically for text processing. - However, it is rarely used nowadays because it is slower than some other languages THE BEGINNINGS OF DATA ABSTRACTION: SIMULA 67 � Design Process - Kristen Nygraad and Ole-Johan Dahl developed Simula I between 1962 and 1964 - It was designed exclusively for system simulation � Language Overview - SIMULA 67 is an extension of ALGOL 60 - Allowed for support of coroutines using the class construct, which is important because the idea of data abstraction starts here ORTHOGONAL DESIGN: ALGOL 68 � Design Process - Introduces the design criteria of orthogonality, which results in features, such as, the inclusion of user-defined data types and implicit heap-dynamic arrays � Evaluation - Introduces a significant number of features not previously used - Although it may be contrasted with PL/I, PL/I gained a far greater acceptance than ALGOL 68 SOME IMPORTANT DESCENDANTS OF THE ALGOL�S � All imperative languages including the object-oriented languages, designed since 1960 owe some of their design to ALGOL 60 � Pascal - Designed by Niklaus Wirth - Highly used for teaching programming; because of this it lacks some features, such as, separate compilation and passing an array of variable length � C - Originally designed for systems programming by Dennis Ritchie in 1972 - Because of its adequate control structures and data structuring facilities C is adaptive to many application areas - A reason for both like and dislike of C is its lack of complete type checking - C tends to be a very flexible language � Modula-2 - Designed by Niklaus Wirth in 1976 (also designed Pascal) - Distinguishing features include modules, which provide support for abstract data types, procedures as types, and low-level facilities for systems programming and coroutines � Modula-3 - Based on Modula-2 and Modula-2+ - To Modula-2 it adds classes and objects for support of object-oriented programming, exception handling, garbagfe collection, and support for concurrency � Oberon - Loosely based on Modula-2 was written by Niklaus Wirth - Although many features were added to Modula-2 to come up with Oberon many were subtracted also. - The main feature added is type extension, which supports object oriented programming - Among those removed are variant records, opaque types, enumeration types, sub-range types, the CARDINAL type, non-integer array indexes, the with statement, and the for statement � Delphi - A hybrid language, similar to C++ in that it was created by adding object-oriented support, along with other things, to an existing imperative language - It is derived from Pascal, therefore making it much more elegant and safe than C++ PROGRAMMING BASED ON LOGIC: PROLOG � The use of a formal logic notation to communicate computational processes to a computer � Design Processes - Alain Colmerauer, Phillip Roussel, and Robert Kowalski developed the design of Prolog - Prolog uses a method for specifying predicate calculus propositions and a restricted form of resolution � Language Overview - Prolog�s programs consist of a collection of statements or facts - The most common use of Prolog is as a kind of database � Evaluation - There are two major reasons why logic programming has not become widespread - The first reason is that it has been proven to highly inefficient - The second reason is that it has been shown to effective for only few areas of application, such as, certain kinds of database management and some areas of AI HISTORY�S LARGEST DESIGN EFFORT: ADA � Ada was originally developed for the Department of Defense � Historical Background - The Army, Navy, and Air Force all proposed the development of a high-level language for embedded systems in 1974 as an attempt to standardize their own embedded systems � Design Process - The committee assigned to this task was responsible for identifying requirements for a new Department of Defense high-level language, evaluating the existing languages to determine whether there was a viable candidate, and recommend adoption or implementation of a minimal set of programming languages - The resulting language went through multiple phases and was named Ada � Language Overview - There are four major features of the Ada language - Packages provide the means for encapsulation of data objects - It includes extensive facilities for exception handling - Allows program units to be generic - It provides for concurrent execution of special program units and named tasks � Evaluation - The Ada language contains most of the concepts of software engineering and language design - The development of a compiler was a difficult task and almost four years after the design of the Ada language was completed, did the Ada compiler appear OBJECT-ORIENTED PROGRAMMING: SMALLTALK � Design Process - The concepts that led to the development of Smalltalk came out of the Ph.D. dissertation work of Alan Kay � Language Overview - The program units of Smalltalk are objects - All computing in Smalltalk is done in the same fashion, passing messages to objects in order to invoke one of the methods assosciated with that object COMBINING IMPERATIVE AND OBJECT-ORIENTED FEATURES: C++ � C++ uses C as a base for which to support many of the concepts introduced by Smalltalk � Design Process - Modifications to C included the addition of function parameter type checking and conversion, and classes - Later the additions included inline functions, default parameters, and overloading of assignment operators - With the above additions to C came the name C with Classes - Not until the inclusion of virtual functions, was the name changed to C++ � Language Overview - C++ provides a collection of predefined classes, along with the possibility of user-defined classes - Operators in C++ may be overloaded, meaning that the user may create operators for existing operators - Dynamic Binding is provided by virtual class functions - Both functions and classes can be templated, which means that they can be parameterized - It also includes exception handling � Evaluation - C++ has become extremely popular - On the downside, C++ is extremely large and suffers some drawbacks similar to PL/I � A Related Language: Eiffel - Like C++ Eiffel combines both imperative and object-oriented features PROGRAMMING THE WORLD WIDE WEB: JAVA � Based on C++ however, numerous constructs have been removed, some changed, and a few others added � The resulting language provides the power and flexibility of C++, however, it is smaller, simpler, and safer � Design Process - Java was designed for an application for which their appeared to be no other language satisfactory enough - This started out as a programming language for embedded consumer electronics - Later it proved to be a useful tool in programming for the web � Language Overview - Java employs both types and classes - It does not have pointers, although its reference types provide some of the capabilities of pointers - Uses a primitive Boolean type mainly for the control expressions of its control statements - There are no record, union, or enumeration types - Supports only single inheritance - Provides garbage collection � Evaluation - Designers of Java have eliminated many of the features that are considered unsafe in C++ - Java is still considered a complex language and its lack of multiple inheritance is said to lead to some peculiar designs - The use of Java has increased much faster than the use of any other language
4768
dbpedia
3
46
https://doi.org/10.1016/B978-0-12-745040-7.50008-9
en
Redirecting
[]
[]
[]
[ "" ]
null
[]
null
null
4768
dbpedia
1
28
https://forums.theregister.com/forum/all/2020/05/15/algol_60_at_60/
en
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree • The Register Forums
https://forums.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://forums.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
[ "https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_extents_16x16.png", "https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_filled_extents_16x16.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/std/reg_logo_no_strapline.png", "https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/magnifying_glass_white_extents_16x16.png", "https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_16x16.png", "https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_close_16x16.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0U3@w@hKS-jz6zf6tF3QAAAAg&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/back_to_page.png", "https://forums.theregister.com/design_picker/c00f80f04b0eaf0123d821f6c9488fc1cb55fd0a/graphics/icons/unstar.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/feeds.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coat_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=3Zr0U3@w@hKS-jz6zf6tF3QAAAAg&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dtop%26test%3D0", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=4Zr0U3@w@hKS-jz6zf6tF3QAAAAg&t=ct%3Dns%26unitnum%3D426raptor%3Deagle%26pos%3Dmid%26test%3D0", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/childcatcher_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/boffin_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coffee_keyboard_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/devil_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/gold_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/unhappy_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/happy_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/windows_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/angel_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/go_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png", "https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_down_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/stop_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/go_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/wtf_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/fail_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/happy_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/meh_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/unhappy_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/angel_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/devil_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/flame_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/mushroom_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/megaphone_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/trollface_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/holmes_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/headmaster_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coat_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/joke_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coffee_keyboard_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/alert_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/black_helicopters_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/boffin_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pirate_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/alien_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/terminator_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/big_brother_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/childcatcher_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/linux_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/gimp_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/windows_32.png", "https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/it_angle_32.png", "https://forums.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_white.png", "https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/std/sitpublogo_2022.png", "https://forums.theregister.com/Design/graphics/std/transparent_pixel.png" ]
[]
[]
[ "" ]
null
[ "Anonymous Coward", "Jason Bloomberg", "Ron Martin", "Ken Moorhouse", "disgruntled yank", "Primus Secundus Tertius", "Doctor Syntax", "shawn.grinter", "Graham Cobb", "John Sturdy" ]
2020-05-15T00:00:00
en
/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://forums.theregister.com/forum/all/2020/05/15/algol_60_at_60/
No love for CORAL 66? Back in the day, in order to get a military application running, I had to take the assembler output of a Coral 66 program and basically rewrite it completely in proper assembler, because the compiler wasn't merely not optimised, it seemed positively designed to waste as many CPU cycles as possible. It used the 9989 microprocessor, and completely ignored the 9989's register windowing system, instead creating boilerplate functions every single time. Kids today...I'm the one feeding the punched tape repeatedly through the reader to get the floating point libraries to work. Arguments over bracket syntax, tabs versus space and to semicolon or not are really a sign that nowadays there is very little to complain about. It started my career move... We had an Elliott 803 at Rugby College (later absorbed into Lanchester Poly). Although I was studying Applied Physics, the computer part of the course fired me up and I went into computing from that point on. I remember the console speaker that, although abused to play music, did give you a sense of what your program was doing. Later on, in my 6800/8008/8080 hobby days, I used a transistor radio for the same thing. There were two no-no's: The first was to remember how little memory you had to play with so large arrays were not possible and, if attempted, resulted in a subscript overflow message (iirc - SUBSCROFLO). The second was to make sure your plotter programmes completed. The plotter involved an extra paper tape load for the operator. If your plot failed then the whole machine had to be restarted. The plotting code was probably an early form of overlay. Before we had magnetic tape installed the paper tape reader was something to behold. The output from the reader had to be caught in a basket as the speed was so high. The computer operator was a very smart young lady too.... Happy days. ;-) I've used Algol-60 at school I did my AO-Level Computer Studies in 1983 it 1983 or 1982 (can't remember!) and I did my project in Algol-60. Because that's what we ran on our Research Machines 380Zs. It was a program to solve the N-Queens problem, using the process of making different permutations of the numbers 1-N, and I did it using recursion (because I could). Little did I realise at the time that it was really out of date, and there's no reason we couldn't have done it in Pascal or C. But that's all we had. Ugh! PASCAL We used it in school (UMASS/Amherst) as a teaching language. Too "wordy" for me, especially the IO statements. But it did get the job done. When I started working, it was at Data General, who had an Algol-like language called DG/L, which I absolutely loved. It was my language of choice for little utility programs on our AOS and AOS/VS systems. Then, of course, came the Sun workstations, with UNIX and C. Game over. Bonus for the CALCOMP drum plotters. I worked part time at the comuter center in school, we had one (3 or 4 foot wide). The number of aborted plots due to pen failure was astounding, as was the pen budget! Algol 68 is not ALGOL 60 Algol 68 is basically a different language from Algol 60—and the Algol 60 designers, most notably Djisktra, were less than impressed by it, and along with others on the committee issued a "minority report" disowning Algol 68. Syntax aside,and leaving out the object-oriented stuff, Algol 68 actually bears a distinct resemblance to C++. Algol 60, on the other hand, begat Pascal. So really they are different languages sharing part of a name. I never got to use Algol 68 because it was difficult to write a compiler for and nowhere I worked had one. Burroughs Algol (a variant of ALGOL 60), on the other hand, was available. Time pressure, and perhaps a desire not to be seen to know too much about the "old iron", meant I never did write any Burroughs Algol. That was perhaps my loss. I don't think I missed anything by not writing Algol 68. BTW Burroughs Algol was implemented in hardware. It was a stack machine with display registers and hardware support for resizing arrays. ALGOL 60 was the first language I learned, at school in the Computer Club. Turn round was a bit longer than ten minutes. We used to write programs on coding sheets, which were then taken to the nearby university and typed up by data prep ladies, run by the operators on the ICL1909, and the coding sheets, pack of cards and printout was returned to us a week later. That did make you concentrate fairly hard on program correctness; my first program calculated primes and I've still got the output somewhere. After a while a friend and I learned that we could get off the bus on the way home, walk up to the university and punch our own cards on IBM 029 punches, and then watch while the operators ran the program. That made things a lot faster and meant we could write bigger programs - my favourite was a linear regression program for the results of our physics experiments. It made them look much more 'official' :) And then we discovered the unversity had a free access PDP-8 so we learned BASIC on DECtape and the joys of typing into an ASR-33 and Friden Flexowriters. And then ... ... the university got a copy of the POP-2 compiler. Still my favourite language ever! Programs to synthesise English using Chomsky's grammars etc etc. Machine Intelligence 1, 2, 3 ... My first computer language... ... was Algol W, at university in the mid-70s. Sequence, selection and iteration - what more do you need? :) It set the way for how I think about programming, and I found it difficult to think in other ways. For example, we later covered the functional programming language SASL (a precursor of Miranda, which in turn preceded Haskell), and I found it almost impossible to think that way, until one day the penny simply dropped, and I had no problems after that. For Raymond, up above, I still have my copy of the MoD "Blue Book" standard for CORAL 66. Used that at university on a CTL mini, and in professional life on Ferranti Argus and CTL/ITL minis. My real love was Pascal, and I remember how chuffed I was the day I discovered that although I had been using it for many months, I hadn't realised that our implementation of it did not include GOTO statements! I still have a free implementation of it on my home PC - very useful for occasional puzzle solving. Others I used included Fortran IV and BCPL at university, and ASM-86, PL/M-86, and PL/I in professional life. But then I became a consultant :) Early Uni Computing Back in the early 70's Bangor University Computing Department kept what was the previous workhorse 803 in a back room, after migrating to an ICL4130. The computing department manager would oversee a rigorous test on the 803 operating procedures before allowing any member of the great unwashed student body unsupervised access to said machine. I've still got my copy of Algol A104 compiler tape in a drawer somewhere. As an introduction into the wide world of programming this was an ideal education in small memory management and code optimisation, useful on my next processor, an intel 4040. How did the compile procedure go? Power supply to 'on', tape in reader, press the control button to 'read', hit the 'operate' bar, press the control button back to 'normal', hit the 'operate' bar... And listen to the soothing chatter of the loudspeaker connected to the overflow bit... Kids these days, they don't believe you! ALGOL 60 was the first language I learned, in 1973. At Warwick Uni, they reckoned that virtually any science undergrad would need computer skills, so this was a compulsory course. They ran an ICL 4130 and a 4120 at the time - designed by Elliott, so presumably somebody at Elliott was an ALGOL fan. Never used it in any commercial setting, but it laid the basis for the structure used in any number of languages since. When writing COBOL, I often sketched things out using ALGOL to check things made sense before getting the coding sheets dirty. Who remembers Backus-Naur ? Not quite how I remember it "you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish" When I was in the sixth form, one of my maths teachers had a contact at the local Poly. We were low priority, do we got to leave our precious punched tapes with the operator who would run it for us in time for us to pick up the gibberish the next day. Needless to say I thought computers were complete bollocks. A couple of years later when I was at University, and they had these lineprinter style terminals, I started to think wow, maybe there's something in this; soon after I got access to a green screen terminal. The rest is history. Fall 1965: Gunn High School in Palo Alto CA... the school district's fist 'computer' class (Perhaps one of the first HS classes anywhere?) They only had a keypunch machine at the school... and at the first class admitted they were still looking for a computer to use, as the District had quashed plans to allow their own in-house computer to be used by [gasp] students! Next class, they announced they had FOUND a computer very nearby: The Burroughs B5500 mainframe at SLAC!!! (Stanford Linear Accelerator Center). No problem for students there!!! We ran our class exercises in B5500 ALGOL. Early Algol68 Compilers In the early 1980’s UCLA had an active group of graduate students working in a compiler for Algol68 for the IBM 360/91 under Robert Ugalus. The book, “the Informal introduction to Algol68” was the Bible for the language semantics. The book created its own set of specialized words like propositity and lapsity to describe in precious words the language semantics. The underlsying memory model was the stack retention model with variants called the Cactus stack for its threaded nature. The language semantics, referred in previous posts as European sensibilities was quite well done. It lacked a machine description in favor of very precise descriptions of the language semantics, very worthwhile reading for those who like reading compilers semantics by jumping into the way back machine. As Joe Bob Briggs used to say, “check it out”. Algol 68 on a ICL 4100 When at Warwick University in 1974 I was using Algol 68 for my computing project. My maze solver did eventually work though as mentioned the bug cycle was more like one day as you had to post your punch card stack into the pigeon hole of the computer suite and wait for them to process it. I found by trial and error that why my maze solver never ran correctly on Friday afternoons was because they used to take half of the magnetic memory offline to clean it or something. The largest maze I could create an a Friday was 3 x 3! Algol 60 on the HP 2100A Well, I've used it -- it was the best language to use on the HP 2100A, because you needed only one paper tape, because Algol 60 was designed to work with a single pass compiler. Apart from it being a better language anyway. FORTRAN IV, as it was then, needed a two paper tapes. So, even for a little FORTRAN program, you had to load the first pass compiler tape (then wind it up on its spool), then put your program tape through and get the intermediate tape. You'd then load the second compiler paper tape, and feed it the intermediate tape. It would then produce the binary. If anything went wrong, you had to start again. Even then proper design made a huge difference. Declaring your variables first, and declaring functions before you use them, makes good sense anyway, and has the excellent side-effect of improving usability by one-pass compilation. The influence of ALGOL 60 You can see the influence of ALGOL 60 on modern languages most clearly if you compare programs written in 1960 versions of ALGOL, FORTRAN, COBOL, and LISP (which were the most widespread languages at the time). The ALGOL 60 program will (for the most part) be readily readable by someone who has learned C, Java, or C# and nothing else. Understanding the FORTRAN, COBOL, or (in particular) LISP programs would require a good deal of explanation, but understanding the ALGOL 60 program would mainly be realising that begin, end, and := correspond to curly braces and = in C, Java, and C#. Look, for example, at the Absmax procedure at https://en.wikipedia.org/wiki/ALGOL_60#Examples_and_portability_issues FORTRAN and COBOL continued to evolve into something completely unlike their 1960 forms while still retaining their names -- even up to today. ALGOL mainly evolved into languages with different names, such as Pascal, C, CPL, Simula and many others. So ALGOL is not really more of a dead language than FORTRAN II and COBOL 60. There was a computer scientist in the late 60s that was asked "What will programming languages look like in 2000?". He answered "I don't know, but I'm pretty sure one of them will be called FORTRAN". This was a pretty good prediction, as Fortran (only name change is dropping the all caps) still exists, but looks nothing like FORTRAN 66, which was the dominant version at the time. You can argue that the modern versions of FORTRAN and COBOL owe more to ALGOL 60 than they do to the 1960 versions of themselves. My first programing language was Burroughs EXTENDED Algol 60 on a B5000 in 1965. Burroughs always extended the basic language. The beauty of the Burroughs' extensions is that their ALGOL 60 was wedded to the hardware design, and their compilers are wall written in the language they compiled. At that time is was ALGOL, COBOL and ESBOL(?SP) which was superset of ALGOL used to program the MCP. I did in fact program the MCP. In ESBOL one had wide open access to the hardware itself. The B5000 hardware had an interesting limitation: Code was generated into program segment strings which were block limited to 1024 words. A word was 48 bits. Why? The program segment string S register was 10 bits. If say, a BEGIN code END; block exceeded 1024 words, then the compiler yielded a "program segment string exceeds 1024" error. This of course nailed me in my first program. Easy enough to repair. The B5000. B6000 series also had virtual memory. Program segment strings need not be memory resident. And the 1024 word segments could be viewed a VM. Each program segment string terminated n a descriptor that referenced the program reference table (PRT). This referenced location could be a disk address to access the next segment. Etc. The B6000 series, I worked on the B6500, generalized the PRT into what was called the cactus stack. This enabled the extended version of ALGOL 60 to have multiple dimensional arrays of tasks. Super feature. An array element had the parent node that could link to family member tasks. The first time I used the feature I had a "death in the family error." This meant that a member the family hierarchy crashed. I can go on and on about this amazing hardware/ALGOL fusion. I have to say it's unforgettable since I'm soon to be 80 years old. I've written software in some 20 or so languages during my career as a computer scientist, and Burroughs extended ALGOL 60 is definitely the mid-wife of almost all of them. Bill Algol was OK, but all wasn't plain sailing. I programmed a lot in Algol 60 between 1967 and 1971 at the University of Salford on the English Electric KDF9. I was doing research into Computer Aided Circuit design using Y matrices. When I started paper tape was in use created and edited on Friden Flexowriters. My program was fairly large for the time and a relatively large roll of tape. There were two ways of changing the program - put the paper tape in the flexowriter and set it to copy, up to the line that you wanted to change, type in the changes, re-sync the rest of the tape and let it copy the rest OR get out the little tape splicing machine and hack out a section, gluing your amendment in. Compiling and running was effectively twice per day since you had to write out a job slip and put your tape in a box for the computer operators to load, compile and execute. Then - REVOLUTION - the kdf9 was front-ended with a pdp8 running a system called COTAN ( I faintly recall from more than half a century ago). This had a disc where you could keep your program and edit it using ksr33 teleprinters (if you were patient). Then submit your job electronically no less. Unfortunately the COTAN system had a sort-of anti -Algol design - it was based on 80 column card images. Students were allocated a certain (insufficient) maximum number of cards. Algol 60 was not Fortran, so my beautiful indented layout of the "begin" and "ends" of the paper tape which used quite a lot of blank characters didn't fit in the allocated card space. Answer - eliminate most of the comments and the blanks. This resulted in a more or less "rectangular" program of solid text 80 characters wide - not that the Algol compilers cared but the very devil to read, edit, and debug. I still have some printouts (somewhere) of the resulting mess together with some of the little boxes originally used to submit the paper tape jobs. I don't hunger after the "old days" of computing. A lot of people program in ALGOL 60 There is a lot of programming done in ALGOL 60 on Burroughs machines (now Unisys MCP machines). Burroughs extended the language with IO, some an extension of FORTRAN-style IO, but most programmers did their own direct formatting, since the FORTRAN IO was interpreted and slow. I believe Burroughs ALGOL was based on Elliot ALGOL. I later worked with another Burroughs guy who developed a language based on Elliot ALGOL for the Apple II. This was at a company called Netcomm in Australia. Actually Don Knuth wrote one of the first Burroughs ALGOL compilers on a summer break as a student. It was on the B200 (from memory), but it predated the B5000 ALGOL compiler. Burroughs ALGOL is a really heavy-duty systems language that makes C look like the toy that it is. It is a shame that C has effectively killed language development. If ALGOL or its next generation (ALGOL-68, ALGOL-W, Pascal, CPL, etc) had continued, we’d probably have pretty solid languages by now, rather than the rather flimsy C. 1965 Having done my degree at Woolwich Polytechnic (Tommy Flowers was there long before me!) from 1965 onwards while working as a rocket scientist at the MoD, I was one of the early progammers who never actually went in to the computer room where the program was run. I simply submitted my Algol programs on coding sheets which would be run and sent back. I did occasionally see the computer through a window! I then taught Algol at Braintree college on an Elliott 803. Nothing compares with MIRFAC, probably the most sophisticed language ever running on the MoD's COSMOS computer though. Simon Lavington reviews this machine. Re: Facebook simply would not exist today if not for Bletchley Park Apart from Colossus Bletchley Park had nothing to do with electronic digital computers. All data analysis was done on Hollerith-type card machines. Actually Britain was far ahead of the US in developing commercial computers. Just look at the history of LEO. Britain also pioneered practical computers such as EDSAC which is the forerunner for all today's computers and can bee seen at the National Museum of Computer, located on the Bletchley Park Estate. Spend 3 hours there and see the Harwell Dekatron - the world's oldest original working computer. While there see a working Enigma/Bombe and the Lorenz/Colossus. My advice is to see TNMoC first then another pop down to Bletchley Park for a brief tour to see the Mansion and other interesting facts about their role in the the second world war.
4768
dbpedia
3
66
http://pascal.hansotten.com/niklaus-wirth/recollections-about-the-development-of-pascal/
en
Recollections about the Development of Pascal – Pascal for small machines
http://pascal.hansotten.com/wp-content/uploads/2016/01/cropped-wirth-header2.jpg
[ "http://pascal.hansotten.com/wp-content/uploads/2016/01/cropped-wirth-header2.jpg" ]
[]
[]
[ "" ]
null
[]
null
en
http://pascal.hansotten.com/wp-content/uploads/2020/10/kim1-as.jpg
http://pascal.hansotten.com/niklaus-wirth/recollections-about-the-development-of-pascal/
N. Wirth Institut fur Computersysteme, ETH Zurich CH-8092 Zurich Abstract Pascal was defined in 1970 and, after a slow start, became one of the most widely used languages in introductory programming courses. This article first summarises the events leading to Pascal’s design and implementation, and then proceeds with a discussion of some of the language’s merits and deficiencies. In the last part, developments that followed its release are recounted. Its influence chiefly derived from its being a vehicle for structured programming and a basis for further development of languages and for experiments in program verification. Contents Early History The Language Later Developments In Retrospect Acknowledgements References Early History The programming language Pascal was designed in the years 1968/69, and I named it after the French philosopher and mathematician, who in 1642 designed one of the first gadgets that might truly be called a digital calculator. The first compiler for Pascal was operational in early 1970, at which time the language definition also was published [Wirth, 1970]. These facts apparently constitute the anchor points of the history of Pascal. However, its genuine beginnings date much further back. It is perhaps equally interesting to shed some light on the events and trends of the times preceding its birth, as it is to recount the steps that led to its widespread use. I shall therefore start with a – more or less chronological – narrative of the early history of Pascal. In the early 1960s, there existed two principal scientific languages: Fortran and Algol 60 [Naur, 1963). The former was already in wide use and supported by large computer manufacturers. The latter – designed by an international committee of computing experts – lacked such support, but attracted attention by its systematic structure and its concise, formalized definition. It was obvious that Algol deserved more attention and a wider field of applications. In order to achieve it, Algol needed additional constructs making it suitable for purposes other than numerical computation. To this end, IFIP established a Working Group with the charter of defining a successor to Algol. There was hope that the undesirable canyon between scientific and commercial programming, by the mid 1960s epitomized as the Fortran and Cobol worlds, could be bridged. I had the privilege of joining Working Group 2.1. in 1964. Several meetings with seemingly endless discussions about general philosophies of language design, about formal definition methods, about syntactic details, character sets, and the whole spectrum of topics connected with programming revealed a discouraging lack of consensus about the approach to be taken. However, the wealth of ideas and experience presented also provided encouragement to coalesce them into a powerful ensemble. As the number of meetings grew, it became evident that two main factions emerged from the roughly two dozen members of the Working Group. One party consisted of the ambitious members, unwilling to build upon the framework of Algol 60, unafraid of constructing features that were largely untried and whose consequences for implementors remained a matter of speculation, and eager to erect another milestone similar to the one set by Algol 60. The opponents were more pragmatic. They were anxious to retain the body of Algol 60 and to extend it with well-understood features widening the area of applications for the envisaged successor language, but retaining the orderly structure of its ancestor. In this spirit, the addition of basic data types for double precision real numbers and for complex numbers was proposed, as well as the record structure known from COBOL, the replacement of Algol’s call-by-name with a call-by-reference parameter, and the replacement of Algol’s overly general for-statement by a restricted but more efficient version. They hesitated to incorporate novel, untested ideas into an official language, well aware that otherwise a milestone might easily turn into a millstone. In 1965, I was commissioned to submit a proposal to the WG which reflected the views of the pragmatists. In a meeting in October of the same year, however, a majority of the members favoured a competing proposal submitted by A. van Wijngaarden, former member of the Algol 60 team, and decided to select it as the basis for Algol X in a meeting of Warsaw in the fall of 1966 [van Wijngaarden, 1969]. Unfortunately, but as foreseen by many, the complexity of Algol 68 caused many delays, with the consequence that at the time of its implementation in the early 70s, many users of Algol 60 had already adopted other languages [Hoare, 1980]. I proceeded to implement my own proposal in spite of its rejection, and to incorporate the concept of dynamic data structures and pointer binding suggested by CAR. Hoare. The implementation was made at Stanford University for the new IBM 360 computer. The project was supported by a grant of the US National Science Foundation. The outcome was published and became known as Algol W [Wirth, 1966]. The system was adopted at many universities for teaching programming courses, but the language remained confined to IBM 360/370 computers. Essentially, Algol W had extended Algol 60 with new data types representing double precision floating-point and complex numbers, with bit strings and with dynamic data structures linked by pointers. In spite of pragmatic precautions, the implementation turned out to be rather complex, requiring a run-time support package. It failed to be an adequate tool for systems programming, partly because it was burdened with features unnecessary for systems programming tasks, partly because it lacked adequately flexible data structuring facilities. I therefore decided to pursue my original goal to design a general-purpose language without the heavy constraints imposed by the necessity of finding a consensus among two dozen experts about each and every little detail. The past experience had given me a life-long mistrust in the products of committees, where many participate in debating and decision making, and few perform the actual work made difficult by the many. In 1968, I assumed a professorship at the Federal Institute of Technology in Zurich (ETH), where Algol 60 had been the language of choice among researchers in numeric computation. The acquisition of CDC computers in 1965 (and even more so in 1970), made this preference hard to justify, because Algol compilers for these computers were rather poorly designed and could not compete with their Fortran counterparts. Furthermore, the task of teaching programming – in particular systems programming – appeared highly unattractive, given the choice between Fortran and assembler code as the only available tools. After all, it was high time to not only preach the virtues of Structured Programming, but to make them applicable in actual practice by providing a language and compilers offering appropriate constructs. The discipline of Structured Programming had been outlined by E. W. Dijkstra [Dijkstra 1966] and represented a major step forward in the battle against what became known as the Software Crisis. It was felt that the discipline was to be taught at the level of introductory programming courses, rather than as an afterthought while trying to retrain old hands. This insight is still valid today. Structured Programming and Stepwise Refinement [Wirth 1971a] marked the beginnings of a methodology of programming and became a cornerstone in letting program design become a subject of intellectual respectability. Hence, the definition of a new language and the development of its compiler were not a mere research project in language design, but rather a blunt necessity. The situation was to recur several times in the following decades, when the best advice was: Lacking adequate tools, build your own! In 1968, the goals were twofold: The language was to be suitable for expressing the fundamental constructs known at the time in a concise and logical way, and its implementation was to be efficient and competitive with existing Fortran compilers. The latter requirement turned out to be rather demanding, given a computer (the CDC 6000) that was designed very much with Fortran in mind. In particular, dynamic arrays and recursive procedures appeared as formidable obstacles, and were therefore excluded from an initial draft of the language. The prohibition of recursion was a mistake and was soon to be rectified, in due recognition that it is unwise to be influenced severely by an inadequate tool of a transitory nature. The task of writing the compiler was assigned to a single graduate student (E. Marmier) in 1969. As his programming experience was restricted to Fortran, the compiler was to be expressed in Fortran, with its translation into Pascal and subsequent self-compilation planned after its completion. This, as it turned out, was another grave mistake. The inadequacy of Fortran to express the complex data structures of a compiler caused the program to become contorted and its translation amounted to a redesign, because the structures inherent in the problem had become invisible in the Fortran formulation. The compiler relied on syntax analysis based on the table-driven bottom-up (LR) scheme adopted from the Algol-W compiler. Sophistication in syntax analysis was very much in style in the 1960s, allegedly because of the power and flexibility required to process high-level languages. It occurred to me then that a much simpler and more perspicuous method could well be used, if the syntax of a language was chosen with the process of its analysis in mind. The second attempt at building a compiler therefore began with its formulation in the source language itself, which by that time had evolved into what was published as Pascal in 1970 [Wirth, 1970], The compiler was to be a single-pass system based on the proven top-down, recursive-descent principle for syntax analysis. We note here that this method was eligible because the ban against recursion had been lifted: recursivity of procedures was to be the normal case. The team of implementors consisted of U. Ammann, E. Marmier, and R. Schild. After the program was completed – a healthy experience in programming in an unimplemented language! – Schild was banished to his home for two weeks, the time it took him to translate the program into an auxiliary, low-level language available on the CDC computer. Thereafter, the bootstrapping process could begin [Wirth, 1971b]. This compiler was completed by mid 1970, and at this time the language definition was published. With the exception of some slight revisions in 1972, it remained stable thereafter. We began using Pascal in introductory programming courses in late 1971. As ETH did not offer a computer science program until ten years later, the use of Pascal for teaching programming to engineers and physicists caused a certain amount of controversy. My argument was that the request to teach the methods used in industry, combined with the fact that industry uses the methods taught at universities, constitutes a vicious circle barring progress. But it was probably my stubborn persistence rather than any reasoned argument that kept Pascal in use. Ten years later, nobody minded. In order to assist in the teaching effort, Kathy Jensen started to write a tutorial text explaining the primary programming concepts of Pascal by means of many examples. This text was first printed as a Technical Report and thereafter appeared in Springer-Verlag’s Lecture Notes Series [Jensen, 1974]. The Language The principal role of a language designer is that of a judicious collector of features or concepts. Once these concepts are selected, forms of expressing them must be found, i.e. a syntax must be defined. The forms expressing individual concepts must be carefully molded into a whole. This is most important, as otherwise the language will appear as incoherent, as a skeleton onto which individual constructs were grafted, perhaps as after-thoughts. Sufficient time had elapsed that the main flaws of Algol were known and could be eliminated. For example, the importance of avoiding ambiguities had been recognised. In many instances, a decision had to be taken whether to solve a case in a clearcut fashion, or to remain compatible with Algol. These options sometimes were mutually exclusive. In retrospect, the decisions in favour of compatibility were unfortunate, as they kept inadequacies of the past alive. The importance of compatibility was overestimated, just as the relevance and size of the Algol-60 community had been. Examples of such cases are the syntactic form of structured statements without closing symbol, the way in which the result of a function procedure is specified (assignment to the function identifier), and the incomplete specification of parameters of formal procedures. All these deficiencies were later corrected in Pascal’s successor language Modula-2 [Wirth, 1982], For example, Algol’s ambiguous conditional statement was retained. Consider IF p THEN IF q THEN A ELSE B which can, according to the specified syntax, be interpreted in the following two ways: IF p THEN [IF q THEN A ELSE B] IF p THEN [IF q THEN A] ELSE B Pascal retained this syntactic ambiguity, selecting, however, the interpretation that every ELSE be associated with the closest THEN at its left. The remedy, known but rejected at the time, consists of requiring an explicit closing symbol for each structured statement, resulting in the two distinct forms for the two cases as shown below: IF p THEN IF q THEN A ELSE B END END IF p THEN IF q THEN A END ELSE B END Pascal also retained the incomplete specification of parameter types of a formal procedure, leaving open a dangerous loophole for breaching type checks. Consider the declarations PROCEDURE P (PROCEDURE q); BEGIN q(x,y) END; PROCEDURE Q (x: REAL); BEGIN… END; and the call P(Q). Then q is called with the wrong number of parameters, which cannot in general be detected at the time of compilation. In contrast to such concessions to tradition stood the elimination of conditional expressions. Thereby the symbol IF clearly becomes a marker of the beginning of a statement and bewildering constructs of the form IF p THEN x:=IF q THEN y ELSE z ELSE w are banished from the language. The baroque for-statement of Algol was replaced by a tamed version which is efficiently implementable, restricting the control variable to be a simple variable and the limit to be evaluated only once instead of before each repetition. For more general cases of repetitions, the while statement was introduced. Thus it became impossible to formulate misleading, non-terminating statements as for example FOR i:=0 STEP 1 UNTIL i DO S and the rather obscure formulation FOR i := n-1, i-1 WHILE i > 0 DO S could be expressed more clearly by i :=n; WHILE i > 0 DO BEGIN i := i-1; S END The primary innovation of Pascal was to incorporate a variety of data types and data structures, similar to Algol’s introduction of a variety of statement structures. Algol offered only three basic data types, namely integers, real numbers, and truth values, and the array structure; Pascal introduced additional basic types and the possibility to define new basic types (enumerations, subranges), as well as new forms of structuring: record, set, and file (sequence), several of which had been present in COBOL Most important was of course the recursivity of structural definitions, and the consequent possibility to combine and nest structures freely. Along with programmer-defined data types came the clear distinction between type definition and variable declaration, variables being instances of a type. The concept of strong typing – already present in Algol – emerged as an important catalyst for secure programming. A type was to be understood as a template for variables specifying all properties that remain fixed during the time-span of a variable’s existence. Whereas its value changes (through assignments), its range of possible values remains fixed, as well as its structure. This explicitness of static properties allows compilers to verify whether rules governing types are respected. The binding of properties to variables in the program text is called early binding and is the hallmark of high-level languages, because it gives clear expression to the intention of the programmer, unobscured by the dynamics of program execution. However, the strict adherence to the notion of (static) type led to some less fortunate consequences. We here refer to the absence of dynamic arrays. These can be understood as static arrays with the number of elements (or the bounding index values) as a parameter. Pascal did not include parametrised types, primarily for reasons of implementation, although the concept was well understood. Whereas the lack of dynamic array variables may perhaps not have been too serious, the lack of dynamic array parameters is clearly recognised as a defect, if not in the view of the compiler designer, then certainly in the view of the programmer of numerical algorithms. For example, the following declarations do not permit procedure P to be called with x as its actual parameter: TYPE A0 = ARRAY [1 .. 100] OF REAL; A1 = ARRAY [0 .. 999] OF REAL; VAR x: A1; PROCEDURE P(x: AO); BEGIN … END Another important contribution of Pascal was the clear conceptual and denotational separation of the notions of structure and access method. Whereas in Algol W arrays could only be declared as static variables and hence only be accessed directly, record structured variables could only be accessed via references (pointers), i.e. indirectly. In Pascal, all structures can be either accessed directly or via pointers, indirection being specified by an explicit dereferencing operator. This separation of concerns was known as ‘orthogonal design’, and was pursued (perhaps to extreme) in Algol 68. The introduction of explicit pointers, i.e. variables of pointer type, was the key to a significant widening of the scope of application. Using pointers, dynamic data structures can be built, as in list-processing languages. It is remarkable that the flexibility in data structuring was made possible without sacrificing strict static type checking. This was due to the concept of pointer binding. I.e. of declaring each pointer type as being bound to the type of the referenced objects, as proposed by [Hoare, 1972]. Consider, for Instance, the declarations TYPE Pt = ^ Rec; Rec = RECORD x,y: REAL END; VAR p, q: Pt; Then p and q, provided they had been properly initialized, are guaranteed to hold either values referring to a record of type Rec, or the constant NIL A statement of the form p^.x := p.y + q^.x turns out to be as type-safe as the simple x := x +y. Indeed, pointers and dynamic structures were considerably more important than dynamic arrays in all applications except numeric computation. Intricately connected to pointers is the mechanism of storage allocation. As Pascal was to be suitable as a system-building language, it tried not to rely on a built-in run-time garbage collection mechanism as had been necessary for Algol W. The solution adopted was to provide an intrinsic procedure NEW for allocating a variable in a storage area called the heap, and a complementary one for deallocation (DISPOSE). NEW is trivial to implement, and DISPOSE can be ignored, and indeed it turned out to be wise to do so, because system procedures depending on programmer’s information are inherently unsafe. The idea of providing a garbage collection scheme was not considered in view of its complexity. After all, the presence of local variables and of programmer-defined data types and structures require a rather sophisticated and complex scheme crucially depending on system integrity. A collector must be able to rely on information about all variables and their types. This information must be generated by the compiler and, moreover, it must be impossible to invalidate it during program execution. The subject of parameter passing methods had already been a source of endless debates and hassles in the days of the search for a successor to Algol 60. The impracticality of its name parameter had been clearly established, and the indispensability of the value parameter was generally accepted. Yet there were valid arguments for a reference parameter, in particular for structured operands on the one hand, and good reasons for result parameters on the other. In the former case the formal parameter constitutes a hidden pointer to the actual variable, in the latter the formal parameter is a local variable to be assigned to the actual variable upon termination of the procedure. The choice of the reference parameter (in Pascal called VAR-parameter) as the only alternative to the value parameter turned out to be simple, appropriate, and successful. And last but not least, Pascal included statements for input and output, whose omission from Algol had been a source of continuing criticism. Particularly with regard to Pascal’s role as a language for instruction, a simple form of such statements was chosen. Their first parameter designates a file and, if omitted, causes the data to be read from or written to the default medium, such as keyboard and printer. The reason for including a special statement for this purpose in the language definition, rather than postulating special, standard procedures, was the desire to allow for a variable number and different types of parameters: Read(x.y);… ;Write(x,y,z) As mentioned before, a language designer collects frequently used programming constructs from his own experience, from the literature, or from other languages, and molds them into syntactic forms in such a way that they together form an integrated language. Whereas the basic framework of Pascal stems from Algol W, many of the new features emerged from suggestions made by CAR. Hoare, including enumeration, subrange, set, and file types. The form of COBOL-like record types was due to Hoare, as well as the idea to represent a computer’s “logical words” by a well-known abstraction, namely sets (of small integers). These “bits and pieces” were typically presented and discussed during meetings of the IFIP Working Croup 2.1 (Algol), and thereafter appeared as communications in the Algol Bulletin. They were collected in his Notes on Data Structuring [Hoare, 1972]. In Pascal, they were distilled into a coherent and consistent framework of syntax and semantics, such that the structures were freely combinable. Pascal permits the definitions of arrays of records, records of arrays, arrays of sets, and arrays of record with files, just to name a few possibilities. Naturally, implementations would have to impose certain limits to the depth of nesting due to finite resources, and certain combinations, such as a file of files, might not be accepted at all. This case may serve as an example of the distinction between the general concepts defined by the language, and supplementary, restrictive rules governing specific implementations. Although the wealth of data types and structuring forms was the principal asset of Pascal, not all of the components were equally successful. We keep in mind that success is a subjective quality, and opinions may differ widely. I therefore concentrate on an ‘evaluation’ of a few constructs where history has given a reasonably clear verdict. The most vociferous criticism came from Habermann, who correctly pointed out that Pascal was not the last word on language design. Apart from taking issue with types and structures being merged into a single concept, and with the lack of constructs like conditional expressions, the exponentiation operator, and local blocks, which were all present in Algol 60, he reproached Pascal for retaining the much-cursed go to statement [Habermann, 1973]. In hindsight, one cannot but agree; at the time, its absence would have deterred too many people from trying to use Pascal. The bold step of proposing a goto-less language was taken ten years later by Pascal’s successor Modula-2, which remedied many shortcomings and eliminated several remaining compatibility concessions to Algol 60, particularly with regard to syntax [Wirth, 1982], A detailed and well-judged reply to the critique by Habermann was written by Lecarme, who judged the merits and deficiencies on the basis of his experience with Pascal in both teaching and compiler design [Lecarme, 1975]. Another significant critique [Welsh, 1977] discusses the issue of structural vs. name equivalence of data types, a distinction that had unfortunately been left open in the definition of Pascal. It caused many debates until it was resolved by the Standards committee. Perhaps the single most unfortunate construct was the variant record. It was provided for the purpose of constructing inhomogeneous data structures. Both for array and for a dynamic structures, in Pascal the element types must be fixed by type declarations. The variant record allows variations of the element types. The unfortunateness of the variant record of Pascal stems from the fact that it provides more flexibility than required to achieve this legitimate goal. In a dynamic structure, typically every element remains of the same type as defined by its creation. The variant record, however, allows more than the construction of heterogeneous structures, i.e. of structures with elements of different, although related types. It allows the type of elements themselves to change at any time. This additional flexibility has the regrettable property to require type checking at run-time for each access to such a variable or to one of its components. Most implemented of Pascal decided that this checking would be too expensive, enlarging code and deteriorating program efficiency. As a consequence, the variant record became a favourite feature to breach the type system by all programmers in love with tricks, which usually turn into pitfalls and calamities. Variant records also became a major hindrance to the notion of portabiliy of programs. Consider, for example, the declaration VAR R: RECORD maxspeed: INTEGER; CASE v. Vehicle OF truck: (nofwheels: INTEGER); vessel: (homeport: String) END Here, the designator R.nofwheels is applicable only if R.v has the value truck, and R.homeport only if R.v = vessel. No compiler checks can safeguard against erroneous use of designators, which, in the case of assignment, may be disastrous, because the variant facility is used by Implementations to save storage by overlaying the fields nofwheels and homeport. With regard to input and output operations, Pascal separated the notions of data transfer (to or from an external medium) and of representation conversion (binary to decimal and vice-versa). External, legible media were to be represented as files (sequences) of characters. Representation conversion was expressed by special read and write statements that have the appearance of procedures but allowed a variable number of parameters. Whereas the latter was essentially a concession to programmers used to Fortran’s I/O statements, the notion of sequence as a structural form was fundamental. Perhaps also in this case, providing sequences of any (even programmer-defined) element types was more than what was genuinely needed in practice. The consequence was that, in contrast to all other data types, files require a certain amount of support from built-in run-time routines, mechanisms not explicitly visible from the program text. The successors of Pascal -Modula-2 and Oberon – later retreated from the notion of the file as a structural form at the same level as array and record. This became possible, because implementations of sequencing mechanisms could be provided through modules (library packages). In Pascal, however, the notion of modules was not yet present; Pascal programs were to be regarded as single, monolithic texts. This view may be acceptable for teaching purposes where exercises are reasonably compact, but it Is not tenable for the construction of large systems. Nevertheless and surprisingly, Pascal compilers could be written as single Pascal programs. Later Developments As Pascal appeared to fulfil our expectations with regard to teaching, the compiler still failed to satisfy the stated goals with regard to efficiency in two aspects: First, the compiler as a relatively large stand-alone program resulted in fairly long ‘turn-around times’ for students. In order to alleviate the problem, I designed a subset of the language containing those features that we believed were to be covered in introductory courses, and a compiler/interpreter package that fitted into a 16K-word block of store, which fell under the most-favoured program status of the computation center. The Pascal-S package was published in a report, and was one of the early comprehensive systems made widely available in source form [Wirth, 1981 ]. Comparable Fortran programs were still substantially faster, an undeniable argument in the hands of the Pascal adversaries. As we were of the opinion that structured programming supported by a structured language and efficiency of compilation and of produced code were not necessarily mutually exclusive, a project for a third compiler was launched, which on the one hand was to demonstrate the advantage of structured top-down design with step-wise refinement [Ammann, 1974], and on the other hand was to pay attention to generating high-quality code. This compiler was written by U. Ammann and achieved both goals quite admirably. It was completed in 1976 [Ammann, 1977]. Although the result was a sophisticated compiler of high quality and reliability, in hindsight we must honestly confess that the effort invested was incommensurate with its effect It did not win over many engineers and even fewer physicists. The argument that Fortran programs “ran faster* was simply replaced by “our programs are written in Fortran”. And what authorises us to teach structured, “better” programming to experts of ten years standing? Also, the code was generated for the CDC 6000 computer which -with its 60-bit word and super-RISC structure – was simply not well suited for the task. Much of Ammann s efforts went into implementing the attribute packed of records. Although semantically irrelevant, it was requested by considerations of storage economy on a computer with very long words. Having had the freedom to design not only the language but also the computer would have simplified the project considerably. In the event, the spread of Pascal came from another front. Not long after the publication of the Pascal definition we received correspondence indicating interest in that language and requesting assistance in compiler construction, mainly from people who were not users of CDC computers. It was this stimulus which encouraged me to design a suitable computer architecture. A version of Ammann’s compiler – easily derived from an early stage of the sequence of refinements – would generate code for this “ideal” architecture, which was described in the form of a Pascal program representing an interpreter. In 1973, the architecture became known as the P-machine, the code as P-code, and the compiler as the P-compiler. The P—kit consisted of the compiler in P-code and the interpreter as a Pascal source program [Nori, 1981]. Recipients could restrict their labour to coding the interpreter in their favourite assembler code, or proceed to modify the source of the P-compiler and replace its code generating routines. This P-system turned out to be the key to Pascal’s spread onto many computers, but the reluctance of many to proceed beyond the interpretive scheme also gave rise to Pascal’s classification as a “slow language, restricted to use in teaching. Among the recipients of the P-kit was the team of K. Bowles at the University of California at San Diego (UCSD) around 1975. He had the foresight to see that a Pascal compiler for an interpretive system might well fit into the memory of a microcomputer, and he mustered the courage to try. Moreover, the idea of P-code made it easy to port Pascal on a whole family of micros and to provide a common basis on all of them for teaching. Microcomputers had just started to emerge, using early microprocessors like Intel’s 8080, DEC’S LSI—11, and Rockwell’s 6502; in Europe, they were hardly known at the time. Bowles not only ported our compiler. His team built an entire system around the compiler, including a program editor, a file system, and a debugger, thereby reducing the time needed for an edit-compile-test step dramatically over any other system in educational use. Starting in 1978, this UCSD-Pascal system spread Pascal ve”ry rapidly to a growing number of users [Bowles, 1980], [Clark, 1982]. It won more “Pascal friends” in a year than the systems used on large “main frames” had won in the previous decade. This phenomenal success had three sources: (1) A high-level language was available on microcomputers which would pervade educational institutions; (2) Pascal became supported by an integrated system instead of a “stand-alone” compiler; and (3), perhaps most importantly, Pascal was offered to a large number of computer novices, i.e. people who were not burdened by previous programming habits. In order to adopt Pascal, they did not have to give up a large previous investment in learning all the idiosyncracies of assembler or Fortran coding. The microcomputer made programming a public activity, hitherto exclusively reserved to the high-priests of computing centers, and Pascal effectively beat Fortran on microcomputers. By 1978, there existed over 80 distinct Pascal implementations on hosts ranging from the Intel 8080 microprocessor to the Cray-1 supercomputer. But Pascal’s usefulness was not restricted to educational institutions; by 1980 all four major manufacturers of workstations (Three Rivers, HP, Apollo, Tektronix) were using Pascal for system programming. Besides being the major agent for the spread of Pascal implementations, the P-system was significant in demonstrating how comprehensible, portable, and reliable a compiler and system program could be made. Many programmers learned much from the P-system, including implementors who did not base their work on the P-system, and others who had never before been able to study a compiler in detail. The fact that a compiler was available in source form caused the P-system to become an influential vehicle of extracurricular education. Several years earlier, attempts had been made to transport the Pascal compiler to other main-frame computers. In these projects no interpreter or intermediate code was used; instead they required the design of new generators of native code. The first of these projects was also the most successful. It was undertaken by J. Welsh and C. Quinn from the Queen’s University at Belfast [Welsh, 1972]. The target was the ICL 1900 computer. The project deserves special mentioning, because it should be considered as one of the earliest, genuinely successful ventures that were later to be called software engineering efforts. Since no CDC computer was available at Belfast, the goal was to employ a method that required as little work on a CDC machine as possible. What remained unavoidable would be performed during a short visit to ETH in Zurich. Welsh and Quinn modified the CDC-Pascal compiler, written in Pascal. by replacing all statements affecting code generation. In addition, they wrote a loader and an interpreter of the ICL-architecture, allowing some tests to be performed on the CDC-computer. All these components were programmed before the crucial visit, and were completed without any possibility of testing. In Zurich, the programs were compiled and a few minor errors were corrected within a week. Back in Belfast, the generated compiler code was executable directly by the ICL-machine after correction of a single remaining error. This achievement was due to a very careful programming and checking effort, and it substantiated the claimed advantages to be gained by programming in a high-level language like Pascal, which provides full, static type checking. The feat was even more remarkable, because more than half of the week had to be spent on finding a way to read the programs brought from Belfast. Aware of the incompatibilities of character sets and tape formats of the two machines (7- vs. 9-track tapes), Welsh and Quinn decided to use punched cards as data carrier. Yet, the obstacles encountered were probably no less formidable. It turned out to be a tricky, if not downright impossible task, to read cards punched by an ICL machine with the CDC reader. Not only did the machines use different sets of characters and different encodings, but certain hole combinations were interpreted directly by the reader as end of records. The manufacturers had done their utmost best to ensure incompatibility! Apart from these perils, the travellers had failed to reckon with the thoroughness of the Swiss customs officers. The two boxes filled with some four thousand cards surely had to arouse their deep suspicion, particularly because these cards contained empty cubicles irregularly spaced by punched holes. Nevertheless, after assurances that these valuable possessions were to be re-exported anyway, the two might-be smugglers were allowed to proceed to perform their mission. Upon their return, the fact that now the holes were differently positioned luckily went unnoticed. Other efforts to port the compiler followed; among them were those for the IBM 360 computers at Grenoble, the PDP-11 at Twente [Bron, 1976], and the PDP-10 at Hamburg (Grosse-Lindemann, 1976). By 1973, Pascal had started to become more widely known and was being used in classrooms as well as for smaller software projects. An essential prerequisite for such acceptance was the availability of a user manual including tutorial material in addition to the language definition. Kathy Jensen embarked on providing the tutorial part, and by 1973 the booklet was published by Springer-Verlag, first in their Lecture Notes Series, and after selling too fast, as an issue on its own [Jensen. 1974]. It was soon to be accompanied by a growing number of introductory textbooks from authors from many countries. The User Manual itself was later to be translated into many different languages, and it became a bestseller. A dedicated group of Pascal fans was located at the University of Minnesota’s computation center. Under the leadership and with the enthusiasm of Andy Mickel, a Pascal Users’ Group (PUG) was formed, whose vehicle of communication was the Pascal Newsletter at first edited by G.H. Richmond (U. of Colorado) and later by Mlckel. The first issue appeared in January 1974. It served as a bulletin board for new Pascal implementations, of new experiences and – of course – of ideas for improving and extending the language. Its most important contribution consisted in tracking all the emerging implementations. This helped both consumers to find compilers for their computers and implementors to coordinate their efforts. At ETH Zurich, we had decided to move on towards other projects and to discontinue distribution of the compiler, and the Minnesota group was ready to take over its maintenance and distribution. Maintenance here refers to adaptation to continually changing operating system versions, as well as to the advent of the Pascal Standard. Around 1977, a committee had been formed to define a standard. At the Southampton conference on Pascal, A. M. Addyman asked for help in forming a standards committee under the British Standards Institute (BSI). In 1978 representatives from industry met at a conference in San Diego hosted by K. Bowles to define a number of extensions to Pascal. This hastened the formation of a standards committee under the wings of IEEE and ANSI/X3. The formation of a Working Group within ISO followed in late 1979, and finally the IEEE and ANSI/X3 committees were merged into the single Joint Pascal Committee. Significant conflicts arose between the US committee and the British and ISO committees, particularly over the issue of conformant array parameters (dynamic arrays). The latter became the major difference between the original Pascal and the one adopted by ISO, the other being the requirement of complete parameter specifications for parametric procedures and functions. The conflict on the issue of dynamic arrays eventually led to a difference between the standards adopted by ANSI on the one, and BSI and ISO on the other side. The unextended standard was adopted by IEEE in 1981 and by ANSI in 1982 [Cooper, 1983], [Ledgard, 1984]. The differing standard was published by BSI in 1982 and approved by ISO in 1983 [ISO, 1983], [Jensen, 1991]. In the meantime, several companies had implemented Pascal and added their own, supposedly indlspensible extensions. The Standard was to bring them back under a single umbrella. If anything would have had a chance to make this dream come true, it would have been the speedy action of declaring the original language as the standard, perhaps with the addition of a few clarifications about obscure points. Instead, several members of the group had fallen prey to the devil’s temptations: they extended the language with their own favourite features. Most of these features I had already contemplated in the original design, but dropped because either of difficulties in clear definition or efficient implementation, or because of questionable benefit to the programmer [Wirth, 1975). As a result, long debates started, requiring many meetings. When the committee finally submitted a document, the language had almost found its way back to the original Pascal. However, a decade had elapsed since publication of the report, during which individuals and companies had produced and distributed compilers: and they were not keen to modify them in order to comply with the late Standard, and even less keen to give up their own extensions. An implementation of the Standard was later published in [Welsh, 1986]. Even before publication of the Standard, however, a validation suite of programs was established and played a significant role in promoting compatibility across various implementations [Wichmann, 1983]. Its role even increased after the adoption of the Standard, and in the USA it made a Federal Information Processing Standard for Pascal possible. The early 70s were the time when, in the aftermath of spectacular failures of large projects, terms like Structured Programming and Software Engineering were coined. They acted as symbols of hope for the drowning, and too often were believed to be panaceas for all the troubles of the past This trend further raised interest in Pascal which – after all – was exhibiting a lucid structure and had been strongly influenced by E. W. Dijkstra’s teachings on structured design. The 70s were also the years when in the same vein it was believed that formal development of correctness proofs for programs was the ultimate goal. CAR. Hoare had postulated axioms and rules of inference about programming notations (it later became known as Hoare-logic). He and I undertook the task of defining Pascal’s semantics formally using this logic. However, we had to concede that a number of features had to be omitted from the formal definition (e.g. pointers) [Hoare, 1973]. Pascal therafter served as vehicle for the realization of program validators in at least two places, namely Stanford University and ETH Zurich. E. Marmier had augmented the compiler to accept assertions (in the form of marked comments) of relations among a program’s variables holding after (or before) executable statements. The task of the assertion checker was to verify or refute the consistency of assertions and statements according to Hoare-logic [Marmier, 1975]. His was one of the earliest endeavours in this direction. Although it was able to establish correctness for various reasonably simple programs, its main contribution was to dispell the simple-minded belief that everything can be automated. Pascal exerted a strong influence on the field of language design. It acted as a catalyst for new ideas and as a vehicle to experiment with them, and in this capacity gave rise to several successor languages. Perhaps the first was P. Brinch Hansen’s Concurrent Pascal [Brinch Hansen, 1975]. It embedded the concepts of concurrent processes and synchronization primitives within the sequential language Pascal. A similar goal, but with emphasis on simulation of discrete event systems based on (quasi-) concurrent processes, led to Pascal-Plus, developed by J. Welsh and J. Elder at Belfast [Welsh, 1984). A considerably larger language was the result of an ambitious project by Lampson et.al., whose goal was to cover all the needs of modern, large scale software engineering. Although deviating in many details and also in syntax, this language Mesa had Pascal as ancestor [Mitchell, 1978]. It added the revolutionary concept of modules with import and export relationships, i.e. of information hiding. Its compiler introduced the notion of separate – as distinct from independent – compilation of modules or packages. This idea was adopted later in Modula-2 [Wirth, 1982], a language that in contrast to Mesa retained the principles of simplicity, economy of concepts, and compactness which had led Pascal to success. Another derivative of Pascal is the language Euclid [London, 1978]. The definition of its semantics is based on a formalism, just as the syntax of Algol 60 had been defined by the formalism BNF. Euclid carefully omits features that were not formally definable. Object Pascal is an extension of Pascal incorporating the notion of object-oriented programming, i.e. of the abstract data type binding data and operators together [Tesler, 1985]. And last but not least the language Ada [Barnes, 1980] must be mentioned. Its design was started in 1977 and was distinctly influenced by Pascal. It lacked, however, an economy of design without which definitions become cumbersome and implementations monstrous. In Retrospect I have been encouraged to state my assessment of the merits and weaknesses of Pascal, of the mistaken decisions in its design, and of its prospects and place in the future. I prefer not to do so explicitly, and instead to refer the reader to my own successive designs, the languages Modula-2 [Wirth. 1982] and Oberon [Wirth, 1988]. Had I named them Pascal-2 and Pascal-3 instead, the questions might not have been asked, because the evolutionary line of these languages would have been evident. It is also fruitless to question and debate early design decisions; better solutions are often quite obvious in hindsight. Perhaps the most important point was that someone did make decisions, in spite of uncertainties. Basically, the principle to include features that were well understood, in particular by implementors, and to leave out those that were still untried and unimplemented, proved to be the most successful single guideline. The second important principle was to publish the language definition after a complete implementation had been established. Publication of work done is always more valuable than publication of work planned. Although Pascal had no support from industry, professional societies, or governmental agencies, it became widely used. The important reason for this success was that many people capable of recognizing its potential actively engaged themselves in its promotion. As crucial as the existence of good implementations is the availability of documentation. The conciseness of the original report made it attractive for many teachers to expand it into valuable textbooks. Innumerable books appeared in many languages between 1977 and 1985, effectively promoting Pascal to become the most widespread language used in introductory programming courses. Good course material and implementations are the indispensible prerequisites for such an evolution. Pascal is still heavily used in teaching at the time of this writing. It may appear that it undergoes the same fate as Fortran, standing In the way of progress. A more benevolent view assigns Pascal the role of paving the way for successors. Acknowledgements I heartily thank the many contributors whose work played an indispensible role in making the Pascal effort a success, and who thereby directly or indirectly helped to advance the discipline of program design. Particular thanks go to CAR. Hoare for providing many enlightening ideas that flowed into Pascal’s design, to U. Ammann, E. Marmier, and R. Schild for their valiant efforts to create an effective and robust compiler, to A. Mickel and his crew for their enthusiasm and untiring engagement in making Pascal widely known by establishing a User Croup and a Newsletter, and to K. Bowles for recognizing that our Pascal compiler was also suitable for microcomputers and for acting on this insight. I also thank the innumerable authors of textbooks, without whose introductory texts Pascal could not have received the acceptance which it did. References [Ammann, 1974] Ammann, U., The method of structured programming applied to the development of a compiler. Int’l Computing Symposium 1973. 93 – 99, North – Holland, 1974. [Ammann, 1977] Ammann, U., On Code Generation in a Pascal Compiler. Software – Practice and Experience, 7, 391 -423 (1977). [Barnes, 1980] An Overwiew of Ada. Software – Practice and Experience, 70, 851 – 887 (1980). [Bowles, 1980] Bowles, K, L. Problem solving using Pascal. Springer-Verlag, 1977. [Brinch Hansen, 1975] Brinch Hansen, P., The Programming Language Concurrent Pascal, IEEE Trans. Software Eng. 1,2,199 – 207, (1975). [Bron, 1976] 8ron, C, and W. de Vries. A Pascal Compiler for the PDP-11 Minicomputers. Software – Practice and Experience. 6,109-116 (1976). [Clark, 1982] Clark, R., and S. Koehler, The UCSD Pascal Handbook. Prentice-Hall, 1982. [Cooper, 1983] Cooper, D., Standard Pascal. User Reference Manual. Norton, 1983. [Dijkstra. 1966) Dijkstra. E. W., Structured Programming. Tech. Report, Univ. of Eindhoven, 1966. also in: Dahl, O. – J. et al, Structured Programming, London: Academic Press 1972. [Grosse-Lindmann, 1976] Grosse-Lindemann, C. O., and H. H. Nagel, Postlude to a Pascal-Compiler Bootstrap on a DECSystem-10. Software – Practice and Experience, 6. 29-42, (1976). [Habermann, 1973] Habermann, A. N., Critical comments on the programming language Pascal, Ada Informatics 3,47 – 57 (1973). [Hoare, 1972] Hoare, CAR., Notes on Data Structuring. In: Dahl, O. -J. et al, Structured Programming, London: Academic Press 1972. [Hoare, 1973] Hoare, CAR. and N. Wirth, An axiomatic definition of the programming language Pascal, Acta Informatics 2,335 – 355 (1973) [Hoare. 1980] Hoare. CAR.. The Emperor’s Old Clothes. Comm.ACM, 24, 2 (feb. 1980). 75 – 83 (Feb. 1980). [ISO, 1983] International Organization for Standardization, Specification for Computer Programming language Pascal, ISO 7185-1982. [Jensen, 1974] Jensen, K. and N. Wirth, Pascal- User Manual and Report, Springer-Verlag, 1974. [Jensen, 1991] Jensen, K., and N. Wirth. Revised by A. B. Mickel and J. F. Miner, Pascal, User Manual and Report, ISO Pascal Standard. Springer-Verlag, 1991. [Lecarme, 1975] Lecarme, 0. and P. Desjardlns, More comment] on the programming language Pascal, Ala Informatics 4. 231 – 243 (1975) [Ledgard.1984] Ledgard, H. The American Pascal Standard. Springer-Verlag, 1984. [London, 1978] London, R.L, J. V. Guttag, J. J. Homing, 8. W. Lampson. J. G. Mitchell, and G. J. Popek. Proof Rules for the Programming Language Euclid. Ada Informatics 10,1 – 26 (1978). [Marmier, 1975] E. Marmier. Automatic Verification of Pascal Programs. ETH-Dissertation No. 5629, Zurich, 1975 [Mitchell, 1978] Mitchell, J. C W. Maybury, R. Sweet Mesa Language Manual. Xerox PARC Report CSL-78-1 (1978). [Naur, 1963] Naur, P., (ed.) Revised report on the algorithmic language Algol 60, Comm. ACM 3, 299 – 316 (1960), and Comm. ACM 6,1 – 17 (1963) [Nori, 1981] Nori, K.V., et al., The Pascal P-code compiler: Implementation notes. In: D.W. Barron, ed., Pascal – The language and its implementation. Wiley 1981. [Tesler, 1985] Tesler, L, Object Pascal Report. Structured Programming (formerly Structured Language World), 9, 3, (1985), 10-14. [van Wijngaarden, 1969] van Wijngaarden, A., (Ed.), Report on the algorithmic language Algol 68. Numer. Math. 14, 79 – 218 (1969) [Welsh, 1972] Welsh, J., and C. Quinn. A Pascal Compiler for ICL1900 Series Computers. Software, Practice and Experience 2. 73-77 (1972). [Welsh, 1977] Welsh. J., W. J. Sneeringer, and CAR. Hoare, Ambiguities and Insecurities in Pascal. Software. Practice and Experience, 7, (1977), 685 – 696. Also in: D.W. Barron. ed., Pascal- The language and its implementation. Wiley 1981. [Welsh, 1984] Welsh, J., and D. Bustard, Sequential Program Structures, Prentice-Hall Int’l, 1984. [Welsh, 1986] Welsh, J., and A Hay, A Model Implementation of Standard Pascal. Prentice-Hall Int’l, 1986. [Wichmann, 1983] Wichmann, B., and Ciechanowicz, Pascal Compiler Validation. Wiley, 1983. [Wirth, 1966] Wirth, N. and CAR. Hoare, A Contribution to the development of ALGOL, Comm. ACM 9,6,413 – 432 (June 1966) [Wirth, 1970] Wirth, N.. The Programming Language Pascal, Tech. Rep. 1, Fachgruppe Computer-Wissenschaften, ETH, Nov. 1970, andAda Informatics 1.35 – 63 (1971) [Wirth, 1971a] Wirth, N.. Program Development by Step-wise Refinement. Comm. ACM 14, 4,221 -227 (April 1971) [Wirth, 1971b] Wirth, N., The design of a Pascal compiler. Software, Practice and Experience 1,309 – 333 (1971). [Wirth, 1975] Wirth, N. An assessment of the programming language Pascal. lEEETrans. on Software Eng.. 1. 2 (June 1975), 192 -198. [Wirth, 1981] Wirth. N. Pascal-S: A subset and its implementation. In: D.W. Barron, ed., Pascal – The language and its implementation. Wiley 1981. [Wirth, 1982] Wirth, N., Programming in Modula-2. Springer-Verlag, 1982.
4768
dbpedia
3
5
https://stackoverflow.com/questions/38286821/what-was-algol-compiler-originally-programmed-in
en
What was ALGOL compiler originally programmed in?
https://cdn.sstatic.net/…g?v=73d79a89bded
https://cdn.sstatic.net/…g?v=73d79a89bded
[ "https://www.gravatar.com/avatar/28340ce341124a05bcf8fbe3852aa9f1?s=64&d=identicon&r=PG&f=y&so-version=2", "https://i.sstatic.net/nmeAD.jpg?s=64", "https://www.gravatar.com/avatar/33ab040b02ee9b0380505f47883701d2?s=64&d=identicon&r=PG&f=y&so-version=2", "https://www.gravatar.com/avatar/0b474fc20a084538f086bfac37bb7b2d?s=64&d=identicon&r=PG&f=y&so-version=2", "https://stackoverflow.com/posts/38286821/ivc/acfa?prg=6e2b0bf3-8f61-41ea-ab47-e5993891e075" ]
[]
[]
[ "" ]
null
[]
2016-07-09T22:19:16
I've been look around and I found little info on what was used to create the ALGOL 58 compiler. Was the compiler created similar to FORTRAN in that they used physical cards to create it?
en
https://cdn.sstatic.net/Sites/stackoverflow/Img/favicon.ico?v=ec617d715196
Stack Overflow
https://stackoverflow.com/questions/38286821/what-was-algol-compiler-originally-programmed-in
I am not sure about the context of the question. Different manufacturers supplied different dialects of Algol. My experience of Algol 60 was firstly with the English Electric KDF9 which had two different but compatible Algol compilers. Firstly the Whetstone compiler/controller system for program development which had excellent diagnostics but ran rather slowly. Secondly the English Electric compiler which compiled much more slowly but produced fast run time code. I believe both these compilers were written in machine code. In 1970, I joined Burroughs Machines Ltd and worked with the B6500 and its successors. Its Algol compiler was written in Algol was used to compile itself! It was the nearest thing we had to machine code! Most of the component parts of the support software were written in Algol. E.G. All the other language compilers! (Basic, PL/I, Cobol, Fortran) If you object that there is a chicken and egg problem here you are quite correct. Before the hardware of the B6500 was running, there was a B6500 emulator running on the B5500. The Algol compiler for the B6500 could thus be written in B5500 Algol and the resulting code run in the B6500 emulator. The changes from the B5500 Algol language to the B6500 version were not that big though the machine code was quite different. So changing the source code for the compiler from B5500 to B6500 was not a big job. If you are going to insist on an answer to the question about how the B5500 Algol compiler was produced, I'm afraid I don't have one! It was before my time. Following up on another-dave's answer, there's a detailed account of the writing of the Algol compiler for the B205 at http://ed-thelen.org/comp-hist/B5000-AlgolRWaychoff.html For a flavor of this, here's an extract from that account about Donald Knuth. "In June we were introduced to this tall college kid that always signed his name with lowercase letters. He was don knuth. He had contracted with Brad MacKenzie to write a 205 Algol 58 Compiler that would be a subset of the 220 Algol 58 compilar that was being written by Jack Merner and Joel Erdwin. don claimed that he could write the compiler and a language manual all by himself during his three and a half month summer vacation. He said that he would do it for $5000. Our Fortran compiler required a card reader, card punch. line printer and automatic floating point. Don said that he would not need the card reader or card punch, but he wanted a magnetic tape unit and paper tape. I asked Gerard Guyod how Brad could have been suckered into paying this college kid $5000 to write something that had to be a piece of junk if he was only going to spend three and a half months on it. Gerard whispered his response to me. He said "We think that he already has it written. He probably did it in his spare time while working in the computer center at Case Institute." I still wasn't entirely satisfied with that answer because I was a college graduate whose first job was for $325 per month and I had just changed jobs and was making $525 per month. Besides that it was taking mortal human beings 25 man-years to write compilers: not three and a half man-months. I thought that Brad had taker leave of his senses."
4768
dbpedia
1
69
http://www.cs.man.ac.uk/CCS/res/res50.htm
en
Computer Resurrection Issue 50
[ "http://www.cs.man.ac.uk/CCS/res/images/res50a.jpg", "http://www.cs.man.ac.uk/CCS/res/images/res50b.jpg", "http://www.cs.man.ac.uk/CCS/res/images/res50c.jpg", "http://www.cs.man.ac.uk/CCS/res/images/res50d.jpg", "http://www.cs.man.ac.uk/CCS/res/images/res50e.jpg", "http://www.cs.man.ac.uk/CCS/res/images/res50f.jpg", "http://www.cs.man.ac.uk/CCS/res/images/res50g.jpg" ]
[]
[]
[ "" ]
null
[]
null
null
Resurrection Home Previous issue Next issue View Original Cover Computer RESURRECTION The Bulletin of the Computer Conservation Society ISSN 0958-7403 Number 50 Spring 2010 Contents ALGOL 60 - A Binary Star Dik Leatherdale Society Activity News Round-Up Pioneer Profiles - Michael Woodger David Yates Reminiscences of Whetstone ALGOL Brian Randell On the Occasion of the 50th Anniversary of the ALGOL 60 Report Tony Hoare An ALGOL 60 Spiritual User Luca Cardelli The Atlas ALGOL System Bob Hopgood ALGOL 60 - Questions of Naming Dik Leatherdale Forthcoming Events Committee of the Society Aims and Objectives TopPrevious Next ALGOL 60 - a Binary Star Dik Leatherdale ALGOL 60 was, perhaps, the most influential development in programming languages of all time. Other languages have been more successful in themselves. Certainly there are many whose use has been more widespread. But none have had the influence on thinking, none have demonstrated the great leap forward, none have received so much attention as ALGOL 60. The vast majority of programming languages in the last half century have owed a huge debt to ALGOL 60. The ALGOL 60 Report stands as a colossus of its kind. It was, and still is, a thing of beauty - a work of art. So, in tribute to those clear-thinking pioneers of 50 years ago, we dedicate this 50th issue of Resurrection issue to ALGOL 60. Not, of course, that ALGOL 60 was without fault. Nor, indeed, did it escape criticism. The lack of any defined input/output facilities was the point most frequently made. That led to the creation of many incompatible I/O systems which, in practice, severely restricted program portability. But for me, and with the benefit of 50 years of hindsight, ALGOL’s major shortcoming was that it was a paper tape-orientated language in what was soon to become a punched card world. And reserved word-based dialects such as Burroughs’ ALGOL were arguably not “strict” ALGOL. But the fierceness of the arguments, the multiplicity of “improved” ALGOLs - ALGOL-W, Pascal, CPL etc. - only serve to demonstrate the extent to which the basic concept was accepted as essentially correct. Today ALGOL is but rarely used. But it is far from forgotten. It lives on in its progeny - C, C++, Delphi, Ada, Java, C# and so many others. I can surely do no better than to leave the last word to Tony Hoare. Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors. Top Previous Next Society Activity Changes to the Committee - David Hartley In December 2009 the CCS Committee reviewed its own membership in the context of the new constitution agreed in October. The membership is largely unchanged, and will be subject to the scrutiny of the next AGM which will be held in September or October later this year. We did, however, lose two long service members namely David Anderson of Portsmouth University, who has been a member since 2003, being variously London meetings coordinator and web site editor. Also, David Glover has represented The National Archives since 2005. We thank both of them for their past services. North West Group After 17 years Ben Gunn has decided to retire as Secretary. Chairman David Hartley paid tribute to Ben′s long service to the Society - Ben′s record is exceptional. The Society is hugely grateful for his dedication and we all wish him well. Meanwhile Member Gordon Adshead has agreed to take over the running of the Group. We all welcome him and look forward to an equally long and useful association. Exhibition to mark the Turing Centenary - Simon Lavington The CCS is working with the Science Museum on an idea for a temporary exhibition in 2012 to mark the centenary of the birth of Alan Turing. The project is in its early stages but the plan is for a small public outreach exhibition, lasting from June 2012 to June 2013, in the Maths and Computing Gallery at the Science Museum. The exhibition has received initial approval from the Science Museum Executive Group with an indicative budget of between £50,000 and £60,000 and the possibility of additional external sponsorship. All this will require further approval and sign-off from the Executive Group, Director and Trustees as the project develops. The exhibition could be themed round four chronological phases of Alan Turing’s involvement with the theory and practice of digital computation. Briefly, these could be: Theory at Cambridge and Princeton: the Universal Turing Machine. Bletchley Park and cryptanalysis: a vital interlude. NPL and the Pilot ACE: hardware and architecture. The Manchester years: software and applications. At its meeting on 18th February the CCS Committee set up a small working party under Simon Lavington to progress the project. Technical advice is being sought from a number of researchers whose work has rendered them expert in various aspects of Alan Turing’s work towards the development of digital computers and their application. There is clearly a need to identify relevant artefacts. A meeting has been held with the Bletchley Park Trust and areas of collaboration have been identified. The timescale for the project is likely to be as follows. Phase 1 (to be completed by end-November 2010): the CCS and others will finalise the intellectual content of the exhibition, which may include an indication of main themes, list of desired artefacts, simulators, models, etc. Phase 2 (to start December 2010 and be completed before end-June 2011): the Museum’s team will organise some form of audience evaluation, to mould the intellectual content into a presentation suitable for public outreach. Phase 3 (to be completed in time for the opening in June 2012): the Museum’s exhibition team undertakes the implementation of the whole project. Anyone who has ideas to contribute is invited to contact Simon Lavington at . Elliott 803 Project - Peter Onion Not much to report this time. The 803 itself is continuing to work well. The only problem has been that over Christmas the tape reader seemed to become less reliable and eventually became unusable. After much cleaning and adjustment it was still unreliable, so the decision was made to try and set up one of the several spare readers. The one chosen was in good condition mechanically and initial tests with a tape loop showed all five data channels were producing correct output signals. However the sprocket-hole channel was producing a very low amplitude output pulse. The fault was quickly diagnosed and the cause identified as the output transistor’s load resistor which had become disconnected from the PCB track. The “new” reader has been performing well for the last few weeks. A Calcomp 565 plotter has been retrieved from the TNMOC store and work will begin soon to restore this to operation. The 565 was supplied by Elliotts as an option for the 803. Sadly our 803 doesn′t have the optional instruction decoding circuits installed in the processor. Nor do we have the original interface unit to connect the plotter to the processor. However, a successful test has been conducted with an alternative scheme that requires minimal connections to be made to the processor. This will allow the plotter to respond correctly when the 803 executes the appropriate output instructions. Elliott 401 Project - Chris Burton The moratorium on accessing the machine continues. Remediation of asbestos- containing materials will take place by Museum conservation staff after similar work to the Pegasus. An alternative to the “firebar” resistors in the power supply will be sought. Procedures, roles and methods of working will have to be worked out with the Museum management before we will be allowed to operate the machine again. Pegasus Project - Len Hewitt After the discovery of asbestos in the Pegasus fuses and an alternator connection box very little happened as far as the Pegasus Project is concerned. In December the official asbestos reports were released and arrangements were made by the Science Museum for meetings to discuss the return of Pegasus to active service. There were a number of items concerning Health and Safety, asbestos, and Project practices to be discussed. The good news from the CCS point of view is that it is being discussed and the Museum is leading the way forward. Chris Burton and Peter Holland attended the two meetings and I attended one for some time via a telephone conference call. The asbestos problem with the alternator gasket has been dealt with by sealing. The asbestos-bearing fuses will be replaced with modern asbestos-free versions of otherwise identical type. Various procedures have been put forward to the Museum on how the repairs should be completed. Before they can take place the Museum has a number of Health and Safety issues that have been considered and will be implemented Until that is done repairs to the damaged wiring cannot take place. From our photographs and circuit investigation and discovery of three blown fuses, we are hoping that a replacement of burnt wires and a damaged tag block should be all that is necessary. There may be other components damaged in the PSU but we are hopeful that the blown fuses should have protected the rest of the circuitry. Harwell Dekatron Computer Project - Tony Frazer There was considerable damage to the wiring on racks 4 and 5 (used for the Dekatron stores). The send/receive guide pulse and power connections run the length of the racks to all stores and across to the extension on rack 5. On both racks, these wires span exposed areas between the bays like the strings of a harp. Many had thus been broken off and others had suffered weakening near the solder joints. All such wiring on rack 5 has been replaced with single strand PVC insulated plate wire using the same insulation colour scheme. Fortunately, the laced loom running up the centre U-points appeared undamaged and has been left in place. Similar repairs will be necessary to rack 4. Spare U-points have been located on equipment in D-block which will be used to replace at least two broken units on rack 4. In addition, 38 wires linking racks 4 and 5 which were hard-wired between the two racks had been cut at some point in order to separate racks 4 and 5. This wiring has been replaced on the rack 5 side and two Jones connectors have been attached using existing threaded holes on rack 4. Three of the five store units have been cleaned and repopulated with tested Dekatrons - one store has been populated with GC10B dekatrons (much brighter than the original GC10As). This has almost exhausted our supply of spare GC10Bs, so we will be looking out for more of these. Three more stores and the rack will be fully populated. We have sufficient GC10Bs to populate the accumulator and one option may be to populate half the stores in each of two store boxes with GC10Bs to facilitate demonstration of the machine to visitors (i.e. bright dekatrons in both send and receive stores). I have also modified a Russian OG-4 Dekatron (which appear to be readily available) as a possible substitute by changing the pinout using a spare octal base as a hard-wired adaptor. Eddie continues to clean and adjust relays - of particular note are relays where spring sets have been deliberately kinked rather than adjusting the travel to make contact. Eddie has completed work on the arithmetic rack and is now working through the store units on rack 5. Johan has made a start on straightening the casing and internal frame of one of the stores (with all dekatrons removed!). It had evidently been impacted causing crumpling of the interior metalwork and possible damage to components. Similar repairs were necessary on two of the other stores. ICT 1300 Project - Rod Brown Work continues to build the high speed interface. This is to allow us to capture the data from the magnetic tape data transfer unit, to complete the software recovery aims of this project. The project was very pleased to be invited to present its story and the history of the 1300 series to the CCS on the 15th of October 2009. This allowed us to continue the communication about the 1301 resurrection project, which is why the website at www.ict1301.co.uk was created. All of this activity is running against a backdrop of growing doubt about the longer term future of the project. There is pressure from Ashford Borough Council to impose and collect non-domestic rates from the building which houses the project instead of being included in the domestic rates of the farm house. This would place the running costs beyond our current budget! We are fighting this issue as decisions which are being made at a national level by the Valuation Office Agency (an executive agency of H.M. Revenue and Customs) could affect not only the 1301 project, but set a precedent for any other restored or conserved computers held by any organisation or individual in UK within garages, sheds, barns etc. The BCS Legal Department have provided us with a supportive statement which has been included in the letter to the Minister responsible for the Valuation Office Agency. Three representatives from the Valuation Office Agency visited the project on 17th February. The decision they reach will determine the final outcome and be used as a precedent for any other non-domestic equipment on which other Councils may seek to levy Rates. The BBC has visited Flossie the ICT 1301 in its home in deepest Kent and interviewed its custodians. The result was broadcast on 18th February. Bombe Rebuild Project - John Harper In Resurrection 49 I said that we had completed all planned construction. With hindsight this statement has turned out to be premature. Soon after Resurrection 49 went to press the team decided that they wanted the extra three German surface fleet wheels in the Typex. This, as many will recall, is the British Typex set up to operate as a German three wheel Enigma machine. This construction is now well under way and when complete will allow us the decrypt any three wheel message and demonstrate a complete Intercept to Ultra process for any combination of eight wheels. Training of Bombe operators is now our major activity. Recruitment has got off to a good start but we still need more people interested in operating our Bombe at weekends. One thing that had not been anticipated is that many volunteers are only able to be at Bletchley about once a month. Therefore we need more volunteers and need more time from existing team members for training. Having said that, the recruits we have so far are of high calibre and they are learning faster than we had expected. This will reduce the training time for all involved. Training documentation has been produced and is being refined as we go along. This too will all help when it comes to training later volunteers. One thing that is satisfying is that the machine is standing up well to new potential operators. We thought that mistakes might be made by trainees and that the Bombe might suffer but this concern is so far unfounded. The machine continues to do all that is asked of it and virtually all the problems encountered can be put down to human error. We now have a new activity. Letchworth - First Garden City Heritage Museum is staging a year-long exhibition entitled Letchworth Garden City and the Second World War from 20th March 2010 - 19th March 2011. The British Tabulating Machine Company and the Bombe design and manufacture will figure prominently. As part of this the Bletchley Park Trust is loaning display material and our team is involved in lending one of the Props Bombes produced specifically for filming of Enigma. Many will recall that this was a full length film featuring Kate Winslet and other well known actors. The Props Bombe we have chosen is slightly worse for wear and needs a good tidying up before it moves to Letchworth. This is already underway. When it arrives we have to refit parts left off during transit. Our website is still at www.jharper.demon.co.uk/bombe1.htm Top Previous Next News Round-Up -101010101- It is our sad duty to report the death at the age of 91 of Andrew Booth, one of the UK’s computer pioneers. Starting in 1945 at Birkbeck College London, he built a series of small, low cost machines. His (then) revolutionary ambition was to build a machine cheap enough that every university could afford one! This, as it turned out, satisfied a need which nobody else working in the field had identified, prefiguring the economic rationale of minicomputers by more than a decade. The APEC was adopted by the British Tabulating Machine company as the basis of its earliest computers, the HEC series, later enhanced and renamed as the ICT 1200 which became the most popular British computer in the late 1950s. Andrew Colin, writing in Resurrection 5, recalled Booth’s development of computers at Birkbeck. Former CCS Chairman Roger Johnson’s Computer Science at Birkbeck College - A Short History also contains a great deal of interesting material. Tilly Blyth and Roger Johnson took part in a tribute to Booth on BBC Radio 4’s Last Word programme on 22nd January. -101010101- The recession having hit newspaper advertising revenues, Technology Guardian ceased publication in December after some 27 years. One of the last articles to be published was Jack Schofield’s excellent coverage of the visit of Sir Maurice Wilkes to The National Museum of Computing at Bletchley Park, organised by our ever-energetic chairman, David Hartley. See www.guardian.co.uk/technology/2009/nov/18/maurice-wilkes-computing-witch Happily Technology Guardian lives on in an online edition which is hidden away at www.guardian.co.uk/technology. Liberated from its weekly publication cycle, articles are now published as they become available. Schofield has promised to try to add an entry to his popular Ask Jack column every day. Recommended. -101010101- The National Museum of Computing and the National Physical Laboratory have combined to create a new gallery at Bletchley Park celebrating the creation of the Internet and recalling the invention of packet switching by Donald Davies. See Martin Campbell-Kelly’s profile of Davies in Resurrection 44. -101010101- The UK’s first ever Vintage Computer Festival will be held in The National Museum of Computing at Bletchley Park on the 19th and 20th June. Enthusiasts and collectors from across Europe are expected to enjoy the many exhibitions, demonstrations, flea market and auction and a number of special talks and presentations. CCS secretary Kevin Murrell describes the event as “The ultimate geek version of the Antiques Roadshow!” -101010101- Under Tilly Blyth’s direction the Oral History of British Science programme for National Life Stories at the British Library has been conducting interviews with many of Britain’s surviving computer pioneers. These life story interviews differ from others as they tend to focus on early experiences and education, as well as the significant advances during a career. Interviews have started with Geoff Tootill and Raymond Bird, and have been agreed with many more significant pioneers. Unless interviewees request they remain confidential, the interviews will be made available to the public through the Sound Archive at the British Library. More information about the programme is available at www.bl.uk/historyofscience. -101010101- The BBC, in conjunction with the British Museum, is currently broadcasting a series of 100 programmes on Radio 4 which tell the stories of 100 objects. As you might expect, this ambitious series is more about antiquities than anything in living memory although Colossus is mentioned on the series website. But a spinoff is, we hear, being developed which will deal with 20th century technology. We have no news of broadcast dates as yet. -101010101- The CCS has renewed for another year its bulk subscription to the I.E.E.E. Annals of the History of Computing. This allows CCS members to obtain a year′s copies at the discounted price of £30 - a figure which has not changed despite the financial upheavals of the past year. Members who are not already subscribers can join the scheme by contacting Hamish Carmichael. Top Previous Next Pioneer Profile: Michael Woodger David Yates Michael Woodger Anyone working on the development of digital computers as early as 1946, as Mike Woodger was, clearly has impeccable credentials as one of the true pioneers of computing. At first an assistant to Alan Turing, no little distinction in itself, he went on to play important parts in the development and application of the Pilot ACE and ACE computers at the National Physical Laboratory, and later in the methodology, design and documentation of programming languages, especially ALGOL 60 and Ada. Mike was born on 28th March 1923 in Epsom, Surrey, England, the eldest of four children. His father Joseph Henry Woodger was emeritus professor of biology in the University of London, and one of his particular ambitions was the construction of a foundation of mathematical logic for biology, based on Whitehead and Russell′s work on the logical basis of mathematics. He taught the young Mike symbolic logic as soon as he was able to follow it. This unusual early grounding in rigorous scientific thinking, and particularly in formal methods, evidently planted a seed which was to grow and bear fruit a hundredfold. After graduating in mathematics from University College London in 1943 and working on military applications for the remainder of World War II, Mike joined the newly-established Mathematics Division of the National Physical Laboratory at Teddington in south-west London in May 1946. After a short period of training in numerical analysis, the main subject of the Division’s work, he was introduced to Turing’s plans for the ACE computer and von Neumann’s plans for EDVAC. He then assisted Turing with detailed logical design and experimental hardware, the latter including an abortive attempt to make an acoustic delay line store using the air column in a drainpipe. He remembers Turing as being kind, shy, and encouraging, and how they shared a common interest in logic. Eventually, after Turing’s departure in 1947 and NPL’s belated recruitment of staff with experience of pulse electronics, the development of a computer to be known as the Pilot ACE was undertaken in earnest. Mike’s main task was writing standard procedures for mathematical operations, which were successfully tested and put to use when Pilot ACE became operational in 1950. He also played a vital part in the application of Pilot ACE to practical problems. In an (unpublished) interview recorded in 1980 he says “I ended up developing an automatic system for handling matrices of reasonable size in the store . . . . . and this was very, very successful”. Indeed it was; his system allowed a whole program of matrix manipulations to be specified for the machine by someone not familiar with the machine language. It included space-saving and error checking facilities, and it was rewritten for the DEUCE and in this form widely used well into the 1960s. This work was important in two ways: it made the machines available to more users more efficiently and so greatly extended the range of applications to which they were put; and also, from a historical perspective, it proved to be one of the earliest practical steps on the road to modern high-level computer languages. In similar vein, the development of the ALGOL 60 language is mentioned in the same interview, but Mike, modest chap that he is, does not stress the importance of his own contribution, which lay particularly in organising and capturing, in a precise written form, the ideas of Peter Naur and his other fellow members on the International ALGOL Committee. Mike is one of the joint authors of the ALGOL 60 report, a milestone in the history of programming languages. At the time the interview was held in July 1980, Mike had started his second period as chairman of IFIP Working Group 2.3, the international forum for discussion of programming methodology. In this work he had stressed the idea that, since a large program is best constructed as a set of modules in such a way that modules at one level use the services of modules at the next lower level and provide a service to modules at the next higher level, the design of programming languages should reflect these same structural concepts. Each level should have its own terminology and not be cluttered with either text or concepts appropriate to other levels. His ideas, which have stood the test of time very well, are explained in more detail in ref. 1. His next contribution, acting as consultant to a consortium led by CII - Honeywell Bull, was to the design of the language at first called Green, one of the four contenders short-listed in a competition to establish a standard language for use in the US Department of Defense. The language was renamed Ada after its success in the competition in 1979. Mike continued his contribution via Alsys S.A., a French company developing Ada documentation, both before and after his retirement from NPL in 1983. Jean Ichbiah, the head of this company, described him fittingly as “an artist of technical writing”. His part-time work for Alsys ended in 1991. Fortunately for those interested in the history of computing, Mike has throughout his career kept documents and records systematically. When he retired from NPL the historical part of this considerable collection was moved to the library of the Science Museum in South Kensington, London, so that it could be made available to those with appropriate specialist interests. The computing world has been fortunate that Mike saw the vacancy notice for mathematicians at NPL in 1946. It led him to a career which was to prove ideally matched to his talents: clarity of thought, ability to handle complex abstract concepts and meticulous attention to detail, qualities he has continued to show to the present day. As published in Communications of the ACM 1 M Woodger. On semantic levels in programming. In: C V Freiman (ed.), Information Processing 71: Proc. IFIP Congress 71, held in Ljubljana, August 1971, North-Holland, 1972, pp.402-407. Top Previous Next Reminiscences of Whetstone Algol Brian Randell The co-author of ALGOL 60 Implementation, always known as Randell and Russell, muses on the origins of Whetstone ALGOL for the KDF9 and its influence. Whetstone ALGOL was inspired by a week-long school at Brighton Technical College in 1961 at which Edsger Dijkstra and Peter Naur seem to have cast a spell over everybody present (see also Tony Hoare in Resurrection 48). I was hired from college in 1957 by English Electric to program nuclear reactor codes on DEUCE. This was at their Atomic Power Division in Whetstone, just outside Leicester. However, I first spent the summer with IBM at their main London offices, which were in Wigmore Street, behind Selfridges Department Store. At IBM I programmed and operated a 650 computer, at the time IBM’s only computer in the UK. The 650 was proudly installed behind large plate glass windows, where it attracted great attention from the passers-by. After the sleek lines of the 650, with its simple and straightforward programming facilities, and its glossy printed documentation, the DEUCE’s workmanlike cabinetry, its binary-level programming, and its duplicated and stapled typewritten manuals, came as rather a shock to me when I arrived at Whetstone in the autumn of 1957. The DEUCE, English Electric’s first computer, was largely based on Alan Turing’s 1945, i.e. post-Bletchley Park, plans at NPL for the ACE computer. It therefore differed greatly from just about all its contemporaries, both British and American. The ancestry of these other machines could all be traced back more or less directly to the work of von Neumann and his colleagues on the design for EDVAC at the Moore School of the University of Pennsylvania. In particular, DEUCE’s design emphasised computation speed and hardware economy. The assumption was that programmers would be willing and able to take full advantage of the machine’s very low-level form of programming. This involved working in binary, and directly controlling the timing of bringing operands to function units and of initiating the operation of these units - in effect a form of what later came to be known as microprogramming. The effect was that, in the hands of an expert programmer, DEUCE could considerably outperform other computers of basically similar hardware speed. I and a fellow maths graduate from Imperial College, Mike Kelly, who joined Whetstone at the same time as me, soon came to regard DEUCE with affection, even as a giant toy. So we did not confine our programming to nuclear reactor codes, but instead, on our own initiative, wrote numerous demonstration programs, and started investigating how we could get DEUCE to help with some of the more detailed clerical aspects of its own programming. DEUCE was in fact not only difficult to program, it was also very difficult to compile for. Most of the then-existing aids to programming DEUCE were interpreters rather than compilers. The effectiveness of these interpreters depended on the cost of the interpretation loop relative to the value of the individual interpreted operations. Thus NPL’s interpreter controlling execution of a set of (efficiently hand-coded) linear algebra routines was very cost effective, and being also convenient was heavily used. At the other end of the spectrum was Alphacode, an interpreter that had been developed for a simple language for floating point scalar arithmetic. This was normally used only for small one-off calculations since programs written in it were typically orders of magnitude slower than the equivalent hand-optimised machine code. The fact that DEUCE, like all the earliest machines, did not have any built-in floating-point arithmetic was very significant. There were subroutines for the various basic floating-point arithmetic operations. But in the interests of efficiency many calculations were laboriously programmed using fixed-point arithmetic. Indeed we programmers regarded the use of floating point almost as cheating, and the coming of hardware floating point as more of a challenge than an opportunity. Against this background Mike and I came up with a compromise solution that took advantage of some peculiar DEUCE hardware features in order to produce a very fast interpreter, albeit just for fixed-point computations. (As I recall it, the main loop of our interpreter had only five instructions, whereas more conventional interpreters typically employed loops involving 50 to 100 instructions.) Despite management opposition - and indeed threats of dismissal - we pressed on with our ideas, and mainly in our own time developed a system which we called EASICODE. EASICODE was so successful, particularly among the scientists and engineers who were the main computer users at Whetstone, that by the time the KDF9 came onto the horizon, I was in charge of a small “Automatic Programming Department”. Lawford Russell had joined me, replacing Mike Kelly who had left for pastures new at IBM (where he put his great coding skills to developing the microcode for what became the System 360/40). Moreover, there was full management support for us to exercise our skills and enthusiasms on this exciting new computer. One important aspect of our environment was that we were working in very close proximity to the users of our programs, and had extensive experience of operating our own and other people’s programs on the DEUCE. We therefore had developed strong (self-defensive) views on the need for programs to be robust in the face of input, operator and hardware errors, and for them to provide meaningful feedback to users whenever possible. (Little did I know that such experiences and views would strongly influence most of my subsequent career, not just our plans for KDF9.) KDF9 was a floating point machine, with a set of 16 high speed registers, organised as a small arithmetic stack or pushdown store, together with another set acting as a pushdown store for subroutine return addresses. So it was very different from DEUCE, and indeed all other computers. (Superficially it had similarities with the contemporary Burroughs B5000, but in fact it presented a very different and harder challenge to compiler writers than the B5000.) Indeed, as with DEUCE, carefully hand-coded KDF9 programs could be very efficient, but the problem of compiling comparably good code from a high-level language was far from straightforward. Faced with this challenge, and a market where IBM’s scientific computers already had an effective optimising compiler for FORTRAN, English Electric’s Computer Division at Kidsgrove was planning a very ambitious full optimising compiler for the recently defined ALGOL 60. It was evident to us that this compiler would not be ready until some considerable time after the first KDF9s were due to become operational. And our view was that, though this compiler might produce fast code, the compiler itself was likely to be far from fast. We therefore decided to concentrate on the early provision of a programming facility that would be suitable for repeated use during program development. Brighton Technical College Then Lawford Russell and I, and members of the Kidsgrove team, attended a workshop at the Brighton College of Technology in 1961, organised by their Automatic Programming Information Centre. At this meeting Edsger Dijkstra described the then very new ALGOL 60 compiler for the Electrologica X1 computer that he and his colleague Jaap A. Zonneveld had implemented. At this stage Lawford and I had not decided what programming language we should support, and were showing dangerous signs of wanting to create our own. Luckily, one of the Kidsgrove group suggested that we should approach Dijkstra to find out if he would support our basing our work on his compiler. He readily agreed, and - with some difficulty - we got English Electric to send us to Amsterdam for a week to confer with him. Our week of discussions with Dijkstra was spent not just on learning how the X1 compiler worked, but also on the design of a high speed translator/interpreter of full ALGOL 60 for the KDF9. These discussions we documented in a lengthy report. For the next few years Dijkstra used our report to defend himself from the numerous further requests he was getting from people who wanted to visit him and find out about the X1 compiler. (Having lost my own copy of this report many years ago, it was only recently that I found another one after a long search through numerous libraries and archives, and made it available on the Internet. I have also only just been kindly alerted - by Doaitse Swierstra - to the existence of a Report by F.E.J. Kruseman Aretz published in 2003 by the Centrum voor Wiskunde en Informatica, successor to the Mathematical Centre. This report contains the full assembly code text of the X1 Compiler together with that of an equivalent Pascal version.) The X1 compiler was I believe the world’s first ALGOL compiler, certainly the first to cope with all the major new challenges of ALGOL 60, such as block structuring, call by name parameters, recursion, etc. (Indeed I recall that some of the German members of the ALGOL Committee, who were struggling to implement the new language that they had voted for, campaigned for the removal of some of these facilities from the language.) There was at this time very little published literature on the subject of ALGOL compilation. I recall there being just one book on compilers, that by Halstead on compiling NELIAC, a dialect of ALGOL 58 language. (ALGOL 58 introduced a number of the features that were extended and generalised in ALGOL 60 - for example it allowed nesting of procedures, with corresponding scoping of identifiers, but it did not have block structuring.) There had, in addition, been a recent growth in publications on formal syntax and syntax analysis (in large part spurred by the ALGOL 60 Report’s use of what became known as BNF), and on arithmetic expression translation. But we relied almost entirely on information and advice that we received from Dijkstra. We set to, and started the detailed design of what became known as the Whetstone ALGOL Translator. We never used the term “WALGOL” ourselves, and avoided the word “compiler” because we were converting ALGOL 60 into an intermediate language, of the form that later became known as P-code, rather than into machine instructions. In effect our intermediate language defined the order code of the machine that we would have liked to have had as our target - an order code that was, I learned some years later, one of the sources of inspiration for the Burroughs 6500 computer, the successor to their B5000. There was considerable, in general friendly, rivalry between Whetstone and Kidsgrove, but also - thanks largely to Fraser Duncan, to whom the Kidsgrove team reported - good cooperation in ensuring the compatibility or our respective systems. In any arguments over such issues we used the ALGOL 60 Report as a neutral referee, so to speak, and thus both projects stuck pretty closely to the letter of the Report. We at Whetstone placed considerable emphasis on easing the program development task. One consequence was that we were dissatisfied by Dijkstra’s strategy of having his compiler signal the first error that it found and then refuse to carry on any further - rather we worked hard on developing a strategy whereby our translator would process the entirety of even a very error-ridden program. It did this in such a way as to have a very good chance of producing an accurate list of many of the errors contained in the program, without getting confused and reporting lots of alleged errors in perfectly good sections of program text. I still recall that when Dijkstra wrote to acknowledge the copy of the report that we had sent him describing this scheme he for the first time addressed me as “Brian”. In all our extensive previous correspondence he had always addressed me as “Mr Randell”. I felt as though I’d had just been awarded a Masters degree! We chose to invent a flow diagram notation to use for the detailed design and documentation of our translator and interpreter. (The idea of documenting our design in ALGOL and using bootstrapping as way a means of implementing it, an approach used for NELIAC, was too strange and unfamiliar an idea for us to contemplate at all seriously.) We then hand-coded the flow diagrams in the KDF9’s assembly language (“usercode”) and used a painfully slow emulator of the KDF9 that had been written for DEUCE to start the process of checking them. But then the delivery date for the first KDF9 started slipping - just a few weeks at a time, but over and over again. These delays were very disruptive, but in fact we ended up putting them to very good use. Someone suggested that we publish the details of Whetstone ALGOL in book form. We sought Dijkstra’s views on this. He was very supportive, and gave me some very good advice, that I’ve since passed on to many graduate students. This was to the effect that a straightforward description of our system would be of interest to only a very limited readership. However, an account that documented all the possibilities we had considered for each design decision that we had taken, explained why we had made our particular design choices (admitting where necessary that a decision had been arbitrary), and reviewed the merits of these decisions in the light of our subsequent experience, would be of much greater value. So we set out to write such a book, while we awaited the arrival of the KDF9. We worked on the book very intensively, since we had no idea how long the delays would continue, and we thought (in fact wrongly) that others elsewhere, in particular Peter Naur in Copenhagen and Gerrit van der Mey in at the Dr. Neher Laboratorium in the Netherlands, were already busy preparing books on their ALGOL efforts. We completed the book in nine months, mainly in our spare time. Our inexperience as authors resulted in our providing the publishers, Academic Press, with a much more polished manuscript than they were used to receiving. We were mindful of all the horror stories that Dijkstra had told us of errors that had been introduced into some of his publications by the manual typesetting process - indeed he urged us to insist on direct photographic reproduction of our typescript. We did not follow this advice, but instead had several typewriters modified so as to have the full ALGOL 60 character set, even including the small subscript “10” character. And we also had all numeric digits use an italic font - so that there could be no confusion between the letter “O” and the digit “0”, or between the letter “l” and the digit “1”. These precautions paid off handsomely. Somewhere during this process, reacting to an announcement of yet another two- week delay, for fun we started to see what we could produce during this period in the way of a little ALGOL -like system for DEUCE, with one-character identifiers (associated with specific words in a 32-word mercury delay line memory). A small succession of further delays, and consequent extensions of our little toy system, ended up with our having - to our own surprise - produced a useful system. This implemented block structure, simple arithmetic expressions, assignment and for statements, procedures, simple parameters, and input/output. Quite a few people started using it in anger for small one-off calculations. But once the first KDF9 became operational at Kidsgrove we abandoned this diversion. Meanwhile various colleagues, at Whetstone and elsewhere, started to implement our flow diagrams in order to provide full ALGOL 60 on various other machines, even including DEUCE. In so doing they provided valuable feedback on both the diagrams, and the early drafts of our book. (I recall that the chapter on blocks and display, at the time very novel and difficult to understand concepts, was rewritten at least five times. It ended up as perhaps the clearest part of the book!) As a result, when we at last got access to a more-or-less working KDF9, our implementation of Whetstone ALGOL went very smoothly. (For a while, if we ever had unexpected results when running our system on the first KDF9, it was our habit to try to reproduce them on Kidsgrove’s DEUCE emulator for KDF9. Any time the results differed we could be sure that the machine and/or the emulator were at fault, and we would gleefully hand both back to their developers while they found out which needed to be mended.) Our system was designed to work on a minimum configuration KDF9, with 8k word memory, and to read its input directly from paper tape. Given its intended use as a program development system, we put considerable effort into ensuring that the translator operated in a single pass. And Lawford did such a good job of overlapping translation with input that the translator kept the paper tape reader operating at full speed, and there was no perceptible delay after the end of tape was reached before the translated program started running. Thus users could regard our system as essentially cost-free. (In contrast, to our great scorn, KDF9’s assembler seemed to do most of its work only after reading had finished.) Another ALGOL compiler activity of which we were directly aware was that by Tony Hoare at Elliott Brothers. His compiler, for the Elliott 803, was in fact I believe operational before any of the compilers based on our design. However, it was for a somewhat limited version of ALGOL. Thus when Tony published his famous QuickSort algorithm in the Communications of the ACM , he included a version that avoided the use of recursion, since this was not supported by his compiler. I’m afraid that, as soon as our system was operational, we rather mischievously sent in a certification of the original recursive version of QuickSort, whose publication in effect signalled the completion of Whetstone ALGOL. I have already alluded to the ALGOL compiler designed by Gerrit Van der Mey - probably the most impressive of the early ALGOL compiler efforts. This was not just because it was the most complete implementation of the language of which I was aware - it even handled dynamic own arrays and various other unfortunate little complications of the language that just about everybody else avoided tackling. Rather it was because van der Mey, who I met just once when I was taken to his house by Fraser Duncan, was both totally blind and deaf. Yet despite these handicaps, he had almost singlehandedly designed his compiler, though colleagues at the Dr. Neher Laboratorium helped with such debugging as was needed. The one other ALGOL Compiler I can recall knowing about in those days was an amazing effort by Brian Higman, for an incredibly small machine, an experimental process control computer being developed by G.E.C. whose total memory capacity was 512 20-bit words. But elsewhere in the UK the emphasis was on Autocode rather than ALGOL, but this was a subject we had very little contact with. By 1964 the Whetstone System was operational, our book had been published, more compiler projects had been undertaken based on it, and I had been head-hunted by IBM to join the T.J. Watson Research Center in Yorktown Heights. I had agreed to join IBM on condition that I was not asked to continue working on ALGOL compilers - I wanted a change. They agreed, and moreover agreed to my accepting the invitation that I’d had a long time earlier to join the IFIP Working Group 2.1 on ALGOL, an invitation that English Electric had not let me accept because of the costs of the foreign travel involved. (The Working Group was formed in 1962 as a successor to the original ALGOL Committee.) So I attended the 1964 meeting of the Committee, and the associated IFIP Working Conference on “Formal Language Description Languages”, held in Baden, near Vienna. (This conference attempted, not altogether successfully, to bring together people interested in language theory and ones interested in actual programming languages and their compilation.) And thus, over the next few years, I found myself in the midst of the growing controversy over plans for a successor to ALGOL 60, culminating in being one of the small group of committee members who in 1968 wrote a Minority Report on ALGOL 68, and resigned en masse from the Committee. But that is another story. One final anecdote: when I returned to the UK in 1969, to the Chair of Computing Science at Newcastle University, I found that Whetstone ALGOL was still the mainstay of their first-year programming courses, and so was being used very extensively on their soon-to-be-replaced KDF9. I took care to make it clear that I wanted no part in dealing with any problems that the University might have with the system, only to be told: “We wouldn′t let you touch it even if you wanted to!” Similarly, I also can claim no credit at all for the recent splendid effort by David Holdsworth and colleagues that has succeeded in getting the Whetstone ALGOL system working again, starting from a lineprinter listing of its code, on an emulator of the KDF9. Editor’s note: This is a transcript of a talk given by the author at the Science Museum on 14 January 2010. Brian Randell is Emeritus Professor of Computer Science at the University of Newcastle and was recently made a fellow of the ACM for his outstanding contribution to computer history. He can be contacted at . CCS Web Site Information The Society has its own Web site, which is now located at www.computerconservationsociety.org. It contains news items, details of forthcoming events, and also electronic copies of all past issues of Resurrection, in both HTML and PDF formats, which can be downloaded for printing. We also have an FTP site at ftp.cs.man.ac.uk/pub/CCS-Archive, where there is other material for downloading including simulators for historic machines. Please note that the latter URL is case-sensitive. Top Previous Next On the Occasion of the 50th Anniversary of the ALGOL 60 Report Tony Hoare “Here is a language [ALGOL 60] so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors. Of particular interest are its introduction of all the main program-structuring concepts, and the simplicity and clarity of its description, rarely equalled and never surpassed.” The quotation is from the appendix to the published version of a keynote talk that I gave at the ACM SIGPLAN Conference in Boston, October 1973. The talk put forward the view that “the primary purpose of a programming language is to help the programmer in the practice of his art... it should give him the greatest assistance in the most difficult aspects of his art, namely program design, documentation, and debugging”. The relevant criteria I gave for judging quality of a language included “simplicity, security2, and readability”. Most of my talk described how several of ALGOL 60’s rivals and successor languages pursued alternative criteria. “...I do not wish to deny that there are many other desirable properties of a programming language: for example, machine independence, stability of specification, use of familiar notations, a large and useful library, existing popularity, or sponsorship by a rich and powerful organisation.” Subsequent history has shown that these desirable properties have been dominant in the choice of programming language by the vast majority of programmers; it was the failure of the ALGOL family of languages to meet these criteria that contributed to its eclipse. Even so, the ideas and the ideals of the ALGOL family of programming languages live on. They are most apparent in newer language designs such as Java and C#, and in the continuing stream of features that are being added to those languages. Even beyond just language design the ALGOL influence stretches to the design of Application Program Interfaces for general-purpose and special-purpose library packages. It extends also to design patterns, which are widely copied (with adaptation) into programs written in existing languages. A design pattern is an embodiment of a software architecture, or an implementation of an abstract programming concept. Effectively, the user of a design pattern is coding by hand a new structure or feature of a more abstract programming language. There remains the objection that extensions to existing languages only build further complex superstructure on foundations that are already too complicated, and inherently insecure. So let us not abandon research to discover a language which combines the simplicity and security of ALGOL 60 with the full range of modern program-structuring principles used in libraries and design patterns: these include resource management, control of responsiveness, object orientation, distribution and concurrency. Such a language would play the same fundamental and unifying role in Computer Science as the framework of natural law plays in other branches of science. Even if the language never achieves wide usage, everyone who understands and appreciates it would thereafter be a better programmer, no matter what language they use for coding their designs. Because that, in essence, has been the really enduring achievement of ALGOL 60 itself. 2 Security was defined as the property that the effect of running a program (even an incorrect one) is entirely predictable from the specification of the high level language, and does not require knowledge of the underlying machine or operating system or translator. Editor’s note: This is a paper submitted by the author to the seminar on ALGOL 60 at the Science Museum on 14 January 2010. Professor Sir Tony Hoare is Emeritus Professor of Computer Science at Oxford University and is now a principal researcher at Microsoft Research in Cambridge. He can be contacted at . Top Previous Next An ALGOL 60 Spiritual User Luca Cardelli Today most respectable programming languages inherit from ALGOL 60. Here we consider a small selection of them and show what ideas have been copied to best effect. Luca Cardelli ALGOL 60 was one of the most talked-about programming languages when I was an undergraduate in Pisa in the early 70’s, and it clearly had a great influence on my studies. I should make it clear right away, though, that a language with a grand total of three types (real, integer, Boolean) had no conceivable practical use for me! The avant-garde languages of the time were Simula 67 and ALGOL 68. I carefully read the grid-arranged chapters of the ALGOL 68 book in both orders, and the object-ridden Simula 67 book a few times. Although more attracted to ALGOL 68, I ended up using Simula 67 as my main programming language because it was available on my IBM mainframe. Considering the alternatives (Fortran, COBOL, PL/1, and Lisp 1.5) there was really little choice. Simula 67 had ALGOL 60 as a subset, but I was attracted to it mostly by its rich type system, garbage collection, and string processing, none of which were features of ALGOL 60. ALGOL block structures were put to new uses, as objects, inventing object-oriented programming over 10 years before it became widely popular. There is no doubt that, through Simula, I was raised in the algorithmic programming style of ALGOL 60, and on the Structured Programming approach of Dijkstra and Hoare. To the point that later in life I could never bring myself to read assembly language algorithms (i.e. Knuth’s): that must surely count as a negative influence of ALGOL 60! The other main influence on my education was λ-calculus, and through it Lisp 1.5. I used Lisp extensively too, but I never enjoyed it because, compared to Simula it had no types, and its supposedly advanced interactive environment was actually quite bizarre. The main connection between λ-calculus and ALGOL 60 goes through Landin: I was familiar with that work too as an undergraduate. ALGOL 60 also deeply influenced my studies of denotational semantics, because some of its more challenging applications were in the semantics of imperative languages (typically ALGOL - like) through Wadsworth’s continuations. After my graduation from Pisa, the influence of ALGOL 60 on me stopped for a while: I moved to Edinburgh and started studying various schools of formal semantics, and functional programming. I found the compilation of functional languages to be an interesting topic, and decided to implement an ML compiler. Ah, but implement it in what language? Surely not in error-prone Lisp, and ML itself was not up to the task yet. I had no longer access to Simula 67, but we had Pascal on the department’s VAX. Hence Pascal became the most ALGOL- like language I ever actually used: it had no objects or garbage collection, but was perfect for the systems programming tasks of the times, including implementing a compiler. A bit later, my first paper on ML type-checking was published with an appendix consisting of a Modula-2 version of my Pascal code for the Hindley-Milner algorithm: that was because ALGOL still had such a large influence that the reviewers rejected the “obscure” ML code that I had first written for that appendix. The purpose of the discussion so far is to set the context of my involvement in Modula-3, because that is the only (possibly) ALGOL - like language that I helped design. Functional programming, the main subject of my research, was not up to the task of writing real systems programs. During my first job at Bell Labs, in the original Unix group, I used C extensively. By the time I moved to Digital Equipment Corporation I was glad to return to a more ALGOL - like language. Modula-2+ was a Xerox-PARC inspired extension of Modula-2 that was widely used at DEC Systems Research Centre: it had a strong list of innovations, but was in need of a redesign. The general problem to be solved then (and now) was to carry out systems programming in a well-structured and type-safe language. My main contributions to Modula-3 were in reformulating the type system of Modula-2+, using techniques coming largely from functional programming. An explicit goal of the Modula-3 committee as a whole was to be inspired by both Simula 67 and ALGOL 68 (and Modula-2 of course). For about 10 years, between Simula 67 and Java, Modula-3 became my favourite language, and my favourite working language of all times. I still regard its module system as unsurpassed, and I say this as a user rather than a designer: it made it possible to cleanly structure the large programs I was writing in ways that I have not found in other languages, including modern or functional languages. Which brings us back to ALGOL 60. The original spirit of ALGOL 60 was the effort to structure algorithms so that they can be better understood, in contrast, for example to structuring functions, or equations, or formulas, or machine architectures. That is what has always driven me, as a language user and designer. ALGOL 60 did not have much of a type system, but its direct successors ALGOL 68 and Simula 67 made radical and divergent contribution to structuring algorithms by types. It did not have Simula’s classes or Modula’s modules, but both evolved from program blocks as independent disciplines of programming. Algol 60 lacked all the most important features, and yet its spirit inspired them all. ALGOL 60 is my favourite language that I have never used, except in spirit. Editor’s note: This is a paper submitted by the author to the seminar on ALGOL 60 at the Science Museum on 14 January 2010. Luca Cardelli is a Principal Researcher at Microsoft Research Cambridge. He heads the Programming Principles and Tools Group and his main interests are in type theory and operational semantics, mostly for applications to language design and distributed computing. He can be contacted at . Cliff Jones In the absence of their respective authors, both the paper above and that of Tony Hoare were read at the Science Museum by Cliff Jones of Newcastle University. The meeting was held jointly with the Advanced Programming Group of the BCS with speakers organised by the APG’s John Florentin. The CCS is grateful to everybody who contributed to a fascinating afternoon. Top Previous Next The Atlas ALGOL System Bob Hopgood To overcome the variation between dialects of ALGOL 60, the Chilton Atlas team implemented an ingenious and versatile pre-processor for the language. I joined the programming group at Harwell in 1959 and spent much of the next three years writing quantum chemistry programs in Fortran, initially for the IBM 704 at Aldermaston and later the 7090. Source code consisted of about five 2,000-card trays and the relocatable binary about 500 cards. A typical development run was the relocatable binary card set plus the source of any revised subroutines. Initially this was sent by car to Aldermaston or plane to Risley but later an IBM 1401 link was established. In 1962 the UKAEA had 10 times the computer power of the UK academic community as a whole. In 1963 Aldermaston replaced the 7090 with an IBM 7030 (Stretch) and the 7090 moved to Risley. Unfortunately, the Fortran compiler delivered by IBM was very poor and Alick Glennie at Aldermaston assembled a team to produce three Fortran compilers in succession (S1, S2 and S3) for the 7030. S1 was a simple fast-compilation system, S2 had basic-block optimisation and S3 included global optimisation. I worked on the lexical/syntax analysis parts of S2 for the first half of 1963 before joining the new Atlas Computer Laboratory (ACL) at Chilton in September 1963. In 1961 work started on a Fortran compilation system for the Ferranti Atlas called Hartran. This was written in Fortran and ran initially on the 7090 before being moved to Atlas. The role of ACL was to provide additional computer facilities for the universities, Harwell and the Rutherford High Energy Laboratory. It was clear early on that the interest in ALGOL in the universities was sufficiently high that an ALGOL Compiler was needed on the Chilton Atlas that could handle programs originating from the Elliott and English Electric families as well as from Ferranti machines. Alex Bell had joined ACL from Manchester. Alex and I were given the responsibility of producing an ALGOL system on Atlas that would be of use to the UK academic community. Initially, the idea was to write yet another ALGOL compiler but after discussions with Robin Kerr (Ferranti, later ICT) and John Clegg (Manchester University) we decided to use the ALGOL compiler that they were writing for Atlas using the Brooker-Morris Compiler-Compiler system and replace the first stage by a flexible pre-processor that would allow the different ALGOL dialects in use to be input and run with their own input/output system. The two computers for which ALGOL programs were most frequently written were the English Electric KDF9 and the Elliott 803. In each case both the input/output facilities provided and the punched form of the program were completely different from the Atlas ALGOL conventions. The 803 used five-hole paper tape, the KDF9 eight hole while Atlas used seven hole! Luckily the tape reader on Atlas could read all three forms of tape. Reading University had an Elliott 803 and we were much indebted to Leonard Dresel who allowed us to come over and run tests on a Friday afternoon. In return, when the pre-processor was up and running in late 1965, we were able to run an eight hour shift of Elliott 803 ALGOL programs through Atlas in about five minutes. To get round the lack of separate compilation facilities in the ALGOL language, the Atlas system wrapped each ALGOL program inside an outer block which had its own variable declarations and a set of procedures. This allowed the individual I/O libraries, the ALGOL library and the GROATS graphics system to be selectively added to any ALGOL program. The pre-processor eventually had over 15 different dialects of ALGOL that it could handle including both French and Danish dialects. KDF9 ALGOL programs could run with Elliott I/O and vice versa if needed. We introduced a card-based dialect and attempted to overcome the sheer size of the source programs by having a pseudo-binary format that encoded each ALGOL basic symbol, identifier and number as a single column on a card together with a symbol table. This made the size manageable but made editing difficult. So we added an editor to allow the pseudo-binary cards to be changed. Even so, running large ALGOL programs was always more difficult that running Fortran programs. Despite all our efforts, the ALGOL usage on Atlas was never more than about 10% of the workload compared with 65% for Fortran and the remainder spread across machine code, Mercury and Atlas Autocodes etc. Each week, Atlas read a million cards and 30 miles of paper tape! The elegance of the ALGOL language never overcame its basic weaknesses: no defined markup, no standard I/O, no defined libraries, no sub-compilation and no relocatable binary loader. Despite all our efforts, the bulk of the scientific community stayed with Fortran despite its language deficiencies. Fortran COMMON and EQUIVALENCE statements together with sub-compilation and relocatable binary made it the more attractive option. Editor’s note: Bob Hopgood is Emeritus Professor of Computer Science at Oxford Brookes University. He can be contacted at . Top Previous Next ALGOL 60 - Questions of Naming Dik Leatherdale Take pity upon your poor editor! Contributions to this, the special ALGOL 60 edition of Resurrection, arrive from hither and thither and yet no sign of any agreement about how to write “ALGOL”. Some write “Algol”, others “ALGOL”. It’s a short form of “Algorithmic Language” of course, but shouldn’t that be “AlgoL”? But nobody writes “AlgoL”. So we refer, of course, to Wikipedia which tells us that The name of the family is sometimes given in mixed case (Algol 60) and sometimes in all uppercase (ALGOL 68). For simplicity, this article uses ALGOL. - a ruling Wikipedia then proceeds to ignore with what we used to call “gay abandon”. The report was originally published in the Communications of the ACM and in the Computer Journal. As you will see on page 13, Comm. ACM uses “ALGOL” which looks rather elegant. Other sources, however, do not. In the end I decided to do whatever the Computer Journal did. Our esteemed chairman looked it up for me. So “ALGOL” it is then. But the original name (of ALGOL 58) was the International Algebraic Language - IAL for short. In the hands of Jules Schwartz, this grew into the once-popular JOVIAL - Jule’s Own Version of the International Algebraic Language, reminding us of Donald Knuth’s wise words - The most important thing in the programming language is the name. A language will not succeed without a good name. I have recently invented a very good name and now I am looking for a suitable language And then again, there is another Algol - the first example to be discovered of a “binary star” - two or more stars orbiting around one another which appear at first sight to be a single celestial object. Once again, we are indebted to Wikipedia to discover that: The name Algol derives from Arabic ra′s al-ghul : head (ra’s) of the ogre (al-ghul) (“the ghoul”) which was given from its position in the constellation Perseus. So there you have it - a programming language, a bon mot and a binary star. So take pity upon your poor editor. This is what he does all day! TopPrevious Next Forthcoming Events London Seminar Programme 15 Apr 2010Software Development at Hursley: The First 35 Years and BeyondGeoffrey Sharman & Andy Krasun 20 May 2010Pegasus @ 50 Chris Burton, Len Hewitt & others 16 Sep 2010Research MachinesJohn Leighfield 28 Oct 2010Computers & Communications John Naughton & Bill Thomson 18 Nov 2010Konrad Zuse and the Origins of the Computer Horst Zuse London meetings take place in the Director’s Suite of the Science Museum, starting at 14:30. The Director’s Suite entrance is in Exhibition Road, next to the exit from the tunnel from South Kensington Station, on the left as you come up the steps. Queries about London meetings should be addressed to Roger Johnson at , or by post to Roger at Birkbeck College, Malet Street, London WC1E 7HX. Manchester Seminar Programme Meetings are currently suspended while refurbishment proceeds at the Museum. It is expected that resumption will be in the autumn. North West Group meetings take place in the Conference Room at the Manchester Museum of Science and Industry, usually starting at 17:30; tea is served from 17:00. Queries about Manchester meetings should go to Gordon Adshead at . Details are subject to change. Members wishing to attend any meeting are advised to check the events page on the Society website at www.computerconservationsociety.org for final details which will be published in advance of each event. Details will also be published on the BCS website (in the BCS events calendar) and in the Events Diary columns of Computing and Computer Weekly. Museums MOSI : Demonstrations of the replica Small-Scale Experimental Machine at the Museum of Science and Industry in Manchester have been suspended due to development work in the museum. Resumption is likely to be late next summer. Bletchley Park : daily. Guided tours and exhibitions, price £10.00, or £8.00 for concessions (children under 12, free). Exhibition of wartime code-breaking equipment and procedures, including the replica Bombe and replica Colossus, plus tours of the wartime buildings. Go to www.bletchleypark.org.uk to check details of times and special events. The National Museum of Computing : Thursday and Saturdays from 13:00. Entry to the Museum is included in the admission price for Bletchley Park. The Museum covers the development of computing from the wartime Colossus computer to the present day and from ICL mainframes to hand-held computers. See www.tnmoc.org for more details. Science Museum :. Pegasus “in steam” days have been suspended for the time being. Please refer to the society website for updates. North West Group contact details Chairman Tom Hinchliffe: Tel: 01663 765040. Email: Secretary Gordon Adshead Tel: 01625 549770. Email: Top Previous Next Committee of the Society [The printed version carries contact details of committee members] Chairman Dr David Hartley FBCS CEng Secretary & Leader, DEC Project Kevin Murrell Treasurer Dan Hayton Chairman, North West Group Tom Hinchliffe Editor, Resurrection Dik Leatherdale MBCS Web Site Editor Alan Thomson Archivist Hamish Carmichael FBCS Meetings Secretary Dr Roger Johnson FBCS Digital Archivist & Leader, Our Computer Heritage Project Professor Simon Lavington FBCS FIEE CEng Science Museum representative Dr Tilly Blyth MOSI representative Catherine Rushmore Codes and Ciphers Heritage Trust representative Pete Chilvers Leader, Colossus Project Tony Sale Hon FBCS Leader, Elliott 401 Project Chris Burton CEng FIEE FBCS Leader, Bombe Rebuild Project John Harper Hon FBCS CEng MIEE Leader, Elliott 803 Project John Sinclair Leader, Pegasus Project Len Hewitt MBCS Leader, Software Conservation Project Dr Dave Holdsworth CEng Hon FBCS Leader, 1301 Project Rod Brown Leader, Harwell Dekatron Computer Project Tony Frazer Professor Martin Campbell-Kelly Peter Holland Dr Doron Swade CEng FBCS MBE Point of Contact Readers who have general queries to put to the Society should address them to the Secretary: contact details are given elsewhere. Members who move house should notify Kevin Murrell of their new address to ensure that they continue to receive copies of Resurrection. Those who are also members of the BCS should note that the CCS membership is different from the BCS list and is therefore maintained separately. Top Previous Aims and objectives The Computer Conservation Society (CCS) is a co-operative venture between the British Computer Society, the Science Museum of London and the Museum of Science and Industry (MOSI) in Manchester. The CCS was constituted in September 1989 as a Specialist Group of the British Computer Society (BCS). It thus is covered by the Royal Charter and charitable status of the BCS. The aims of the CCS are to To promote the conservation of historic computers and to identify existing computers which may need to be archived in the future To develop awareness of the importance of historic computers To develop expertise in the conservation and restoration of historic computers, To represent the interests of Computer Conservation Society members with other bodies, To promote the study of historic computers, their use and the history of the computer industry, To publish information of relevance to these objectives for the information of Computer Conservation Society members and the wider public. Membership is open to anyone interested in computer conservation and the history of computing. The CCS is funded and supported by voluntary subscriptions from members, a grant from the BCS, fees from corporate membership, donations, and by the free use of the facilities of both museums. Some charges may be made for publications and attendance at seminars and conferences. There are a number of active Projects on specific computer restorations and early computer technologies and software. Younger people are especially encouraged to take part in order to achieve skills transfer.
4768
dbpedia
0
25
https://craftofcoding.wordpress.com/category/algol/
en
Algol – The Craft of Coding
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://craftofcoding.wordpress.com/wp-content/uploads/2022/11/e_sale_algol60.png?w=680", "https://craftofcoding.wordpress.com/wp-content/uploads/2021/03/algorithm35.png?w=572", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2024-06-11T14:23:12+00:00
Posts about Algol written by spqr
en
https://s1.wp.com/i/favicon.ico
The Craft of Coding
https://craftofcoding.wordpress.com/category/algol/
Algol Quicksort – Algorithm 63, and 64 Quicksort is a well-known sorting algorithm developed by C. A. R. Hoare in 1962 while working for the small British scientific computer manufacturer Elliott Brothers Ltd. [1, 2]. In his 1980 ACM Turing Award Lecture [3] he describes how a course on Algol 60, taken in early 1961 introduced him to the concept of recursive procedures. The presence of recursion in Algol 60 allowed Hoare to write Quicksort, a concept he had invented while trying to improve upon the Shellsort of Shell [4]. The entire algorithm was described in half a page, and comprised of two algorithms (written in Algol 60): Algorithm 63 which described the partition function, and Algorithm 64 which performed the actual Quicksort function. One of the greatest benefits of Algol 60 was the fact that it, unlike Fortran, allowed recursion. Algorithm 64 procedure quicksort (A,M,N); value M,N; array A; integer M,N; comment Quicksort is a very fast and convenient method of sorting an array in the random-access store of a computer. The entire contents of the store may be sorted, since no extra space is required. The average number of comparisons made is 2(M-N) ln (N-M), and the average number of exchanges is one sixth this amount. Suitable refinements of this method will be desirable for its implementation on any actual computer; begin integer I,J; if M < N then begin partition(A,M,N,I,J); quicksort(A,M,J); quicksort(A,I,N); end end quicksort Algorithm 63 The original Algol algorithm as published in Communications of the ACM, used a series of goto statements to facilitate the partitioning of the array into those items less than the pivot, and those items greater than the pivot. procedure partition (A,M,N,I,J); value M,N; array A; integer M,N,I,J; comment I and J are output variables, and A is the array (with subscript bounds M:N) which is operated upon by this procedure. Partition takes the value X of a random element of the array A, and rearranges the values of the elements of the array in such a way that there exist integers I and J with the following properties: M ≦ J < I ≦ N provided M < N A[R] ≦ X for M ≦ R ≦ J A[R] = X for J < R < I A[R] ≧ X for I ≦ R ≦ N The procedure uses an integer procedure random(M,N) which chooses equiprobably a random integer F between M and N, and also a procedure exchange, which exchanges the values of its two parameters; begin real X; integer F; F := random(M,N); X := A[F]; I := M; J := N; up: for I := I step 1 until N do if X < A[I] then go to down; I := N; down: for J := J step -1 until M do if A[J] < X then go to change; J := M; change: if I < J then begin exchange(A[I],A[J]); I := I + 1; J := J - 1; go to up end else if I < F then begin exchange(A[I],A[F]); I := I + 1; end else if F < J then begin exchange(A[F],A[J]); J := J - 1; end; end partition Further reading Hoare, C.A.R., “Algorithm 64 Quicksort”, Communications of the ACM, 4(7), p.321 (1961) Hoare, C.A.R., “Algorithm 63 Partition”, Communications of the ACM, 4(7), p.321 (1961) Hoare, C.A.R., “The Emperor’s Old Clothes”, Communications of the ACM, 24(2), p.75-83 (1981) Shell, D.L., A high-speed sorting procedure”, Communications of the ACM, 2(7), pp.30-32 (1959) Translating an Algol-60 program to Fortran This post deals with the simple task of translating an Algol-60 program to Fortran, and some simple tricks which can be used. The program in question calculates e to many digits. There are many algorithms for calculating numbers like π and e. Normally e is calculated using the infinite series. These series often offer fast convergence, and are easy to calculate, however the problem is that computers are generally unable to accommodate more than 10-20 significant digits. For generating an accurate value for e, one has to turn to an alternative algorithm. One such algorithm is provided by Sale [1] in Algol-60 (or thereabouts). As there are no Algol-60 compilers out there, it is then a task of converting the program to another language. Here we have chosen Fortran, for reasons which will become clear as we progress through the discussion. The first thing someone will say is that they don’t know Algol-60. While I’m sure that’s true of most people, if you have a basic clue about control structures in programs, and can read a program, deciphering what goes on here should not be that hard. The first thing to do is come reverse engineering. You have to understand what the algorithm embedded in the Algol-60 code does. The first thing you might notice about this code is the fact that it allocates dynamic-type arrays within the code. This was not that unusual for Algol-type languages. Inputs to the procedure are n and d: where n is the number of decimal places to be calculated, and d is the array used to store them in. The use of the keyword loop does not in fact indicate a loop, but rather a label for that particular line. The if statement on the next line, if true, uses a goto to “loop” back to loop. In the Fortran this has been replaced by a real do while loop. m := 4; test := (n + 1) × 2.30258509; loop: m := m + 1; if m × (ln(m)-1.0)+0.5 × ln(6.2831852) × m) ≤ test then go to loop; m = 4 test = (n+1) * 2.30258509 do while (m * (log(m*1.0)-1.0) + 0.5 * log(6.2831852 * m) <= test) m = m + 1 end do You will notice that variables are created when they are needed in the original program. As Fortran does not work in the same manner, these variables will be moved to the declaration block at the top of the subroutine. Here the value of m calculated in the previous code is used to allocate storage for the array coef. Fortran will allow allocate statements anywhere in a program. The loop to set the elements of coef to 1 is replaced with a simple statement in Fortran. The two do loops in Fortran fairly well mimic the for loops in Algol-60 program. begin integer i,j,carry,temp integer array coed[2:m]; for l:=2 step 1 until m do coef[j]:=1; d[0] := 2; sweep: for i:=1 step 1 until n do begin carry := 0; for j:=m step -1 until 2 do begin temp := coef[j] × 10 + carry; carry := temp ÷ j; coef[j] := temp - carry × j end d[i] := carry end end allocate(coef(2:m)) coef(2:m) = 1 d(0) = 2; do i = 1, n carry = 0 do j = m,2,-1 temp = coef(j) * 10 + carry; carry = temp / j coef(j) = temp - carry * j end do d(i) = carry end do Now we can wrap these code segments inside a subroutine: procedure ecalculation(n,d); value n; integer n; integer array d; begin integer m; real test; comment Add code segments 1 and 2 end of ecalculation; subroutine ecalculation(n,d) integer, intent(in) :: n integer, allocatable, dimension(:), intent(out):: d integer :: m, i, j, carry, temp integer, allocatable, dimension(:) :: coef real :: test allocate(d(0:n)) ! Add code segments 1 and 2 end subroutine ecalculation And finally add this subroutine into a Fortran main program wrapper: program ecalc implicit none integer :: i,num integer, allocatable, dimension(:) :: eArr print*, "Number of digits?" read (*,*) num call ecalculation(num,eArr) print*, "e (actual) = 2.7182818284590452353602874" write(*,fmt="(a15)", advance="no") "e (calc) = 2." do i = 1,num write(*,fmt="(i1)", advance="no") eArr(i) end do write(*,*) contains ! Add subroutine ecalculation end program ecalc Now we can compile the program and run a test: Number of digits? 25 e (actual) = 2.7182818284590452353602874 e (calc) = 2.7182818284590452353602874 Refs: Sale, A.H.J., “The calculation of e to many significant digits”, The Computer Journal, 11(2), pp.229-230 (1968) Wirth’s thoughts on the “new Algol” committee. Niklaus Wirth was a part of the working group that was looking at a successor to Agol-60 – IFIP, International Federation for Information Processing, Working Group 2.1. He would later resign from the committee. In 1968 he presented the closing word at the ALGOL Colloquium in Zurich. At the time he considered the world of programming to be in “disorder and disorientation“. He summed up the official IFIP meetings as following a pattern of “disarray and dispute”, followed by “a general feeling of discouragement and despair”. It doesn’t sound like a working group that functioned very well. The idea of specifying a “universal language” lead to what Wirth termed a “monster language“, concluding that the working group had failed and should “dissolve”. His final thought is a comment worth repeating: It has been sitting on an ivory tower, and when it was recognized that the tower was leaning, it was too inflexible to make a decision for further action. But this seems to be the fate of committees. Next week, we will carefully watch the tower of Pisa. Wirth, N., “Closing word at Zurich Colloquium“, ALGOL Bulletin, 29, pp.16-19 (1968) The Wirth Trinity – Design and examples The subject [i.e. computer languages] seemed to consist of 1 percent science and 99 percent sorcery, and this tilt had to be changed. Niklaus Wirth, Turing Award Lecture (1984) Wirth designed programming languages based on a number of design principles. Occam’s Razor – In the early 1960s Wirth was concerned that languages were becoming more complex than necessary. In his thesis “A Generalization of ALGOL”, he wrote of eliminating many of the non-fundamental concepts of programming languages. Lucidity First – Wirth’s language designs are models of lucidity, i.e. clarity. The notation of the syntax of the languages serves as a natural extension of the algorithmic thinking underpinning the language. Principle, not technique – Wirth was completely aware of the difference between elements of a language that are essential, and those that are ephemeral. Pascal was designed to emphasize the idea of structured programming, not obscure concepts that could be added to a language, and serve little or no purpose. Pedagogy and practice – Pascal was designed for teaching, yet Wirth himself admitted that he does not distinguish between pedagogy and practice: “…I do not believe in using tools and formalisms in teaching that are inadequate for any practical task.“[1] What better way to illustrate the difference between the languages than actually provide some code? In this case the code for the classic Factorial. Modula-2 MODULE Factorial; FROM InOut IMPORT WriteString, WriteInt, WriteLn, ReadInt; VAR i, n, fact : INTEGER; BEGIN WriteString('Factorial n? '); ReadInt(n); FOR i := 2 TO n DO fact := fact * I; END; WriteInt(n); WriteString('! = '); WriteInt(fact); WriteLn; END Factorial. Oberon MODULE Factorial; IMPORT In, Out; VAR i, n, fact : INTEGER; BEGIN Out.String('Factorial n? '); In.Int(n); fact := 1; FOR i := 2 TO n DO fact := fact * i; END; Out.Int(n); Out.String('! = '); Out.Int(fact); Out.Ln; END Factorial. Pascal (FPC) program Factorial; var I, n, fact : integer; begin writeln('Factorial n? '); read(n); for i := 2 to n do fact := fact * i; write(n,'! = ',fact); end. Wirth, N., 1984 ACM A.M. Turing Award lecture. Euler : The beginnings of a journey The first experiment led to a dissertation and the language Euler – a trip with a bush knife through the jungle of language features and facilities. The result was academic elegance, but not much or practical utility – almost an antithesis of the later data-typed and structured programming languages. Niklaus Wirth [4] All journey’s have a beginning and for Niklaus Wirth his journey into the design of programming languages likely began with Euler. Euler was developed as part of Wirth PhD dissertation work under supervision of Helmut Weber at Stanford University. It was a generalization of Algol-60. It was a language that was suppose to be simpler, yet more flexible than Algol-60, defined with a rigorous formal definition [1,2]. Euler was experimental in nature but likely the impetus for Wirth to continue in language design. Influences? Algol-60 Why was it developed? As a simper version of Algol-60. Language features The numeric data type number, which could be used for either integer or real. Other data types were boolean, and symbol (string of characters in quotes) One of the first languages to include reference (address of a variable). List processing using the list (a sequence of elements of any type). For control structures, other than procedure calls there was only an if expression, and a goto. A for loop was not provided in Euler, but could be easily programmed as a procedure. Later versions included goto and various other control structures. Used the ⟵ as the assignment operator (original, later changed to :=). Example Programs [3] real procedure HORNER (real array A(*); integer value N; real value X); begin real S; S := 0; for I := N step -1 until 0 do S := S * X + A(I); S end procedure COPY (real array U, V(*,*); integer value A, B); for I := 1 until A do for J := 1 until B do U(i,j) := V(i,j) Refs: Wirth, N., Weber, H., “EULER: A generalization of ALGOL, and its formal definition: Part I”, CACM, 9(1), pp.13-25 (1966) Wirth, N., Weber, H., “EULER: A generalization of ALGOL, and its formal definition: Part II”, CACM, 9(2), pp.89-99 (1966) Algol W Reference Manual (1972) Wirth, N., “1984 ACM A.M. Turing Award Recipient Lecture”, CACM, 28(2), pp.159-164 (1985) Algol-W : A precursor to Pascal Before Pascal, there was a language designed by Niklaus Wirth which heavily influenced Pascal’s design. Algol-W is a language which was based on the proposal for Algol-X, the successor to Algol-60, submitted by Niklaus Wirth [1] to IFIP Working Group 2.1 in August 1965. However at the committee meeting at Grenoble in October 1965, they decided it was not a great advancement over Algol-60, and dropped it as an official working document. The committee instead chose a more complicated design which was to become Algol-68. Wirth then collaborated with Tony Hoare to revise the draft [2], and Algol-W was born. The proposal they published in 1966 [2] had a number of goals. It provided an overview of where the AGOL effort was headed, information which could be used for experimental implementations, and potential further extensions. Algol-W represented a somewhat conservative evolution of Algol-60, where some concepts were simplified, and some extensions were added. The most important extension was likely the introduction of the concept of generalized data structures. To supplement the array construct, the record was added. This made Algol-W more powerful than Algol-60 in its ability to handle data structures. The language was first implemented on an IBM 360 using PL/360, Wirth’s other early language. In September 1966, Wirth published a follow-up relating to problems encountered whilst implementing Algol-W [4]. Some of the changes would now be considered syntactic sugar. For example array subscripting was changed from a[i][j] to a[i,j], strings were made static, and the concatenation operator was eliminated. In addition, the for statement was given greater versatility, allowing for: for x := A step B until C do for x := A until C do for x := A1, A2, A3, ..., An do Algol-W was used mostly in the context of an instructional language at universities like Stanford for a number of years, and formed the basis for Pascal. Influences? Algol-60 Why was it developed? As a successor to Algol-60. What did it actually do for programming? Introduction of the case statement, first proposed by Hoare. Introduction of a record as a data structure. Value-result method of passing parameters. Features simplified from Algol-60 The switch declaration and designator were abolished, replaced by the case statement. Labels are simplified, so that they only serve as a link between goto and its destination. A goto cannot jump from outside into a conditional statement or case statement, or iterative statement. The own concept was abandoned. Integer labels are abolished. The for statement confined to simple statements (changed after implementation). Notational simplification to arrays. Extensions added Addition of a record data structure. Addition of complex, bits (bit sequence) and string types. Operations for bit sequences. For types real and complex, a long variant is provided. A “result parameter” was added to procedures. Example Programs Below are three example programs from [3]. The first program is a classic iterative Factorial, where goto's are used to construct the “for” loops. BEGIN COMMENT CALCULATE N FACTORIAL; INTEGER N,M,FACT; START: READ(N); FACT := 1; M := N; IF N >= 0 THEN BEGIN LOOP: IF N > 0 THEN BEGIN FACT := FACT*N; N := N-1; GO TO LOOP END ELSE BEGIN WRITE(M,"FACTORIAL =", FACT); GO TO START END END END. Calculate the squares and cubes of integers: BEGIN INTEGER SQ, CU; FOR X := 3 STEP 2 UNTIL 13 DO BEGIN SQ := X*X; CU := SQ*X; WRITE("X =",X, "X**2 =", SQ, "X**3 =", CU) END END. Read an array of N real numbers and sort them using Bubblesort: BEGIN INTEGER N; READ(N); BEGIN REAL ARRAY X(1::N); REAL TEMP; LOGICAL INTERCHANGES; WRITE("ORIGINAL ARRAY"); WRITE(" "); FOR I := 1 UNTIL N DO BEGIN READON(X(I)); WRITE(X(I)) END; COMMENT BEGIN SORT; FOR J := N-1 STEP -1 UNTIl 1 DO BEGIN INTERCHANGES := FALSE; FOR I := 1 UNTIL J DO IF X(I) > X(I+1) THEN BEGIN TEMP := X(I); X(I) := X(I+1); X(I+1) := TEMP; INTERCHANGES := TRUE; END; IF ¬ INTERCHANGES THEN GO TO XIT END; XIT: WRITE(" "); WRITE("SORTED ARRAY"); WRITE(" "); FOR I := 1 UNTIL N DO WRITE(X(I)) END END. In the last program, notice the use of the symbol ¬, which is one of the mathematical symbols for negation, with the other two being ~ and !. Why did this not take on (only a few languages like PL/I used it), rather than use double symbols like != or ~=, or even Fortran’s horrible /= (one of the few Fortran syntax things that really bugs me). Refs: Wirth, N., “Proposal for a Report on a Successor of ALGOL 60”, MR75, Stichting Mathematical Centre, Amsterdam (1965) Wirth, N., Hoare, C.A.R., “A contribution to the development of ALGOL”, CACM, 9, pp.413-431 (1966) Bauer, H.R., Introduction to ALGOL W Programming, Computer Science Dept. Stanford University (1969) Wirth, N., “Additional Notes on – Contribution to the development of ALGOL”, ALGOL Bulletin, 24, pp.13-17 (1966) The Wirth Trinity – Oberon On another leave at Xerox PARC (1984/85), Wirth had the goal of developing a modern multitasking OS. Wirth, together with J. Gutknecht embarked on this project in 1987, intending to implement it in Modula-2. They needed a simple language, so Oberon was born, the last in the line of “ALGOL-like” languages. It appeared with a few additions (e.g. OO) and a number of subtractions (discarded Modula-2 features). Oberon-2 appeared in 1991, and became the version of the language most commonly used. It was the last in the family of “Algol-like” languages. Influences? Oberon: Modula-2 Oberon-2: Oberon, Pascal, Modula-2 Why was it developed? Oberon was born of Modula-2, adding support for OO. The principle design guideline was to concentrate on features that are basic and essential, and to omit ephemeral issues [6]. Wirth said that the design of Oberon was driven by “the recognition of the cancerous growth of complexity in languages that had recently emerged, such as C, C++ and Ada, which appeared even less suitable for teaching than for engineering in industry.” [6] He lumped Modula-2 into this category as well, referring to it as “overly bulky, containing features that we had rarely used”. What did it actually do for programming? Traditional programming supported as well as OO-like constructs. Design considerations The idea behind Oberon was a language simpler than its predecessors. Language features “dumped” from Modula-2 It eliminated many superfluous Modula-2 features, e.g. the FOR loop, enumeration types, WITH statement, non-integer array indices [1,4]. Variant records (mostly misused, and issues related to garbage automated garbage collection). Enumeration types (defy extensibility over module boundaries). Subrange types (compiler complexity, used almost exclusively for indexing arrays). SET types (replaced with a single predefined type SET, with elements 0-31) Type CARDINAL (with 32-bit integers, the need for unsigned arithmetic was reduced). Qualified import (every occurrence of an imported identifier must be preceded by the defining modules name) nested modules WITH statement (explicit qualification of field identifiers is preferred) FOR loop (omitted from Oberon, but reintroduced in Oberon-2) Low-level facilities: ADDRESS, WORD. Language features added Only a few new features were added, many of which had a hint of OO [4]. Addition of SHORTINT, LONGINT, and LONGREAL Type extension (inheritance) – construction of new types based on existing types. Type inclusion, binds all arithmetic types together. In Modula-2, numeric types are incompatible. LONGREAL ⊇ REAL ⊇ LONGINT ⊇ INTEGER ⊇ SHORTINT POINTER types restricted to record or array types. Language Genealogy Refs: Wirth, N., “The development of procedural programming languages – personal contributions and perspectives”, in Modular Programming Languages, JMLC 2000, LNCS, V.1897 (2000) Wirth, N., “The Programming Language Oberon”, Software- Practice and Experience, 18(7), pp.671-690 (1988) Mössenböck, H., Wirth, N., “The Programming Language Oberon-2” (1993) Wirth, N., “From Modula to Oberon” (1990) Wirth, N., “Programming in Oberon” (1992) Wirth, N., “Modula-2 and Oberon”, in ACM Conf. on History of Programming Languages, pp.3-1-3-10 (2007) Wirth, N., “Modula: A language for modular multiprogramming”, Software- Practice and Experience, 7, pp.37-65 (1977) The Wirth Trinity – Modula-2 Pascal had become quite popular in the 1970s. This was in part attributed to the fact that it was a somewhat easy language to learn. It was not as fast a language, as say Fortran, but it did incorporate the notion of structured programming, so that by the late 1970s programmers had learnt to code without using the dreaded goto statement. Wirth realized that Pascal had its shortcomings [1], and considered Pascal’s I/O to be “inadequate and inflexible“. A new language was to be designed, but instead of being called Pascal-2, it was called Modula-2. To experiment with multiprogramming primitives, Wirth contrived a rudimentary language called Modula [7]. Modula was never intended to be a language like Pascal. In 1976 Wirth spent a year at the Xerox Palo Alto Research Center (PARC). Here he learned about hardware design, and on returning to Switzerland began work on what was to become Lilith, a personal workstation [3]. Wirth had some fixed constraints on the system: single-user, single processor, and all software written in a single language [1]. As to the language, Pascal was not capable, nor was Modula, at least not by themselves. The solution? Modula-2, a conglomeration of the bloodlines of Pascal, Modula, and Mesa (a language being developed at Xerox PARC). The Lilith was marketed as “The Modula Computer”, and sold for US$22,750. Modula-2 was based on the concept of a module, which allowed for high level abstraction, and low-level facilities. The language was defined in 1978, and implemented on a PDP-11, with the first Modula-2 compiler was released in 1980. One of the reasons Modula-2 didn’t succeed was the lack of a good free compiler (there is one now, but only for Windows). Influences? Modula-2 was influenced by Pascal, Modula, and Mesa (a Pascal offspring). Why was it developed? To alleviate the shortcomings of Pascal, and to design a language more apt for the time. Whereas Pascal was designed more for teaching, Modula-2 was designed as more of a systems-language. What did it actually do for programming? Modules. Derived from the notion of abstract data types, and incorporating information hiding, it built on the concepts of Mesa. This allowed things like I/O to be removed from the language proper, and encapsulated in a module forming a “standard library”. The module structure isolates its contents from the surrounding program, and modules can be separated into a definition and an implementation part. All communications to other modules occurs through the imported and exported identifiers. The module can also be regarded as a representation of the concept of an “abstract data type” postulated by Liskov in 1974 [10]. Language features (Modula) “dumped” from Pascal Variant records. Built-in I/O – move to libraries. No goto statement. Packing of data – Pascal allowed data in record and array structures to be packed. Improvements over Pascal (i.e. what was added/modified) Source code is case-sensitive, reserved words are in UPPERCASE. Open arrays. The PROCEDURE type. Flexible declarations: types, variables, and procedures can be mixed together, as opposed to Pascal’s strict const, type, var, etc. CASE has an ELSE for matching unspecified values. Also permits subranges. Boolean expressions are evaluated conditionally. I/O is relegated to library modules to avoid system dependencies. Readability was enhanced through the use of control structure terminators: REPEAT–UNTIL, IF/WHILE/FOR – END. This eliminates the begin-end block construct. The FOR statement is augmented by the clause BY. Pascal’s downto clause is missing. The type CARDINAL to allow for unsigned (positive) integers. Two standard procedures for incrementing and decrementing: INC and DEC. LONG identifiers. There are better control transfers. The statements RETURN and EXIT are used to transfer control from procedures and looping structures. HALT is used to terminate a program. No goto statement. No syntactic ambiguities in decisions, e.g. no dangling ELSE. IF statements always require an END. A new looping statement called LOOP, providing an infinite loop. Standard I/O functions, dynamic storage allocation, files via library modules. Language deficiencies No standard procedures for I/O and storage allocation (some found this quite onerous). For a list of ambiguities and insecurities, the interested reader is referred to [8], [9], and [12]. Language Genealogy Refs: Wirth, N., “The development of procedural programming languages – personal contributions and perspectives”, in Modular Programming Languages, JMLC 2000, LNCS, V.1897 (2000) Wirth, N., “History and Goals of Modula-2”, BYTE, pp.145-152 (Aug.1984). Ohran, R., “Lilith and Modula-2”, BYTE, pp.181-192 (Aug.1984). Paul, R.J., “An Introduction to Modula-2”, BYTE, pp.195-210 (Aug.1984). Coar, D., “Pascal, Ada, and Modula-2”, BYTE, pp.215-232 (Aug.1984). Gutknecht, J., “Tutorial on Modula-2”, BYTE, pp.157-176 (Aug.1984). Wirth, N., “Modula: A language for modular multiprogramming”, Software- Practice and Experience, 7, pp.37-65 (1977) Spector, D., “Ambiguities and insecurities in Modula-2”, ACM SIGPLAN Notices, 17(8), pp.43-51 (1982) Torbett, M.A, “More ambiguities and insecurities in Modula-2”, ACM SIGPLAN Notices, 22(5), pp.11-17 (1987) Liskov, B., Zilles, S., “Programming with abstract data types”, in ACM SIGPLAN Notices, 4, pp.50-59 (1974). Collins, S., “Comparing Modula-2 with Pascal and Ada”, Data Processing, 26(10), pp.32-34 (1984) Cornelius, B.J., “Problems with the language Modula-2”, Software-Practice and Experience, 18(6), pp.529-543 (1988) The Wirth Trinity – Pascal Algol was likely minimally successful from the point of view of being used extensively in industry, but was used in academic environments. There were many attempts to extend it’s applicability, which lead to Algol 68, and Algol W. However the complexity of Algol 68 ultimately lead to that variant’s demise. PL/I was an attempt to create Fortran VI, however, combining features from Algol, Fortran and COBOL, but the product was once again an extremely large language. Both Algol 68 and PL/I exemplify the notion of “Swiss army knife” approach to language design – providing every conceivable feature. What was required was a smaller, more compact language – enter Pascal. Wirth began design on Pascal (named after French mathematician Blaise Pascal) in 1968, with a compiler written entirely in Pascal, on a CDC 6000 series mainframe. The first language compiler appeared in 1970. The language was tweaked in 1972, and became an ISO standard in 1982. In describing Pascal, Wirth remarked that “the guiding idea in the layout of the syntax of Pascal was simplicity, due to the recognition that structures difficult to process by compilers are also difficult to master by human readers and writers”. The highlights of Pascal [1] were: Simple control structures for decisions, and repetitions. Scalar data types = boolean, integer, real, char and enumerations. Ability to construct complex data structures using records, arrays, and sets. Strict static typing – every constant, variable, function or parameter had a type. Dynamic data structures built with the use of pointers. Recursive procedures. Influences? Pascal was heavily influenced by Algol, and is often called Algol-like. Why was Pascal developed? Pascal was born out of what Wirth terms “an act of liberation”. Liberation from the prospect of using Algol or Fortran as languages to teach programming, and liberation from the design constraint imposed by committee work. What did it actually do for programming? Pascal was one of the first languages built from the ground up with the notion of structured programming. With no commercial backing, Pascal succeeded on its own merits, and was implemented on systems ranging from Cray supercomputers to personal computers. Programmers who felt “straitjacketed” by writing programs in BASIC, flocked to Pascal. It was an ideal language for teaching programming. Pascal was drafted as the basis for the DOD’s Ada project. It introduced records into scientific language (although Algol-W really did this). It introduced a usable case statement Design considerations The general idea dominating the design of Pascal was to provide a language appealing to systematic thinking, mirroring conventional mathematical notation, satisfying the needs of practical programming, and encouraging a structured approach. It should be simple, have the ability to handle non-numeric data, be suitable for teaching programming, and have the compile-time and runtime efficiency of Fortran. Language features Records, and variant records. Algol-60 had blocks (local declarations + statements) and compound statements (statements only), whereas Pascal eliminated the block. More, yet simpler control structures than Algol-60. Use of a real assignment operator, := Strong type safety. Case insensitive. Native set operators. Language deficiencies Keeping the goto statement. Syntactic ambiguities inherited from Algol – the lack of explicit closing symbols for nestable constructs, e.g. dangling-else. Inability to support separate compilation of modules hindered the development of large programs. The flawed case statement, which lacked an else clause. Fixed size of arrays, precluded the use of general math and string libraries. Fortran, Cobol programmers felt handcuffed by Pascal’s compulsory declaration of variables. No exponentiation operator. Criticisms of the language Lack of block structures. No dynamic arrays. Lack of the “2nd” form of conditional (inline if). Labels and the goto statement. “Unnatural” unification of subranges, types and structures. The difference between procedure and function is marginal. Wirth, N., “The development of procedural programming languages – personal contributions and perspectives”, in Modular Programming Languages, JMLC 2000, LNCS (1897). Algorithm 35: The Sieve (in Algol-68) The Sieve of Eratosthenes likely first appeared in computing in 1961 as Algorithm 35 in Communications of the ACM (p.151). It, likely many other algorithms of the time was shown in Algol-60. To try and implement it as close as possible to the original syntax, given that there are few if any Algol-60 compilers out there, I chose to reproduce the algorithm in its entirety in Algol-68 using the Algol-68 Genie (a68g) specification. It wasn’t a lot of fun. PROC sieve = (INT nmax, REF INT j) []INT: ( [nmax]INT primes; INT i; primes[1] := 1; primes[2] := 2; primes[3] := 3; j := 3; FOR n FROM 3 BY 2 TO nmax DO i := 3; l1: IF primes[i] <= sqrt(n) THEN GOTO a1 ELSE GOTO a2 FI; a1: IF n/primes[i] = n%primes[i] THEN GOTO b1 ELSE GOTO b2 FI; b2: i := i + 1; GOTO l1; a2: primes[j] := n; j := j + 1; b1: print("") OD; j := j - 1; primes ); main: ( INT n, numpr; print("Upper limit to calculate sieve? "); read(n); [n]INT primes; primes := sieve(n,numpr); FOR i FROM 1 TO numpr DO print(whole(primes[i],5)); IF i MOD 10 = 0 THEN print(new line) FI OD ) It was an interesting experience, partially because the documentation for a68g is okay, but it is filled with the same obtuse stuff in the original reports for Algol-68. There is a lot to this language, and trying to find simple things, like how to pass back a variable from a subprogram is trying. There are things I inherently like, like the ability to specify the size of the array to be created within the function (proc seems like a procedure, but behaves more like a function). And I like that assignment is :=, and equality is =. It’s also nice to use a language where it is easy to return an array from a function. Here is the program executed with the value of n set to 1000. Upper limit to calculate sieve? 1000 +1 +2 +3 +5 +7 +11 +13 +17 +19 +23 +29 +31 +37 +41 +43 +47 +53 +59 +61 +67 +71 +73 +79 +83 +89 +97 +101 +103 +107 +109 +113 +127 +131 +137 +139 +149 +151 +157 +163 +167 +173 +179 +181 +191 +193 +197 +199 +211 +223 +227 +229 +233 +239 +241 +251 +257 +263 +269 +271 +277 +281 +283 +293 +307 +311 +313 +317 +331 +337 +347 +349 +353 +359 +367 +373 +379 +383 +389 +397 +401 +409 +419 +421 +431 +433 +439 +443 +449 +457 +461 +463 +467 +479 +487 +491 +499 +503 +509 +521 +523 +541 +547 +557 +563 +569 +571 +577 +587 +593 +599 +601 +607 +613 +617 +619 +631 +641 +643 +647 +653 +659 +661 +673 +677 +683 +691 +701 +709 +719 +727 +733 +739 +743 +751 +757 +761 +769 +773 +787 +797 +809 +811 +821 +823 +827 +829 +839 +853 +857 +859 +863 +877 +881 +883 +887 +907 +911 +919 +929 +937 +941 +947 +953 +967 +971 +977 +983 +991 +997
4768
dbpedia
1
45
https://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340.html
en
E.W.Dijkstra Archive: The Humble Programmer (EWD 340)
[ "https://www.cs.utexas.edu/~EWD/spam_vaccine/at_medium.gif" ]
[]
[]
[ "" ]
null
[]
null
null
The Humble Programmer by Edsger W. Dijkstra As a result of a long sequence of coincidences I entered the programming profession officially on the first spring morning of 1952 and as far as I have been able to trace, I was the first Dutchman to do so in my country. In retrospect the most amazing thing was the slowness with which, at least in my part of the world, the programming profession emerged, a slowness which is now hard to believe. But I am grateful for two vivid recollections from that period that establish that slowness beyond any doubt. After having programmed for some three years, I had a discussion with A. van Wijngaarden, who was then my boss at the Mathematical Centre in Amsterdam, a discussion for which I shall remain grateful to him as long as I live. The point was that I was supposed to study theoretical physics at the University of Leiden simultaneously, and as I found the two activities harder and harder to combine, I had to make up my mind, either to stop programming and become a real, respectable theoretical physicist, or to carry my study of physics to a formal completion only, with a minimum of effort, and to become....., yes what? A programmer? But was that a respectable profession? For after all, what was programming? Where was the sound body of knowledge that could support it as an intellectually respectable discipline? I remember quite vividly how I envied my hardware colleagues, who, when asked about their professional competence, could at least point out that they knew everything about vacuum tubes, amplifiers and the rest, whereas I felt that, when faced with that question, I would stand empty-handed. Full of misgivings I knocked on van Wijngaarden’s office door, asking him whether I could “speak to him for a moment”; when I left his office a number of hours later, I was another person. For after having listened to my problems patiently, he agreed that up till that moment there was not much of a programming discipline, but then he went on to explain quietly that automatic computers were here to stay, that we were just at the beginning and could not I be one of the persons called to make programming a respectable discipline in the years to come? This was a turning point in my life and I completed my study of physics formally as quickly as I could. One moral of the above story is, of course, that we must be very careful when we give advice to younger people; sometimes they follow it! Another two years later, in 1957, I married and Dutch marriage rites require you to state your profession and I stated that I was a programmer. But the municipal authorities of the town of Amsterdam did not accept it on the grounds that there was no such profession. And, believe it or not, but under the heading “profession” my marriage act shows the ridiculous entry “theoretical physicist”! So much for the slowness with which I saw the programming profession emerge in my own country. Since then I have seen more of the world, and it is my general impression that in other countries, apart from a possible shift of dates, the growth pattern has been very much the same. Let me try to capture the situation in those old days in a little bit more detail, in the hope of getting a better understanding of the situation today. While we pursue our analysis, we shall see how many common misunderstandings about the true nature of the programming task can be traced back to that now distant past. The first automatic electronic computers were all unique, single-copy machines and they were all to be found in an environment with the exciting flavour of an experimental laboratory. Once the vision of the automatic computer was there, its realisation was a tremendous challenge to the electronic technology then available, and one thing is certain: we cannot deny the courage of the groups that decided to try and build such a fantastic piece of equipment. For fantastic pieces of equipment they were: in retrospect one can only wonder that those first machines worked at all, at least sometimes. The overwhelming problem was to get and keep the machine in working order. The preoccupation with the physical aspects of automatic computing is still reflected in the names of the older scientific societies in the field, such as the Association for Computing Machinery or the British Computer Society, names in which explicit reference is made to the physical equipment. What about the poor programmer? Well, to tell the honest truth: he was hardly noticed. For one thing, the first machines were so bulky that you could hardly move them and besides that, they required such extensive maintenance that it was quite natural that the place where people tried to use the machine was the same laboratory where the machine had been developed. Secondly, his somewhat invisible work was without any glamour: you could show the machine to visitors and that was several orders of magnitude more spectacular than some sheets of coding. But most important of all, the programmer himself had a very modest view of his own work: his work derived all its significance from the existence of that wonderful machine. Because that was a unique machine, he knew only too well that his programs had only local significance and also, because it was patently obvious that this machine would have a limited lifetime, he knew that very little of his work would have a lasting value. Finally, there is yet another circumstance that had a profound influence on the programmer’s attitude to his work: on the one hand, besides being unreliable, his machine was usually too slow and its memory was usually too small, i.e. he was faced with a pinching shoe, while on the other hand its usually somewhat queer order code would cater for the most unexpected constructions. And in those days many a clever programmer derived an immense intellectual satisfaction from the cunning tricks by means of which he contrived to squeeze the impossible into the constraints of his equipment. Two opinions about programming date from those days. I mention them now, I shall return to them later. The one opinion was that a really competent programmer should be puzzle-minded and very fond of clever tricks; the other opinion was that programming was nothing more than optimizing the efficiency of the computational process, in one direction or the other. The latter opinion was the result of the frequent circumstance that, indeed, the available equipment was a painfully pinching shoe, and in those days one often encountered the naive expectation that, once more powerful machines were available, programming would no longer be a problem, for then the struggle to push the machine to its limits would no longer be necessary and that was all what programming was about, wasn’t it? But in the next decades something completely different happened: more powerful machines became available, not just an order of magnitude more powerful, even several orders of magnitude more powerful. But instead of finding ourselves in the state of eternal bliss of all programming problems solved, we found ourselves up to our necks in the software crisis! How come? There is a minor cause: in one or two respects modern machinery is basically more difficult to handle than the old machinery. Firstly, we have got the I/O interrupts, occurring at unpredictable and irreproducible moments; compared with the old sequential machine that pretended to be a fully deterministic automaton, this has been a dramatic change and many a systems programmer’s grey hair bears witness to the fact that we should not talk lightly about the logical problems created by that feature. Secondly, we have got machines equipped with multi-level stores, presenting us problems of management strategy that, in spite of the extensive literature on the subject, still remain rather elusive. So much for the added complication due to structural changes of the actual machines. But I called this a minor cause; the major cause is... that the machines have become several orders of magnitude more powerful! To put it quite bluntly: as long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming had become an equally gigantic problem. In this sense the electronic industry has not solved a single problem, it has only created them, it has created the problem of using its products. To put it in another way: as the power of available machines grew by a factor of more than a thousand, society’s ambition to apply these machines grew in proportion, and it was the poor programmer who found his job in this exploded field of tension between ends and means. The increased power of the hardware, together with the perhaps even more dramatic increase in its reliability, made solutions feasible that the programmer had not dared to dream about a few years before. And now, a few years later, he had to dream about them and, even worse, he had to transform such dreams into reality! Is it a wonder that we found ourselves in a software crisis? No, certainly not, and as you may guess, it was even predicted well in advance; but the trouble with minor prophets, of course, is that it is only five years later that you really know that they had been right. Then, in the mid-sixties, something terrible happened: the computers of the so-called third generation made their appearance. The official literature tells us that their price/performance ratio has been one of the major design objectives. But if you take as “performance” the duty cycle of the machine’s various components, little will prevent you from ending up with a design in which the major part of your performance goal is reached by internal housekeeping activities of doubtful necessity. And if your definition of price is the price to be paid for the hardware, little will prevent you from ending up with a design that is terribly hard to program for: for instance the order code might be such as to enforce, either upon the programmer or upon the system, early binding decisions presenting conflicts that really cannot be resolved. And to a large extent these unpleasant possibilities seem to have become reality. When these machines were announced and their functional specifications became known, quite a few among us must have become quite miserable; at least I was. It was only reasonable to expect that such machines would flood the computing community, and it was therefore all the more important that their design should be as sound as possible. But the design embodied such serious flaws that I felt that with a single stroke the progress of computing science had been retarded by at least ten years: it was then that I had the blackest week in the whole of my professional life. Perhaps the most saddening thing now is that, even after all those years of frustrating experience, still so many people honestly believe that some law of nature tells us that machines have to be that way. They silence their doubts by observing how many of these machines have been sold, and derive from that observation the false sense of security that, after all, the design cannot have been that bad. But upon closer inspection, that line of defense has the same convincing strength as the argument that cigarette smoking must be healthy because so many people do it. It is in this connection that I regret that it is not customary for scientific journals in the computing area to publish reviews of newly announced computers in much the same way as we review scientific publications: to review machines would be at least as important. And here I have a confession to make: in the early sixties I wrote such a review with the intention of submitting it to the CACM, but in spite of the fact that the few colleagues to whom the text was sent for their advice, urged me all to do so, I did not dare to do it, fearing that the difficulties either for myself or for the editorial board would prove to be too great. This suppression was an act of cowardice on my side for which I blame myself more and more. The difficulties I foresaw were a consequence of the absence of generally accepted criteria, and although I was convinced of the validity of the criteria I had chosen to apply, I feared that my review would be refused or discarded as “a matter of personal taste”. I still think that such reviews would be extremely useful and I am longing to see them appear, for their accepted appearance would be a sure sign of maturity of the computing community. The reason that I have paid the above attention to the hardware scene is because I have the feeling that one of the most important aspects of any computing tool is its influence on the thinking habits of those that try to use it, and because I have reasons to believe that that influence is many times stronger than is commonly assumed. Let us now switch our attention to the software scene. Here the diversity has been so large that I must confine myself to a few stepping stones. I am painfully aware of the arbitrariness of my choice and I beg you not to draw any conclusions with regard to my appreciation of the many efforts that will remain unmentioned. In the beginning there was the EDSAC in Cambridge, England, and I think it quite impressive that right from the start the notion of a subroutine library played a central role in the design of that machine and of the way in which it should be used. It is now nearly 25 years later and the computing scene has changed dramatically, but the notion of basic software is still with us, and the notion of the closed subroutine is still one of the key concepts in programming. We should recognise the closed subroutines as one of the greatest software inventions; it has survived three generations of computers and it will survive a few more, because it caters for the implementation of one of our basic patterns of abstraction. Regrettably enough, its importance has been underestimated in the design of the third generation computers, in which the great number of explicitly named registers of the arithmetic unit implies a large overhead on the subroutine mechanism. But even that did not kill the concept of the subroutine, and we can only pray that the mutation won’t prove to be hereditary. The second major development on the software scene that I would like to mention is the birth of FORTRAN. At that time this was a project of great temerity and the people responsible for it deserve our great admiration. It would be absolutely unfair to blame them for shortcomings that only became apparent after a decade or so of extensive usage: groups with a successful look-ahead of ten years are quite rare! In retrospect we must rate FORTRAN as a successful coding technique, but with very few effective aids to conception, aids which are now so urgently needed that time has come to consider it out of date. The sooner we can forget that FORTRAN has ever existed, the better, for as a vehicle of thought it is no longer adequate: it wastes our brainpower, is too risky and therefore too expensive to use. FORTRAN’s tragic fate has been its wide acceptance, mentally chaining thousands and thousands of programmers to our past mistakes. I pray daily that more of my fellow-programmers may find the means of freeing themselves from the curse of compatibility. The third project I would not like to leave unmentioned is LISP, a fascinating enterprise of a completely different nature. With a few very basic principles at its foundation, it has shown a remarkable stability. Besides that, LISP has been the carrier for a considerable number of in a sense our most sophisticated computer applications. LISP has jokingly been described as “the most intelligent way to misuse a computer”. I think that description a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts. The fourth project to be mentioned is ALGOL 60. While up to the present day FORTRAN programmers still tend to understand their programming language in terms of the specific implementation they are working with —hence the prevalence of octal and hexadecimal dumps—, while the definition of LISP is still a curious mixture of what the language means and how the mechanism works, the famous Report on the Algorithmic Language ALGOL 60 is the fruit of a genuine effort to carry abstraction a vital step further and to define a programming language in an implementation-independent way. One could argue that in this respect its authors have been so successful that they have created serious doubts as to whether it could be implemented at all! The report gloriously demonstrated the power of the formal method BNF, now fairly known as Backus-Naur-Form, and the power of carefully phrased English, at least when used by someone as brilliant as Peter Naur. I think that it is fair to say that only very few documents as short as this have had an equally profound influence on the computing community. The ease with which in later years the names ALGOL and ALGOL-like have been used, as an unprotected trade mark, to lend some of its glory to a number of sometimes hardly related younger projects, is a somewhat shocking compliment to its standing. The strength of BNF as a defining device is responsible for what I regard as one of the weaknesses of the language: an over-elaborate and not too systematic syntax could now be crammed into the confines of very few pages. With a device as powerful as BNF, the Report on the Algorithmic Language ALGOL 60 should have been much shorter. Besides that I am getting very doubtful about ALGOL 60’s parameter mechanism: it allows the programmer so much combinatorial freedom, that its confident use requires a strong discipline from the programmer. Besides expensive to implement it seems dangerous to use. Finally, although the subject is not a pleasant one, I must mention PL/1, a programming language for which the defining documentation is of a frightening size and complexity. Using PL/1 must be like flying a plane with 7000 buttons, switches and handles to manipulate in the cockpit. I absolutely fail to see how we can keep our growing programs firmly within our intellectual grip when by its sheer baroqueness the programming language —our basic tool, mind you!— already escapes our intellectual control. And if I have to describe the influence PL/1 can have on its users, the closest metaphor that comes to my mind is that of a drug. I remember from a symposium on higher level programming language a lecture given in defense of PL/1 by a man who described himself as one of its devoted users. But within a one-hour lecture in praise of PL/1. he managed to ask for the addition of about fifty new “features”, little supposing that the main source of his problems could very well be that it contained already far too many “features”. The speaker displayed all the depressing symptoms of addiction, reduced as he was to the state of mental stagnation in which he could only ask for more, more, more... When FORTRAN has been called an infantile disorder, full PL/1, with its growth characteristics of a dangerous tumor, could turn out to be a fatal disease. So much for the past. But there is no point in making mistakes unless thereafter we are able to learn from them. As a matter of fact, I think that we have learned so much, that within a few years programming can be an activity vastly different from what it has been up till now, so different that we had better prepare ourselves for the shock. Let me sketch for you one of the possible futures. At first sight, this vision of programming in perhaps already the near future may strike you as utterly fantastic. Let me therefore also add the considerations that might lead one to the conclusion that this vision could be a very real possibility. The vision is that, well before the seventies have run to completion, we shall be able to design and implement the kind of systems that are now straining our programming ability, at the expense of only a few percent in man-years of what they cost us now, and that besides that, these systems will be virtually free of bugs. These two improvements go hand in hand. In the latter respect software seems to be different from many other products, where as a rule a higher quality implies a higher price. Those who want really reliable software will discover that they must find means of avoiding the majority of bugs to start with, and as a result the programming process will become cheaper. If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with. In other words: both goals point to the same change. Such a drastic change in such a short period of time would be a revolution, and to all persons that base their expectations for the future on smooth extrapolation of the recent past —appealing to some unwritten laws of social and cultural inertia— the chance that this drastic change will take place must seem negligible. But we all know that sometimes revolutions do take place! And what are the chances for this one? There seem to be three major conditions that must be fulfilled. The world at large must recognize the need for the change; secondly the economic need for it must be sufficiently strong; and, thirdly, the change must be technically feasible. Let me discuss these three conditions in the above order. With respect to the recognition of the need for greater reliability of software, I expect no disagreement anymore. Only a few years ago this was different: to talk about a software crisis was blasphemy. The turning point was the Conference on Software Engineering in Garmisch, October 1968, a conference that created a sensation as there occurred the first open admission of the software crisis. And by now it is generally recognized that the design of any large sophisticated system is going to be a very difficult job, and whenever one meets people responsible for such undertakings, one finds them very much concerned about the reliability issue, and rightly so. In short, our first condition seems to be satisfied. Now for the economic need. Nowadays one often encounters the opinion that in the sixties programming has been an overpaid profession, and that in the coming years programmer salaries may be expected to go down. Usually this opinion is expressed in connection with the recession, but it could be a symptom of something different and quite healthy, viz. that perhaps the programmers of the past decade have not done so good a job as they should have done. Society is getting dissatisfied with the performance of programmers and of their products. But there is another factor of much greater weight. In the present situation it is quite usual that for a specific system, the price to be paid for the development of the software is of the same order of magnitude as the price of the hardware needed, and society more or less accepts that. But hardware manufacturers tell us that in the next decade hardware prices can be expected to drop with a factor of ten. If software development were to continue to be the same clumsy and expensive process as it is now, things would get completely out of balance. You cannot expect society to accept this, and therefore we must learn to program an order of magnitude more effectively. To put it in another way: as long as machines were the largest item on the budget, the programming profession could get away with its clumsy techniques, but that umbrella will fold rapidly. In short, also our second condition seems to be satisfied. And now the third condition: is it technically feasible? I think it might and I shall give you six arguments in support of that opinion. A study of program structure had revealed that programs —even alternative programs for the same task and with the same mathematical content— can differ tremendously in their intellectual manageability. A number of rules have been discovered, violation of which will either seriously impair or totally destroy the intellectual manageability of the program. These rules are of two kinds. Those of the first kind are easily imposed mechanically, viz. by a suitably chosen programming language. Examples are the exclusion of goto-statements and of procedures with more than one output parameter. For those of the second kind I at least —but that may be due to lack of competence on my side— see no way of imposing them mechanically, as it seems to need some sort of automatic theorem prover for which I have no existence proof. Therefore, for the time being and perhaps forever, the rules of the second kind present themselves as elements of discipline required from the programmer. Some of the rules I have in mind are so clear that they can be taught and that there never needs to be an argument as to whether a given program violates them or not. Examples are the requirements that no loop should be written down without providing a proof for termination nor without stating the relation whose invariance will not be destroyed by the execution of the repeatable statement. I now suggest that we confine ourselves to the design and implementation of intellectually manageable programs. If someone fears that this restriction is so severe that we cannot live with it, I can reassure him: the class of intellectually manageable programs is still sufficiently rich to contain many very realistic programs for any problem capable of algorithmic solution. We must not forget that it is not our business to make programs, it is our business to design classes of computations that will display a desired behaviour. The suggestion of confining ourselves to intellectually manageable programs is the basis for the first two of my announced six arguments. Argument one is that, as the programmer only needs to consider intellectually manageable programs, the alternatives he is choosing between are much, much easier to cope with. Argument two is that, as soon as we have decided to restrict ourselves to the subset of the intellectually manageable programs, we have achieved, once and for all, a drastic reduction of the solution space to be considered. And this argument is distinct from argument one. Argument three is based on the constructive approach to the problem of program correctness. Today a usual technique is to make a program and then to test it. But: program testing can be a very effective way to show the presence of bugs, but is hopelessly inadequate for showing their absence. The only effective way to raise the confidence level of a program significantly is to give a convincing proof of its correctness. But one should not first make the program and then prove its correctness, because then the requirement of providing the proof would only increase the poor programmer’s burden. On the contrary: the programmer should let correctness proof and program grow hand in hand. Argument three is essentially based on the following observation. If one first asks oneself what the structure of a convincing proof would be and, having found this, then constructs a program satisfying this proof’s requirements, then these correctness concerns turn out to be a very effective heuristic guidance. By definition this approach is only applicable when we restrict ourselves to intellectually manageable programs, but it provides us with effective means for finding a satisfactory one among these. Argument four has to do with the way in which the amount of intellectual effort needed to design a program depends on the program length. It has been suggested that there is some kind of law of nature telling us that the amount of intellectual effort needed grows with the square of program length. But, thank goodness, no one has been able to prove this law. And this is because it need not be true. We all know that the only mental tool by means of which a very finite piece of reasoning can cover a myriad cases is called “abstraction”; as a result the effective exploitation of his powers of abstraction must be regarded as one of the most vital activities of a competent programmer. In this connection it might be worth-while to point out that the purpose of abstracting is not to be vague, but to create a new semantic level in which one can be absolutely precise. Of course I have tried to find a fundamental cause that would prevent our abstraction mechanisms from being sufficiently effective. But no matter how hard I tried, I did not find such a cause. As a result I tend to the assumption —up till now not disproved by experience— that by suitable application of our powers of abstraction, the intellectual effort needed to conceive or to understand a program need not grow more than proportional to program length. But a by-product of these investigations may be of much greater practical significance, and is, in fact, the basis of my fourth argument. The by-product was the identification of a number of patterns of abstraction that play a vital role in the whole process of composing programs. Enough is now known about these patterns of abstraction that you could devote a lecture to about each of them. What the familiarity and conscious knowledge of these patterns of abstraction imply dawned upon me when I realized that, had they been common knowledge fifteen years ago, the step from BNF to syntax-directed compilers, for instance, could have taken a few minutes instead of a few years. Therefore I present our recent knowledge of vital abstraction patterns as the fourth argument. Now for the fifth argument. It has to do with the influence of the tool we are trying to use upon our own thinking habits. I observe a cultural tradition, which in all probability has its roots in the Renaissance, to ignore this influence, to regard the human mind as the supreme and autonomous master of its artefacts. But if I start to analyse the thinking habits of myself and of my fellow human beings, I come, whether I like it or not, to a completely different conclusion, viz. that the tools we are trying to use and the language or notation we are using to express or record our thoughts, are the major factors determining what we can think or express at all! The analysis of the influence that programming languages have on the thinking habits of its users, and the recognition that, by now, brainpower is by far our scarcest resource, they together give us a new collection of yardsticks for comparing the relative merits of various programming languages. The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. In the case of a well-known conversational programming language I have been told from various sides that as soon as a programming community is equipped with a terminal for it, a specific phenomenon occurs that even has a well-established name: it is called “the one-liners”. It takes one of two different forms: one programmer places a one-line program on the desk of another and either he proudly tells what it does and adds the question “Can you code this in less symbols?” —as if this were of any conceptual relevance!— or he just asks “Guess what it does!”. From this observation we must conclude that this language as a tool is an open invitation for clever tricks; and while exactly this may be the explanation for some of its appeal, viz. to those who like to show how clever they are, I am sorry, but I must regard this as one of the most damning things that can be said about a programming language. Another lesson we should have learned from the recent past is that the development of “richer” or “more powerful” programming languages was a mistake in the sense that these baroque monstrosities, these conglomerations of idiosyncrasies, are really unmanageable, both mechanically and mentally. I see a great future for very systematic and very modest programming languages. When I say “modest”, I mean that, for instance, not only ALGOL 60’s “for clause”, but even FORTRAN’s “DO loop” may find themselves thrown out as being too baroque. I have run a little programming experiment with really experienced volunteers, but something quite unintended and quite unexpected turned up. None of my volunteers found the obvious and most elegant solution. Upon closer analysis this turned out to have a common source: their notion of repetition was so tightly connected to the idea of an associated controlled variable to be stepped up, that they were mentally blocked from seeing the obvious. Their solutions were less efficient, needlessly hard to understand, and it took them a very long time to find them. It was a revealing, but also shocking experience for me. Finally, in one respect one hopes that tomorrow’s programming languages will differ greatly from what we are used to now: to a much greater extent than hitherto they should invite us to reflect in the structure of what we write down all abstractions needed to cope conceptually with the complexity of what we are designing. So much for the greater adequacy of our future tools, which was the basis of the fifth argument. As an aside I would like to insert a warning to those who identify the difficulty of the programming task with the struggle against the inadequacies of our current tools, because they might conclude that, once our tools will be much more adequate, programming will no longer be a problem. Programming will remain very difficult, because once we have freed ourselves from the circumstantial cumbersomeness, we will find ourselves free to tackle the problems that are now well beyond our programming capacity. You can quarrel with my sixth argument, for it is not so easy to collect experimental evidence for its support, a fact that will not prevent me from believing in its validity. Up till now I have not mentioned the word “hierarchy”, but I think that it is fair to say that this is a key concept for all systems embodying a nicely factored solution. I could even go one step further and make an article of faith out of it, viz. that the only problems we can really solve in a satisfactory manner are those that finally admit a nicely factored solution. At first sight this view of human limitations may strike you as a rather depressing view of our predicament, but I don’t feel it that way, on the contrary! The best way to learn to live with our limitations is to know them. By the time that we are sufficiently modest to try factored solutions only, because the other efforts escape our intellectual grip, we shall do our utmost best to avoid all those interfaces impairing our ability to factor the system in a helpful way. And I cannot but expect that this will repeatedly lead to the discovery that an initially untractable problem can be factored after all. Anyone who has seen how the majority of the troubles of the compiling phase called “code generation” can be tracked down to funny properties of the order code, will know a simple example of the kind of things I have in mind. The wider applicability of nicely factored solutions is my sixth and last argument for the technical feasibility of the revolution that might take place in the current decade. In principle I leave it to you to decide for yourself how much weight you are going to give to my considerations, knowing only too well that I can force no one else to share my beliefs. As each serious revolution, it will provoke violent opposition and one can ask oneself where to expect the conservative forces trying to counteract such a development. I don’t expect them primarily in big business, not even in the computer business; I expect them rather in the educational institutions that provide today’s training and in those conservative groups of computer users that think their old programs so important that they don’t think it worth-while to rewrite and improve them. In this connection it is sad to observe that on many a university campus the choice of the central computing facility has too often been determined by the demands of a few established but expensive applications with a disregard of the question how many thousands of “small users” that are willing to write their own programs were going to suffer from this choice. Too often, for instance, high-energy physics seems to have blackmailed the scientific community with the price of its remaining experimental equipment. The easiest answer, of course, is a flat denial of the technical feasibility, but I am afraid that you need pretty strong arguments for that. No reassurance, alas, can be obtained from the remark that the intellectual ceiling of today’s average programmer will prevent the revolution from taking place: with others programming so much more effectively, he is liable to be edged out of the picture anyway. There may also be political impediments. Even if we know how to educate tomorrow’s professional programmer, it is not certain that the society we are living in will allow us to do so. The first effect of teaching a methodology —rather than disseminating knowledge— is that of enhancing the capacities of the already capable, thus magnifying the difference in intelligence. In a society in which the educational system is used as an instrument for the establishment of a homogenized culture, in which the cream is prevented from rising to the top, the education of competent programmers could be politically impalatable. Let me conclude. Automatic computers have now been with us for a quarter of a century. They have had a great impact on our society in their capacity of tools, but in that capacity their influence will be but a ripple on the surface of our culture, compared with the much more profound influence they will have in their capacity of intellectual challenge without precedent in the cultural history of mankind. Hierarchical systems seem to have the property that something considered as an undivided entity on one level, is considered as a composite object on the next lower level of greater detail; as a result the natural grain of space or time that is applicable at each level decreases by an order of magnitude when we shift our attention from one level to the next lower one. We understand walls in terms of bricks, bricks in terms of crystals, crystals in terms of molecules etc. As a result the number of levels that can be distinguished meaningfully in a hierarchical system is kind of proportional to the logarithm of the ratio between the largest and the smallest grain, and therefore, unless this ratio is very large, we cannot expect many levels. In computer programming our basic building block has an associated time grain of less than a microsecond, but our program may take hours of computation time. I do not know of any other technology covering a ratio of 1010 or more: the computer, by virtue of its fantastic speed, seems to be the first to provide us with an environment where highly hierarchical artefacts are both possible and necessary. This challenge, viz. the confrontation with the programming task, is so unique that this novel experience can teach us a lot about ourselves. It should deepen our understanding of the processes of design and creation, it should give us better control over the task of organizing our thoughts. If it did not do so, to my taste we should not deserve the computer at all! It has already taught us a few lessons, and the one I have chosen to stress in this talk is the following. We shall do a much better programming job, provided that we approach the task with a full appreciation of its tremendous difficulty, provided that we stick to modest and elegant programming languages, provided that we respect the intrinsic limitations of the human mind and approach the task as Very Humble Programmers.
4768
dbpedia
2
10
https://michaeltansy.wordpress.com/2013/03/18/concept-of-programming-language-2-evolution-of-the-major-programming-language/
en
Concept of Programming Language 2 (Evolution of the Major Programming Language)
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "http://s10.flagcounter.com/count/3Ur/bg_FFFFFF/txt_000000/border_CCCCCC/columns_2/maxflags_10/viewers_0/labels_1/pageviews_1/flags_1/", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2013-03-18T00:00:00
REVIEW QUESTION 1. In what year was Plankalkul designed? In what year was that design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 2. Mention an intereseting feature of Zuse’s programs. One of the…
en
https://s1.wp.com/i/favicon.ico
michaeltansy
https://michaeltansy.wordpress.com/2013/03/18/concept-of-programming-language-2-evolution-of-the-major-programming-language/
REVIEW QUESTION 1. In what year was Plankalkul designed? In what year was that design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 2. Mention an intereseting feature of Zuse’s programs. One of the interesting feature of Zuse’s programs is the inclusion of mathematical expressions showing the current relationships between program variables. 3. What does Plankalkul mean? Plankalkul means program calculus. 4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they ? The first hardware shortcoming was the lack of floating point hardware. The second was the lack was automatic incrementing address registers. 5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped? The number of bits is 72. It is grouped into 12 parts which contains 6 bit bytes each. 6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming languages ? Explain why. Its capabilities prompted the development of Fortran because it was able to support floating-point operations hardware. 7. Who developed the Speedcoding system for the IBM 701 ? The speedcoding was developed by John Backus. 8. Who developed Short Code? Why is Short Code called automatic programming? The shortcode was developed by John Mauchly in 1949. Short Code is called automatic programming because Short Code was not translated to machine code, it was implemented with a pure interpreter. 10. What was the most significant feature added to Fortran I to get Fortran II? Independent-compilation capability 11. What control flow statements were added to Fortran IV to get Fortran 77? logical loop statements and IF with an optional ELSE 12. Which version of Fortran was the first to have any sort of dynamic variables? Fortran 90 13. Which version of Fortran was the first to have character string handling? Fortran 77 14. Why were linguists interested in artificial intelligence in the late 1950s? Linguists were concerned with natural language processing. 15. What are the different data types and structures in Common LISP? Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of packages for modularizing collection of function and data providing access control. 16. In what way are scheme and Common LISP opposites of each other? Common LISP allows for static scoping and dynamic scoping Scheme only uses static scoping. Scheme is relatively small while Common LISP is large and complex. 17. What dialect of LISP is used for introductory programming courses at some universities? Scheme 18. What two professional organizations together designed ALGOL 60? ACM and GAMM 19. What was the goal for developing C? C has adequate control statements and data-sructuring facilities to allow its use in many application areas. It alse has a rich set of operators that provide a high degree of expressiveness. 20. What were the significant modifications to ALGOL 58 to produce ALGOL 60? The concept of block structure was introduced, two different means of passing parameters to subprograms were allowed, procedures were allowed to be recursive, stack-dynamic arrays were allowed. 21. What language was designed to describe the syntax of ALGOL 60? BNF 22. On what language was COBOL based? FLOW-MATIC 23. In what year did the COBOL design process begin? 1959 24. What data structure that appeared in COBOL originated with Plankalkul? Hierarchical data structures (records) 25. What organization was most responsible for the early success of COBOL (in terms of extent of use)? Department of Defense (DoD) 27. Why was BASIC an important language in the early 1980s? Its smaller dialects could be implemented on computers with very small memories 28. PL/I was designed to replace what two languages? COBOL and Fortran 29. For what new line of computers was PL/I designed? the IBM system/360 line of computers 30. What features of SIMULA 67 are now important parts of some object-oriented languages? Data abstraction 31. What innovation of data structuring was introduced in ALGOL 68 but is often credited to Pascal? User-defined data types 32. What design criterion was used extensively in ALGOL 68? Orthogonality 33. What language introduced the case statement? The case statement was introduced in ALGOL-W, a version of ALGOL 60 created by Niklaus Wirth and C. A. R. (Tony) Hoare. 34. What operators in C were modeled on similar operators in ALGOL 68? for and switch statements, in its assigning operators, and in its treatment of pointer 35. What are two characteristics of C that make it less safe than Pascal? Lack of complete type checking and flexibility 37. What are the two kinds of statements that populate a Prolog database? The statements that populate a Prolog database are called facts and rules. 38. What is the primary application area for which Ada was designed? Embedded systems 39. What are the concurrent program units of ada called? tasks (using the rendezvous mechanism) 42. What three concepts are the basis for object-oriented programming Classes, objects and methods 43. Why does C++ include the features of Ada that are known to be unsafe? A goal of Ada was that it could be used for which Ada could be used, so virtually none of the features of Ada would be removed, not even those considered to be unsafe. 44. From what language does Objective-C borrow its syntax for method calls? Smalltalk language. 45. What programming paradigm that nearly all recently designed languages support is not supported by Go? Doesn’t support traditional object-oriented programming, as it has no form of inheritance. PROBLEM SET 1. What features of Fortran IV do you think would have had the greatest influence on Javaif the Java designers had been familiar with Fortran? Logical data type – boolean, used mainly for the control expressions of itscontrol statements (such as if and while).We can create simple version of the complex compile, and link processes of earlier compilers. 2. Determine the capabilities of Short Code, and compare them with those of a contemporary programmable hand calculator. Short Code is capable of making multiplication without multiplication code needed, just put two operands side-by-side.Besides, it also uses pure interpreter to run itself.Short Code consists of coded version of mathematical expression that was to be evaluated.Contemporary programmable calculator was able to do multiplication with its embedded programming language. People can solve arithmetic problem easier than using the Short Code but it is more difficult to do further improvement to the programmable hand calculator than to the Sort Code.
4768
dbpedia
1
12
https://www.theregister.com/2020/05/15/algol_60_at_60/
en
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
[ "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_extents_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_filled_extents_16x16.png", "https://www.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/std/reg_logo_no_strapline.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/magnifying_glass_white_extents_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_close_16x16.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0U2SU6MzFOUgoNLKRX@QAAANM&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_red.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0U2SU6MzFOUgoNLKRX@QAAANM&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0U2SU6MzFOUgoNLKRX@QAAANM&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0U2SU6MzFOUgoNLKRX@QAAANM&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0U2SU6MzFOUgoNLKRX@QAAANM&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0U2SU6MzFOUgoNLKRX@QAAANM&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0", "https://regmedia.co.uk/2020/05/14/algol_60_tapes.jpg?x=648&y=372&infer_y=1", "https://regmedia.co.uk/2020/05/14/algol_plotter.jpg?x=648&y=324&infer_y=1", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=6&c=66Zr0U2SU6MzFOUgoNLKRX@QAAANM&t=ct%3Dns%26unitnum%3D6%26raptor%3Dhawk%26pos%3Dbtm%26test%3D0", "https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_white.png", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/std/sitpublogo_2022.png", "https://www.theregister.com/Design/graphics/std/transparent_pixel.png" ]
[]
[]
[ "" ]
null
[ "Richard Speed" ]
2020-05-15T00:00:00
Back to the time when tape was king
en
/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://www.theregister.com/2020/05/15/algol_60_at_60/
2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages. The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes. ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula. "In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research in Cambridge, with every computer having its own particular twist on things. A first generation of languages, called "Autocode", existed for coding problems like equations which could then be translated into machine code, but lacked the bells and whistles of today. Worse, some had features that others lacked, making hopping between systems tricky. "There was an Autocode for the [Elliott] 803," said Onion, "but it only supported expressions like A + B = C, so if you've got a complex equation, you have to break it down into individual single binary operations. So there was still a lot of hard work to be done by the programmer." "Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more productive." The overhead of compiling, and inefficiencies in the compilers themselves, meant that machine code remained king of the performance hill, but for those doing science work, the ability to churn out some code to solve a problem and then simply move on to the next was appealing. "Fortran," Herbert continued, "was more like an autocode," before laughing, "It still is in some ways! "And a bunch of people thought you could do better." Enter the International Federation for Information Processing (IFIP), which Herbert recalled "had a whole bunch of committees who looked at standards and problems in computing". One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output, in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds of things hadn't really been thought about or worked through properly," recalled Herbert. And so there were revisions and changes. A periodical called "The ALGOL Bulletin" detailed the travails of those involved as the problems and the weaknesses in the language were dealt with (or at least attempted). The process was not unlike an open-source mailing list today, but in paper form. Eventually, Herbert told us, "they published the ALGOL 60 report, which is the baseline that everyone then worked to." The committees were under pressure and also suffered a little from differing international approaches. The American side had a lot of experience in Fortran and were seeking something that could quickly be made to work on their computers, while the Europeans were a little more cerebral and had, Herbert laughed, "terrible notions like beauty and elegance in mind for the language". "People were sorting out some of the things that we now take for granted like ideas in structured programming, data structures, data types," he added. Seeking solutions to the problem of portability of programmers between systems and code between hardware generations as well as avoiding the pain of having to rewrite programs every time a new iteration of computer arrived, vendors embraced the language with variants cropping up over many manufacturers. Alas, those seeking a handy-dandy "HELLO WORLD" example will be disappointed. The Achilles' heel of the language that would go on to inspire so many others was that it lacked standard input/output capabilities. "The defining committee couldn't agree on how to do input/output," said Herbert. "They decided that would be left to a library, and that library would be user dependent." "In this case," added Onion, "the user being the compiler writer." Oh dear. The omission pretty much did for vendor independence as manufacturers naturally went their own way, leaving large chunks of code incompatible between systems. There were also elements of ALGOL 60 that were open to interpretation, leaving it a little compromised from the start. While ALGOL ploughed its furrow, Fortran continued to be developed in parallel. "People in the Fortran world," explained Herbert, "saw ideas in ALGOL they quite liked and brought them across." As the decades passed, Fortran remained the centre of gravity for scientific computing while ALGOL became more of an academic language, used for teaching computer science ideas. "It was quite heavily used in the scientific community," Herbert said. "Most mainframe manufacturers supported it." Some of the team behind ALGOL 60 stayed with the project and went on to come up with ALGOL 68, which, as far as Herbert is concerned, "nailed all the things that ALGOL 60 had left a bit vague". Indeed, it was hard to avoid in the 1970s for those taking computer science courses. This hack has fond memories of the successor language, while the grandfather of Reg sub-editor Richard Currie had a hand in the development of ALGOL 68-R and RS. "It had the world's most exotic input output system," Herbert laughed. It was also, sadly for its enthusiasts, a bit of a dead end. Despite ALGOL 68-R becoming widely used in (particularly British) military applications for a time, it would take until the 1970s for a full implementation of ALGOL 68 to become available. The last edition of The ALGOL Bulletin was published in 1988, with its editor noting: "ALGOL 68 as a language is very stable. It is used and loved by those who understand its benefits, and ignored (or misquoted) by the rest." The story of ALGOL 60 is not so much of the language's eventual fate, but also of those that it inspired. ALGOL W, based on a proposal for ALGOL X, by Niklaus Wirth and QuickSort creator Tony Hoare would go on to inspire Wirth's Pascal and Modula-2. Pascal's influence continues to be felt today. ALGOL 60 also heavily influenced the Combined Programming Language (CPL), developed in the 1960s but not implemented until the following decade. CPL in turn led to Basic CPL (BCPL), from which B descended. The B language was further developed to become C. Tony Hoare was responsible for the implementation of ALGOL 60 on the Elliott 803 computer, an example of which remains operational at The National Museum of Computing, although compiling and running a program on that hardware is a little different to the development environments to which coders are now accustomed. First, the compiler must be loaded from paper tape. The ALGOL program itself is then fed into the tape reader and "it sort of chunters away," remarked Onion, "for anything between 30 seconds to perhaps 15 or 20 minutes during the compilation." Behold the chuntering Once compiled, a program would be free to use the space originally occupied by the compiler. Doing so would, however, not win the programmer any popularity awards since the next user would have to reload the compiler again. Leaving it in memory meant that multiple programs could be run. "That made it very popular for teaching," said Herbert, "because you can have a line of students, each with their paper tape with their programme in their hand and you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish, you go away and think about it and the next student runs their programme." With paper tape being king, Onion observed that the experience of programming taught a bit of focus: "When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code…" The National Museum of Computing has two Elliott machines in its collection, a 1962 803B (which was donated after spending 15 years lurking in a barn following its decommissioning) and a 903. Both are fully operational and can be seen running once the museum is able to open its doors once again. The 803B, which is maintained by Onion, also features a Calcomp drum plotter as well as some additional input/output features. As for taking the ALGOL 60 itself out for a spin today, there are a few options for those not fortunate enough to have an Elliott 803 or 903 to hand. MARST will translate ALGOL 60 to C or one can get a feel for the whole 803 experience via a simulator. Although as ALGOL 60 turns 60, you could just fire up a modern programming language. Lurking within will likely be the ideas of ALGOL's designers. ®
4768
dbpedia
2
51
https://www.hillelwayne.com/post/influential-dead-languages/
en
10 Most(ly dead) Influential Programming Languages
https://www.hillelwayne.com/favicon.ico
https://www.hillelwayne.com/favicon.ico
[ "https://www.hillelwayne.com/post/influential-dead-languages/IDSkeyboard.jpg", "https://www.hillelwayne.com/post/influential-dead-languages/java.jpg" ]
[]
[]
[ "" ]
null
[]
2020-03-25T00:00:00+00:00
The other day I read 20 most significant programming languages in history, a “preposterous table I just made up.” He certainly got preposterous right: he lists Go as “most significant” but not ALGOL, Smalltalk, or ML. He also leaves off Pascal because it’s “mostly dead”. Preposterous! That defeats the whole point of what “significant in history” means. So let’s talk about some “mostly dead” languages and why they matter so much.
en
/favicon.ico
Hillel Wayne
https://www.hillelwayne.com/post/influential-dead-languages/
The other day I read 20 most significant programming languages in history, a “preposterous table I just made up.” He certainly got preposterous right: he lists Go as “most significant” but not ALGOL, Smalltalk, or ML. He also leaves off Pascal because it’s “mostly dead”. Preposterous! That defeats the whole point of what “significant in history” means. So let’s talk about some “mostly dead” languages and why they matter so much. Disclaimer: Yeah not all of these are dead and not all of these are forgotten. Like most people have heard of Smalltalk, right? Also there’s probably like a billion mistakes in this, because when you’re doing a survey of 60 years of computing history you’re gonna get some things wrong. Feel free to yell at me if you see anything! Disclaimer 2: Yeah I know some of these are “first to invent” and others are “first to popularize”. History is complicated! Detecting Influence Before we start, a quick primer on finding influence. Just knowing that X was the first language with feature Z doesn’t mean that X actually influenced Z. While Absys was arguably the first logic programming language, almost all of logic programming actually stems from Prolog, which was developed independently. Ultimately there’s only one way to know for certain that X influenced Y: citation. This means one of Y cites X in its reference manual Y cites a paper that cites X The author of Y says “we were influenced by X.” Citations are transitive. Sometimes the language manual for Q lists motivating document R, which cites paper S as an inspiration, which mentions it got the ideas from language T. Then we know that T influenced Q, even if the chain is several steps long. This means digging through many sources to find a signal. To speed this up we use heuristics to decide where to look. One effective heuristic is programming language cognates. It’s very rare for languages to independently come up with the same syntax. So if two languages share some syntax, one likely influenced the other. For example: even without reading design decisions by Matz, we know that Ruby was influenced by Smalltalk, as they both filter a list with a select method. This isn’t conclusive evidence. Maybe Matz came up with it independently, or maybe Ruby and Smalltalk were both influenced by a common ancestor. But it gives us a place to start looking. The Languages COBOL Background: CODASYL, 1960. COBOL is shaped by the business/science split in computing. At that time high-level industry languages were either used for engineering computations or managing data. The engineers had all gotten behind FORTRAN while the business world was a mess of COMTRAN, FLOW-MATIC, and others, so the Department of Defense got a committee together to make a single universal business language. That’s COBOL. COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP. While we consider it a punchline today, it was once the most popular language in the world. It still runs a lot of our legacy business systems. Significance: In terms of syntax and semantics we don’t see much of COBOL in modern computing. COBOL’s most important addition is the concept of record data. In FORTRAN and ALGOL, your only data structure was the static array. In COBOL, though, you could read in structured files with hierarchical data, and it would automatically destructure them into the representative variables. This was a precursor to modern day structs. Cause of Death: Two factors here. One: COBOL had no overlap with other PLT efforts. Very few people built on COBOL. This meant that second or third generation languages, which built on the lessons of their ancestors, had almost no COBOL DNA. This was less intrinsic problem of COBOL and more because of the academia’s disdain for its creation process. CODASYL was a business group and obviously wasn’t worth paying attention to. COBOL was also enormously complex, even for today’s languages. This means that COBOL compilers lagged contemporaries on microcomputers and minicomputers, giving spaces for other languages to flourish and eventually outcompete it. ALGOL Background: The ALGOL committee, 1960. ALGOL-58 came out two years before but was quickly superseded, so I’m wrapping them into each other. The committee wanted to make a good language for researching algorithms. In other words, ALGOL was a formalized “pseudocode”. Of the four mother languages, ALGOL is the most “dead”; Everybody still knows about LISP, COBOL still powers tons of legacy systems, and most scientific packages still have some FORTRAN. But I’ve met plenty of programmers who haven’t even heard of ALGOL. You’d think it’d be the least important of the mother languages, but it’s the opposite. Of the four, only LISP comes anywhere close to the pervasive importance of ALGOL. Significance: Let’s see: lexical scoping, structured programming, nested functions, formal language specifications, call-by-name semantics, BNF grammars, block comments… every modern language today is deeply influenced by ALGOL. Cause of Death: ALGOL was a research language, not a commercial language. It was designed for studying algorithms. The spec didn’t define any I/O, which kinda made it impossible to use in practice. Sure, you could write a compiler extension, but then you might as well add other stuff too. And that’s exactly what people did. In 1960 and 70 people made a huge number of ALGOL-likes by extending ALGOL with I/O and extra data structures. This includes JOVIAL, SIMULA, CLU, and CPL. Later languages were then based off these extensions, not ALGOL directly. We call C an “ALGOL-like”, but it’s actually a BCPL-like, which was a CPL-like, which was an ALGOL-like. ALGOL’s children buried it. Eventually the ALGOL people tried to extend it into ALGOL-68, which radically departed from ALGOL-60 and hasn’t had close to the same influence. The ALGOL-60 lineage continues with Niklaus Wirth’s Pascal. APL Background: Ken Iverson, 1962. Originally a hand-written notation for array math, IBM picked it up and used as an programming language. As a language, APL focused on array processing: being able to concisely manipulate large blocks of numbers. If you’ve heard of APL before, you probably know it as “that weird symbol language”. One of the most notorious code snippets is this implementation of the Game of Life: You had to write it with a specialized keyboard, like this: Nonetheless, APL got popular on mainframes for running with very low memory requirements. Significance: Array processing. At a time when adding two lists of numbers meant a map or a loop, APL introduced the idea of operating on the entire array at once. For example: This was a really big deal in scientific circles. So much applied math boils down to large-scale operations on large matrices. When you can just take the outer product with ∘.f, it’s really damn easy to take outer products! Through this innovation APL lead to R, numpy, pandas, Matlab, etc. There’s also the direct descendants of APL: J, Dyalog, K, Q. They’ve been less successful but still see lots of use in the finance sector. Cause of Death: Well, the obvious problem is the keyboards. If you can’t write it in ASCII, you’re not going to write very much of it. Iverson fixed this with J, which uses digraphs instead of different symbols. Instead of ≠, you write ~:. This was in 1990, though, which is a bit late to popularize a radically different programming style. The subtler problem is that APL and J only worked on homogeneous data. You can’t store both strings and numbers in the same data structure (unless you use boxes, which is a whole other can of worms) and working with strings is generally a nightmare. So no dataframes, which excludes a lot of modern data science. Further Reading: Notation as a Tool of Thought BASIC Background: John Kemeny, 1964. Originally a simplified FORTRAN-like, intended to help people outside engineering use computers. BASIC really took off in the microcomputer era. The first microcomputers didn’t have enough memory to compile “real” programming languages, whereas you could cram a pared-down BASIC compiler into like 2 kilobytes. BASIC became a lingua franca for early-stage programmers. If you were programming at home in the 1970’s, you were probably writing BASIC on a microcomputer. Significance: The biggest technical impact is runtime interpretation. BASIC was the first language with a real-time interpreter (the Dartmouth Time Sharing System), beating APL by a year. And that APL system was only available to IBM customers, so really it was BASIC or nothing for a long time. BASIC had a bigger social impact. It brought programming to households, kids especially. Many of the influential programmers in the 80’s and 90’s first learned how to program on BASIC. Many enterprise programs were also written in BASIC, which probably helped accelerate the decline of Cobol. BASIC has one more neat trick up its sleeve: Office tooling! Microsoft eventually turned BASIC into Visual Basic, which they used as the Office macro language. This then spread to OpenOffice and LibreOffice, entrenching BASIC in that particular niche. More recently it’s lost ground to JavaScript and is now a legacy macro language. Cause of Death: People saw BASIC as a “lesser” language. You might use it if you were a kid or a small business owner, but real programmers used a real language. Once manufacturers could cheaply make microcomputers with more than 16k of RAM they started depreciating BASIC for languages like Pascal and C. BASIC lived on for a while as a legacy kids teaching language, but seems to have died out of that niche, too. PL/I Background: IBM, 1966. IBM’s business was split into two languages: FORTRAN for scientists and COMTRAN for business folk. Facing competition from COBOL and wanting to streamline their systems, they tried to make a language that was useful for both engineering and business purposes. This ended up looking like a sort of superset of the two languages, with a bunch of additional features stapled on top. Now everybody could use the same language and IBM can make a lot more money! Yaaaaaaaay Significance: The authors of ALGOL-68 mockingly called PL/I an obsolete language. But everything ALGOL-68 did, PL/I did earlier and better. While COBOL got structured data first, PL/I was the first language to implement them as a type. In COBOL, reading in a user with a name would give you two global variables, user and name. In PL/I, you’d get one variable with a field, user.name. PL/I was also the first high-level language with pointers for direct memory manipulation, constants, and function overloading. Many of these ideas entered mainstream programming via C, which was a mix of both BCPL and PL/I. C even uses PL/I’s comment syntax. Cause of Death: All the FORTRAN programmers thought it was too much like COBOL and all the COBOL programmers thought it was too much like FORTRAN. IBM had tried to take on two established languages with a much more complicated one. It didn’t help that they were the only group with the compiler, meaning everybody else was mistrustful of vendor lock-in. By the time IBM was able to make headway in both of these issues the wider computing world had already moved on to the microcomputer era, where PL/I was out competed by BASIC. Further Reading: The Choice of PL/I SIMULA 67 Background: Ole Dahl and Kristen Nygaard, 1967. They extended ALGOL for doing simulations. First they made SIMULA I, which had dedicated simulation and “activity” syntax. SIMULA I saw some early use, but the two were dissatisfied with how “specialized” the language felt and how much duplicate code they had in their simulations. They wanted to make a more general framework for representing things in general, not simulations only. Their idea was to allow users to define new types called “classes” with polymorphic function resolution. Then users could build the simulation features as a special case of the object system, making it easy to customize how it all worked to their particular needs. Significance: While SIMULA wasn’t the first “true” OOP language, it was the first language with proper objects and laid much of the groundwork that others would build on. This includes the class/object split, subclassing, virtual methods, and protected attributes. It inspired almost all of the academic research into objects after 1967. Both CLU and ML cited SIMULA as a major source of inspiration. Bjarne Stroustroup did his PhD on SIMULA, eventually incorporating a lot of its ideas into C++. Cause of Death: In that same PhD Stroustroup claimed that SIMULA was waaaaaay too slow to use at scale. “Good luck getting anything done if you aren’t on a mainframe” slow. It’s worth noting that Smalltalk-80, which took the same ideas even further, had an extra 13 years of Moore’s law behind it. And even Smalltalk was often mocked as too slow. Everybody went and implemented the ideas in SIMULA that they could integrate into faster, simpler languages. Further Reading: Compiling SIMULA: a historical study of technological genesis, The History of Simula Pascal Background: Niklaus Wirth, 1970. Made to capture the essence of ALGOL-60 after ALGOL-68 got waaaaaay too complicated for Wirth’s liking. It first got big as the “introduction to CS” language, and by the early 80’s was the second-most popular language on the Usenet job boards. Wirth considers the whole family- Pascal, Modula, and Oberon- as a single unified language concept. Significance: Pascal didn’t introduce any completely new ideas. It was an intentionally conservative language that tried to pick the best parts of the past decade and provide them in a unified package. Pascal brought ALGOL syntax outside academia, so much so that ALGOL’s assignment syntax, :=, got called “Pascal style” instead. From this point on most language features that look like ALGOL were more likely inspired by Pascal than directly by ALGOL itself. While Pascal wasn’t very innovative, variants of it were. Wirth also pioneered the idea of “stepwise refinement” as a means of writing rigorous software. This eventually lead to the Modulas, which popularized first class software modules, and Euclid, the first formal verification language to see production use. Cause of Death: I’m calling a mulligan on this one. Unlike most of the other ones on this list, Pascal didn’t have major structural barriers or a sharp competitor. Sure, it competed with C, but it was still doing fine for a very long time. People usually attribute the Why Pascal is not my favorite language essay, but that’s too neat of an answer and history is a lot messier. Also, Delphi is still pretty high-ranked in the TIOBE and PYPA measurements, so it’s not exactly dead in the same way SIMULA is. An accurate analysis of the fall of Pascal would be longer than the rest of this essay. Further Reading: The Programming Language Pascal, Pascal and its Successors CLU Background: Barbara Liskov, 1975. Liskov wanted to mess around with abstract data types. That’s it. That’s the whole reason for CLU. Significance: CLU might be the most influential language that nobody’s ever heard of. Iterators? CLU. Abstract data types? CLU. Generics? CLU. Checked exceptions? CLU. We didn’t adopt the same terminology, so it’s not 100% obvious it all comes from CLU, but still. Every language spec for the next decade would namedrop CLU. CLU did a lot. Cause of Death: CLU was a showcase language; Liskov wanted to get people to adopt her ideas, not her specific language. And they did: almost every language today owes something to CLU. As soon as she completed CLU she moved on to Argus, which was supposed to showcase her ideas on concurrency. That hasn’t seen nearly the same adoption, and there’s still a lot of stuff in it left to mine. Further reading: A History of CLU ML Background: Robin Milner, 1976. Milner was building the LCF Prover, one of the first proof assistants. If you wrote a proof in the right format, LCF could check to see if it was correct or not. To assist in writing the proofs, Milner created a metalanguage based on sound mathematical formalisms, which at the time meant strict static types and higher-order functions. Eventually ML was standardized as Standard ML. Significance: ML is arguably the oldest “algebraic programming language”. There’s a lot of stuff we attribute to ML: algebraic data types, modules, typed functional programming. Surprisingly, it was not the first for a lot of these! The first ML was just designed to work with LCF and wasn’t a general purpose language, so lacked a lot of these features. As people started making it more general they pulled ideas from other research languages and incorporated them into ML. One very important idea did start in ML, though: type inference. ML was the first statically-typed language where you didn’t have to write the types out, as the compiler would figure out the types for you. This paved the way for typed FP to escape academia and enter production use. ML also greatly influenced modern theorem provers. The “program” languages for Isabelle, CVC3, and Coq are ML-based. And a lot of type theory was based on ML, though in more recent years the Haskell branch of FP has become more popular. Cause of Death: ML had a lot of interesting features, but people paid attention to it for the type inference. At the time ML was still a special purpose language for the theorem provers. SML came out the same year as Haskell, which was a much “purer” example of a typed FP language. Smalltalk Background: Alan Kay, 1972, 1976, and 1980. It’s sort of a moving target. Smalltalk-72 was the first, Smalltalk-76 introduced the idea of “object-oriented programming” to the wider world, and Smalltalk-80 was the one that saw widespread adoption. Smalltalk wasn’t the first language with objects but it was the first “object-oriented” one. The difference was that Simula had objects in addition to primitives like numbers and booleans, while in Smalltalk, booleans were also objects. I wrote a bit about this here if you want to learn more. Significance: We sometimes think that Smalltalk is “true” OOP and things like Java and Python aren’t “real” OOP, but that’s not true. OOP is a giant mess of many different influences, just like every other paradigm. But it was certainly the thing that popularized the idea. If you crack open any general theory OOP book from the mid-80’s or early 90’s, they’ll be in Smalltalk. Many will also translate their examples to C++, and a few will use another language, but everybody will use Smalltalk. Smalltalk also spread the idea of objects as shareable data, leading the way to CORBA, and it inspired the computational Actor model. Cause of Death: The common belief is that Smalltalk lost because people used C++ instead. But that’s untrue. Smalltalk did have some issues, specifically its difficulty interoping with other tools and poor runtime performance. But even into the 1990’s Smalltalk was doing respectable business and many people assumed it would be a dominant business language. Then Java happened. Smalltalk wasn’t the only casualty of the “Javapocalypse”: Java also marginalized Eiffel, Ada95, and pretty much everything else in the OOP world. The interesting question isn’t “Why did Smalltalk die”, it’s “Why did C++ survive”. I think it’s because C++ had better C interop so was easier to extend into legacy systems. This is just a small sample of the important dead languages. I didn’t cover ALPHARD, ALTRAN, Argus, Automath, BCPL, COMTRAN, CPL, Eiffel, FLOW-MATIC, HOPE, Hypercard, ISWIM, JOVIAL, MacSyma, Mesa, Miranda, Multics Shell, PLANNER, SMP, Sketchpad, or SNOBOL. All of them contributed in their own way to the modern programming world. History is complicated. Most influential languages never went mainstream. Few people used any one of them. But each one inspired people, who inspired other people, so the DNA of these forgotten languages appear decades after they’re forgotten. But there are also untold languages that didn’t get their ideas out. The Encyclopaedia of Programming Languages lists over 8,000 programming languages. Many of them had ideas that never left their bubble. Consider how much we’d have lost if nobody had heard of SIMULA, or Liskov never shared CLU. That’s one reason I love studying history. To learn what we’ve lost and find it again. The first draft of this was originally shared on my newsletter. If you found this interesting, why not subscribe?
4768
dbpedia
0
88
https://forem.julialang.org/ndx/where-do-we-go-next-from-here-with-programming-languages-3ec2
en
Where do we go next from here with programming languages?
https://forem.julialang.org/images/EfMDG8BVcB1NfnHz9Do-mhMCtGnxzJUKwGjbeT4ovng/rs:fill:1000:500/g:sm/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS5qdWxpYWxhbmcu/b3JnL3JlbW90ZWlt/YWdlcy91cGxvYWRz/L2FydGljbGVzL2I2/Mm5zNnpzMWtqYjE3/Ymp6amk4LmpwZw
https://forem.julialang.org/images/EfMDG8BVcB1NfnHz9Do-mhMCtGnxzJUKwGjbeT4ovng/rs:fill:1000:500/g:sm/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS5qdWxpYWxhbmcu/b3JnL3JlbW90ZWlt/YWdlcy91cGxvYWRz/L2FydGljbGVzL2I2/Mm5zNnpzMWtqYjE3/Ymp6amk4LmpwZw
[ "https://forem-bce91056d27d.forem.cloud/remoteimages/uploads/logos/resized_logo_CFnt9o1904mD7GeQihh4.png", "https://forem.julialang.org/assets/heart-plus-active-9ea3b22f2bc311281db911d416166c5f430636e76b15cd5df6b3b841d830eefa.svg", "https://forem.julialang.org/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg", "https://forem.julialang.org/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg", "https://forem.julialang.org/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg", "https://forem.julialang.org/assets/raised-hands-74b2099fd66a39f2d7eed9305ee0f4553df0eb7b4f11b01b6b1b499973048fe5.svg", "https://forem.julialang.org/assets/fire-f60e7a582391810302117f987b22a8ef04a2fe0df7e3258a5f49332df1cec71e.svg", "https://forem.julialang.org/images/g_3LMGZPTFtmjWrYKzSVeobElnU_07RZ-RSNfUPhr0Y/rs:fill:1000:420/g:sm/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS5qdWxpYWxhbmcu/b3JnL3JlbW90ZWlt/YWdlcy91cGxvYWRz/L2FydGljbGVzL2I2/Mm5zNnpzMWtqYjE3/Ymp6amk4LmpwZw", "https://forem.julialang.org/images/OeRMPMs-0MVGK8xwRkjFWBvzE6Q1uAvGybGvT4iFoMg/rs:fill:50:50/g:sm/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS5qdWxpYWxhbmcu/b3JnL3JlbW90ZWlt/YWdlcy91cGxvYWRz/L3VzZXIvcHJvZmls/ZV9pbWFnZS82MjIv/OWZiNTUzNjMtNGZj/ZS00NzNmLWJiYTUt/MGYyYzNkNjEzODg1/LnBuZw", "https://forem.julialang.org/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg", "https://forem.julialang.org/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg", "https://forem.julialang.org/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg", "https://forem.julialang.org/assets/raised-hands-74b2099fd66a39f2d7eed9305ee0f4553df0eb7b4f11b01b6b1b499973048fe5.svg", "https://forem.julialang.org/assets/fire-f60e7a582391810302117f987b22a8ef04a2fe0df7e3258a5f49332df1cec71e.svg", "https://forem.julialang.org/images/S67V3CJtf3eEry3GBBCq_L7xklvkV6EBpkYESK2aqUQ/rt:fit/w:256/g:sm/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS1iY2U5MTA1NmQy/N2QuZm9yZW0uY2xv/dWQvcmVtb3RlaW1h/Z2VzL3VwbG9hZHMv/YXJ0aWNsZXMvNXo3/a3Z4c3d4ZmNjMWxp/MGNoM2gucG5n", "https://forem.julialang.org/images/7a1KvI1h33hX1FZJz0bOKhPgbkW93ap387-2qIU9ooY/rs:fill:90:90/g:sm/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS5qdWxpYWxhbmcu/b3JnL3JlbW90ZWlt/YWdlcy91cGxvYWRz/L3VzZXIvcHJvZmls/ZV9pbWFnZS82MjIv/OWZiNTUzNjMtNGZj/ZS00NzNmLWJiYTUt/MGYyYzNkNjEzODg1/LnBuZw", "https://forem.julialang.org/images/TC-ooJOdsCNflVPpWROlQjEiRtoQaPUCRwhNHTBvgh4/rt:fit/w:190/g:sm/q:80/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS1iY2U5MTA1NmQy/N2QuZm9yZW0uY2xv/dWQvcmVtb3RlaW1h/Z2VzL3VwbG9hZHMv/YXJ0aWNsZXMvNXo3/a3Z4c3d4ZmNjMWxp/MGNoM2gucG5n", "https://forem.julialang.org/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg", "https://forem.julialang.org/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg", "https://forem.julialang.org/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg", "https://forem.julialang.org/assets/raised-hands-74b2099fd66a39f2d7eed9305ee0f4553df0eb7b4f11b01b6b1b499973048fe5.svg", "https://forem.julialang.org/assets/fire-f60e7a582391810302117f987b22a8ef04a2fe0df7e3258a5f49332df1cec71e.svg" ]
[]
[]
[ "" ]
null
[ "ndx" ]
2022-07-12T16:43:02+00:00
Once upon a time - exactly 74 years ago - a man born in Berlin, Prussia, at the time of the German...
en
https://forem.julialang.org/images/Vz_KiBFa44eSIr6ZGIchzGfaXYrJpEbM_cqRgerBecw/rt:fit/w:32/g:sm/mb:500000/ar:1/aHR0cHM6Ly9mb3Jl/bS1iY2U5MTA1NmQy/N2QuZm9yZW0uY2xv/dWQvcmVtb3RlaW1h/Z2VzL3VwbG9hZHMv/YXJ0aWNsZXMvZWlu/Nms0Yzg2Y2RnNjhv/bGR2ZncucG5n
Julia Community 🟣
https://forem.julialang.org/ndx/where-do-we-go-next-from-here-with-programming-languages-3ec2
Once upon a time - exactly 74 years ago - a man born in Berlin, Prussia, at the time of the German Empire, published the first concept of what we call today as a "high-level programming language". But wait, a lot happened before then. History has it that after he graduated, he went to work for one of the big motor companies of his time, using his artistic skills in the design of advertisements. He also worked as a design engineer at one aircraft factory in a place near Berlin. This required the performance of many routine calculations by hand, leading him to theorise and plan a way of doing them by machine. This man, who later in the twilight of his days on earth would die from a heart attack, was a self-taught, and built and developed a lot of his works without knowledge about other mechanical computing machines that existed already. Although he would later go on to build on existing discoveries and technologies, between 1936 and 1945 he was in near-total isolation; he completed his work entirely independently of other leading computer scientists and mathematicians of his day. It was in 1945, after the Luisenstadt bombing, that he fled from Berlin to the rural Allgäu. In the extreme deprivation of post-war Germany, he was unable to build computers. Unable to continue building computers - which was also forbidden by the Allied Powers - he devoted his time to the development of a higher-level programming model and language. In 1948 he published a paper and presented it at one of the annual meetings of his time. His work failed to attract much attention. In a 1957 lecture, he expressed his hope that the higher level programming language he had created, "after some time as a Sleeping Beauty, will yet come to life." He expressed disappointment that the designers of another programming language, who used most of his works never acknowledged the influence of high level programming language on their own work. The man we've been taking about is Konrad Zuse, and Plankalkül was the name of the higher level programming language he built, which was essentially made for engineers. ALGOL 58 was the programming language designed with most of his work, but the developers never acknowledged the influence of Plankalkül on it. Plankalkül was more comprehensively published[vague] in 1972. The first compiler was implemented by Joachim Hohmann in his 1975 dissertation. Other independent implementations followed in 1998 and 2000 at the Free University of Berlin, history continues of course, but we will end there for now. To know where we are going we somehow need to know where we are coming from - and that story was introduced just as a way to take us back a little to see where this "higher level programming language" thing started and where it's likely to go. There are currently more than 700 programming languages out there as at this time of writing. Some are low level, most are high level. Some designed for speed, others designed for simplicity. Some designed for security, others designed for low-level interactions, and the list goes on and on and on. Go out now and ask someone with no computer programming language: how many programming languages do we have? Let them make an educated guess with the knowledge of how they see the computer world, chances are they would all answer in the range of 0 - 10. The ones who answer in the range of 20 - 50 would be waiting to get a response back from you that he/she was wrong. Chances are as well, that if you should make an educated guess now, you might get it completely wrong as well. The vague answer is that there are over 700 programming languages. WOW!!!! I failed it as well, it was only by "googling" that I came to know of this unimaginable number. Now, the question is: why 700? why not 1? why not 5? why not 10? why not even 20? how come its over 700?😱. That's an open question, so discuss it with your friends. My view is this: there are over 700 but are all 700 being massively used. Of course, that's a simple NO. If all 700 are not being used, why do they exist? They exist because there's an informal law in computer programming: "programming languages are only created, they cannot be destroyed." But why can't they be destroyed? Because the trend is: learn programming and then fall in love with a language. Then technology advances and you know we can no longer use your "loving language" efficiently anymore as the current world demands something with a different philosophy, which your loving language can do if tweaked, but wasn't designed for, but you choose to continue using it and only use a different language when your job requires it. So you see, we can only create but not destroy, because every creation gathers a fan base which won't let it die, no matter its age - Plankalkül is a case in point, as its still being used. But that doesn't answer the question of this article - where do we go next from here with programming languages? Honestly, speaking technology is one sweet field where advances are great (and they would always come), only if it won't hurt or demand you to learn something kinda entirely new or change your current perspectives and beliefs. The golden question now is this: if the man/woman who created your programming language "loved" the language they learnt programming with so much that it blinded them to see a change or think in a new paradigm, do you think we would have seen another high-level programming language other than Plankalkül ? Would there have been Python, if Van Rossum so loved BASIC, or would there have been C++ if Bjarne so loved C, or would there have been Julia if Viral could only see Lisp, and the list goes on and on and on. The point is technology itself is built on ADVANCEMENT, so no matter how much you love a language, know its only going to be for a while or if you're lucky enough, your whole lifetime; but no matter how hard you try, your kids won't use what you use, or even if they used what you use, they won't use it in the same way that you used it - that's how technology works, so embrace it. But still the question is not answered yet - where do we go next from here with programming languages? Where we go next from here depends on two things: A body would be set up to regulate the creation of programming languages. New languages won't be created except they're coming with a completely different philosophy. Now think for a moment if advancement and discoveries was regulated? Would there be any "outside-the-box" discoveries accepted. Of course the answer is NO. History shows how regulation only helps in maintaining moral conducts and order but kills imagination and discoveries into new areas; this is the reason our school system still thinks "grading students" has been the norm and so is good, even when mathematically and statistically speaking, grades are no projection into the future of any child, as countless evidences has shown, and rather than helping students, it kills their morale and makes some feel like losers. The truth is if there was a regulation (which there won't be) in the technology space to control new discoveries, the world would have been a boring place, as we would have only seen ways on how to improve the post office system instead of envisioning an "email" along with it. C++ was designed with "classes" in mind. So its built on the already existing ideas of "C" but with "classes" included. So can we say "classes" is a whole different philosophy to enforce the creation of a new language. If its not, why don't we just extend the language "C", so it accepts the "class" idea and then we don't need a new language, but simply have one with two design philosophy? The answer to this question is a question: a car can drive in a blazing speed and take us where we want to in a small time and with comfort compared to if had to walk by foot. The aeroplane can fly us to where we want to in a small time and with comfort compared to if we had to walk or drive a car. But if a car "can take us in a small time and with comfort" to where we want compared to if we walk by foot, why do we need an aeroplane then? Why not just add to the design of a car so it can fly, and then we only need one device, just a car? Until you can answer this question with the car and aeroplane analogy, you can't answer the question with C and C++. The truth is "design and philosophy" is very different from "features and abilities". "Features and abilities" build on pre-existing "design and philosophy", but then "design and philosophy" are completely built on new "ideas and thought process." There is no difference between a Tesla car and a Ferrari, except the features and abilities. So we only needed a Ferrari when we needed new features, for this we do not need a new device, we simply have to build on the "pre-existing design and philosophy" of a car. But when it comes to an aeroplane, it brings a new idea - the idea of objects being able to fly in the air. That's a whole new "though process". So you see, its best if we come down and think of how to make "objects" fly instead of how to make a "car" fly. Then we finally did it, the name doesn't matter - we could have called an aeroplane a "flying car", what matters is the different idea an aeroplane has. This is how it is too with programming languages. Yes, there are languages with "features and abilities" which in the true sense the developers should have thought of contributing to the orginal languages they borrowed the idea from, instead of creating a new one - but we humans know how we all want something called "our" own, instead of I built "their" own: this is the reason there are many programming languages in the world today, not the design and philosophy thing. Its good to know C++ was designed for classes and C for procedural. So when a use case comes for each design, I know where to go to. The question then is if we have very limited programming paradigms and concepts, how come we have over 700 languages as if we're on planet talking to humans, when in its true sense we're only talking to a computer? The reason is the "features and abilities" that leads to new programming languages instead of developing already existing ones when its just "features and designs". Of course we all caused it, when we started attributing programming languages to "owners", "creators", etc. If there was no "creator" or "owner" to a programming language, then the idea of contribution would have been easy for us humans. Look at the earth, imagine if only Mr. A owned planet earth, we would all have seen this has a place to just dump and go, but because we see it as for "humanity" as a whole, we all contribute to its good. So this is one area in the future where open source can go to, an area where there are only "contributors" and no "creators", "owners" or "founders" exists. Still the question yet is not answered: where do we go next from here with programming languages. I think now I can answer in my view. The programming languages numbers would keep surging at a rate probably 100 against new programming paradigms or concepts, as the creation of any programming paradigm leads to at least 100 programming languages with same "design and philosophy" but different "features and abilities", i.e, programming languages : programming paradigms is at a ratio of 100:1. Secondly programming paradigms and new concepts would be getting diminished as a lot of emphasis would be on languages instead of new paradigms that would help the technical computing community. This is of course is my view and it can be wrong. Finally I want to speak on Julia vs Python? Comparing this two languages is like comparing an aeroplane to a car. They both have one purpose: take us to where we want to in quick time and comfort than if we would work. But with different "design and philosophy", one moves on the land and is quick compared to a human foot, the other moves on the air, and is quickest compared to the car and a human foot. So you sit there and ask? Should I switch to Julia from Python, or go learn C++, or even ruby. C++ is a programming language used to talk on a lower-level with the computer. It is still popular because of an idea I call "diminishing-increasing dependency". Diminishing-Increasing dependency is a concept where we are going away from something, but haven't found other alternatives to fully replace it, so the more we go away from it looking for other alternatives, the more we rely on it. Check the last 100 programming languages designed, and you'd see what I'm talking about: the syntax is so different from C++, that you wouldn't even think any programming language should look like C++ do. But the truth is, yes the generation we're slowly moving into does not want a program to be "dirty" like C++ even when it does the work, we want something we can read to somebody else and still give the computer to execute it. But the truth is, the computer is still at a stage where it only understands 0's and 1's - so C++ is the language that interacts with it so closely at that level, hence the more better a language we look for, the more C++ we would be using. So C++ would keep getting popular as it has no competitor, because no languages are being develop that way to be "dirty" anymore, just so it can talk on a lower-level with a computer. So you see modern languages having more competitors than old languages like FORTRAN, COBOL, C and C++, because no one is building languages in that style anymore - and the more we look for the perfect language, the more we have to depend on the only ones we have for now that can talk to a computer in a very close way - "Diminishing-Increasing Dependency". Of course, 100 years from now, C++ and its counterparts would all be gone in a way and would only be used as legacy languages, because we would have developed the computer to a stage where it can take human-readable code and still be fast and even faster than C++ and its counterpart would have been - Is 100 years too much an exaggeration? Well, just give it any number, but one thing is sure, its not happening in your time if you're old enough to read this as at the time of writing. Now back to Python and Julia. If you're an avid Python lover, do you belief deep down 20 years from now, we would all still be depending on Python to run numerical simulations? Well the future would tell anyway, but sure we know the answer in a way. The truth is we can't depend on Python for numerical simulations 10, 20, 30 years from now. The language wasn't designed for "speed" and "numerical computing". It was designed for "readability" and "general use". Every day, data is being flooded out into the space, and we want something a bit faster and one that makes lives easier as well. Python is a great language but it just got into a community that would come back to bite it. Numerical computing is not "general use", it needs to be "fast", and that's a first requirement. Then it needs to embrace the mathematical world and breathe it. Python itself does not have any one of these traits, it only works as an interface after the work is done inside by other core languages - making you worry over learning another language if you have an idea that's worth building from scratch and is very time intensive. Now when you come to Julia, you could in a way exaggerate that the 100 years I said above is finally here. It gives you the speed and even more than the speed C++ would give you (this is still largely debatable) along with the readability. So in a sense, Julia is the 21-st century language, as it seeks to finally make C++ and its counterpart obsolete. So you see, Python is no comparable to Julia, Julia is competing with C++ and counteparts we all know that, but as for Python, it is just too small for it - the fact that the community hasn't fully accepted Julia yet doesn't makes Python anywhere near it; Julia is just waiting for full industry acceptance. Python is only there because of its uncountable libraries available (libraries mostly developed outside the language f its does a lot of intensive work) and community base it has over Julia, not the language itself Julia beats C++ speed on some instance and also breathes and lives the math world. You can use '√' to represent a square root and its valid code. 2x - 5y + 4x * 23y is valid Julia code. This is just a pinch of how it lives for the mathematical world - so your kids would grow up to see that just as we understand "2x - 5y", the computer understands it as well. Of course, python can add that if it chooses, but the point is, it wasn't designed for that - you want to be on an aeroplane flying and not a car that was tweaked so it can fly.
4768
dbpedia
3
6
https://fortran-lang.discourse.group/t/history-of-fortran-fortran-ii-source-code/117
en
History of Fortran (Fortran II source code)
https://canada1.discours…c805f45c6ee6.png
https://canada1.discours…c805f45c6ee6.png
[ "https://canada1.discourse-cdn.com/free1/uploads/fortran_lang/original/1X/753300ee50b9fc3850b174441743608cd407dbbc.png", "https://avatars.discourse-cdn.com/v4/letter/m/858c86/48.png", "https://emoji.discourse-cdn.com/twitter/slight_smile.png?v=10" ]
[]
[]
[ "" ]
null
[]
2020-06-14T04:52:02+00:00
I noticed the following resources today: http://www.softwarepreservation.org/projects/FORTRAN/ https://news.ycombinator.com/item?id=23497895 Among other things, they have the FORTRAN II compiler sources and they claim&hellip;
en
https://canada1.discours…6ee6_2_32x32.png
Fortran Discourse
https://fortran-lang.discourse.group/t/history-of-fortran-fortran-ii-source-code/117
I noticed the following resources today: http://www.softwarepreservation.org/projects/FORTRAN/ https://news.ycombinator.com/item?id=23497895 Among other things, they have the FORTRAN II compiler sources and they claim they got it working in a simulator, as well as the programs that the compiler generates. I don’t have time to do it myself, but would somebody be interested in trying to get the FORTRAN II compiler working and posting step by step instructions? Also I would be interested in having a page at fortran-lang.org about Fortran’s history, with the focus on the language changes with code examples. We can do that from FORTRAN 66 onward relatively easily using the published standards, see e.g. here: https://www.fortran90.org/ But I would like to extend this all the way to FORTRAN I if possible. Based on https://en.wikipedia.org/wiki/Fortran#History, it seems there was simply FORTRAN I (1957) FORTRAN II (1958) FORTRAN III (1958, but never released as a product) FORTRAN IV (1962) The FORTRAN IV was essentially equivalent to FORTRAN 66, of which the standard is available (see the link above). The Wikipedia page is a good start, but I would prefer to have more code examples for each Fortran version, I think it conveys much better what the language allowed and how it looked like. The basic elements of Fortran didn’t really change from FORTRAN I, and so having this page at fortran-lang.org would allow to drive the point home of what problem Fortran is trying to solve and that the original mission still applies today. I am quite confused about the early history of the standardization of FORTRAN. The English Wikipedia page says about ANSI X 3.9 1966 Fortran 66 (published on the 7th March 1966): FORTRAN 66 effectively became the first industry-standard version of FORTRAN. ANSI X 3.9 1966 Fortran 66 : American National Standards Institute : Free Download, Borrow, and Streaming : Internet Archive But it does not speak about the ECMA-9 Fortran standard, published in April 1965: ECMA is the European Computer Manufacturers Association. I am also confused about that paragraph in the introduction: The ECMA FORTRAN has been adopted by the International Organization for Standardization (ISO) as intermediate level (referred to as Intermediate FORTRAN) between “FORTRAN” and “Basic FORTRAN” as will be described in the ISO Recommendation on FORTRAN. The ECMA FORTRAN is a subset of the full ISO FORTRAN. I thought Fortran became an ISO standard only with Fortran 90. So what is that ISO FORTRAN they are talking about? I have found two interesting documents about the early standards: https://fortran.bcs.org/~fortrandev/1972/fortran_in_washington_1972.pdf In July 1972 ISO published a recommended standard for FORTRAN under the document number R1539. This has three levels of the language, corresponding closely with American Standard FORTRAN, ECMA FORTRAN, and American Basic Standard FORTRAN. That second document gives more details about the history of Fortran standardization: My understanding is that the ECMA Fortran was the first Fortran standard in 1965, but it was a subset of the future Fortran 66, which was itself (like Fortran 77) a double standard with the full FORTRAN standard and a Basic Fortran standard. ECMA was somewhere between those full and basic standards. In 1972, the ISO published an ISO Recommendation for Fortran (R1539), including these three levels of the language. I have found no clear definition of what is an ISO Recommendation, but my understanding is that it is not a true standard but may become one later (?). It is all the more surprising that it was not until Fortran 90 that it became an ISO standard. I’m afraid that getting the compiler working on an IBM 704 simulator will still leave a lot more to be done before the program can be run. The output from the compiler would be SAP assembler code (example of SAP code for a short Fortran program that inverts a 5 X 5 Hilbert matrix ). That assembler code would need to be run through the assembler and linked with the Fortran II library to obtain the equivalent of an EXE/a.out. Add to that the IBM 704 OS, simulated peripherals such as card reader, line printer, channels, etc.! With a modern Fortran compiler one should have little trouble running Fortran II programs on a PC or MAC, except for a few features such as sense lights and sense switches that may not be included in the simulator. Whoops, formatting is lost, try again: C SOLUTION OF QUADRATIC EQUATION C (P. 122 OF A FORTRAN PRIMER BY E. ORGANICK) 1 READ INPUT TAPE 5, 51, ANAME, N 51 FORMAT(A6,I2) WRITE OUTPUT TAPE 6,52, ANAME 52 FORMAT(1H1,33HROOTS OF QUADRATIC EQUATIONS FROM A6) DO 21 I = 1, N READ INPUT TAPE 5, 53, A, B, C 53 FORMAT(3F10.2) WRITE OUTPUT TAPE 6,54, I, A, B, C 54 FORMAT(1H0,8HSET NO. I2/5H A = F8.2,12X,4HB = F8.2,12X,4HC = F8.2) IF(A) 10, 7, 10 7 RLIN = -C/B WRITE OUTPUT TAPE 6, 55, RLIN 55 FORMAT(7H LINEAR,25X,4HX = F10.3) GO TO 21 10 D = B**2 - 4.*A*C IF(D) 12, 17, 17 12 COMPR = -B/(2.*A) COMP1 = SQRTF(-D)/(2.*A) COMP2= -COMP1 WRITE OUTPUT TAPE 6, 56, COMPR, COMP1, COMPR, COMP2 56 FORMAT(8H COMPLEX,21X,7HR(X1)= F10.3,11X,7HI(X1)= F10.3,/1H ,28X, 17HR(X2)= F10.3,11X,7HI(X2)= F10.3) 16 GO TO 21 17 REAL1 = (-B + SQRTF(D))/(2.*A) REAL2 = (-B - SQRTF(D))/(2.*A) 20 WRITE OUTPUT TAPE 6, 57, REAL1, REAL2 57 FORMAT(6H REAL 25X,5HX1 = F10.3,13X,5HX2 = F10.3) 21 CONTINUE WRITE OUTPUT TAPE 6, 58, ANAME 58 FORMAT(8H0END OF A6) GO TO 1 END
4768
dbpedia
0
26
https://retrocomputing.stackexchange.com/questions/15988/what-did-the-dod-think-fortran-lacked
en
What did the DoD think Fortran lacked?
https://cdn.sstatic.net/…g?v=6b27f39a6023
https://cdn.sstatic.net/…g?v=6b27f39a6023
[ "https://i.sstatic.net/kK3cz.gif?s=64", "https://www.gravatar.com/avatar/5aedd802dca0d3b7397664dfac645ae8?s=64&d=identicon&r=PG&f=y&so-version=2", "https://i.sstatic.net/kK3cz.gif?s=64", "https://www.gravatar.com/avatar/04ce27f0ebe4713b51911cd8475c3b42?s=64&d=identicon&r=PG", "https://lh3.googleusercontent.com/-yspgfz81fFw/AAAAAAAAAAI/AAAAAAAAD5o/7CFgAMaSHSE/photo.jpg?sz=64", "https://retrocomputing.stackexchange.com/posts/15988/ivc/acfa?prg=ba9ee183-9dbf-404e-944d-bc4311e0eb9b" ]
[]
[]
[ "" ]
null
[]
2020-08-27T21:45:33
According to https://en.wikipedia.org/wiki/COBOL the project to design COBOL began when On 28 and 29 May 1959 (exactly one year after the Zürich ALGOL 58 meeting), a meeting was held at the Pentag...
en
https://cdn.sstatic.net/Sites/retrocomputing/Img/favicon.ico?v=9b6f1e4fd156
Retrocomputing Stack Exchange
https://retrocomputing.stackexchange.com/questions/15988/what-did-the-dod-think-fortran-lacked
The meeting that defined the requirements of the new language took place on May 28–29, 1959. Charles Phillips prepared a memo several months later summarizing the decisions made at that meeting. Its listing of requirements is reprinted on page 201 of the ACM’s History of Programming Languages. a. Majority of group supported maximum use of simple English language; even though some participants suggested there might be advantage from using mathematical symbolism. b. A minority suggested we steer away from problem-oriented language because English is not a panacea as it cannot be manipulated as algebraic expressions can. c. The need is for a programming language that is easier to use, even if somewhat less powerful. d. We need to broaden the base of those who can state problems to computers. e. The [Common Business Language] should not be biased by present compiler problems. The committee did not consider FORTRAN as an alternative. According to Jean E. Sammet, who was chair (She describes herself as the “chairman”) of two of the committees that developed COBOL and served on a third, the major inspirations were FLOW-MATIC (developed by Grace Hopper and others for Remington-Rand Univac), AIMACO (developed by the Air Materiel Command based on Hopper’s work, and described by Sammet as “a minor modification of FLOW-MATIC”) and COMTRAN (Commercial Translator, which at that time existed as a manual at IBM, and had never been implemented). Sammet claims that FACT, developed at Honeywell, had far less influence on COBOL than some people believed. The entire chapter I link to has extensive notes that Sammet took at the time from the committee that developed COBOL, and the decisions it made. She makes the particularly interesting admission on page 221: I felt there was a strong anti-IBM bias in this committee from me, and from some (but certainly not all) of the others. Since I was not working for IBM at the time, I can freely (although not with pride) admit that in some cases suggestions or decisions were made on the basis of doing things differently from how IBM did it. For example, we felt that the verb for loop control should not be called DO because that was how FORTRAN did it. Sammet lists among the ideas COBOL took from FLOW-MATIC, “It used full data names rather than short symbolic names (as in FORTRAN)” e.g. SOCIAL-SECUR instead of SOCSEC, and used English words as commands. Less cosmetically, it allowed fields to be packed into a data word. She says, “Note that Fortran assumes that every number is in a single machine word.” It separated data definitions from instructions, which she says became so commonplace that it’s difficult to appreciate what a conceptual breakthrough it was. Among the ideas she lists as coming from COMTRAN are nested data structures, expressions, and conditionals. It was controversial at that time to allow mathematical formulas and even Boolean expressions, as some committee members believed those were only needed in a few edge cases. She also states that IAL, which developed into ALGOL, had a significant influence, by convincing the committee not to follow its example, and instead allow in its source code only characters that actually exist. FORTRAN was, at the time(*1), lacking almost everything, from string handling to all I/O beside reading numbers from cards or tape. Heck, not even integer size was guaranteed across machines. No real way of structuring or flow control beside GOTO — even subroutines/functions were only integrated a year before with FORTRAN II. For most parts, FORTRAN is a symbolic assembler with a focus on mathematics, that made it easy to write formulas, but not much else. But real world computing is about data handling and I/O. This is especially true for a huge organization as the US military, represented by its bureaucratic arm, the DoD. Being able to easily write down some complex calculation is nice, but useless in an environment there it's about managing inventory, ordering supplies, calculating pay and have all of that delivered in time. An army is like a huge corporation, not a scientific institute and the task at hand is data processing, not number crunching. Data Processing is vastly different from number crunching — it's a completely different world. It's about the classic "read card, process item, write output" loop, the stuff that got automated with punch cards. It is the core reason why the /360 has survived until today as a successful architecture. Its instruction set is perfectly tailored to shovel data, being built to support these principles. The fact that IBM tried to make it an all around (360 degree) architecture by including FP and even trying to make it fit for process control didn't really matter on the long run — others were much better for either. And the DoD did need data processing to fight the wars at the time (Korea just ended and Vietnam was coming up), and a language to support writing data processing programs in a machine independent way. That's why COBOL was developed as a result of the mentioned conference. *1 - FORTRAN did improve a lot since back then, still, it didn't really change the basic nature.
4768
dbpedia
3
24
https://www.theregister.com/2020/05/15/algol_60_at_60/
en
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
[ "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_extents_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_filled_extents_16x16.png", "https://www.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/std/reg_logo_no_strapline.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/magnifying_glass_white_extents_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_16x16.png", "https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_close_16x16.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0VEHWlSz1sq7b5zolIlgAAAI0&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_red.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0VEHWlSz1sq7b5zolIlgAAAI0&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0VEHWlSz1sq7b5zolIlgAAAI0&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0VEHWlSz1sq7b5zolIlgAAAI0&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0VEHWlSz1sq7b5zolIlgAAAI0&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0VEHWlSz1sq7b5zolIlgAAAI0&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0", "https://regmedia.co.uk/2020/05/14/algol_60_tapes.jpg?x=648&y=372&infer_y=1", "https://regmedia.co.uk/2020/05/14/algol_plotter.jpg?x=648&y=324&infer_y=1", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg", "https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png", "https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=6&c=66Zr0VEHWlSz1sq7b5zolIlgAAAI0&t=ct%3Dns%26unitnum%3D6%26raptor%3Dhawk%26pos%3Dbtm%26test%3D0", "https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_white.png", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg", "https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/std/sitpublogo_2022.png", "https://www.theregister.com/Design/graphics/std/transparent_pixel.png" ]
[]
[]
[ "" ]
null
[ "Richard Speed" ]
2020-05-15T00:00:00
Back to the time when tape was king
en
/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
https://www.theregister.com/2020/05/15/algol_60_at_60/
2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages. The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes. ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula. "In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research in Cambridge, with every computer having its own particular twist on things. A first generation of languages, called "Autocode", existed for coding problems like equations which could then be translated into machine code, but lacked the bells and whistles of today. Worse, some had features that others lacked, making hopping between systems tricky. "There was an Autocode for the [Elliott] 803," said Onion, "but it only supported expressions like A + B = C, so if you've got a complex equation, you have to break it down into individual single binary operations. So there was still a lot of hard work to be done by the programmer." "Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more productive." The overhead of compiling, and inefficiencies in the compilers themselves, meant that machine code remained king of the performance hill, but for those doing science work, the ability to churn out some code to solve a problem and then simply move on to the next was appealing. "Fortran," Herbert continued, "was more like an autocode," before laughing, "It still is in some ways! "And a bunch of people thought you could do better." Enter the International Federation for Information Processing (IFIP), which Herbert recalled "had a whole bunch of committees who looked at standards and problems in computing". One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output, in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds of things hadn't really been thought about or worked through properly," recalled Herbert. And so there were revisions and changes. A periodical called "The ALGOL Bulletin" detailed the travails of those involved as the problems and the weaknesses in the language were dealt with (or at least attempted). The process was not unlike an open-source mailing list today, but in paper form. Eventually, Herbert told us, "they published the ALGOL 60 report, which is the baseline that everyone then worked to." The committees were under pressure and also suffered a little from differing international approaches. The American side had a lot of experience in Fortran and were seeking something that could quickly be made to work on their computers, while the Europeans were a little more cerebral and had, Herbert laughed, "terrible notions like beauty and elegance in mind for the language". "People were sorting out some of the things that we now take for granted like ideas in structured programming, data structures, data types," he added. Seeking solutions to the problem of portability of programmers between systems and code between hardware generations as well as avoiding the pain of having to rewrite programs every time a new iteration of computer arrived, vendors embraced the language with variants cropping up over many manufacturers. Alas, those seeking a handy-dandy "HELLO WORLD" example will be disappointed. The Achilles' heel of the language that would go on to inspire so many others was that it lacked standard input/output capabilities. "The defining committee couldn't agree on how to do input/output," said Herbert. "They decided that would be left to a library, and that library would be user dependent." "In this case," added Onion, "the user being the compiler writer." Oh dear. The omission pretty much did for vendor independence as manufacturers naturally went their own way, leaving large chunks of code incompatible between systems. There were also elements of ALGOL 60 that were open to interpretation, leaving it a little compromised from the start. While ALGOL ploughed its furrow, Fortran continued to be developed in parallel. "People in the Fortran world," explained Herbert, "saw ideas in ALGOL they quite liked and brought them across." As the decades passed, Fortran remained the centre of gravity for scientific computing while ALGOL became more of an academic language, used for teaching computer science ideas. "It was quite heavily used in the scientific community," Herbert said. "Most mainframe manufacturers supported it." Some of the team behind ALGOL 60 stayed with the project and went on to come up with ALGOL 68, which, as far as Herbert is concerned, "nailed all the things that ALGOL 60 had left a bit vague". Indeed, it was hard to avoid in the 1970s for those taking computer science courses. This hack has fond memories of the successor language, while the grandfather of Reg sub-editor Richard Currie had a hand in the development of ALGOL 68-R and RS. "It had the world's most exotic input output system," Herbert laughed. It was also, sadly for its enthusiasts, a bit of a dead end. Despite ALGOL 68-R becoming widely used in (particularly British) military applications for a time, it would take until the 1970s for a full implementation of ALGOL 68 to become available. The last edition of The ALGOL Bulletin was published in 1988, with its editor noting: "ALGOL 68 as a language is very stable. It is used and loved by those who understand its benefits, and ignored (or misquoted) by the rest." The story of ALGOL 60 is not so much of the language's eventual fate, but also of those that it inspired. ALGOL W, based on a proposal for ALGOL X, by Niklaus Wirth and QuickSort creator Tony Hoare would go on to inspire Wirth's Pascal and Modula-2. Pascal's influence continues to be felt today. ALGOL 60 also heavily influenced the Combined Programming Language (CPL), developed in the 1960s but not implemented until the following decade. CPL in turn led to Basic CPL (BCPL), from which B descended. The B language was further developed to become C. Tony Hoare was responsible for the implementation of ALGOL 60 on the Elliott 803 computer, an example of which remains operational at The National Museum of Computing, although compiling and running a program on that hardware is a little different to the development environments to which coders are now accustomed. First, the compiler must be loaded from paper tape. The ALGOL program itself is then fed into the tape reader and "it sort of chunters away," remarked Onion, "for anything between 30 seconds to perhaps 15 or 20 minutes during the compilation." Behold the chuntering Once compiled, a program would be free to use the space originally occupied by the compiler. Doing so would, however, not win the programmer any popularity awards since the next user would have to reload the compiler again. Leaving it in memory meant that multiple programs could be run. "That made it very popular for teaching," said Herbert, "because you can have a line of students, each with their paper tape with their programme in their hand and you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish, you go away and think about it and the next student runs their programme." With paper tape being king, Onion observed that the experience of programming taught a bit of focus: "When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code…" The National Museum of Computing has two Elliott machines in its collection, a 1962 803B (which was donated after spending 15 years lurking in a barn following its decommissioning) and a 903. Both are fully operational and can be seen running once the museum is able to open its doors once again. The 803B, which is maintained by Onion, also features a Calcomp drum plotter as well as some additional input/output features. As for taking the ALGOL 60 itself out for a spin today, there are a few options for those not fortunate enough to have an Elliott 803 or 903 to hand. MARST will translate ALGOL 60 to C or one can get a feel for the whole 803 experience via a simulator. Although as ALGOL 60 turns 60, you could just fire up a modern programming language. Lurking within will likely be the ideas of ALGOL's designers. ®
4768
dbpedia
1
46
https://jmvdveer.home.xs4all.nl/en.post.perspective-of-algol-68.html
en
A perspective of Algol 68
https://jmvdveer.home.xs…gol-68-genie.png
https://jmvdveer.home.xs…gol-68-genie.png
[ "https://jmvdveer.home.xs4all.nl/images.banner.png", "https://jmvdveer.home.xs4all.nl/images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.veldpost.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.sals-sans.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.endgame.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.sun.png", "https://jmvdveer.home.xs4all.nl/thumb.images.pooh.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.pleiades.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.vm-mvs.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.cd.png", "https://jmvdveer.home.xs4all.nl/thumb.images.interface.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.language-banner.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.thermodynamics.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.cwi.png", "https://jmvdveer.home.xs4all.nl/thumb.images.interface.png", "https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-thesis.png", "https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-foaralder.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.eeyore.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/images.veer.png" ]
[]
[]
[ "" ]
null
[ "Marcel van der Veer" ]
2024-07-03T15:03:26+00:00
To better understand the position of Algol 68 among today’s plethora of programming languages, we should consider their development …
images.icons.a68g.ico
https://jmvdveer.home.xs4all.nl/
https://jmvdveer.home.xs4all.nl//en.post.perspective-of-algol-68.html
The development of Algol played an important role in establishing computer science as an academic discipline. The former Mathematisch Centrum Amsterdam, now CWI, Centrum voor Wiskunde & Informatica was a leading institute in the design of Algol 68, as it was in later years in the development of Python. To better understand the position of Algol 68 among today’s plethora of programming languages, we should consider their development. First I will give a chronological overview discussing the period 1950-1960 in which a number of programming languages evolved, then the period 1980-1990 that saw consolidation of imperative languages and later the recombination and maturation of existing ideas in the 1990’s, and finally current trends. After that, the particular history of Algol 68 is discussed, and its current position. You will of course be aware that this post is an Algol 68 implementer's perspective. In the period 1950-1960 a number of programming languages evolved, the descendants of which are still widely used. The most notable are Fortran by Backus et al., Lisp by McCarthy et al., Cobol by Hopper et al. and Algol 60 by a committee of European and American academics including Backus. Algol 60 was particularly influential in the design of later languages since it introduced nested block structure, lexical scope, and a syntax in Backus-Naur form (BNF). Nearly all subsequent programming languages have used a variant of BNF to describe context-free syntax. At the time of the development of Algol 68, programming languages were required to serve two purposes. They should provide concepts and statements allowing a precise formal description of computing processes and facilitate communication between programmers, and they should provide a tool to solve small to medium-sized problems without specialist help. The context of Algol 68’s development is perhaps adequately illustrated by a quote [1] from Edsger Dijkstra: The intrinsic difficulty of the programming task has never been refuted … I vividly remember from the late 60’s the tendency to blame the programming languages in use and to believe in all naivety that, once the proper way of communicating with the machines had been found, all programming ills would have been cured. The early procedural programming languages served above purposes required for them. However, the evolving need to build complex interactive systems asked for decomposition of a problem into "natural" components, resulting in object oriented programming languages starting as early as the 1960’s. The object oriented and procedural paradigms each have strengths and weaknesses and it is not always clear which paradigm is best suited to certain problems. In numerical and scientific computing for instance, the benefit of object oriented languages over procedural languages is controversial since in heavy number crunching, efficiency is a top priority. The latter is a likely reason that software packages coded in Fortran are operational today, despite the rise of for instance Python in this field. The period 1960 to 1980 produced most of the major language paradigms now in use. Algol 68 was conceived as a successor to Algol 60. Its syntax and semantics became even more orthogonal and were defined by a Van Wijngaarden grammar, a formalism designed specifically for this purpose. Simula by Nygaard and Dahl was a superset of Algol 60 supporting object oriented programming, while Smalltalk by Kay, Ingalls and Kaehler, was a newly designed object oriented language. C, the Unix system programming language, was developed by Ritchie and Thompson at Bell Laboratories between 1969 and 1973. Prolog by Colmerauer, Roussel, and Kowalski was the first logic programming language. ML by Milner built a polymorphic type system on top of Lisp, pioneering statically typed functional programming languages. Each of these languages spawned a family of descendants, and most modern languages count at least one of them in their ancestry. Other important languages that were developed in this period include Pascal, Forth, Scheme and SQL. During the 1980's imperative languages consolidated. Although major new paradigms for imperative languages did not appear, many researchers elaborated on existing ideas from the 1970’s, for example object oriented programming, and adapting them to new contexts, for example to distributed systems. C++ combined object oriented programming and system programming. The United States government standardised Ada as a system programming language for defense contractors. Mainly in Japan major efforts were spent investigating so-called fifth-generation programming languages that incorporated logic programming constructs. The functional languages community standardised ML and Lisp. Research in Miranda, a functional language with lazy evaluation, began in this decade. An important trend in 1980’s language design was increased focus on programming large-scale systems through the use of modules, reflected in the development of Modula, Ada and ML. Some other notable languages from the 1980’s are Objective C and Perl. In the 1990’s recombination and maturation of existing ideas continued. An important motivation in this period was productivity. Many rapid application development (RAD) languages emerged, which usually were descendants of older, typically object oriented, languages that were equipped with an IDE and garbage collection. These languages included Object Pascal, Visual Basic, and Java. Java in particular received much attention. More radical and innovative were new scripting languages. These did not directly descend from other languages and featured new syntax and liberal incorporation of features. Many consider these scripting languages as more productive than RAD languages, though others will forward that scripting languages may make small programs simpler but large programs are more difficult to write and maintain. Nevertheless, scripting languages came to be the most prominent ones used in connection with the internet. Some important languages that were developed in the 1990’s are Haskell, Python and PHP. Some current trends in programming languages are mechanisms for security and reliability verification, alternative mechanisms for modularity, component-oriented software development, constructs to support concurrent and distributed programming, metaprogramming, and integration with databases. This century has so far seen the introduction of for example C#, Visual Basic.NET and Go. Algol, ALGOrithmic Language, is a family of imperative computer programming languages which greatly influenced many other languages and became the de facto way algorithms were described in textbooks and academic works for almost three decades. The two specifications relevant to this publication are Algol 60, revised in 1963, and Algol 68, revised in 1976. Algol 58, originally known as IAL (International Algebraic Language), was an early member of the Algol family soon superseded by Algol 60. Algol 58 introduced a compound statement which was restricted to flow of control only and did not relate to lexical scope as do Algol 60’s blocks. Ideally, a programming language supports systematic expression of algorithms by offering appropriate control structures and data structures, and a precise, consistent formal definition to avoid surprises and portability issues resulting from details that are implementation dependent. Members of the Algol family (Algol 60 and Algol 68, Simula, Pascal and also Ada, …) are considered reasonable approximations of such "ideal" languages, although all of them have strong points as well as disadvantages. Algol 68 offers appropriate means of abstraction and exemplary control structures that lead to a good understanding of structured programming. Its orthogonality results in economic use of language constructs making it a beautiful tool for programming. The design of Algol was firmly rooted in the computing community, at the time a term for the small but growing group of computer professionals and scientists. It formed an international platform for discussing programming languages, compiler construction, program construction, et cetera, and thus Algol had an important part in establishing computer science as an academic discipline in its own right. Algol 60 was designed by and for numerical mathematicians; in its day it was the Lingua Franca of computer science. The language introduced block structure with lexical scope and a concise BNF definition that were appreciated by people with a background in mathematics, but it lacked compilers and industrial support which gave the advantage to languages as Fortran and Cobol. To promote Algol, its application range had to be extended. IFIP [2] Working Group 2.1 Algorithmic Languages and Calculi (WG 2.1), that to this day has continuing responsibility for Algol 60 and Algol 68, assumed the task of developing a successor to Algol 60. In the early 1960’s WG 2.1 discussed this successor and in 1965 descriptions of a language Algol X based on these discussions were invited. This resulted in various language proposals by Wirth, Seegmüller and Van Wijngaarden [3] and other significant contributions by Hoare and Naur. Van Wijngaarden’s paper Orthogonal design and description of a formal language [4] featured a new technique for language design and definition and formed the basis for what would develop into Algol 68. Many features found in Algol 68 were first proposed in Algol Bulletin by the original authors of Algol 60 like Peter Naur, by new members of WG 2.1 like Tony Hoare and Niklaus Wirth, and by many others from the world-wide computing community. Reference [5] gives a first hand account of the events leading to Algol 68. Algol 68 has had a large influence on the development of programming languages since it addressed many issues; for example orthogonality, a strong type system, procedures as types, memory management, treatment of arrays, a rigorous description of syntax, and parallel processing, but also ideas that caused debate over the years such as context-sensitive coercions and quite complicated input-output formatting. After various, apparently lively, meetings WG 2.1 had not reached unanimous consent. Eventually, Algol 68 was produced by those members wanting a new milestone in language design. Dijkstra, Hoare, Seegmüller and others wrote a brief "Minority Report", more a letter than a report actually, outlining their view on what Algol 60's successor should have been; many years later it was commented that no programming language developed since, would have satisfied that vision. Yet others, for example Wirth and Hoare, opted to shorten the development cycle by improving Algol 60, which eventually produced Algol W and later Pascal. Algol 60 syntax is in BNF form whereas Algol 68 syntax is described by a two-level W-grammar (W for Van Wijngaarden) that can define a context-sensitive grammar. Simply put, in a W-grammar a high-level grammar produces a low-level grammar for all correct particular programs. The concept of context-sensitive grammar was introduced by Chomsky in the 1950’s to describe the syntax of natural language where a word may or may not be appropriate in a certain position, depending on context. Analogously, Algol 68 syntax defines syntactic restrictions; for example demanding that applied-identifiers or operators be declared (but not necessarily before they are applied), or demanding that modes result in finite objects that require finite coercion, et cetera. To reject incorrect programs, a context-free syntax must be complemented with extra rules formulated in natural language. The rigorous Algol 68 specification had a few loose ends, while the C specification had a few hundred. Probably because of the formal character of the Revised Report, which takes some study to comprehend, the misconception got hold that Algol 68 is a complex language. In fact it is rather lean, and the alleged obscurity of description is toned down by most who studied it [5]. Since it is argued that the specification of many contemporary languages, including that of modern C, is more complex than that of Algol 68 [6], one can ask whether the latter just looked complex at the time, fifty years ago. Algol 68 was defined in a formal document, first published in January 1969, and later published in Acta Informatica and also printed in Sigplan Notices. A Revised Report was issued in 1976 [7]. Algol 68 was the first major language for which a full formal definition was made before it was implemented. Though known to be terse, the Revised Report does contain humour solis sacerdotibus — to quote Koster from [5]: The strict and sober syntax permits itself small puns, as well as a liberal use of portmanteau words. Transput is input or output. ’Stowed’ is the word for structured or rowed. Hipping is the coercion for the hop, skip and jump. MOID is MODE or void. All metanotions ending on ETY have an empty production. Just reading aloud certain lines of the syntax, slightly raising the voice for capitalized words, conveys a feeling of heroic and pagan fun (…) Such lines cannot be read or written with a straight face. Algol 68 was designed for programmers, not for compiler writers, in a time when the field of compiler construction was not as advanced as it is today. Implementation efforts based on formal methods generally failed; Algol 68’s context-sensitive grammar required some invention to parse [8]. At the time compilers usually were made available on mainframes by computing centres, which may explain why Algol 68 was popular in locations rather than areas, for instance Amsterdam, Berlin or Cambridge. It was relatively popular in the United Kingdom, where the ALGOL68R , ALGOL68RS and ALGOL68C compilers were developed. Commercial initiatives were relatively unsuccessful; for instance the FLACC compiler sold just twenty-two copies [9]. Industry did not adopt Algol 68, with a few exceptions. There may be as many opinions on this as there are experts you might ask. IBM would not support it (though CDC did, on demand of academia), it was not as fast as Fortran (but might have been), there was no community outside academia, implementations were lagging … All too often, the timing must be right, the environment must be open to acceptance and adequate marketing should be in place, for an innovation to have a chance. But this is my two cents. Once university mainframes were phased out, the compilers on them were decommissioned. For scientific work, many migrated in the 1990's from mainframes to workstations that did not have Algol 68. In fact, me leaving academia in 1991 led me to write my own Algol 68 implementation so I could continue programming in it. Algol 68 apparently influenced many who were exposed to it, and it may still do so. A few individuals wonder, even today, whether it would have a future. In my opinion, the world has moved on. Although the language was innovative in its days, development stopped so it reflects the era it was developed in. Probably there is a newer language like Python that better suits your modern application on modern infrastructure, and has a large community. Algol 68 will not catch up anymore and further development would probably convert it into an anachronism. On the other hand, Algol 68 appears very familiar to young people, which is a testament to the influence it had on how we think about programming languages. Therefore the Algol 68 Genie project preserves Algol 68 out of educational as well as scientific-historical interest. Those interested in the language may study it and actually use it to understand the language and the influence it had, and experience what computer science and computing was like in its pioneering years, half a century ago. Considering the reactions to the project, I think it is fair to say that it succeeds in this purpose. Of course, people also use it for programming, for instance in numerical mathematics as I like to do. Despite its age, even today Algol 68 is a beautiful means for writing algorithms. Notes [1] Transcript from keynote delivered at the ACM 1984 South Central Regional Conference. Source: E. W. Dijkstra Archive-the manuscripts of Edsger W. Dijkstra. [2] IFIP, the International Federation for Information Processing is an umbrella organisation for national information processing organisations. It was established in 1960 under the auspices of UNESCO. [3] Adriaan van Wijngaarden (1916-1987) is considered by many to be the founding father of computer science in the Netherlands. He was co-founder of IFIP and one of the designers of Algol 60 and later Algol 68. As leader of the Algol 68 committee, he made a profound contribution to the field of programming language design, definition and description. [4] A. van Wijngaarden, Orthogonal design and description of a formal language. Available online from CWI. [5] Proceedings, Conference on the history of Algol 68 [1993]. Available online from CWI. [6] K. Henney. Procedural Programming: It’s Back? It Never Went Away. Available for viewing on YouTube. [7] A. van Wijngaarden et al., Revised report on the algorithmic language Algol 68 [1976]. Available online from CWI, or in HTML as a previous blog post, or as part of Learning Algol 68 Genie . [8] B. Mailloux, On the implementation of Algol 68 [1968]. Available online from CWI. Algol 68 Genie employs a multi-pass scheme to parse Algol 68. Extracting declarations before parsing as proposed by Mailloux, renders the W-grammar LALR. [9] Source: Chris Thomson, formerly with Chion Corporation, on comp.lang.misc [1988].
4768
dbpedia
2
13
https://thehistoryofcomputing.net/website/algol
en
The History of Computing: ALGOL
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
[ "https://ssl-static.libsyn.com/p/assets/platform/websuite/deezer-logo.png", "https://ssl-static.libsyn.com/p/assets/platform/websuite/stitcher.png", "https://ssl-static.libsyn.com/p/assets/platform/websuite/radio-public.png", "https://assets.libsyn.com/secure/show/197558/become-a-patron-button-1.png" ]
[]
[]
[ "" ]
null
[]
null
Today we’re going to cover a computer programming language many might not have heard of, ALGOL.&nbsp; ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language.&nbsp; Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC.&nbsp; The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues.&nbsp; Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted. They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types.&nbsp; And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards.&nbsp; Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name. In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL. And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia,&nbsp; Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979.&nbsp; But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate.&nbsp; You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today.&nbsp; In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently.&nbsp; And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay.&nbsp; And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
en
https://assets.libsyn.com/secure/content/47114459?height=16&width=16
https://thehistoryofcomputing.net/website/algol
Today we’re going to cover a computer programming language many might not have heard of, ALGOL. ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language. Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC. The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues. Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted. They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types. And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards. Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name. In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL. And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia, Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979. But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate. You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today. In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently. And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay. And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
4768
dbpedia
1
11
https://craftofcoding.wordpress.com/tag/algol/
en
Algol – The Craft of Coding
https://s0.wp.com/i/blank.jpg
https://s0.wp.com/i/blank.jpg
[ "https://craftofcoding.wordpress.com/wp-content/uploads/2021/03/algorithm35.png?w=572", "https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages1-1.png?w=599", "https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages2.png?w=620", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://s2.wp.com/i/logo/wpcom-gray-white.png", "https://pixel.wp.com/b.gif?v=noscript" ]
[]
[]
[ "" ]
null
[]
2021-04-06T20:03:37+00:00
Posts about Algol written by spqr
en
https://s1.wp.com/i/favicon.ico
The Craft of Coding
https://craftofcoding.wordpress.com/tag/algol/
The Sieve of Eratosthenes likely first appeared in computing in 1961 as Algorithm 35 in Communications of the ACM (p.151). It, likely many other algorithms of the time was shown in Algol-60. To try and implement it as close as possible to the original syntax, given that there are few if any Algol-60 compilers out there, I chose to reproduce the algorithm in its entirety in Algol-68 using the Algol-68 Genie (a68g) specification. It wasn’t a lot of fun. PROC sieve = (INT nmax, REF INT j) []INT: ( [nmax]INT primes; INT i; primes[1] := 1; primes[2] := 2; primes[3] := 3; j := 3; FOR n FROM 3 BY 2 TO nmax DO i := 3; l1: IF primes[i] <= sqrt(n) THEN GOTO a1 ELSE GOTO a2 FI; a1: IF n/primes[i] = n%primes[i] THEN GOTO b1 ELSE GOTO b2 FI; b2: i := i + 1; GOTO l1; a2: primes[j] := n; j := j + 1; b1: print("") OD; j := j - 1; primes ); main: ( INT n, numpr; print("Upper limit to calculate sieve? "); read(n); [n]INT primes; primes := sieve(n,numpr); FOR i FROM 1 TO numpr DO print(whole(primes[i],5)); IF i MOD 10 = 0 THEN print(new line) FI OD ) It was an interesting experience, partially because the documentation for a68g is okay, but it is filled with the same obtuse stuff in the original reports for Algol-68. There is a lot to this language, and trying to find simple things, like how to pass back a variable from a subprogram is trying. There are things I inherently like, like the ability to specify the size of the array to be created within the function (proc seems like a procedure, but behaves more like a function). And I like that assignment is :=, and equality is =. It’s also nice to use a language where it is easy to return an array from a function. Here is the program executed with the value of n set to 1000. Upper limit to calculate sieve? 1000 +1 +2 +3 +5 +7 +11 +13 +17 +19 +23 +29 +31 +37 +41 +43 +47 +53 +59 +61 +67 +71 +73 +79 +83 +89 +97 +101 +103 +107 +109 +113 +127 +131 +137 +139 +149 +151 +157 +163 +167 +173 +179 +181 +191 +193 +197 +199 +211 +223 +227 +229 +233 +239 +241 +251 +257 +263 +269 +271 +277 +281 +283 +293 +307 +311 +313 +317 +331 +337 +347 +349 +353 +359 +367 +373 +379 +383 +389 +397 +401 +409 +419 +421 +431 +433 +439 +443 +449 +457 +461 +463 +467 +479 +487 +491 +499 +503 +509 +521 +523 +541 +547 +557 +563 +569 +571 +577 +587 +593 +599 +601 +607 +613 +617 +619 +631 +641 +643 +647 +653 +659 +661 +673 +677 +683 +691 +701 +709 +719 +727 +733 +739 +743 +751 +757 +761 +769 +773 +787 +797 +809 +811 +821 +823 +827 +829 +839 +853 +857 +859 +863 +877 +881 +883 +887 +907 +911 +919 +929 +937 +941 +947 +953 +967 +971 +977 +983 +991 +997 Language design in the 1960’s was dominated by attempts to improve upon Algol60. Hoare noted in 1973 [1] that ALGOL-60 was “… a language so far ahead of its time, that it was not only an improvement on its predecessors, but also on nearly all its successors”. Niklaus Wirth was a graduate student at University of California, Berkeley when he started playing around with languages. He joined a research group which was engaged with the implementation and improvement of a dialect of ALGOL-58, NELIAC. He described the compiler as “an intricate mess“, and the process one of “1% science, and 99% sorcery” [1]. The first language Wirth designed leading to his dissertation was Euler, as Wirth himself put it “a trip with the bush knife through the jungle of language features and facilities” [1]. Euler had academic elegance, but no real practical value, however it did catch the attention of the IFIP Working Group, engaged in designing the successor to ALGOL-60. There seemed to be two camps here, one which wanted to push the boundaries of language design and another which wanted to extend ALGOL-60. Wirth belonged to the latter group. In 1965, three reports by Wirth, Seegmüller, and Wijngaarden described three different quasi-complete languages. The complexity of the design process, involving far too many people, eventually led Wirth to become disheartened with the design process, and he went off to develop his own version of ALGOL-60s successor. Designed with contributions from Tony Hoare, this language would become ALGOL-W. Wirth’s first significant language was PL360, a byproduct of the ALGOL-W effort. The IBM 360 upon which ALGOL-W was implemented offered the choice of assembly language or Fortran compilers, neither of which was very attractive. PL360 [2] was a tool with which to implement ALGOL-W. ALGOL-W had a number of applications, but was deficient as a systems programming language. PL360 was to become more successful than ALGOL-W, largely because ALGOL-W was a complex language, and the target computers inadequate. In the fall of 1967, Wirth returned to Switzerland to begin work on the language that would become most closely associated with his language design efforts – Pascal. Wirth would go on to create a trinity of languages, neither of which were that closely related, but all that had one thing in common, they were “ALGOL-like” languages – Pascal, Modula-2 and Oberon. Pascal – Based largely on Algol W, including the use of it’s while and case statements, and record structures. There were syntactic differences, however Algol 60 was almost a subset of Pascal. Modula-2 – Wirth noted that Modula-2 “includes all aspects of Pascal, and extends them with the module concept”. Oberon – Evolved from Modula-2 by very few additions, and several subtractions. Note that “ALGOL-like” really implied – a procedural paradigm, a rigorously defined syntax, traditional mathematical notation (with the nonsense of symbols like ++), block structure providing identifier scope, the availability of recursion, and a strict, static data typing system. [3] [1] Wirth, N., “From programming language design to computer construction”, CACM, 28(2), pp. 160-164 (1985) [2] Wirth, N., “PL360, a programming language for the 360 computers”, Journal of the ACM, 15(1), pp.34-74 (1968) [3] Wirth, N., “Modula-2 and Oberon”, in ACM Conf. on History of Programming Languages, pp.3-1-3-10 (2007) The cool thing about some languages is that you don’t have to be a guru in order to interpret them. Take Algol 60 for example. Algol wasn’t the best language in the world, but it had some nice features. For example you could use a if statement within an assignment to assign values. Here’s an example: postage := if weight ≤ 20 then 40 else if weight ≤ 125 then 80 else if weight ≤ 500 then 120 else 200; Okay, some people will say “But C can do that too!” Yes it can, but certainly not as elegantly. Here is the C expression, using a ternary operator. postage = (weight <= 20) ? 40 : (weight <= 125) ? 80 : (weight <= 500) ? 120 : 200 ; It basically embeds a ternary conditional each time the “else if” is invoked. It just isn’t very elegant to look at, let alone write, or understand. After the freewheeling fifties, programming languages entered a more bohemian time, with many languages appearing, and those 1950s languages evolving, as new programming ideologies evolved. First out of the gate was ALGOL 60 which evolved from the fledgling Algol 58. It was common practice to “re-design” a language quite considerably, rather than make tweaks to an existing language. As such languages such as Algol 58 often fell out of favour as people adopted newer versions of a language. Algol was to spawn what would eventually become quite an issue in the computing community: augmentations, extensions, and derivations, leading to numerous dialects of a language. ALGOL 68 appeared just before its contemporaries C and Pascal, however due to inherent complexities, never really hit it off. (ALGOL 68 was used by European defense agencies, however the US decided to hedge its bets with Ada.) There was some controversy over the design of a successor to Algol 60, with Wirth and Hoare going on to develop their own successor, Algol-W. Algol 68 was a major revision to Algol 60, whereas Algol-W included more subtle changes. The 1960’s also saw a number of updates to both Fortran and Cobol. Languages had also begun to diversify into differing realms. In 1964, Kenneth E. Iverson introduced APL, (A Programming Language) a concise symbol-based language adept at dealing with arrays. Early versions of the language contained no control structures, and whilst Cobol may have strayed too far into “English-language” syntax, APL may have gone too far with its mathematical syntax. APL was popular with those doing computer graphics, but its use declined in the 1980s, due in part to the advent of languages such as MATLAB and GNU Octave. The 1960s also saw the second generation of programming languages evolve, those influenced by the likes of Fortran or Algol. In 1964 PL/I (Programming Language One) made its appearance, supposedly for use in data processing, numerical computation, scientific computing, and systems programming. However it was a more complex language than either Fortran or Cobol, from which it had evolved, and was not extremely successful. Algol also influenced the design of Simula, the first OO language which appeared in 1967. In an educational context, Logo appeared in 1967, famous for its use of “turtle graphics”, influenced by Lisp. P.S. If you want to have some fun with a Logo interpreter, check this out.
4768
dbpedia
2
44
https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN.htm
en
Wikipedia, the free encyclopedia
[ "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/150px-Fortran_acs_cover.jpeg", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/320px-Ibm704.gif", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/magnify-clip.png", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/180px-FortranCardPROJ039.jpg", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/magnify-clip.png", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/300px-FortranCodingForm.png", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/magnify-clip.png", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/Ambox_style.png", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/50px-Wikibooks-logo-en.png", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/poweredby_mediawiki_88x31.png", "https://web.math.pmf.unizg.hr/~nela/prteme/FORTRAN_files/wikimedia-button.png" ]
[]
[]
[ "Fortran", "Fortran", "10", "000 Maniacs", "1975", "2007", "2008", "ALGOL 58", "ASCII", "Abstract data type", "Ada (programming language)", "Aliasing (computing)" ]
null
[]
null
en
http://en.wikipedia.org/favicon.ico
null
From Wikipedia, the free encyclopedia (Redirected from FORTRAN) Jump to: navigation, search Fortran The Fortran Automatic Coding System for the IBM 704 (October 15, 1956), the first Programmer's Reference Manual for Fortran Paradigm multi-paradigm: procedural, imperative, structured, object-oriented Appeared in 1957 Designed by John Backus Developer John Backus & IBM Typing discipline strong, static Major implementations Absoft, GFortran, G95, Intel, Lahey/Fujitsu, Open Watcom, Pathscale, PGI, Silverfrost, Sun, XL Fortran, others Influenced ALGOL 58, PL/I Fortran (previously FORTRAN[1]) is a general-purpose,[2] procedural,[3] imperative programming language that is especially suited to numeric computation and scientific computing. Originally developed by IBM in the 1950s for scientific and engineering applications, Fortran came to dominate this area of programming early on and has been in continual use for over half a century in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics (CFD), computational physics, and computational chemistry. It is one of the most popular languages in the area of High-performance computing and programs to benchmark and rank the world's fastest supercomputers are written in Fortran[4]. Fortran (a blend word derived from The IBM Mathematical Formula Translating System) encompasses a lineage of versions, each of which evolved to add extensions to the language while usually retaining compatibility with previous versions. Successive versions have added support for processing of character-based data (FORTRAN 77), array programming, module-based programming and object-based programming (Fortran 90 / 95), and object-oriented and generic programming (Fortran 2003). Contents [hide] 1 History 1.1 FORTRAN 1.2 FORTRAN II 1.2.1 Simple Fortran II program 1.3 FORTRAN III 1.4 FORTRAN IV 1.5 FORTRAN 66 1.6 FORTRAN 77 1.7 Fortran 90 1.7.1 Obsolescence & deletions 1.8 Fortran 95 1.8.1 Conditional compilation and varying length strings 1.9 Fortran 2003 1.10 Fortran 2008 1.11 The legacy of FORTRAN 2 Language features 3 Portability 4 Variants of Fortran 4.1 Specific variants 4.1.1 FOR TRANSIT for the IBM 650 4.2 Fortran-based languages 5 Code examples 6 FORTRAN quotations 6.1 From the historical record 6.2 References in popular culture 7 Letter O considered harmful 8 References 8.1 Textbooks 8.2 "Core" language standards 8.3 Related standards 9 Notes 10 See also 11 External links [edit] History In late 1953, John W. Backus submitted a proposal to his superiors at IBM to develop a more efficient alternative to assembly language for programming their IBM 704 mainframe computer. Backus' historic FORTRAN team consisted of programmers Richard Goldberg, Sheldon F. Best, Harlan Herrick, Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Lois Haibt and David Sayre.[5] A draft specification for The IBM Mathematical Formula Translating System was completed by mid-1954. The first manual for FORTRAN appeared in October 1956, with the first FORTRAN compiler delivered in April 1957. This was an optimizing compiler, because customers were reluctant to use a high-level programming language unless its compiler could generate code whose performance was comparable to that of hand-coded assembly language. While the community was skeptical that this new method could possibly out-perform hand-coding, it reduced the amount of programming statements necessary to operate a machine by a factor of 20, and quickly gained acceptance. Said creator John Backus during a 1979 interview with Think, the IBM employee magazine, "Much of my work has come from being lazy. I didn't like writing programs, and so, when I was working on the IBM 701 (an early computer), writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."[6] The language was widely adopted by scientists for writing numerically intensive programs, which encouraged compiler writers to produce compilers that could generate faster and more efficient code. The inclusion of a complex number data type in the language made Fortran especially suited to technical applications such as electrical engineering. By 1960, versions of FORTRAN were available for the IBM 709, 650, 1620, and 7090 computers. Significantly, the increasing popularity of FORTRAN spurred competing computer manufacturers to provide FORTRAN compilers for their machines, so that by 1963 over 40 FORTRAN compilers existed. For these reasons, FORTRAN is considered to be the first widely used programming language supported across a variety of computer architectures. The development of FORTRAN paralleled the early evolution of compiler technology; indeed many advances in the theory and design of compilers were specifically motivated by the need to generate efficient code for FORTRAN programs. [edit] FORTRAN The initial release of FORTRAN for the IBM 704 contained 32 statements, including: DIMENSION and EQUIVALENCE statements Assignment statements Three-way arithmetic IF statement.[7] IF statements for checking exceptions (ACCUMULATOR OVERFLOW, QUOTIENT OVERFLOW, and DIVIDE CHECK); and IF statements for manipulating sense switches and sense lights GOTO, computed GOTO, ASSIGN, and assigned GOTO DO loops Formatted I/O: FORMAT, READ, READ INPUT TAPE, WRITE, WRITE OUTPUT TAPE, PRINT, and PUNCH Unformatted I/O: READ TAPE, READ DRUM, WRITE TAPE, and WRITE DRUM Other I/O: END FILE, REWIND, and BACKSPACE PAUSE, STOP, and CONTINUE FREQUENCY statement (for providing optimization hints to the compiler)[8] [edit] FORTRAN II IBM's FORTRAN II appeared in 1958. The main enhancement was to support procedural programming by allowing user-written subroutines and functions. Six new statements were introduced: SUBROUTINE, FUNCTION, and END CALL and RETURN COMMON Over the next few years, FORTRAN II would also add support for the DOUBLE PRECISION and COMPLEX data types. [edit] Simple Fortran II program This program, for Heron's formula, has one data card input, with simple zero-data edit check. If one of the input values is zero or negative, then the program will end with an error code of "STOP 1" in the job control card listing following the execution of the program. Normal output will be one line printed with A, B, C, and AREA on the "report" page following the compilation with no specific units are stated; and with a normal "STOP" in the job card listing. C AREA OF A TRIANGLE WITH A STANDARD SQUARE ROOT FUNCTION C INPUT - CARD READER UNIT 5, INTEGER INPUT C OUTPUT - LINE PRINTER UNIT 6, REAL OUTPUT C INPUT ERROR DISPLAY ERROR OUTPUT CODE 1 IN JOB CONTROL LISTING READ INPUT TAPE 5, 501, IA, IB, IC 501 FORMAT (3I5) C IA, IB, AND IC MAY NOT BE NEGATIVE C FURTHERMORE, THE SUM OF TWO SIDES OF A TRIANGLE C IS GREATER THAN THE THIRD SIDE, SO WE CHECK FOR THAT, TOO IF (IA) 777, 777, 701 701 IF (IB) 777, 777, 702 702 IF (IC) 777, 777, 703 703 IF (IA+IB-IC) 777,777,704 704 IF (IA+IC-IB) 777,777,705 705 IF (IB+IC-IA) 777,777,799 777 STOP 1 C USING HERON'S FORMULA WE CALCULATE THE C AREA OF THE TRIANGLE 799 S = FLOATF (IA + IB + IC) / 2.0 AREA = SQRT( S * (S - FLOATF(IA)) * (S - FLOATF(IB)) * + (S - FLOATF(IC))) WRITE OUTPUT TAPE 6, 601, IA, IB, IC, AREA 601 FORMAT (4H A= ,I5,5H B= ,I5,5H C= ,I5,8H AREA= ,F10.2, + 13H SQUARE UNITS) STOP END [edit] FORTRAN III IBM also developed a FORTRAN III in 1958 that allowed for inline assembler code among other features; however, this version was never released as a product. Like the 704 FORTRAN and FORTRAN II, FORTRAN III included machine-dependent features that made code written in it unportable from machine to machine. Early versions of FORTRAN provided by other vendors suffered from the same disadvantage. [edit] FORTRAN IV Starting in 1961, as a result of customer demands, IBM began development of a FORTRAN IV that removed the machine-dependent features of FORTRAN II (such as READ INPUT TAPE), while adding new features such as a LOGICAL data type, logical Boolean expressions and the logical IF statement as an alternative to the arithmetic IF statement. FORTRAN IV was eventually released in 1962, first for the IBM 7030 ("Stretch") computer, followed by versions for the IBM 7090 and IBM 7094. By 1965, Fortran IV was supposed to be the "standard" and in compliance with American Standards Association X3.4.3 FORTRAN Working Group.[9] [edit] FORTRAN 66 Perhaps the most significant development in the early history of FORTRAN was the decision by the American Standards Association (now ANSI) to form a committee to develop an "American Standard Fortran." The resulting two standards, approved in March 1966, defined two languages, FORTRAN (based on FORTRAN IV, which had served as a de facto standard), and Basic FORTRAN (based on FORTRAN II, but stripped of its machine-dependent features). The FORTRAN defined by the first standard became known as FORTRAN 66 (although many continued to refer to it as FORTRAN IV, the language upon which the standard was largely based). FORTRAN 66 effectively became the first "industry-standard" version of FORTRAN. FORTRAN 66 included: Main program, SUBROUTINE, FUNCTION, and BLOCK DATA program units INTEGER, REAL, DOUBLE PRECISION, COMPLEX, and LOGICAL data types COMMON, DIMENSION, and EQUIVALENCE statements DATA statement for specifying initial values Intrinsic and EXTERNAL (e.g., library) functions Assignment statement GOTO, assigned GOTO, and computed GOTO statements Logical IF and arithmetic (three-way) IF statements DO loops READ, WRITE, BACKSPACE, REWIND, and ENDFILE statements for sequential I/O FORMAT statement CALL, RETURN, PAUSE, and STOP statements Hollerith constants in DATA and FORMAT statements, and as actual arguments to procedures Identifiers of up to six characters in length Comment lines [edit] FORTRAN 77 After the release of the FORTRAN 66 standard, compiler vendors introduced a number of extensions to "Standard Fortran", prompting ANSI in 1969 to begin work on revising the 1966 standard. Final drafts of this revised standard circulated in 1977, leading to formal approval of the new FORTRAN standard in April 1978. The new standard, known as FORTRAN 77, added a number of significant features to address many of the shortcomings of FORTRAN 66: Block IF and END IF statements, with optional ELSE and ELSE IF clauses, to provide improved language support for structured programming DO loop extensions, including parameter expressions, negative increments, and zero trip counts OPEN, CLOSE, and INQUIRE statements for improved I/O capability Direct-access file I/O IMPLICIT statement CHARACTER data type, with vastly expanded facilities for character input and output and processing of character-based data PARAMETER statement for specifying constants SAVE statement for persistent local variables Generic names for intrinsic functions A set of intrinsics (LGE, LGT, LLE, LLT) for lexical comparison of strings, based upon the ASCII collating sequence. (ASCII functions were demanded by the U. S. Department of Defense, in their conditional approval vote.) In this revision of the standard, a number of features were removed or altered in a manner that might invalidate previously standard-conforming programs. (Removal was the only allowable alternative to X3J3 at that time, since the concept of "deprecation" was not yet available for ANSI standards.) While most of the 24 items in the conflict list (see Appendix A2 of X3.9-1978) addressed loopholes or pathological cases permitted by the previous standard but rarely used, a small number of specific capabilities were deliberately removed, such as: Hollerith constants and Hollerith data, such as: GREET = 12HHELLO THERE! Reading into a H edit (Hollerith field) descriptor in a FORMAT specification. Overindexing of array bounds by subscripts. DIMENSION A(10,5) Y= A(11,1) Transfer of control into the range of a DO loop (also known as "Extended Range"). An important practical extension to FORTRAN 77 was the release of MIL-STD-1753 in 1978. This specification, developed by the U. S. Department of Defense, standardized a number of features implemented by most FORTRAN 77 compilers but not included in the ANSI FORTRAN 77 standard. These features would eventually be incorporated into the Fortran 90 standard. DO WHILE and END DO statements INCLUDE statement IMPLICIT NONE variant of the IMPLICIT statement Bit manipulation intrinsic functions, based on similar functions included in Industrial Real-Time Fortran (ANSI/ISA S61.1 (1976)) The IEEE 1003.9 POSIX Standard, released in 1991, provided a simple means for Fortran-77 programmers to issue POSIX system calls. Over 100 calls were defined in the document — allowing access to POSIX-compatible process control, signal handling, file system control, device control, procedure pointing, and stream I/O in a portable manner. The development of a revised standard to succeed FORTRAN 77 would be repeatedly delayed as the standardization process struggled to keep up with rapid changes in computing and programming practice. In the meantime, as the "Standard FORTRAN" for nearly fifteen years, FORTRAN 77 would become the historically most important dialect. Control Data Corporation computers had another version of FORTRAN 77, called Minnesota FORTRAN, with variations in output constructs, special uses of COMMONs and DATA statements, optimizations code levels for compiling, and detailed error listings, extensive warning messages, and debugs.[10] [edit] Fortran 90 The much delayed successor to FORTRAN 77, informally known as Fortran 90, was finally released as an ISO standard in 1991 and an ANSI Standard in 1992. This major revision added many new features to reflect the significant changes in programming practice that had evolved since the 1978 standard: Free-form source input, also with lowercase Fortran keywords Identifiers up to 31 characters in length Inline comments Ability to operate on arrays (or array sections) as a whole, thus greatly simplifying math and engineering computations. whole, partial and masked array assignment statements and array expressions, such as X(1:N)=R(1:N)*COS(A(1:N))) WHERE statement for selective array assignment array-valued constants and expressions, user-defined array-valued functions and array constructors. RECURSIVE procedures Modules, to group related procedures and data together, and make them available to other program units, including the capability to limit the accessibility to only specific parts of the module. A vastly improved argument-passing mechanism, allowing interfaces to be checked at compile time User-written interfaces for generic procedures Operator overloading Derived/abstract data types New data type declaration syntax, to specify the data type and other attributes of variables Dynamic memory allocation by means of the ALLOCATABLE attribute and the ALLOCATE and DEALLOCATE statements POINTER attribute, pointer assignment, and NULLIFY statement to facilitate the creation and manipulation of dynamic data structures Structured looping constructs, with an END DO statement for loop termination, and EXIT and CYCLE statements for "breaking out" of normal DO loop iterations in an orderly way SELECT . . . CASE construct for multi-way selection Portable specification of numerical precision under the user's control New and enhanced intrinsic procedures. [edit] Obsolescence & deletions Unlike the previous revision, Fortran 90 did not delete any features. (Appendix B.1 says, "The list of deleted features in this standard is empty.") Any standard-conforming FORTRAN 77 program is also standard-conforming under Fortran 90, and either standard should be usable to define its behavior. A small set of features were identified as "obsolescent" and expected to be removed in a future standard. Obsolescent feature Example Status / 95 Arithmetic IF-statement IF (X) 10, 20, 30 Non-integer DO parameters or control variables DO 9 X= 1.7, 1.6, -0.1 Deleted Shared DO-loop termination or termination with a statement other than END DO or CONTINUE DO 9 J= 1, 10 DO 9 K= 1, 10 9 L= J + K Branching to END IF from outside a block 66 GO TO 77 ; . . . IF (E) THEN ; . . . 77 END IF Deleted Alternate return CALL SUBR( X, Y *100, *200 ) PAUSE statement PAUSE 600 Deleted ASSIGN statement and assigned GO TO statement 100 . . . ASSIGN 100 TO H . . . GO TO H . . . Deleted Assigned FORMAT specifiers ASSIGN F TO 606 Deleted H edit descriptors 606 FORMAT ( 9H1GOODBYE. ) Deleted Computed GO TO statement GO TO (10, 20, 30, 40), index (Obso.) Statement functions FOIL( X, Y )= X**2 + 2*X*Y + Y**2 (Obso.) DATA statements among executable statements X= 27.3 DATA A, B, C / 5.0, 12.0. 13.0 / . . . (Obso.) CHARACTER* form of CHARACTER declaration CHARACTER*8 STRING ! Use CHARACTER(8) (Obso.) Assumed character length functions Fixed form source code * Column 1 contains * or ! or C for comments. C Column 6 for continuation. [edit] Fortran 95 Fortran 95 was a minor revision, mostly to resolve some outstanding issues from the Fortran 90 standard. Nevertheless, Fortran 95 also added a number of extensions, notably from the High Performance Fortran specification: FORALL and nested WHERE constructs to aid vectorization User-defined PURE and ELEMENTAL procedures Pointer initialization and structure default initialization. A number of intrinsic functions were extended (for example a dim argument was added to the maxloc intrinsic). Several features noted in Fortran 90 to be deprecated were removed from Fortran 95: REAL and DOUBLE PRECISION DO variables Branching to an END IF statement from outside its block PAUSE statement ASSIGN and assigned GOTO statement, and assigned format specifiers H edit descriptor. An important supplement to Fortran 95 was the ISO technical report TR-15581: Enhanced Data Type Facilities, informally known as the Allocatable TR. This specification defined enhanced use of ALLOCATABLE arrays, prior to the availability of fully Fortran 2003-compliant Fortran compilers. Such uses include ALLOCATABLE arrays as derived type components, in procedure dummy argument lists, and as function return values. (ALLOCATABLE arrays are preferable to POINTER-based arrays because ALLOCATABLE arrays are guaranteed by Fortran 95 to be deallocated automatically when they go out of scope, eliminating the possibility of memory leakage. In addition, aliasing is not an issue for optimization of array references, allowing compilers to generate faster code than in the case of pointers.) Another important supplement to Fortran 95 was the ISO technical report TR-15580: Floating-point exception handling, informally known as the IEEE TR. This specification defined support for IEEE floating-point arithmetic and floating point exception handling. [edit] Conditional compilation and varying length strings In addition to the mandatory "Base language" (defined in ISO/IEC 1539-1 : 1997), the Fortran 95 language also includes two optional modules: Varying character strings (ISO/IEC 1539-2 : 2000) Conditional compilation (ISO/IEC 1539-3 : 1998) which, together, comprise the multi-part International Standard (ISO/IEC 1539). According to the standards developers, "the optional parts describe self-contained features which have been requested by a substantial body of users and/or implementors, but which are not deemed to be of sufficient generality for them to be required in all standard-conforming Fortran compilers." Nevertheless, if a standard-conforming Fortran does provide such options, then they "must be provided in accordance with the description of those facilities in the appropriate Part of the Standard." [edit] Fortran 2003 The most recent standard, Fortran 2003, is a major revision introducing many new features. A comprehensive summary of the new features of Fortran 2003 is available at the Fortran Working Group (WG5) official Web site.[11] From that article, the major enhancements for this revision include: Derived type enhancements: parameterized derived types, improved control of accessibility, improved structure constructors, and finalizers. Object oriented programming support: type extension and inheritance, polymorphism, dynamic type allocation, and type-bound procedures. Data manipulation enhancements: allocatable components (incorporating TR 15581), deferred type parameters, VOLATILE attribute, explicit type specification in array constructors and allocate statements, pointer enhancements, extended initialization expressions, and enhanced intrinsic procedures. Input/output enhancements: asynchronous transfer, stream access, user specified transfer operations for derived types, user specified control of rounding during format conversions, named constants for preconnected units, the FLUSH statement, regularization of keywords, and access to error messages. Procedure pointers. Support for IEEE floating-point arithmetic and floating point exception handling (incorporating TR 15580). Interoperability with the C programming language. Support for international usage: access to ISO 10646 4-byte characters and choice of decimal or comma in numeric formatted input/output. Enhanced integration with the host operating system: access to command line arguments, environment variables, and processor error messages. An important supplement to Fortran 2003 was the ISO technical report TR-19767: Enhanced module facilities in Fortran. This report provided submodules, which make Fortran modules more similar to Modula-2 modules. They are similar to Ada private child subunits. This allows the specification and implementation of a module to be expressed in separate program units, which improves packaging of large libraries, allows preservation of trade secrets while publishing definitive interfaces, and prevents compilation cascades. [edit] Fortran 2008 Efforts are underway to develop a revision to Fortran 2003, tentatively called Fortran 2008. As with Fortran 95, this is intended to be a minor upgrade, incorporating clarifications and corrections to Fortran 2003, as well as introducing a select few new capabilities. As of February 2007, the proposed new capabilities include[12] Co-array Fortran – a parallel processing model BIT data type In August 2007, the BIT data type was removed. In February 2008, Coarrays were scaled back: Parallel I/O and teams were removed. The complete original work plan is available at http://j3-fortran.org/doc/year/07/07-010.html. [edit] The legacy of FORTRAN Since Fortran has been in use for more than fifty years, there is a vast body of Fortran in daily use throughout the scientific and engineering communities. It is the primary language for some of the most intensive supercomputing tasks, such as weather and climate modeling, computational fluid dynamics, computational chemistry, computational economics, and computational physics. Even today, half a century later, many of the floating-point benchmarks to gauge the performance of new computer processors are still written in Fortran (e.g., CFP2006, the floating-point component of the SPEC CPU2006 benchmarks). [edit] Language features The Fortran language features described are intended to be a fairly comprehensive overview of the Fortran language; full details may be found in any of several Fortran textbooks. Only those features widely used in new programs are described, as few of the historic features are used in modern programs. Still, most have been retained in the language to maintain backward compatibility. For more details on this topic, see Fortran language features. [edit] Portability Portability was a problem in the early days because there was no agreed standard—not even IBM's reference manual—and computer companies vied to differentiate their offerings from others by providing incompatible features. Standards have improved portability. The 1966 standard provided a reference syntax and semantics, but vendors continued to provide incompatible extensions. Although careful programmers were coming to realize that use of incompatible extensions caused expensive portability problems, and were therefore using programs such as The PFORT Verifier, it was not until after the 1977 standard, when the National Bureau of Standards (now NIST) published FIPS PUB 69, that processors purchased by the U.S. Government were required to diagnose extensions of the standard. Rather than offer two processors, essentially every compiler eventually had at least an option to diagnose extensions. Incompatible extensions were not the only portability problem. For numerical calculations, it is important to take account of the characteristics of the arithmetic. This was addressed by Fox et al. in the context of the 1966 standard by the PORT library. The ideas therein became widely used, and were eventually incorporated into the 1990 standard by way of intrinsic inquiry functions. The widespread (now almost universal) adoption of the IEEE 754 standard for binary floating-point arithmetic has essentially removed this problem. Access to the computing environment (e.g. the program's command line, environment variables, textual explanation of error conditions) remained a problem until it was addressed by the 2003 standard. Large collections of "library" software that could be described as being loosely-related to engineering and scientific calculations, such as graphics libraries, have been written in C, and therefore access to them presented a portability problem. This has been addressed by incorporation of C interoperability into the 2003 standard. It is now possible (and relatively easy) to write an entirely portable program in Fortran, even without recourse to a preprocessor. [edit] Variants of Fortran [edit] Specific variants Vendors of high-performance scientific computers (e.g., Burroughs, CDC, Cray, Honeywell, IBM, Texas Instruments, and UNIVAC) added extensions to Fortran to take advantage of special hardware features such as instruction cache, CPU pipelines, and vector arrays. For example, one of IBM's FORTRAN compilers (H Extended IUP) had a level of optimization which reordered the machine language instructions to keep multiple internal arithmetic units busy simultaneously. Another example is CFD, a special variant of Fortran designed specifically for the ILLIAC IV supercomputer, running at NASA's Ames Research Center. IBM Research Labs also developed an extended FORTRAN-based language called "VECTRAN" for processing of vectors and matricies. Object-Oriented Fortran was an object-oriented extension of Fortran, in which data items can be grouped into objects, which can be instantiated and executed in parallel. It was available for Sun, Iris, iPSC, and nCUBE, but is no longer supported. Such machine-specific extensions have either disappeared over time or have had elements incorporated into the main standards; the major remaining extension is OpenMP, which is a cross-platform extension for shared memory programming. One new extension, CoArray Fortran, is intended to support parallel programming. [edit] FOR TRANSIT for the IBM 650 "FOR TRANSIT" was the name of a reduced version of the IBM 704 FORTRAN language, which was implemented for the IBM 650, using a translator program developed at Carnegie [13] in the late 1950s. The following comment appears in the IBM Reference Manual ("FOR TRANSIT Automatic Coding System" C28-4038, Copyright 1957, 1959 by IBM): The FORTRAN system was designed for a more complex machine than the 650, and consequently some of the 32 statements found in the FORTRAN Programmer's Reference Manual are not acceptable to the FOR TRANSIT system. In addition, certain restrictions to the FORTRAN language have been added. However, none of these restrictions make a source program written for FOR TRANSIT incompatible with the FORTRAN system for the 704. The permissible statements were: Arithmetic assignment statements, e.g. a = b GO to n GO TO (n1, n2, ..., nm), i IF (a) n1, n2, n3 PAUSE STOP DO n i = m1, m2 CONTINUE END READ n, list PUNCH n, list DIMENSION V, V, V, ... EQUIVALENCE (a,b,c), (d,c), ... Up to ten subroutines could be used in one program. FOR TRANSIT statements were limited to columns 7 thru 56, only. Punched cards were used for input and output on the IBM 650. Three passes were required to translate source code to the "IT" language, then to compile the IT statements into SOAP assembly language, and finally to produce the object program, which could then be loaded into the machine to run the program (using punched cards for data input, and outputting results onto punched cards.) Two versions existed for the 650s with a 2000 word memory drum: FOR TRANSIT I (S) and FOR TRANSIT II, the latter for machines equipped with indexing registers and automatic floating point decimal (bi-quinary) arithmetic. Appendix A of the manual included wiring diagrams for the IBM 533 control panel. [edit] Fortran-based languages Prior to FORTRAN 77, a number of preprocessors were commonly used to provide a friendlier language, with the advantage that the preprocessed code could be compiled on any machine with a standard FORTRAN compiler. Popular preprocessors included FLECS, MORTRAN, Ratfor, and Ratfiv. (Ratfor and Ratfiv, for example, implemented a remarkably C-like language, outputting preprocessed code in standard FORTRAN 66.[14]) The Fortran-95 Standard includes an optional Part 3 which defines an optional conditional compilation capability. This capability is often referred to as "CoCo". Many Fortran compilers have integrated subsets of the C preprocessor into their systems. SIMSCRIPT is an application specific Fortran preprocessor for modeling and simulating large discrete systems. F (programming language) was designed to be a clean subset of Fortran 95 that attempted to remove the redundant, unstructured, and deprecated features of Fortran, such as the EQUIVALENCE statement. F retains the array features added in Fortran 90, and removes control statements that were obsoleted by structured programming constructs added both Fortran 77 and Fortran 90. F is described by its creators as "a compiled, structured, array programming language especially well suited to education and scientific computing." "F Programming Language Homepage)". [edit] Code examples The sample programs can be compiled and run with any standard Fortran compiler (see the end of this article for lists of compilers). Most modern Fortran compilers expect a file with a .f or .for extension (for FORTRAN 66 or FORTRAN 77 fixed-form source, although the FORTRAN 66 dialect may have to be selected specifically with a command-line option) or .f90/.f95 extension (for Fortran 90/95 free-form source, respectively). For more details on this topic, see Wikibooks:Fortran/Fortran examples. [edit] FORTRAN quotations For a programming language with a half-century legacy, FORTRAN not surprisingly has accumulated its share of jokes and folklore. [edit] From the historical record God is Real, unless declared Integer. J. Allan Toogood, FORTRAN programmer[15] FORTRAN—the "infantile disorder"—, by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use. —Edsger Dijkstra, 1975[16] "The sooner the world forgets that FORTRAN ever existed, the better." (imputed to Joseph Weizenbaum) "95 percent of the people who programmed in the early years would never have done it without Fortran." — Ken Thompson, circa 2005[17] "The primary purpose of the DATA statement is to give names to constants; instead of referring to pi as 3.141592653589793 at every appearance, the variable PI can be given that value with a DATA statement and used instead of the longer form of the constant. This also simplifies modifying the program, should the value of pi change." —Early FORTRAN manual for Xerox Computers[18] "Consistently separating words by spaces became a general custom about the tenth century A.D., and lasted until about 1957, when FORTRAN abandoned the practice." —Sun FORTRAN Reference Manual "People are very flexible and learn to adjust to strange surroundings — they can become accustomed to read Lisp and Fortran programs, for example." —Art of PROLOG, Leon Sterling and Ehud Shapiro, MIT Press "Warning: Go directly to Jail. Do not pass GO. Do not collect $200." — Easter egg in the SDS/Xerox Sigma 7 FORTRAN compiler, when the statement GO TO JAIL was encountered. The message is from a Chance card in Monopoly. "A computer without COBOL and FORTRAN is like a piece of chocolate cake without ketchup or mustard." — a fortune cookie from the Unix program fortune. "The determined Real Programmer can write FORTRAN programs in any language." — Real Programmers Don't Use Pascal, Ed Post, 1982 [edit] References in popular culture In the pilot episode of the Futurama series, the robot Bender drinks a bottle of Olde FORTRAN Malt Liquor (alluding to "Olde English" malt liquor) Computer folklore has incorrectly attributed the loss of the Mariner 1 space probe to a syntax error in a Fortran program. For example, "Recall the first American space probe to Venus, reportedly lost because Fortran cannot recognize a missing comma in a DO statement…"[19]. In 1982, 10,000 Maniacs released a song named "Planned Obsolescence" that includes the repeated line — "Science [is] truth for life, in Fortran tongue the answer". [edit] Letter O considered harmful During the same Fortran Standards Committee meeting at which the name "FORTRAN 77" was chosen, a technical proposal was somehow smuggled into the official distribution, bearing the title, "Letter O considered harmful". This deceptively simple proposal purported to address the confusion that sometimes arises between the letter "O" and the numeral zero, by eliminating the letter from allowable variable names. However, the method proposed was to eliminate the letter from the character set entirely (thereby retaining 48 as the number of lexical characters, which the colon had increased to 49). Among the "PRO" arguments was the assertion that this would also promote structured programming, by making it impossible to use the notorious GO TO statement as before. (Troublesome FORMAT statements would be eliminated, as well.) The sole "CON" argument conceded that "this might invalidate some existing programs" but noted that most of these "probably were non-conforming, anyway".[20] [edit] References [edit] Textbooks Akin, Ed (2003). Object Oriented Programming via Fortran 90/95, 1st ed., Cambridge University Press. ISBN 0-521-52408-3. Etter, D. M. (1990). Structured FORTRAN 77 for Engineers and Scientists, 3rd ed., The Benjamin/Cummings Publishing Company, Inc.. ISBN 0-8053-0051-1. Chapman, Stephen J. (2007). Fortran 95/2003 for Scientists and Engineers, 3rd ed., McGraw-Hill. ISBN 978-0-07-319157-7. Chapman, Stephen J. (2003). Fortran 90/95 for Scientists and Engineers, 2nd ed., McGraw-Hill. ISBN 0-07-282575-8. Chivers, Ian; Jane Sleightholme (2006). Introduction to Programming with Fortran, 1st ed., Springer. ISBN 1-84628-053-2. Ellis, T. M. R.; Ivor R. Phillips, Thomas M. Lahey (1994). Fortran 90 Programming, 1st ed., Addison Wesley. ISBN 0-201-54446-6. Kupferschmid, Michael (2002). Classical Fortran: Programming for Engineering and Scientific Applications. Marcel Dekker (CRC Press). ISBN 0-8247-0802-4. McCracken, Daniel D. (1961). A Guide to Fortran Programming. Wiley. McCracken, Daniel D. (1965). A Guide to Fortran IV Programming. Wiley. Metcalf, Michael; John Reid, Malcolm Cohen (2004). Fortran 95/2003 Explained. Oxford University Press. ISBN 0-19-852693-8. Nyhoff, Larry; Sanford Leestma (1995). FORTRAN 77 for Engineers and Scientists with an Introduction to Fortran 90, 4th ed., Prentice Hall. ISBN 0-13-363003-X. da Cunha, Rudnei Dias (2005). Introdução à Linguagem de Programação Fortran 90. Editora da Universidade Federal do Rio Grande do Sul. ISBN 85-7025-829-1. Martínez Baena, Javier; Ignario Requena Ramos, Nicolás Marín Ruiz (2006). Programación estructurada con Fortran 90/95. Universidad de Granada. ISBN 84-338-3923-3. [edit] "Core" language standards ANSI X3.9-1966. USA Standard FORTRAN, informally known as FORTRAN 66, American National Standards Institute. ANSI X3.9-1978. American National Standard – Programming Language FORTRAN, also known as ISO 1539-1980, informally known as FORTRAN 77, American National Standards Institute. ANSI X3.198-1992 (R1997). American National Standard – Programming Language Fortran Extended, informally known as Fortran 90, American National Standards Institute. ISO/IEC 1539-1:1997. Information technology – Programming languages – Fortran – Part 1: Base language, informally known as Fortran 95. There are a further two parts to this standard. Part 1 has been formally adopted by ANSI. ISO/IEC 1539-1:2004. Information technology – Programming languages – Fortran – Part 1: Base language, informally known as Fortran 2003. [edit] Related standards Wilfried Kneis (October 1981). "Draft standard Industrial Real-Time FORTRAN". ACM SIGPLAN Notices 16 (7): 45–60. ACM Press. doi:10.1145/947864.947868. MIL-STD-1753. DoD Supplement to X3.9-1978. U. S. Government Printing Office. POSIX 1003.9-1992. POSIX FORTRAN 77 Language Interface – Part 1: Binding for System Application Program Interface [API]. The Institute of Electrical and Electronics Engineers, Inc. ISO 8651-1:1988 Information processing systems -- Computer graphics -- Graphical Kernel System (GKS) language bindings -- Part 1: FORTRAN [edit] Notes [edit] See also Alphabetical list of programming languages [edit] External links History "The FORTRAN Automatic Coding System" (1.39 MiB) — 1957 copy describes the design and implementation of the first FORTRAN compiler by the IBM team Early Fortran manuals and The very first Fortran manual, by John Backus (6.11 MiB) dated [1956-10-15] History of FORTRAN and Systems Manual for 704/ /709 FORTRAN (13.5 MiB) FORTRAN at HOPL site "The IBM CE Manual for FORTRAN I, II, and 709" from 1959 (3.82 MiB) "A History of Language Processor Technology in IBM" (1.45 MiB) — by F.E. Allen, IBM Journal of Research and Development, v.25, no.5, September 1981 Standards Comprehensive Fortran Standards Documents by GFortran JTC1/SC22/WG5 — The ISO/IEC Fortran Working Group ANSI(R) X3.9-1978 Fortran 77 Standard MIL-STD 1753 DoD Extensions to Fortran 77 ISO/IEC 1539:1991 Fortran 90 Standard final draft Fortran 95 Standard WG5 (2003) ISO/IEC JTC1/SC22/WG5 N1578 Final Committee Draft of Fortran 2003 standard Tutorials Professional Programmer's Guide to FORTRAN 77 (493 KiB) — Guide written by Clive G. Page of the University of Leicester Unit 7.1 FORTRAN 77 and Unit 7.2 Fortran 90 — Part of ASPIRE's "Computational Science Textbook" project User Notes on FORTRAN Programming (UNFP) — An open cooperative guide References The Professional Programmer's Guide to FORTRAN 77 Fortran 77, 90, 95, 2003 Information & Resources Fortran 77 — FORTRAN 77 documentation Fortran 77 4.0 Reference Manual (851 KiB) Code repositories Fortran 90 Software Repository — Numerical Algorithms Group High-Precision Software Directory — Computational Research Division of Lawrence Berkeley National Laboratory National HPCC Software Exchange (defunct) Department of Computer Science at the University of Tennessee at Knoxville Netlib Repository at the University of Tennessee at Knoxville and Oak Ridge National Laboratory Software from Alan J. Miller Logistic Regression, TOMS algorithms, Special code for F, Applied Statistics Algorithms and NAS compilers Open source compilers GFortran — The GNU Fortran compiler, part of GCC. Distributed as part of GCC as of GCC 4.0. Replaced g77. g95 — Free, open source Fortran 95 compiler and runtime libraries Open64 — Open Research Compiler. Suite of optimizing compiler development tools for Intel Itanium (Linux) Open Watcom — A joint effort between SciTech Software Inc, Sybase and the Open Source development community to maintain Fortran cross compilers and tools Non-open source compilers Intel Fortran Compiler Absoft Fortran 95/90/77 and C/C++ compilers for Windows, Mac OS 9 and OS X, Linux IA32 and AMD Opteron and AMD Athlon 64-bit processors IBM Fortran 95 compilers for AIX, Blue Gene, and Linux Lahey/Fujitsu — Fortran 95 compilers for Linux and Windows NAGWare Fortran 95 compiler with 2003 features for Linux, Windows, and Unix on many platforms Qlogic Pathscale — Fortran 95, C, and C++ compilers for Linux on AMD Opteron and Intel 64-bit and 32-bit x86 CPUs PGI Fortran 95, C and C++ Compilers for 32-bit and 64-bit AMD64 and IA32 processor-based Linux and Windows systems Silverfrost (was Salford) — Personal edition (Windows) Sun Studio Compiler Suite — From Sun Microsystems; compiles optimized and parallelized code for the Solaris OS on SPARC and x86/x64 platforms, and Linux on x86/x64 platforms Integrated Development Environment Photran — An IDE for Fortran 77, 90, and 95 based on Eclipse and the CDT. Graphical libraries/GUI DISLIN — A high-level plotting library for displaying data as curves, polar plots, bar graphs, pie charts, 3D-color plots, surfaces, contours and maps f90gl — Public domain implementation of the official Fortran 90 bindings for OpenGL (Linux, Mac OS X, Solaris, UNIX, Windows) ftcl — A Fortran–Tcl/TK interface g2 graphical library — Portable and 2D graphics library (Linux, Mac OS X, OpenVMS, Solaris, UNIX, Windows)GNU LGPL GrWin Graphics Library — Free graphics routine library for Fortran (Windows) pilib — Platform Independent Library for Fortran 90/95 (Linux, Mac OS X, UNIX, Windows) GNU LGPL PLplot — A Scientific Plotting Library (Linux, Mac OS X, MS-DOS, Unix, Windows) GNU LGPL MATFOR — A Scientific Graphical Numerical Library (Linux, Mac OS X, MS-DOS, SuSe, Windows) Testing Frameworks FUnit — a unit testing framework. Miscellaneous
4768
dbpedia
1
4
https://www.coursehero.com/file/p7f9e1am/Based-on-the-experience-of-Fortran-I-Algol-58-was-announced-in-1958-Two/
en
[]
[]
[]
[ "" ]
null
[]
null
null
4768
dbpedia
1
27
https://itsfoss.community/t/if-linux-distros-were-programming-languages/8696
en
If linux distros were programming languages
https://itsfoss.communit…5c186d2f699c.png
https://itsfoss.communit…5c186d2f699c.png
[ "https://itsfoss.community/user_avatar/itsfoss.community/terminaldweller/48/5498_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/terminaldweller/48/5498_2.png", "https://itsfoss.community/images/emoji/twitter/laughing.png?v=12", "https://itsfoss.community/user_avatar/itsfoss.community/terminaldweller/48/5498_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/akito/48/565_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/nevj/48/6970_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/terminaldweller/48/5498_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/akito/48/565_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/nevj/48/6970_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/nevj/48/6970_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/akito/48/565_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/4dandl4/48/6928_2.png", "https://itsfoss.community/uploads/default/optimized/2X/b/b7a0241caba814817ee8bd1d3e5f6671dd408958_2_192x500.jpeg", "https://itsfoss.community/uploads/default/optimized/2X/9/94c8a9e3ee4886aa16d84b0b34160f33525903df_2_206x500.jpeg", "https://itsfoss.community/uploads/default/optimized/2X/f/f9984f1a5fa4c3ffa1290fd0c0e2af315477d199_2_690x416.jpeg", "https://itsfoss.community/user_avatar/itsfoss.community/daniel.m.tripp/48/6953_2.png", "https://itsfoss.community/user_avatar/itsfoss.community/daniel.m.tripp/48/6953_2.png", "https://itsfoss.community/images/emoji/twitter/laughing.png?v=12", "https://itsfoss.community/user_avatar/itsfoss.community/akito/48/565_2.png" ]
[]
[]
[ "" ]
null
[ "nevj (Neville Jackson)", "TerminalDweller (Terminal Dweller)", "Fast.Edi (Edgar Hoffmann)" ]
2022-04-01T23:11:18+00:00
Without further ado, “tux” a go! Gentoo is clearly Haskell, no doubt. Both of these are misunderstood and considered “hard yet flexible” Debian is C. Both of them existed from the start, and were used to influence many&hellip;
en
https://itsfoss.communit…611b_2_32x32.png
It's FOSS Community
https://itsfoss.community/t/if-linux-distros-were-programming-languages/8696
I see, I have to elaborate to make myself clear enough, to not be misunderstood. I’ll try to elaborate using examples. Humans have a hard life. They shouldn’t be bothered with more problems, than they need to. I think, everyone agrees with that. Programming languages deliver ways of talking to a computer. By using them, a human is able to instruct a computer to do certain actions and operations; deliver results. Life was hard. When a couple of kilobytes RAM and a 100MB HDD were immensely huge and expensive, life was hard in the computer world. It was a wild and tough world. Small, too, but that makes it even tougher, as fewer members would be able to support you in issues. Most of the time, you were alone reading a 500 page physical book about computers in a library. Luckily, life got easier. We got more RAM. We got more CPU power. We got cheaper PCs. I think, if you look at the Raspberry Pi project, enough is said. You know, life has progressed. Life has improved. Back then, there was ALGOL 68. Hell, most of you veterans even extensively programmed in Assembly! Tell that to someone, who just started out with Python or whatever… It’s a different world, with easier ways for humans to interact with computers. We got rid of FORTRAN. It has no significant impact in the world of popular and most used languages around the world. It’s mostly used for legacy projects and for special purposes, but it’s not a general purpose language, one picks for a new project. In fact, I haven’t seen a single new project written in FORTRAN and I also have never seen any project made with FORTRAN on Github, by accident. I would need to explicitly search for one, to find one. We got rid COBOL. It’s pretty much only used for maintaining very old legacy projects, often in the finance sector. Nobody will use COBOL for a new project. For good reasons, there are much better and modern alternatives, fitting our current world. Of course, we also got rid of all ALGOL and the different versions of it. Antiquated, complicated, old. It reeks of 80 year olds talking about post-war programming. I hope you get where I am getting it by now… We are progressing. We are going further. We are improving the computer world. Every single day, the world uses more and more computers for any task. The world becomes computerised. We need modern and fitting technology for today, to be able to properly solve today’s tasks. Not the tasks from 50 years ago… Now, we made progress everywhere. We got rid of shit. We got rid of huge monolithic projects. Pretty much all new projects I know of are written in Microservices or similar architectures. We use Kubernetes. Docker. We use all the new shit. Well, at least for new projects. I admin, there are more than enough legacy projects out there, doing the same mistakes since 20 years, just because it wasn’t a mistake yet, 20 years ago. Now, in all this hype, all this luck, all this modernity, all this looking forward, we are stuck with C in the systems programming market. What the hell? I ask you, do you want to program in Assembly?! HELL NO! I ask you, do you want to program in COBOL?! HELL NO! I ask you! DO YOU WANT TO PROGRAM IN ALGOL 68?! HEEEEELL NOOO!!! I ask you, do you want to program in C… And there it is. Suddenly, some people from somewhere below, wake up and start raising their bodies. Zombies? Sleeping elitists? Bearded men trapped in a mount’s crevice? Or maybe just harmless oldies, who did not take their meds? Not sure, if this mystery will ever be solved… To be continued… Do they come from deep underground caves, having slept for the past 30 years? Or did they literally raise from ground, below a forest? We won’t ever know this… Anyway. They start saying “yes, I mean, there are reasons… bla… bla… blabla…”. And I’m just standing there, shocked, and say to myself WHAT THE FUCK!!… “We have Rust, we have Nim, we have Go, and counting! We have so many alternatives… For every flavour of systems programming, you have your take. YES!!! Even for speed. Rust and Nim do not show any notable impact on performance, in all tests performed to this day.” I scream. “But C has been there forever… bla… amount of libraries… bla…blabla… compatibility with existing… blalblalglblsdfljhadsfjlhfsdascvcc…” they mutter. Like, with this attitude of these dumb or elitist pricks, we would still be stuck with COBOL and ALGOL 58!! We wouldn’t have gotten Java, Go, C#, etc… Heck, even Python. I hate Python, but it’s sure as hell better than any line of production C code, I have ever read in my life. There is a reason we got rid of COBOL and its friends. Back then, it was fine for the time and its context. Now, it’s just crap, we don’t want to use. Despite all that, we still have people, who think C is a valid programming language in 2022. It’s not. It hasn’t been for the past 10 years. It was fine back, when it was created. In fact, it was a bit revolutionary. But knocks on your head wake up! This was decades ago! C is so cancerous and so hated by myself, because it keeps popping up everywhere. It’s everywhere on Github. It’s everywhere in production, around the world. So many projects are polluted with this pest. It seems like you cannot get rid of this viral infection. Sure, even Linux has a huge impact on this. And this is also what I hate about Linux. It promotes the usage, at least indirectly, of some antiquated shitty ass language, which should’ve been buried at least 10 years ago. There is a reason the word cancer starts with a C! Now, to put the cherry on top of this rant, here is a tiny example, representing the very tip of the iceberg. The 1 nanometer wide very tip of the iceberg; the whole iceberg being bigger than all mountains on earth united… int main() { int a; } What is this? What the hell is that? What is a?
4768
dbpedia
2
33
https://jmvdveer.home.xs4all.nl/en.post.perspective-of-algol-68.html
en
A perspective of Algol 68
https://jmvdveer.home.xs…gol-68-genie.png
https://jmvdveer.home.xs…gol-68-genie.png
[ "https://jmvdveer.home.xs4all.nl/images.banner.png", "https://jmvdveer.home.xs4all.nl/images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.veldpost.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.sals-sans.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.endgame.png", "https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.sun.png", "https://jmvdveer.home.xs4all.nl/thumb.images.pooh.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.pleiades.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.vm-mvs.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.cd.png", "https://jmvdveer.home.xs4all.nl/thumb.images.interface.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.language-banner.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.thermodynamics.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.cwi.png", "https://jmvdveer.home.xs4all.nl/thumb.images.interface.png", "https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-thesis.png", "https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-foaralder.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/thumb.images.blog.eeyore.png", "https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png", "https://jmvdveer.home.xs4all.nl/images.veer.png" ]
[]
[]
[ "" ]
null
[ "Marcel van der Veer" ]
2024-07-03T15:03:26+00:00
To better understand the position of Algol 68 among today’s plethora of programming languages, we should consider their development …
images.icons.a68g.ico
https://jmvdveer.home.xs4all.nl/
https://jmvdveer.home.xs4all.nl//en.post.perspective-of-algol-68.html
The development of Algol played an important role in establishing computer science as an academic discipline. The former Mathematisch Centrum Amsterdam, now CWI, Centrum voor Wiskunde & Informatica was a leading institute in the design of Algol 68, as it was in later years in the development of Python. To better understand the position of Algol 68 among today’s plethora of programming languages, we should consider their development. First I will give a chronological overview discussing the period 1950-1960 in which a number of programming languages evolved, then the period 1980-1990 that saw consolidation of imperative languages and later the recombination and maturation of existing ideas in the 1990’s, and finally current trends. After that, the particular history of Algol 68 is discussed, and its current position. You will of course be aware that this post is an Algol 68 implementer's perspective. In the period 1950-1960 a number of programming languages evolved, the descendants of which are still widely used. The most notable are Fortran by Backus et al., Lisp by McCarthy et al., Cobol by Hopper et al. and Algol 60 by a committee of European and American academics including Backus. Algol 60 was particularly influential in the design of later languages since it introduced nested block structure, lexical scope, and a syntax in Backus-Naur form (BNF). Nearly all subsequent programming languages have used a variant of BNF to describe context-free syntax. At the time of the development of Algol 68, programming languages were required to serve two purposes. They should provide concepts and statements allowing a precise formal description of computing processes and facilitate communication between programmers, and they should provide a tool to solve small to medium-sized problems without specialist help. The context of Algol 68’s development is perhaps adequately illustrated by a quote [1] from Edsger Dijkstra: The intrinsic difficulty of the programming task has never been refuted … I vividly remember from the late 60’s the tendency to blame the programming languages in use and to believe in all naivety that, once the proper way of communicating with the machines had been found, all programming ills would have been cured. The early procedural programming languages served above purposes required for them. However, the evolving need to build complex interactive systems asked for decomposition of a problem into "natural" components, resulting in object oriented programming languages starting as early as the 1960’s. The object oriented and procedural paradigms each have strengths and weaknesses and it is not always clear which paradigm is best suited to certain problems. In numerical and scientific computing for instance, the benefit of object oriented languages over procedural languages is controversial since in heavy number crunching, efficiency is a top priority. The latter is a likely reason that software packages coded in Fortran are operational today, despite the rise of for instance Python in this field. The period 1960 to 1980 produced most of the major language paradigms now in use. Algol 68 was conceived as a successor to Algol 60. Its syntax and semantics became even more orthogonal and were defined by a Van Wijngaarden grammar, a formalism designed specifically for this purpose. Simula by Nygaard and Dahl was a superset of Algol 60 supporting object oriented programming, while Smalltalk by Kay, Ingalls and Kaehler, was a newly designed object oriented language. C, the Unix system programming language, was developed by Ritchie and Thompson at Bell Laboratories between 1969 and 1973. Prolog by Colmerauer, Roussel, and Kowalski was the first logic programming language. ML by Milner built a polymorphic type system on top of Lisp, pioneering statically typed functional programming languages. Each of these languages spawned a family of descendants, and most modern languages count at least one of them in their ancestry. Other important languages that were developed in this period include Pascal, Forth, Scheme and SQL. During the 1980's imperative languages consolidated. Although major new paradigms for imperative languages did not appear, many researchers elaborated on existing ideas from the 1970’s, for example object oriented programming, and adapting them to new contexts, for example to distributed systems. C++ combined object oriented programming and system programming. The United States government standardised Ada as a system programming language for defense contractors. Mainly in Japan major efforts were spent investigating so-called fifth-generation programming languages that incorporated logic programming constructs. The functional languages community standardised ML and Lisp. Research in Miranda, a functional language with lazy evaluation, began in this decade. An important trend in 1980’s language design was increased focus on programming large-scale systems through the use of modules, reflected in the development of Modula, Ada and ML. Some other notable languages from the 1980’s are Objective C and Perl. In the 1990’s recombination and maturation of existing ideas continued. An important motivation in this period was productivity. Many rapid application development (RAD) languages emerged, which usually were descendants of older, typically object oriented, languages that were equipped with an IDE and garbage collection. These languages included Object Pascal, Visual Basic, and Java. Java in particular received much attention. More radical and innovative were new scripting languages. These did not directly descend from other languages and featured new syntax and liberal incorporation of features. Many consider these scripting languages as more productive than RAD languages, though others will forward that scripting languages may make small programs simpler but large programs are more difficult to write and maintain. Nevertheless, scripting languages came to be the most prominent ones used in connection with the internet. Some important languages that were developed in the 1990’s are Haskell, Python and PHP. Some current trends in programming languages are mechanisms for security and reliability verification, alternative mechanisms for modularity, component-oriented software development, constructs to support concurrent and distributed programming, metaprogramming, and integration with databases. This century has so far seen the introduction of for example C#, Visual Basic.NET and Go. Algol, ALGOrithmic Language, is a family of imperative computer programming languages which greatly influenced many other languages and became the de facto way algorithms were described in textbooks and academic works for almost three decades. The two specifications relevant to this publication are Algol 60, revised in 1963, and Algol 68, revised in 1976. Algol 58, originally known as IAL (International Algebraic Language), was an early member of the Algol family soon superseded by Algol 60. Algol 58 introduced a compound statement which was restricted to flow of control only and did not relate to lexical scope as do Algol 60’s blocks. Ideally, a programming language supports systematic expression of algorithms by offering appropriate control structures and data structures, and a precise, consistent formal definition to avoid surprises and portability issues resulting from details that are implementation dependent. Members of the Algol family (Algol 60 and Algol 68, Simula, Pascal and also Ada, …) are considered reasonable approximations of such "ideal" languages, although all of them have strong points as well as disadvantages. Algol 68 offers appropriate means of abstraction and exemplary control structures that lead to a good understanding of structured programming. Its orthogonality results in economic use of language constructs making it a beautiful tool for programming. The design of Algol was firmly rooted in the computing community, at the time a term for the small but growing group of computer professionals and scientists. It formed an international platform for discussing programming languages, compiler construction, program construction, et cetera, and thus Algol had an important part in establishing computer science as an academic discipline in its own right. Algol 60 was designed by and for numerical mathematicians; in its day it was the Lingua Franca of computer science. The language introduced block structure with lexical scope and a concise BNF definition that were appreciated by people with a background in mathematics, but it lacked compilers and industrial support which gave the advantage to languages as Fortran and Cobol. To promote Algol, its application range had to be extended. IFIP [2] Working Group 2.1 Algorithmic Languages and Calculi (WG 2.1), that to this day has continuing responsibility for Algol 60 and Algol 68, assumed the task of developing a successor to Algol 60. In the early 1960’s WG 2.1 discussed this successor and in 1965 descriptions of a language Algol X based on these discussions were invited. This resulted in various language proposals by Wirth, Seegmüller and Van Wijngaarden [3] and other significant contributions by Hoare and Naur. Van Wijngaarden’s paper Orthogonal design and description of a formal language [4] featured a new technique for language design and definition and formed the basis for what would develop into Algol 68. Many features found in Algol 68 were first proposed in Algol Bulletin by the original authors of Algol 60 like Peter Naur, by new members of WG 2.1 like Tony Hoare and Niklaus Wirth, and by many others from the world-wide computing community. Reference [5] gives a first hand account of the events leading to Algol 68. Algol 68 has had a large influence on the development of programming languages since it addressed many issues; for example orthogonality, a strong type system, procedures as types, memory management, treatment of arrays, a rigorous description of syntax, and parallel processing, but also ideas that caused debate over the years such as context-sensitive coercions and quite complicated input-output formatting. After various, apparently lively, meetings WG 2.1 had not reached unanimous consent. Eventually, Algol 68 was produced by those members wanting a new milestone in language design. Dijkstra, Hoare, Seegmüller and others wrote a brief "Minority Report", more a letter than a report actually, outlining their view on what Algol 60's successor should have been; many years later it was commented that no programming language developed since, would have satisfied that vision. Yet others, for example Wirth and Hoare, opted to shorten the development cycle by improving Algol 60, which eventually produced Algol W and later Pascal. Algol 60 syntax is in BNF form whereas Algol 68 syntax is described by a two-level W-grammar (W for Van Wijngaarden) that can define a context-sensitive grammar. Simply put, in a W-grammar a high-level grammar produces a low-level grammar for all correct particular programs. The concept of context-sensitive grammar was introduced by Chomsky in the 1950’s to describe the syntax of natural language where a word may or may not be appropriate in a certain position, depending on context. Analogously, Algol 68 syntax defines syntactic restrictions; for example demanding that applied-identifiers or operators be declared (but not necessarily before they are applied), or demanding that modes result in finite objects that require finite coercion, et cetera. To reject incorrect programs, a context-free syntax must be complemented with extra rules formulated in natural language. The rigorous Algol 68 specification had a few loose ends, while the C specification had a few hundred. Probably because of the formal character of the Revised Report, which takes some study to comprehend, the misconception got hold that Algol 68 is a complex language. In fact it is rather lean, and the alleged obscurity of description is toned down by most who studied it [5]. Since it is argued that the specification of many contemporary languages, including that of modern C, is more complex than that of Algol 68 [6], one can ask whether the latter just looked complex at the time, fifty years ago. Algol 68 was defined in a formal document, first published in January 1969, and later published in Acta Informatica and also printed in Sigplan Notices. A Revised Report was issued in 1976 [7]. Algol 68 was the first major language for which a full formal definition was made before it was implemented. Though known to be terse, the Revised Report does contain humour solis sacerdotibus — to quote Koster from [5]: The strict and sober syntax permits itself small puns, as well as a liberal use of portmanteau words. Transput is input or output. ’Stowed’ is the word for structured or rowed. Hipping is the coercion for the hop, skip and jump. MOID is MODE or void. All metanotions ending on ETY have an empty production. Just reading aloud certain lines of the syntax, slightly raising the voice for capitalized words, conveys a feeling of heroic and pagan fun (…) Such lines cannot be read or written with a straight face. Algol 68 was designed for programmers, not for compiler writers, in a time when the field of compiler construction was not as advanced as it is today. Implementation efforts based on formal methods generally failed; Algol 68’s context-sensitive grammar required some invention to parse [8]. At the time compilers usually were made available on mainframes by computing centres, which may explain why Algol 68 was popular in locations rather than areas, for instance Amsterdam, Berlin or Cambridge. It was relatively popular in the United Kingdom, where the ALGOL68R , ALGOL68RS and ALGOL68C compilers were developed. Commercial initiatives were relatively unsuccessful; for instance the FLACC compiler sold just twenty-two copies [9]. Industry did not adopt Algol 68, with a few exceptions. There may be as many opinions on this as there are experts you might ask. IBM would not support it (though CDC did, on demand of academia), it was not as fast as Fortran (but might have been), there was no community outside academia, implementations were lagging … All too often, the timing must be right, the environment must be open to acceptance and adequate marketing should be in place, for an innovation to have a chance. But this is my two cents. Once university mainframes were phased out, the compilers on them were decommissioned. For scientific work, many migrated in the 1990's from mainframes to workstations that did not have Algol 68. In fact, me leaving academia in 1991 led me to write my own Algol 68 implementation so I could continue programming in it. Algol 68 apparently influenced many who were exposed to it, and it may still do so. A few individuals wonder, even today, whether it would have a future. In my opinion, the world has moved on. Although the language was innovative in its days, development stopped so it reflects the era it was developed in. Probably there is a newer language like Python that better suits your modern application on modern infrastructure, and has a large community. Algol 68 will not catch up anymore and further development would probably convert it into an anachronism. On the other hand, Algol 68 appears very familiar to young people, which is a testament to the influence it had on how we think about programming languages. Therefore the Algol 68 Genie project preserves Algol 68 out of educational as well as scientific-historical interest. Those interested in the language may study it and actually use it to understand the language and the influence it had, and experience what computer science and computing was like in its pioneering years, half a century ago. Considering the reactions to the project, I think it is fair to say that it succeeds in this purpose. Of course, people also use it for programming, for instance in numerical mathematics as I like to do. Despite its age, even today Algol 68 is a beautiful means for writing algorithms. Notes [1] Transcript from keynote delivered at the ACM 1984 South Central Regional Conference. Source: E. W. Dijkstra Archive-the manuscripts of Edsger W. Dijkstra. [2] IFIP, the International Federation for Information Processing is an umbrella organisation for national information processing organisations. It was established in 1960 under the auspices of UNESCO. [3] Adriaan van Wijngaarden (1916-1987) is considered by many to be the founding father of computer science in the Netherlands. He was co-founder of IFIP and one of the designers of Algol 60 and later Algol 68. As leader of the Algol 68 committee, he made a profound contribution to the field of programming language design, definition and description. [4] A. van Wijngaarden, Orthogonal design and description of a formal language. Available online from CWI. [5] Proceedings, Conference on the history of Algol 68 [1993]. Available online from CWI. [6] K. Henney. Procedural Programming: It’s Back? It Never Went Away. Available for viewing on YouTube. [7] A. van Wijngaarden et al., Revised report on the algorithmic language Algol 68 [1976]. Available online from CWI, or in HTML as a previous blog post, or as part of Learning Algol 68 Genie . [8] B. Mailloux, On the implementation of Algol 68 [1968]. Available online from CWI. Algol 68 Genie employs a multi-pass scheme to parse Algol 68. Extracting declarations before parsing as proposed by Mailloux, renders the W-grammar LALR. [9] Source: Chris Thomson, formerly with Chion Corporation, on comp.lang.misc [1988].
4768
dbpedia
0
1
https://en.wikipedia.org/wiki/ALGOL
en
Wikipedia
https://upload.wikimedia…t_al_-_cover.jpg
https://upload.wikimedia…t_al_-_cover.jpg
[ "https://en.wikipedia.org/static/images/icons/wikipedia.png", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg", "https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg", "https://upload.wikimedia.org/wikipedia/commons/thumb/c/c8/1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg/121px-1965_ALGOL-20_A_Language_Manual%2C_Fierst_et_al_-_cover.jpg", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/8/84/Algol%26Fortran_family-by-Borkowski.svg/220px-Algol%26Fortran_family-by-Borkowski.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/f/f2/Edit-clear.svg/40px-Edit-clear.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/9/9c/Flag_of_Denmark.svg/20px-Flag_of_Denmark.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/0/03/Flag_of_Italy.svg/23px-Flag_of_Italy.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/c/c3/Flag_of_France.svg/23px-Flag_of_France.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/a/a9/Flag_of_the_Soviet_Union.svg/23px-Flag_of_the_Soviet_Union.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/3/3d/Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg/23px-Flag_of_the_Estonian_Soviet_Socialist_Republic_%281953%E2%80%931990%29.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/1/12/Flag_of_Poland.svg/23px-Flag_of_Poland.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Flag_of_the_People%27s_Republic_of_China.svg/23px-Flag_of_the_People%27s_Republic_of_China.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png", "https://upload.wikimedia.org/wikipedia/commons/thumb/1/1c/Wiki_letter_w_cropped.svg/44px-Wiki_letter_w_cropped.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png", "https://upload.wikimedia.org/wikipedia/en/thumb/8/8a/OOjs_UI_icon_edit-ltr-progressive.svg/10px-OOjs_UI_icon_edit-ltr-progressive.svg.png", "https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1", "https://en.wikipedia.org/static/images/footer/wikimedia-button.svg", "https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg" ]
[]
[]
[ "" ]
null
[ "Contributors to Wikimedia projects" ]
2001-08-26T23:02:19+00:00
en
/static/apple-touch/wikipedia.png
https://en.wikipedia.org/wiki/ALGOL
Family of programming languages This article is about the programming language family. For other uses, see Algol (disambiguation). ALGOLParadigmProcedural, imperative, structuredFamilyALGOLDesigned byBauer, Bottenbruch, Rutishauser, Samelson, Backus, Katz, Perlis, Wegstein, Naur, Vauquois, van Wijngaarden, Woodger, Green, McCarthyFirst appeared1958; 66 years ago ( )Typing disciplineStatic, strongScopeLexicalInfluencedMost subsequent imperative languages (including so-called ALGOL-like languages) e.g. PL/I, Simula, Pascal, C and Scheme ALGOL ( ; short for "Algorithmic Language")[1] is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years.[2] In the sense that the syntax of most modern languages is "Algol-like",[3] it was arguably more influential than three other high-level programming languages among which it was roughly contemporary: FORTRAN, Lisp, and COBOL.[4] It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/I, Simula, BCPL, B, Pascal, Ada, and C. ALGOL introduced code blocks and the begin...end pairs for delimiting them. It was also the first language implementing nested function definitions with lexical scope. Moreover, it was the first programming language which gave detailed attention to formal language definition and through the Algol 60 Report introduced Backus–Naur form, a principal formal grammar notation for language design. There were three major specifications, named after the years they were first published: ALGOL 58 – originally proposed to be called IAL, for International Algebraic Language. ALGOL 60 – first implemented as X1 ALGOL 60 in 1961. Revised 1963.[5][6][7] ALGOL 68 – introduced new elements including flexible arrays, slices, parallelism, operator identification. Revised 1973.[8] ALGOL 68 is substantially different from ALGOL 60 and was not well received,[according to whom?] so reference to "Algol" is generally understood to mean ALGOL 60 and its dialects.[citation needed] History [edit] ALGOL was developed jointly by a committee of European and American computer scientists in a meeting in 1958 at the Swiss Federal Institute of Technology in Zurich (cf. ALGOL 58).[9] It specified three different syntaxes: a reference syntax, a publication syntax, and an implementation syntax, syntaxes that permitted it to use different keyword names and conventions for decimal points (commas vs periods) for different languages.[citation needed] ALGOL was used mostly by research computer scientists in the United States and in Europe; commercial applications were hindered by the absence of standard input/output facilities in its description, and the lack of interest in the language by large computer vendors (other than Burroughs Corporation).[citation needed] ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development.[according to whom?] John Backus developed the Backus normal form method of describing programming languages specifically for ALGOL 58. It was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur form.[10] Peter Naur: "As editor of the ALGOL Bulletin I was drawn into the international discussions of the language and was selected to be member of the European language design group in November 1959. In this capacity I was the editor of the ALGOL 60 report, produced as the result of the ALGOL 60 meeting in Paris in January 1960."[11] The following people attended the meeting in Paris (from 1 to 16 January):[citation needed] Friedrich Ludwig Bauer, Peter Naur, Heinz Rutishauser, Klaus Samelson, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger (from Europe) John Warner Backus, Julien Green, Charles Katz, John McCarthy, Alan Jay Perlis, and Joseph Henry Wegstein (from the US). Alan Perlis gave a vivid description of the meeting: "The meetings were exhausting, interminable, and exhilarating. One became aggravated when one's good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent."[This quote needs a citation] ALGOL 60 inspired many languages that followed it. Tony Hoare remarked: "Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors."[12] The Scheme programming language, a variant of Lisp that adopted the block structure and lexical scope of ALGOL, also adopted the wording "Revised Report on the Algorithmic Language Scheme" for its standards documents in homage to ALGOL.[13] ALGOL and programming language research [edit] As Peter Landin noted,[citation needed] ALGOL was the first language to combine seamlessly imperative effects with the (call-by-name) lambda calculus.[citation needed] Perhaps the most elegant formulation of the language is due to John C. Reynolds, and it best exhibits its syntactic and semantic purity.[according to whom?] Reynolds's idealized ALGOL also made a convincing methodologic argument regarding the suitability of local effects in the context of call-by-name languages, in contrast with the global effects used by call-by-value languages such as ML.[citation needed] The conceptual integrity of the language made it one of the main objects of semantic research, along with Programming Computable Functions (PCF) and ML.[citation needed] IAL implementations timeline [edit] To date there have been at least 70 augmentations, extensions, derivations and sublanguages of Algol 60.[14] Name Year Author Country Description Target CPU ZMMD-implementation 1958 Friedrich L. Bauer, Heinz Rutishauser, Klaus Samelson, Hermann Bottenbruch Germany implementation of ALGOL 58 Z22 (later Zuse's Z23 was delivered with an Algol 60 compiler)[15] X1 ALGOL 60 1960 August[16] Edsger W. Dijkstra and Jaap A. Zonneveld Netherlands First implementation of ALGOL 60[17] Electrologica X1 Elliott ALGOL 1960s C. A. R. Hoare UK Subject of the 1980 Turing Award Lecture[18] Elliott 803, Elliott 503, Elliott 4100 series JOVIAL 1960 Jules Schwartz US A DOD HOL prior to Ada Various (see article) Burroughs Algol (Several variants) 1961 Burroughs Corporation (with participation by Hoare, Dijkstra, and others) US Basis of the Burroughs (and now Unisys MCP based) computers Burroughs Large Systems and their midrange also. Case ALGOL 1961 Case Institute of Technology[19] US Simula was originally contracted as a simulation extension of the Case ALGOL UNIVAC 1107 GOGOL 1961 William M. McKeeman US For ODIN time-sharing system[20] PDP-1 RegneCentralen ALGOL 1961 Peter Naur, Jørn Jensen Denmark Implementation of full Algol 60 DASK at Regnecentralen Dartmouth ALGOL 30 1962 Thomas Eugene Kurtz et al. US LGP-30 USS 90 Algol 1962 L. Petrone Italy ALGOL 60 1962 Bernard Vauquois, Louis Bolliet[21] France Institut d'Informatique et Mathématiques Appliquées de Grenoble (IMAG) and Compagnie des Machines Bull Bull Gamma 60 Algol Translator 1962 G. van der Mey and W.L. van der Poel Netherlands Staatsbedrijf der Posterijen, Telegrafie en Telefonie ZEBRA Kidsgrove Algol 1963 F. G. Duncan UK English Electric Company KDF9 VALGOL 1963 Val Schorre US A test of the META II compiler compiler Whetstone 1964 Brian Randell and L. J. Russell UK Atomic Power Division of English Electric Company. Precursor to Ferranti Pegasus, National Physical Laboratories ACE and English Electric DEUCE implementations. English Electric Company KDF9 NU ALGOL 1965 Norway UNIVAC ALGEK 1965 Soviet Union АЛГЭК, based on ALGOL-60 and COBOL support, for economical tasks Minsk-22 ALGOL W 1966 Niklaus Wirth US Proposed successor to ALGOL 60 IBM System/360 MALGOL 1966 publ. A. Viil, M Kotli & M. Rakhendi, Estonian SSR Minsk-22 ALGAMS 1967 GAMS group (ГАМС, группа автоматизации программирования для машин среднего класса), cooperation of Comecon Academies of Science Comecon Minsk-22, later ES EVM, BESM ALGOL/ZAM 1967 Poland Polish ZAM computer Simula 67 1967 Ole-Johan Dahl and Kristen Nygaard Norway Algol 60 with classes UNIVAC 1107 Triplex-ALGOL Karlsruhe 1967/1968 Karlsruhe, Germany ALGOL 60 (1963) with triplex numbers for interval arithmetic [22] Chinese Algol 1972 China Chinese characters, expressed via the Symbol system DG/L 1972 US DG Eclipse family of Computers S-algol 1979 Ron Morrison UK Addition of orthogonal datatypes with intended use as a teaching language PDP-11 with a subsequent implementation on the Java VM The Burroughs dialects included special Bootstrapping dialects such as ESPOL and NEWP. The latter is still used for Unisys MCP system software. Properties [edit] ALGOL 60 as officially defined had no I/O facilities; implementations defined their own in ways that were rarely compatible with each other. In contrast, ALGOL 68 offered an extensive library of transput (input/output) facilities. ALGOL 60 allowed for two evaluation strategies for parameter passing: the common call-by-value, and call-by-name. Call-by-name has certain effects in contrast to call-by-reference. For example, without specifying the parameters as value or reference, it is impossible to develop a procedure that will swap the values of two parameters if the actual parameters that are passed in are an integer variable and an array that is indexed by that same integer variable.[23] Think of passing a pointer to swap(i, A[i]) in to a function. Now that every time swap is referenced, it is reevaluated. Say i := 1 and A[i] := 2, so every time swap is referenced it will return the other combination of the values ([1,2], [2,1], [1,2] and so on). A similar situation occurs with a random function passed as actual argument. Call-by-name is known by many compiler designers for the interesting "thunks" that are used to implement it. Donald Knuth devised the "man or boy test" to separate compilers that correctly implemented "recursion and non-local references." This test contains an example of call-by-name. ALGOL 68 was defined using a two-level grammar formalism invented by Adriaan van Wijngaarden and which bears his name. Van Wijngaarden grammars use a context-free grammar to generate an infinite set of productions that will recognize a particular ALGOL 68 program; notably, they are able to express the kind of requirements that in many other programming language standards are labelled "semantics" and have to be expressed in ambiguity-prone natural language prose, and then implemented in compilers as ad hoc code attached to the formal language parser. Examples and portability [edit] Code sample comparisons [edit] ALGOL 60 [edit] (The way the bold text has to be written depends on the implementation, e.g. 'INTEGER'—quotation marks included—for integer. This is known as stropping.) procedure Absmax(a) Size:(n, m) Result:(y) Subscripts:(i, k); value n, m; array a; integer n, m, i, k; real y; comment The absolute greatest element of the matrix a, of size n by m, is copied to y, and the subscripts of this element to i and k; begin integer p, q; y := 0; i := k := 1; for p := 1 step 1 until n do for q := 1 step 1 until m do if abs(a[p, q]) > y then begin y := abs(a[p, q]); i := p; k := q end end Absmax Here is an example of how to produce a table using Elliott 803 ALGOL.[24] FLOATING POINT ALGOL TEST' BEGIN REAL A,B,C,D' READ D' FOR A:= 0.0 STEP D UNTIL 6.3 DO BEGIN PRINT PUNCH(3),££L??' B := SIN(A)' C := COS(A)' PRINT PUNCH(3),SAMELINE,ALIGNED(1,6),A,B,C' END END' ALGOL 68 [edit] The following code samples are ALGOL 68 versions of the above ALGOL 60 code samples. ALGOL 68 implementations used ALGOL 60's approaches to stropping. In ALGOL 68's case tokens with the bold typeface are reserved words, types (modes) or operators. proc abs max = ([,]real a, ref real y, ref int i, k)real: comment The absolute greatest element of the matrix a, of size ⌈a by 2⌈a is transferred to y, and the subscripts of this element to i and k; comment begin real y := 0; i := ⌊a; k := 2⌊a; for p from ⌊a to ⌈a do for q from 2⌊a to 2⌈a do if abs a[p, q] > y then y := abs a[p, q]; i := p; k := q fi od od; y end # abs max # Note: lower (⌊) and upper (⌈) bounds of an array, and array slicing, are directly available to the programmer. floating point algol68 test: ( real a,b,c,d; # printf – sends output to the file stand out. # # printf($p$); – selects a new page # printf(($pg$,"Enter d:")); read(d); for step from 0 while a:=step*d; a <= 2*pi do printf($l$); # $l$ - selects a new line. # b := sin(a); c := cos(a); printf(($z-d.6d$,a,b,c)) # formats output with 1 digit before and 6 after the decimal point. # od ) Timeline: Hello world [edit] The variations and lack of portability of the programs from one implementation to another is easily demonstrated by the classic hello world program.[citation needed] ALGOL 58 (IAL) [edit] Main article: ALGOL 58 ALGOL 58 had no I/O facilities. ALGOL 60 family [edit] Main article: ALGOL 60 Since ALGOL 60 had no I/O facilities, there is no portable hello world program in ALGOL. The next three examples are in Burroughs Extended Algol. The first two direct output at the interactive terminal they are run on. The first uses a character array, similar to C. The language allows the array identifier to be used as a pointer to the array, and hence in a REPLACE statement. BEGIN FILE F(KIND=REMOTE); EBCDIC ARRAY E[0:11]; REPLACE E BY "HELLO WORLD!"; WRITE(F, *, E); END. A simpler program using an inline format: BEGIN FILE F(KIND=REMOTE); WRITE(F, <"HELLO WORLD!">); END. An even simpler program using the Display statement. Note that its output would end up at the system console ('SPO'): BEGIN DISPLAY("HELLO WORLD!") END. An alternative example, using Elliott Algol I/O is as follows. Elliott Algol used different characters for "open-string-quote" and "close-string-quote", represented here by ‘ and ’ . programHiFolks; begin print‘Helloworld’ end; Below is a version from Elliott 803 Algol (A104). The standard Elliott 803 used five-hole paper tape and thus only had upper case. The code lacked any quote characters so £ (UK Pound Sign) was used for open quote and ? (Question Mark) for close quote. Special sequences were placed in double quotes (e.g. ££L?? produced a new line on the teleprinter). HIFOLKS' BEGIN PRINT £HELLO WORLD£L??' END' The ICT 1900 series Algol I/O version allowed input from paper tape or punched card. Paper tape 'full' mode allowed lower case. Output was to a line printer. The open and close quote characters were represented using '(' and ')' and spaces by %.[25] 'BEGIN' WRITE TEXT('('HELLO%WORLD')'); 'END' ALGOL 68 [edit] Main article: ALGOL 68 ALGOL 68 code was published with reserved words typically in lowercase, but bolded or underlined. begin printf(($gl$,"Hello, world!")) end In the language of the "Algol 68 Report" the input/output facilities were collectively called the "Transput". Timeline of ALGOL special characters [edit] The ALGOLs were conceived at a time when character sets were diverse and evolving rapidly; also, the ALGOLs were defined so that only uppercase letters were required. 1960: IFIP – The Algol 60 language and report included several mathematical symbols which are available on modern computers and operating systems, but, unfortunately, were unsupported on most computing systems at the time. For instance: ×, ÷, ≤, ≥, ≠, ¬, ∨, ∧, ⊂, ≡, ␣ and ⏨. 1961 September: ASCII – The ASCII character set, then in an early stage of development, had the \ (Back slash) character added to it in order to support ALGOL's boolean operators /\ and \/.[26] 1962: ALCOR – This character set included the unusual "᛭" runic cross[27] character for multiplication and the "⏨" Decimal Exponent Symbol[28] for floating point notation.[29][30][31] 1964: GOST – The 1964 Soviet standard GOST 10859 allowed the encoding of 4-bit, 5-bit, 6-bit and 7-bit characters in ALGOL.[32] 1968: The "Algol 68 Report" – used extant ALGOL characters, and further adopted →, ↓, ↑, □, ⌊, ⌈, ⎩, ⎧, ○, ⊥, and ¢ characters which can be found on the IBM 2741 keyboard with typeball (or golf ball) print heads inserted (such as the APL golf ball). These became available in the mid-1960s while ALGOL 68 was being drafted. The report was translated into Russian, German, French, and Bulgarian, and allowed programming in languages with larger character sets, e.g., Cyrillic alphabet of the Soviet BESM-4. All ALGOL's characters are also part of the Unicode standard and most of them are available in several popular fonts. 2009 October: Unicode – The ⏨ (Decimal Exponent Symbol) for floating point notation was added to Unicode 5.2 for backward compatibility with historic Buran programme ALGOL software.[33] Legacy [edit] A significant contribution of the ALGOL 58 Report was to provide standard terms for programming concepts: statement, declaration, type, label, primary, block, and others.[34] See also [edit] References [edit] Further reading [edit] O'Hearn, Peter; Tennent, Robert D., eds. (1997). Algol-like Languages. Vol. 1. Cambridge, MA: Birkhauser. doi:10.1007/978-1-4612-4118-8. ISBN 9780817638801. S2CID 6273486. The first volume of a two volume set, that includes an introduction from Peter W. O’Hearn and Robert D. Tennent, a report on Algol 60 from Peter Naur and colleagues, four chapters from John C. Reynolds, and other chapters from Christopher Strachey, Matthias Felleisen, Stephen Weeks, Albert R. Meyer, Kurt Sieber, Vipin Swarup, Uday S. Reddy, and Evan Ireland. Baumann, Richard [in German]; Feliciano, Manuel; Bauer, Friedrich Ludwig; Samelson, Klaus (1964). Introduction to ALGOL–A Primer for the Non-Specialist, Emphasizing the Practical Uses of the Algorithmic Language. Automatic Computation. Englewood Cliffs, NJ: Prentice-Hall, Inc. ISBN 0134778286. LCCN 64-10740. ark:/13960/t6qz35p37 . Randell, Brian & L. J. Russell (1964). ALGOL 60 Implementation: The Translation and Use of ALGOL 60 Programs on a Computer. Academic Press. CiteSeerX 10.1.1.737.475. . On the design of the Whetstone Compiler, and one of the early published descriptions of implementing a compiler. For a related paper, see Brian Randell's Whetstone Algol Revisited Archived 27 February 2008 at the Wayback Machine, and The Whetstone KDF9 Algol Translator. Dijkstra, E. W (1961), ALGOL 60 Translation: An ALGOL 60 Translator for the X1 and Making a Translator for ALGOL 60 (PDF), report MR 35/61, Amsterdam: Mathematisch Centrum, archived (PDF) from the original on 9 October 2022 Kruseman Aretz, Frans E.J., The Dijkstra–Zonneveld ALGOL 60 Compiler for the Electrologica X1 (PDF), Historical note SEN, 2, Amsterdam: Centrum voor Wiskunde en Informatica, archived (PDF) from the original on 9 October 2022 Goos, Gerhard [in German] (7 August 2017). Geschichte der Deutschsprachigen Informatik - Programmiersprachen und Übersetzerbau [History of Informatics in German-Speaking Countries - Programming Languages and Compiler Design] (PDF) (in German). Karlsruhe, Germany: Fakultät für Informatik, Karlsruhe Institute of Technology. Archived (PDF) from the original on 19 May 2022 .
4768
dbpedia
2
64
https://bytescout.com/blog/how-to-program-in-algol-for-mainframe.html
en
How to Program In Algol for MainFrame
https://s16458.pcdn.co/w…or-Education.jpg
https://s16458.pcdn.co/w…or-Education.jpg
[ "https://s16458.pcdn.co/wp-content/uploads/2020/01/logo-1.png", "https://s16458.pcdn.co/wp-content/uploads/2020/01/logo-1.png", "https://s16458.pcdn.co/wp-content/uploads/2020/01/logo-1.png", "https://s16458.pcdn.co/wp-content/uploads/2019/11/A233_RPA-for-Education.jpg", "https://s16458.pcdn.co/wp-content/uploads/2019/11/A233_RPA-for-Education.jpg", "https://s16458.pcdn.co/wp-content/uploads/2016/01/bytescout_bear_6.png", "https://s16458.pcdn.co/wp-content/uploads/2016/01/bytescout_bear_6.png", "https://s16458.pcdn.co/wp-content/uploads/2016/01/bytescout_bear_6.png", "https://s16458.pcdn.co/wp-content/uploads/2022/02/A471_OpenCart-Blogs-Sales-OC2-to-OC3-Migration-1110x500.jpg", "https://s16458.pcdn.co/wp-content/uploads/2019/05/comics-wordpress-theme-before-and-after-1110x500.jpg", "https://s16458.pcdn.co/wp-content/uploads/2022/02/PDF-Association2.png", "https://s16458.pcdn.co/wp-content/uploads/2022/02/PDF-Association2.png" ]
[]
[]
[ "" ]
null
[ "dk" ]
2020-08-01T09:20:49+00:00
Learn the fundamentals of programming in ALGOL for Mainframe application, including its program structure, data and decelerations, and more.
en
//s16458.pcdn.co/wp-content/uploads/2016/09/favicon.png
ByteScout
https://bytescout.com/blog/how-to-program-in-algol-for-mainframe.html
Are you looking to learn about how to program in ALGOL programming language for mainframe computers? If you do, then you have come to the right place. ALGOL is undoubtedly one of the most influential languages out there. It became popular in the era of other influential programming languages such as COBOL, Fortran, and so on! What Makes ALGOL so special? So, what made ALGOL so special? It is because it influenced other languages and is also used to describe algorithms in textbooks until modern programming languages were introduced. ALGOL builds upon the FORTRAN failure by offering better design and implementation. It is also one of the first programming languages that defined code blocks. The first ALGOL version was ALGOL 58 where it. In this article, we will explore ALGOL and how you can use it for mainframe programming. Let’s get started. How To Program In Algol Programming Language For MainFrame Computers To make sure that we are on the same page, we are going to use ALGOL 68. Program Structure The first thing that you need to learn is the program structure. Each of the ALGOL 68 programs starts with “begin” and then finishes with “end”. The syntax would read like below. begin Insert code here end Apart from that, an ALGOL 68 program can do the following: Symbols can be written in boldface type so that they are distinguishable from keywords Carriage returns and space can be used to improve readability Comments can be represented by four symbols: comment, co, #, and ¢. Data and Decelerations ALGOL 68 uses modes instead of “type” to detonate the data type. For example, there is Boolean, character, real, and integer. Just like modern programming languages, these variables need to be declared before they can be used. begin real a,b,c,d, bool nit int x,y,z,employee end Statements Next, we come to statements. In ALGOL 68, every statement yields value. This is because ALGOL is an expressive language where the value is shown on the right-hand side of the assignment. Apart from that, it supports if statements, for statements, case statements, while statements, and goto statements. If you ever programmed before, you will find all of these statements familiar to you. if x> 5 then print (“impressive”) else print (“nope”) fi We also have else-if which is represented as the elif keyword. It can be used as below. if word = “abc” then print(“new born”) elif word = “pizza” then print (“teen”) elif word = “money” then print (“adult”) else print (“alien”) fi You can run conditions based on the if statement. Moreover, you can also use loops to do an operation multiple times. Let’s see a for loop example. for k from 1 by 1 to j+2 do print (new line) od; In contrast, the while- do loop works as below. while x<y V a<0 do x:= x+ 1; y = y+ 1; print((i,j)) od; Modes in Algol 68 There are multiple modes that are part of the ALGOL 68 programming languages. These modes are simply data types and can be divided into primitive modes, array modes, structured modes, united modes, reference-to, and procedure modes. These data types enable you to handle the different data in your ALGOL 68. As a programmer, you need to decide beforehand what type of data type you want to use, depending on your approach. What’s Next? This is just the beginning when it comes to programming in ALGOL programming language for mainframe computers. With the above knowledge, you can get started with the language itself. So, what program are you going to build for the mainframe using ALGOL 68? Comment below and let us know.
4768
dbpedia
1
31
https://www.encyclopedia.com/science-and-technology/computers-and-electrical-engineering/computers-and-computing/algol
en
Encyclopedia.com
[ "https://www.encyclopedia.com/themes/custom/trustme/images/header-logo.jpg" ]
[]
[]
[ "Get information", "facts", "and pictures", "about Algol", "at Encyclopedia.com", "Make", "research", "projects", "and school reports", "about Algol", "easy", "with credible", "articles", "from our FREE", "online encyclopedia and dictionary" ]
null
[]
null
*Algol Acronym for algorithmic language.* The generic name for a family of high-level languages of great significance in the development of computing.
en
/sites/default/files/favicon.ico
https://www.encyclopedia.com/science-and-technology/computers-and-electrical-engineering/computers-and-computing/algol
Algol Algol Acronym for algorithmic language. The generic name for a family of high-level languages of great significance in the development of computing. In 1958 the Association for Computing Machinery (ACM) in the US and the Gessellschaft für Angewante Mathematik und Mechanik (GAMM) in Europe set up a joint committee to define an international algorithmic language (IAL). The language that was designed became known as Algol, and was later called Algol 58 to distinguish it from later versions. Algol 58 was not intended to be a viable language, and in 1960 an augmented committee was convened to devise the second iteration, which was published as the language Algol 60. See also JOVIAL. Algol 60 was much more popular in Europe than in the US, probably due to the dominance of IBM and Fortran in the North American market. It introduced many new concepts, notably block structure (see block-structured languages), nested scopes, modes of parameter passing to procedures, and the definition of the language introduced the now classic BNF notation for describing syntax. The influence of Algol 60 can be seen in all succeeding languages, and it stands as a milestone in the development of programming languages. In the years following the publication of the Algol 60 Report, a working group of the International Federation for Information Processing was set up to consider the definition of a successor to Algol 60. There were many dissensions within the group, and eventually a minority report was issued proposing the language Algol 68. The first implementation of Algol 68, named ALGOL 68R, was produced at the Royal Signals and Radar Establishment in the UK. ALGOL 68R demonstrated that Algol 68 was a viable language (not at the time a self-evident proposition). Although Algol 68 introduced many novel concepts of great theoretical interest and significance, its practical application was almost nil. One of the most notable features of Algol 68 is its formal specification using a two-level grammar. Although a very precise definition, it is very difficult to understand, and this difficulty partly accounts for the low acceptance of the language. One of the most significant effects of the split in the Algol 68 working group is that it led indirectly to the development of Pascal. Cite this article Pick a style below, and copy the text for your bibliography. MLA Chicago APA JOHN DAINTITH "Algol ." A Dictionary of Computing. . Encyclopedia.com. 30 Jul. 2024 <https://www.encyclopedia.com>. JOHN DAINTITH "Algol ." A Dictionary of Computing. . Encyclopedia.com. (July 30, 2024). https://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/algol JOHN DAINTITH "Algol ." A Dictionary of Computing. . Retrieved July 30, 2024 from Encyclopedia.com: https://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/algol Learn more about citation styles Citation styles Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA). Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list. Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites: Modern Language Association http://www.mla.org/style The Chicago Manual of Style http://www.chicagomanualofstyle.org/tools_citationguide.html American Psychological Association http://apastyle.apa.org/ Notes: Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates. In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list. Algol Algol (or ALGOL) (ˈælgɒl) Computing algorithmic language
4768
dbpedia
3
28
http://groups.umd.umich.edu/cis/course.des/cis400/algol/algol.html
en
The ALGOL Programming Language
[ "http://groups.umd.umich.edu/cis/course.des/cis400/algol/algol.jpg", "http://groups.umd.umich.edu/cis/course.des/cis400/prev_btn.gif", "http://groups.umd.umich.edu/cis/course.des/cis400/home_btn.gif", "http://groups.umd.umich.edu/cis/course.des/cis400/next_btn.gif" ]
[]
[]
[ "" ]
null
[]
null
null
Significant Language Features ALGOL was the first second-generation programming language and its characteristics are typical of the entire generation. First consider the data structures, which are very close to first generation structures. In ALGOL 60 the block structure was introduced: the ability to create blocks of statements for the scope of variables and the extent of influence of control statements. Along with that, two different means of passing parameters to subprograms; call by value and call by name. Structured control statements: if - then - else and the use of a general condition for iteration control were also features, as was the concept of recursion: the ability of a procedure to call itself. One of the greatest impacts ALGOL 60 had was a result of its description as found in Naur (1963). A major contribution of this report was the introduction of BNF notation for defining the syntax of the language. Overall, ALGOL is considered to be perhaps the most orthogonal programming language, meaning it has a relatively small number of basic constructs and a set of rules for combining those constructs. Every construct has a type associated with it and there are no restrictions on those types. In addition, most constructs produce values. Several of ALGOL�s other characteristics are listed below:
4768
dbpedia
3
53
https://www.geeksforgeeks.org/the-evolution-of-programming-languages/
en
The Evolution of Programming Languages
https://media.geeksforge…_200x200-min.png
https://media.geeksforge…_200x200-min.png
[ "https://media.geeksforgeeks.org/gfg-gg-logo.svg", "https://media.geeksforgeeks.org/auth-dashboard-uploads/Google-news.svg", "https://media.geeksforgeeks.org/auth/profile/2328y629trkwky4q6lb4", "https://media.geeksforgeeks.org/auth-dashboard-uploads/Google-news.svg", "https://media.geeksforgeeks.org/auth-dashboard-uploads/new-premium-rbanner-us.png", "https://media.geeksforgeeks.org/auth-dashboard-uploads/gfgFooterLogo.png", "https://media.geeksforgeeks.org/auth-dashboard-uploads/googleplay.png", "https://media.geeksforgeeks.org/auth-dashboard-uploads/appstore.png", "https://media.geeksforgeeks.org/auth-dashboard-uploads/suggestChangeIcon.png", "https://media.geeksforgeeks.org/auth-dashboard-uploads/createImprovementIcon.png" ]
[]
[]
[ "Data Structures", "Algorithms", "Python", "Java", "C", "C++", "JavaScript", "Android Development", "SQL", "Data Science", "Machine Learning", "PHP", "Web Development", "System Design", "Tutorial", "Technical Blogs", "Interview Experience", "Interview Preparation", "Programming", "Competitive Programming", "Jobs", "Coding Contests", "GATE CSE", "HTML", "CSS", "React", "NodeJS", "Placement", "Aptitude", "Quiz", "Computer Science", "Programming Examples", "GeeksforGeeks Courses", "Puzzles", "SSC", "Banking", "UPSC", "Commerce", "Finance", "CBSE", "School", "k12", "General Knowledge", "News", "Mathematics", "Exams" ]
null
[ "GeeksforGeeks" ]
2021-01-26T12:09:26
A Computer Science portal for geeks. It contains well written, well thought and well explained computer science and programming articles, quizzes and practice/competitive programming/company interview Questions.
en
https://media.geeksforge…/gfg_favicon.png
GeeksforGeeks
https://www.geeksforgeeks.org/the-evolution-of-programming-languages/
Last Updated : 11 Jul, 2022 Improve Programming Language is indeed the fundamental unit of today’s tech world. It is considered as the set of commands and instructions that we give to the machines to perform a particular task. For example, if you give some set of instructions to add two numbers then the machine will do it for you and tell you the correct answer accordingly. But do you know that Programming Languages are having a long and rich history of their evolution? And with a similar concern, here in this article, we’ll take a look at the evolution of Programming Languages over the period. In the computer world, we have about 500+ programming languages with having their own syntax and features. And if you type who’s the father of the computer, then the search engine will show you the result as to Charles Babbage but the father of the computer didn’t write the first code. It was Ada Lovelace who has written the first-ever computer programming language and the year was 1883. 1883: The Journey starts from here…!! In the early days, Charles Babbage had made the device, but he was confused about how to give instructions to the machine, and then Ada Lovelace wrote the instructions for the analytical engine. The device was made by Charles Babbage and the code was written by Ada Lovelace for computing Bernoulli’s number. First time in history that the capability of computer devices was judged. 1949: Assembly Language It is a type of low-level language. It mainly consists of instructions (kind of symbols) that only machines could understand. In today’s time also assembly language is used in real-time programs such as simulation flight navigation systems and medical equipment eg – Fly-by-wire (FBW) systems. It is also used to create computer viruses. 1952: Autocode Developed by Alick Glennie. The first compiled computer programming language. COBOL and FORTRAN are the languages referred to as Autocode. 1957: FORTRAN Developers are John Backus and IBM. It was designed for numeric computation and scientific computing. Software for NASA probes voyager-1 (space probe) and voyager-2 (space probe) was originally written in FORTRAN 5. 1958: ALGOL ALGOL stands for ALGOrithmic Language. The initial phase of the most popular programming languages of C, C++, and JAVA. It was also the first language implementing the nested function and has a simple syntax than FORTRAN. The first programming language to have a code block like “begin” that indicates that your program has started and “end” means you have ended your code. 1959: COBOL It stands for COmmon Business-Oriented Language. In 1997, 80% of the world’s business ran on Cobol. The US internal revenue service scrambled its path to COBOL-based IMF (individual master file) in order to pay the tens of millions of payments mandated by the coronavirus aid, relief, and economic security. 1964: BASIC It stands for beginners All-purpose symbolic instruction code. In 1991 Microsoft released Visual Basic, an updated version of Basic The first microcomputer version of Basic was co-written by Bill Gates, Paul Allen, and Monte Davidoff for their newly-formed company, Microsoft. 1972: C It is a general-purpose, procedural programming language and the most popular programming language till now. All the code that was previously written in assembly language gets replaced by the C language like operating system, kernel, and many other applications. It can be used in implementing an operating system, embedded system, and also on the website using the Common Gateway Interface (CGI). C is the mother of almost all higher-level programming languages like C#, D, Go, Java, JavaScript, Limbo, LPC, Perl, PHP, Python, and Unix’s C shell. Some other programming languages that are popular among programmers are listed below. Please Login to comment...